Batch download files from multiple different folders in the same S3 bucket

0

I have a csv file with a column specifying the names of the files that I would like to download. These files are stored in multiple different folders in the same S3 buckets. Is there a way I could download these files by passing the file names (either in a text file or a csv column), have AWS search in all the folders inside this S3 bucket, and download those files needed? I also have configured AWS CLI already if there's an option there. Thanks!

1 回答
1

Here is a link to download file from S3 using python. https://boto3.amazonaws.com/v1/documentation/api/1.9.42/guide/s3-example-download-file.html Try it out and see if you are able to download single files. You can then pass a text file to python program and for each filename in the text file, run the above code to download the file. Hope it helps

已回答 2 年前
  • Thank you for your answer, Sandeep. I tried specifying my bucket, key, and filename just as what is showing in this example - s3.Bucket('mybucket').download_file('hello.txt', '/tmp/hello.txt'), but I kept running into this - ClientError: An error occurred (404) when calling the HeadObject operation: Not Found. So the 'hello.txt' in their example is a single file, stored in the folder 'tmp' inside the bucket called 'mybucket', correct? Just trying to understand the file structure here.

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则

相关内容