Batch download files from multiple different folders in the same S3 bucket

0

I have a csv file with a column specifying the names of the files that I would like to download. These files are stored in multiple different folders in the same S3 buckets. Is there a way I could download these files by passing the file names (either in a text file or a csv column), have AWS search in all the folders inside this S3 bucket, and download those files needed? I also have configured AWS CLI already if there's an option there. Thanks!

1 Risposta
1

Here is a link to download file from S3 using python. https://boto3.amazonaws.com/v1/documentation/api/1.9.42/guide/s3-example-download-file.html Try it out and see if you are able to download single files. You can then pass a text file to python program and for each filename in the text file, run the above code to download the file. Hope it helps

con risposta 2 anni fa
  • Thank you for your answer, Sandeep. I tried specifying my bucket, key, and filename just as what is showing in this example - s3.Bucket('mybucket').download_file('hello.txt', '/tmp/hello.txt'), but I kept running into this - ClientError: An error occurred (404) when calling the HeadObject operation: Not Found. So the 'hello.txt' in their example is a single file, stored in the folder 'tmp' inside the bucket called 'mybucket', correct? Just trying to understand the file structure here.

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande