Batch download files from multiple different folders in the same S3 bucket

0

I have a csv file with a column specifying the names of the files that I would like to download. These files are stored in multiple different folders in the same S3 buckets. Is there a way I could download these files by passing the file names (either in a text file or a csv column), have AWS search in all the folders inside this S3 bucket, and download those files needed? I also have configured AWS CLI already if there's an option there. Thanks!

1 Respuesta
1

Here is a link to download file from S3 using python. https://boto3.amazonaws.com/v1/documentation/api/1.9.42/guide/s3-example-download-file.html Try it out and see if you are able to download single files. You can then pass a text file to python program and for each filename in the text file, run the above code to download the file. Hope it helps

respondido hace 2 años
  • Thank you for your answer, Sandeep. I tried specifying my bucket, key, and filename just as what is showing in this example - s3.Bucket('mybucket').download_file('hello.txt', '/tmp/hello.txt'), but I kept running into this - ClientError: An error occurred (404) when calling the HeadObject operation: Not Found. So the 'hello.txt' in their example is a single file, stored in the folder 'tmp' inside the bucket called 'mybucket', correct? Just trying to understand the file structure here.

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas