How can I use AWS Glue to split a file by number of lines?

0

I have a file that contains about 1000 lines currently stored in an S3 bucket, and I want to split this file into smaller files (about 200 - 500 lines/file). I have searched for internet and only found solution to merge files into a larger file only. Can I use Glue to custom output file by lines? Or I should use any other method? I would very thankful if you can guide me the procedure.

Thank you so much!

asked 2 years ago1222 views
1 Answer
1
Accepted Answer

Hello,

As you mentioned you have only 1000 lines file which is quite small and for this purpose instead of using Glue ETL Spark job(It is recommended to process large amount of data) I would suggest to use below shell command which you can execute on Ec2 instance.

aws s3 cp s3://sourcebucket/csv/nycflights13.csv - | split -d -l 200 --filter "aws s3 cp - \"s3://destbucket/csv/bigdata_\$FILE.csv\""

Also you can use Glue python shell job and execute above shell command. [1]

Reference:

[1] Execute shell command using Python: https://www.codingninjas.com/blog/2021/06/25/how-to-execute-shell-commands-with-python/#:~:text=The%20naive%20approach%20to%20run,function%20that%20executes%20shell%20commands.

AWS
answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions