How can I use AWS Glue to split a file by number of lines?

0

I have a file that contains about 1000 lines currently stored in an S3 bucket, and I want to split this file into smaller files (about 200 - 500 lines/file). I have searched for internet and only found solution to merge files into a larger file only. Can I use Glue to custom output file by lines? Or I should use any other method? I would very thankful if you can guide me the procedure.

Thank you so much!

已提問 2 年前檢視次數 1247 次
1 個回答
1
已接受的答案

Hello,

As you mentioned you have only 1000 lines file which is quite small and for this purpose instead of using Glue ETL Spark job(It is recommended to process large amount of data) I would suggest to use below shell command which you can execute on Ec2 instance.

aws s3 cp s3://sourcebucket/csv/nycflights13.csv - | split -d -l 200 --filter "aws s3 cp - \"s3://destbucket/csv/bigdata_\$FILE.csv\""

Also you can use Glue python shell job and execute above shell command. [1]

Reference:

[1] Execute shell command using Python: https://www.codingninjas.com/blog/2021/06/25/how-to-execute-shell-commands-with-python/#:~:text=The%20naive%20approach%20to%20run,function%20that%20executes%20shell%20commands.

AWS
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南