1 Answer
- Newest
- Most votes
- Most comments
1
Hi RahulD,
To add a file pattern in your AWS Glue ETL job's Python script, you can modify the connection_options to include a custom file name or pattern. AWS Glue doesn’t directly support wildcard patterns for output filenames, but you can achieve a similar effect by dynamically setting the file name in the script.
Here's how you can modify your script to include a file pattern like dostrp*.csv.gz:
timestamp = datetime.datetime.now().strftime("%Y%m%d%H%M%S") filename = f"dostrp{timestamp}.csv.gz"
AmazonS3_node55566666 = glueContext.write_dynamic_frame.from_options( frame=AWSGlueDataCatalog_node188777777, connection_type="s3", format="csv", format_options={"separator": "|"}, connection_options={ "path": f"{outputbucketname}/{filename}", "compression": "gzip", "partitionKeys": [] }, transformation_ctx="AmazonS3_node5566677777" )
If this approach doesn’t fully meet your needs, please provide additional details about your requirements. I'm here to help.
Relevant content
- asked a year ago
- asked a year ago
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated 10 months ago

Thanks Vitor. I tried the logic as you mentioned. But unfortunately the glue job run was succeded but it didnt generate the file in S3 bucket. When i remove this logic then it generated the file in s3 bucket.
I tried all the different logic below with variable filename but it didn't worked:
filename = f"dostrp{timestamp}.csv.gz" filename = f"dostrp{timestamp}.gz" filename = f"dostrp{timestamp}"
As i already provided the below logic previously to generate gzip file with csv format does it has any impact or what might be the issue ? format="csv", connection_options={"path": outputbucketname, "compression": "gzip"
Currently without your logic it generates the file like below: run-1723993970803-part-r-00005.gz
Hi @Vitor, also in s3 bucket when i unzip the gz file, the file has created in "file" type format and not csv format. Can you please tell what might be the issue?
Try this approach for custom file naming:
If the issue persists, please provide more details.
thanks now it works as expected :)