- Newest
- Most votes
- Most comments
Hello.
You can share the S3 path etc. from EventBridge to AWS Batch by following the steps in the document below.
https://docs.aws.amazon.com/batch/latest/userguide/batch-cwe-target.html#cwe-input-transformer
On the AWS Batch side, you can share the S3 path etc. with Python by executing a command in the job definition.
https://docs.aws.amazon.com/batch/latest/userguide/create-job-definition-Fargate.html
https://docs.aws.amazon.com/batch/latest/userguide/job_definition_parameters.html
For example, suppose you set the following input transformer in EventBridge as shown below.
https://repost.aws/knowledge-center/batch-target-eventbridge-rules
{"S3BucketValue":"$.detail.requestParameters.bucketName","S3KeyValue":"$.detail.requestParameters.key"}
Set the parameters passed to the batch job as follows.
{"Parameters" : {"S3bucket": <S3BucketValue>, "S3key": <S3KeyValue>}}
The AWS Batch job definition command passes S3 information to Python as follows.
"command": ["python", "main.py", "Ref::S3bucket", "Ref::S3key"]
You should be able to check S3 information from the arguments by using the Python code as shown below.
import sys
s3_bucket_name = sys.argv[1]
s3_key = sys.argv[2]
Relevant content
- asked 2 years ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated a year ago
I tried using
import sys s3_bucket_name = sys.argv[1] s3_key = sys.argv[2]
Still nothing showed up what could be the possible reason for it?
Do i have to mention something in the docker file as well?