- Mais recentes
- Mais votos
- Mais comentários
I'm updating this post for the community. The solution that I found was to use the instructions provided in this post : https://repost.aws/questions/QU_WC7301mT8qR7ip_9cyjdQ/eventbridge-pipes-and-ecs-task
In summary: it is not possible to set optional BatchParameters using the GUI when creating a Pipe in EventBridge. BUT you can create a pipe with those optional parameters using the CLI command or by updating an existing pipe (always with the CLI commands).
So my solution was:
- First I got a json of the most important params of my pipe.
aws pipes describe-pipe --name MY_PIPE_NAME --query '{Name: Name, RoleArn: RoleArn, TargetParameters: TargetParameters}'
- The above command prints the json information to your terminal, so you can copy it and save it to a file, let's say the file name is update.json
- Next you can open the json file and add the optional parameters you want to pass to your batch job, like this:
{
"Name": "video_3",
"RoleArn": "arn:aws:iam::*:role/service-role/Amazon_EventBridge_Pipe_video_3_a6617541",
"TargetParameters": {
"BatchJobParameters": {
"JobDefinition": "arn:aws:batch:eu-west-2:*:job-definition/video_transcription_job:6",
"JobName": "video_job",
"Parameters": {"recording_id":"$.body.recording_id", "recording_url": "$.body.recording_url"}
}
}
}
Notice that I put the dynamic json params in here. I expect to find recording_id and recording_url in the body of the message of my sqs queue that triggers this pipe.
- Once you update the json like this, you can use the update command to update your pipe:
aws pipes update-pipe --cli-input-json file://update.json
Notice that you need this "file://" before the name of your json file to load it.
- DONE! Now if you go the GUI and check your pipe target definition, you will find the optional parameters field you specified in the json file.
Hello Jacek.
Try to use Input Transformation to ensure parameter delivery.
Amazon EventBridge Pipes input transformation
Example:
aws events put-targets --rule RuleName \
--targets "Id"="1",
"Arn"="arn:aws:batch:região:ID_da_conta:job-definition/NameDefinitionJob:1",
"RoleArn"="arn:aws:iam::AccountID:role/RoleName",
"InputTransformer"="{
'InputPathsMap': {
'recordingId':'$.payload.Parameters.recording_id',
'recordingUrl':'$.payload.Parameters.recording_url'
},
'Template': '{\"jobName\": \"NomeDoJob\", \"jobQueue\": \"NomeDaFila\", \"jobDefinition\": \"NomeDefiniçãoJob\", \"containerOverrides\": {\"command\": [\"<comando ou script a ser executado>\", \"--recordingId\", <recordingId>, \"--recordingUrl\", \"<recordingUrl>\"]}}'
}"
Conteúdo relevante
- AWS OFICIALAtualizada há 2 anos
- AWS OFICIALAtualizada há 7 meses
- AWS OFICIALAtualizada há 2 anos
- AWS OFICIALAtualizada há 2 meses
Thank you for answering. I added a screenshot of the pipe I created and the batch job container details. I'm using the UI to create the pipe. I didn't create any Rule for this task, since I want to read a message from a sqs queue, extract the two parameters from the message and pass them to the batch Job. Any idea why it is not working?