How to pass parameters from EventBridge Pipe to Batch Job?

0

I set up an eventbridge pipe that reads from an SQS queue and should trigger a batch job. The problem is that the pipe is not passing the parameters to the batch job. Here are the two logs: The first shows that the pipe got the parameters succesfully from the sqs queue. The second shows that the request to the batch job submit does NOT have those parameters. I've tried all possible permutations of the Target Input Transformer, but still had no success.

LOG 1 { "resourceArn": "arn:aws:pipes:eu-west-2:355280565032:pipe/video_2", "timestamp": 1708684686652, "executionId": "39f111cc-590d-4c14-be5f-4a4e916c072b", "messageType": "TargetTransformationSucceeded", "logLevel": "TRACE", "payload": "[{"Parameters":{"recording_id":"15","recording_url":"la luna nera 2"}}]" }

LOG 2 { "resourceArn": "arn:aws:pipes:eu-west-2:355280565032:pipe/video_2", "timestamp": 1708684686806, "executionId": "39f111cc-590d-4c14-be5f-4a4e916c072b", "messageType": "TargetInvocationFailed", "logLevel": "ERROR", "error": { "message": "Target invocation failed with error from Batch.", "httpStatusCode": 400, "awsService": "batch", "requestId": "6d2e015d-8675-4b0b-9ce6-05faf45a0665", "exceptionType": "BadRequest", "resourceArn": "arn:aws:batch:eu-west-2:355280565032:job-queue/video_transcription_job_queue" }, "awsRequest": "{"jobName":"video_job","jobQueue":"arn:aws:batch:eu-west-2:355280565032:job-queue/video_transcription_job_queue","shareIdentifier":null,"schedulingPriorityOverride":null,"arrayProperties":null,"dependsOn":null,"jobDefinition":"arn:aws:batch:eu-west-2:355280565032:job-definition/video_transcription_job:6","parameters":null,"containerOverrides":null,"nodeOverrides":null,"retryStrategy":null,"propagateTags":null,"timeout":null,"tags":null,"eksPropertiesOverride":null}", "awsResponse": "Unable to substitute value. No parameter found for reference recording_id (Service: Batch, Status Code: 400, Request ID: 6d2e015d-8675-4b0b-9ce6-05faf45a0665)" }

EDIT: Added a screenshot of the pipe I created Enter image description here Enter image description here

2개 답변
1
수락된 답변

I'm updating this post for the community. The solution that I found was to use the instructions provided in this post : https://repost.aws/questions/QU_WC7301mT8qR7ip_9cyjdQ/eventbridge-pipes-and-ecs-task

In summary: it is not possible to set optional BatchParameters using the GUI when creating a Pipe in EventBridge. BUT you can create a pipe with those optional parameters using the CLI command or by updating an existing pipe (always with the CLI commands).

So my solution was:

  1. First I got a json of the most important params of my pipe.

aws pipes describe-pipe --name MY_PIPE_NAME --query '{Name: Name, RoleArn: RoleArn, TargetParameters: TargetParameters}'

  1. The above command prints the json information to your terminal, so you can copy it and save it to a file, let's say the file name is update.json
  2. Next you can open the json file and add the optional parameters you want to pass to your batch job, like this:
{
    "Name": "video_3",
    "RoleArn": "arn:aws:iam::*:role/service-role/Amazon_EventBridge_Pipe_video_3_a6617541",
    "TargetParameters": {
        "BatchJobParameters": {
            "JobDefinition": "arn:aws:batch:eu-west-2:*:job-definition/video_transcription_job:6",
            "JobName": "video_job",
            "Parameters": {"recording_id":"$.body.recording_id", "recording_url": "$.body.recording_url"}
        }
    }
}

Notice that I put the dynamic json params in here. I expect to find recording_id and recording_url in the body of the message of my sqs queue that triggers this pipe.

  1. Once you update the json like this, you can use the update command to update your pipe:

aws pipes update-pipe --cli-input-json file://update.json

Notice that you need this "file://" before the name of your json file to load it.

  1. DONE! Now if you go the GUI and check your pipe target definition, you will find the optional parameters field you specified in the json file.

Enter image description here

Jacek
답변함 2달 전
profile picture
전문가
검토됨 2달 전
-1

Hello Jacek.

Try to use Input Transformation to ensure parameter delivery.

Amazon EventBridge Pipes input transformation

Example:

 aws events put-targets --rule RuleName \
    --targets "Id"="1",
              "Arn"="arn:aws:batch:região:ID_da_conta:job-definition/NameDefinitionJob:1",
              "RoleArn"="arn:aws:iam::AccountID:role/RoleName",
              "InputTransformer"="{
                  'InputPathsMap': {
                      'recordingId':'$.payload.Parameters.recording_id',
                      'recordingUrl':'$.payload.Parameters.recording_url'
                  },
                  'Template': '{\"jobName\": \"NomeDoJob\", \"jobQueue\": \"NomeDaFila\", \"jobDefinition\": \"NomeDefiniçãoJob\", \"containerOverrides\": {\"command\": [\"<comando ou script a ser executado>\", \"--recordingId\", <recordingId>, \"--recordingUrl\", \"<recordingUrl>\"]}}'
              }"
profile picture
답변함 2달 전
  • Thank you for answering. I added a screenshot of the pipe I created and the batch job container details. I'm using the UI to create the pipe. I didn't create any Rule for this task, since I want to read a message from a sqs queue, extract the two parameters from the message and pass them to the batch Job. Any idea why it is not working?

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠