output of databrew profile job in a step functions

0

i have a State that runs a databrew job and the next State is a lambda function. how to pass the results of the State(databrew job) to the State(lambda) to get the JobName, etc etc, tkx

Willi5
已提問 1 年前檢視次數 260 次
2 個答案
0
已接受的答案

i found the fail, the "Resource" must have at the end .sync to replciate all the parameters, with that already works

Willi5
已回答 1 年前
0

The Glue DataBrew StartJobRun API action returns a RunId.

You can add this to the workflow state using the ResultPath in the Task state, like so: "ResultPath": "$.JobRun". See here for more information about using ResultPath.

You can then access that value in the event passed to your Lambda function and use it, for example, to call DescribeJobRun. This page describes how to manipulate workflow and task state. The Data Flow Simulator (linked from this page) is also very useful in visualizing the data flow of your state machine.

AWS
MattK
已回答 1 年前
  • Hi MattK, i added the "ResultPath": "$.JobRun" in the State(databrew job), but their output is: { "profilejobname": "dataqualitytest2job", "StatePayload": "Starts the check of Data Quality", "AWS_STEP_FUNCTIONS_STARTED_BY_EXECUTION_ID": "arn:aws:states:eu-west-1:XXX:execution:dev-DQingestion_CheckRules:4de06712-b9c0-XXX-XXX-XXX", "JobRun": { "RunId": "db_c1f5f78adc0c02edd381b1370dadf54fb49154eaf5bbca9f430861961b59f729", "SdkHttpMetadata": { "AllHttpHeaders": { "X-Cache": [ "Miss from cloudfront" ], "x-amz-apigw-id": [ "BeuHXFZWDoEFVQg=" ], "Access-Control-Allow-Origin": [ "*" ...

    the databrew job writes the result in a S3 bucket, so I need to read in the lambda State the bucket, filename generated to get the content of the file and then get the profilejson to know the status of the ruleset. next a piece of code in lambda function:

    def lambda_handler(event, context): # TODO implement ... jobname = event["jobname"] for o in event["Outputs"]: bucketname = o["Location"]["Bucket"] if "dq-validation" in o["Location"]["Key"]: filename = o["Location"]["Key"] ... The event don't have the Outputs to know what was the file generated and which bucket.

    the question would be, in the lambda State how to know in which bucket and what was the file generated in the previous State(databrewJob)

    thanks for your help

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南