output of databrew profile job in a step functions

0

i have a State that runs a databrew job and the next State is a lambda function. how to pass the results of the State(databrew job) to the State(lambda) to get the JobName, etc etc, tkx

Willi5
posta un anno fa261 visualizzazioni
2 Risposte
0
Risposta accettata

i found the fail, the "Resource" must have at the end .sync to replciate all the parameters, with that already works

Willi5
con risposta un anno fa
0

The Glue DataBrew StartJobRun API action returns a RunId.

You can add this to the workflow state using the ResultPath in the Task state, like so: "ResultPath": "$.JobRun". See here for more information about using ResultPath.

You can then access that value in the event passed to your Lambda function and use it, for example, to call DescribeJobRun. This page describes how to manipulate workflow and task state. The Data Flow Simulator (linked from this page) is also very useful in visualizing the data flow of your state machine.

AWS
MattK
con risposta un anno fa
  • Hi MattK, i added the "ResultPath": "$.JobRun" in the State(databrew job), but their output is: { "profilejobname": "dataqualitytest2job", "StatePayload": "Starts the check of Data Quality", "AWS_STEP_FUNCTIONS_STARTED_BY_EXECUTION_ID": "arn:aws:states:eu-west-1:XXX:execution:dev-DQingestion_CheckRules:4de06712-b9c0-XXX-XXX-XXX", "JobRun": { "RunId": "db_c1f5f78adc0c02edd381b1370dadf54fb49154eaf5bbca9f430861961b59f729", "SdkHttpMetadata": { "AllHttpHeaders": { "X-Cache": [ "Miss from cloudfront" ], "x-amz-apigw-id": [ "BeuHXFZWDoEFVQg=" ], "Access-Control-Allow-Origin": [ "*" ...

    the databrew job writes the result in a S3 bucket, so I need to read in the lambda State the bucket, filename generated to get the content of the file and then get the profilejson to know the status of the ruleset. next a piece of code in lambda function:

    def lambda_handler(event, context): # TODO implement ... jobname = event["jobname"] for o in event["Outputs"]: bucketname = o["Location"]["Bucket"] if "dq-validation" in o["Location"]["Key"]: filename = o["Location"]["Key"] ... The event don't have the Outputs to know what was the file generated and which bucket.

    the question would be, in the lambda State how to know in which bucket and what was the file generated in the previous State(databrewJob)

    thanks for your help

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande