How to add logging in step function configuration for EMR serverless Job

0

I am executing pyspark script on EMR serverless and i am using AWS step function for creating EMR application and submitting job in it. I want to know how to add logging option in AWS step function configuration while creating EMR application or submitting job in it so that i can see log files from spark job run in cloudwatch or in s3. this is my current block from StartJobRun state from AWS step function configuration.

"EMR Serverless StartJobRun": { "Type": "Task", "Resource": "arn:aws:states:::emr-serverless:startJobRun", "Parameters": { "ApplicationId.$": "$.ApplicationId", "ExecutionRoleArn": "arn:aws:iam:/demo-emrserverless-iam-role", "JobDriver": { "SparkSubmit": { "EntryPoint": "s3://demo/test.py", "EntryPointArguments.$":[], "SparkSubmitParameters": "--conf } },

i tried to add "JobConfiguration" but it is not accepting JobConfiguration

1 回答
0

Hello,

In order to add logging to EMR serverless job using the API StartJobRUn[1]. You need to use the key "ConfigurationOverrides"[2] and then "MonitoringConfiguration"[3] inside it to enable S3 logging or cloudwatch logging etc.

More information about logging in EMR Serverless can be found here[4].

AWS
已回答 19 天前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则