如何将Spark(EMR无服务器)作业日志发布到CloudWatch

0

【以下的问题经过翻译处理】 使用Scala创建了一个Spark作业,现在尝试找到将日志记录到cloudwatch的方法。

到目前为止,我尝试将作业打包为一个cloudwatch appender的uber Jar,并通过传递log4j选项来实现,如下所示:

--class Main 
--conf spark.files=s3://fen-x-data-migration-1234/emr-demo/etl-job/conf/log4j.properties#log4j.properties 
--conf spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties 
--conf spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j.properties 
--conf spark.hadoop.hive.metastore.client.factory.class=com.amazonaws.glue.catalog.metastore.AWSGlueDataCatalogHiveClientFactory

我还尝试通过程序添加appender。

profile picture
EXPERT
asked 6 months ago35 views
1 Answer
0

【以下的回答经过翻译处理】 由于Spark和Hive产生的日志量很大,Cloudwatch日志在这个规模下并不总是划算的。因此,我们提供了管理存储,EMR以不增加任何费用为客户存储日志30天[2]。客户也可以选择将日志存储在S3中。

感谢您的来信,注意安全!

[1] https://docs.aws.amazon.com/emr/latest/EMR-Serverless-UserGuide/emr-serverless.html

[2] https://docs.aws.amazon.com/emr/latest/EMR-Serverless-UserGuide/logging.html

profile picture
EXPERT
answered 6 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions