How to redirect entire output of spark-submit to s3

0

using spark submit to command how can we send complete application logs which we are going to see when submited spark submit command.

anudeep
demandé il y a un an369 vues
1 réponse
0

First, make sure you have the proper AWS credentials for the worker running the spark-submit job. This will depend on what you're using to run the task (for example, Glue Job execution role, Fargate execution role, EC2 instance profile, etc.). Once you have this you can set the Amazon S3 bucket you want to save to as the output path parameter. You can use Spark's 'save' method to write the results to this output path. For example:

val outputDataFrame: data = // your data
outputDataFrame.write.parquet("s3://yourbucket/output")

Depending on where you run your job, you can gather your application logs in CloudWatch. EC2, Glue, Fargate, EKS, ECS all integrate with Amazon CloudWatch so you can enable the execution role to write job to CloudWatch. You can find your application logs there. It's then up to you if you want to send those logs to other storage destinations like S3, Splunk, DataDog, etc.

profile pictureAWS
EXPERT
pechung
répondu il y a un an
AWS
INGÉNIEUR EN ASSISTANCE TECHNIQUE
vérifié il y a un mois

Vous n'êtes pas connecté. Se connecter pour publier une réponse.

Une bonne réponse répond clairement à la question, contient des commentaires constructifs et encourage le développement professionnel de la personne qui pose la question.

Instructions pour répondre aux questions