How to redirect entire output of spark-submit to s3

0

using spark submit to command how can we send complete application logs which we are going to see when submited spark submit command.

anudeep
gefragt vor einem Jahr368 Aufrufe
1 Antwort
0

First, make sure you have the proper AWS credentials for the worker running the spark-submit job. This will depend on what you're using to run the task (for example, Glue Job execution role, Fargate execution role, EC2 instance profile, etc.). Once you have this you can set the Amazon S3 bucket you want to save to as the output path parameter. You can use Spark's 'save' method to write the results to this output path. For example:

val outputDataFrame: data = // your data
outputDataFrame.write.parquet("s3://yourbucket/output")

Depending on where you run your job, you can gather your application logs in CloudWatch. EC2, Glue, Fargate, EKS, ECS all integrate with Amazon CloudWatch so you can enable the execution role to write job to CloudWatch. You can find your application logs there. It's then up to you if you want to send those logs to other storage destinations like S3, Splunk, DataDog, etc.

profile pictureAWS
EXPERTE
pechung
beantwortet vor einem Jahr
AWS
SUPPORT-TECHNIKER
überprüft vor einem Monat

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen