EMR Serverless jar job issues

0

Good day. I'm trying to submit EMR Serverless job using custom docker image for the serverless app and submitting JAR file to run. All script and spark arguments are passed correctly. Job stars but fails with error:

Error: ETL config file 's3:/path/to/file/etl.conf' must exist and be readable

File exists in the specified location, IAM role used for the job has full S3 access. What else can cause the problem?

Sviat
asked 3 months ago191 views
1 Answer
1

Hello Sviat,

I understand that you are trying to run a spark application in EMR Serverless. Can you confirm if the EMR Serverless Spark application fails to read the file or you are trying to read the file in the driver (either python or scala).

If you are using driver to read the file, the interface which you might using might not support S3. With that said, I would recommend you to pass the file using --files or --conf spark.files and then access using pyspark.SparkFiles.get

An example code snippet on how to use it is as below.

import pyspark
from pyspark.sql import SparkSession
from pyspark import SparkFiles
spark = SparkSession.builder \
                    .appName('').enableHiveSupport() \
                    .getOrCreate()

path = SparkFiles.get("config.json")
f = open(path, "r")
print(f.read())
spark.stop()

if the above doesn't resolve your use-case, may i request you to share the spark properties you have used for the EMR Serverless application and also the code script where you are trying to access the file (if used).

AWS
answered 3 months ago
profile pictureAWS
SUPPORT ENGINEER
reviewed 3 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions