EMR Serverless jar job issues

0

Good day. I'm trying to submit EMR Serverless job using custom docker image for the serverless app and submitting JAR file to run. All script and spark arguments are passed correctly. Job stars but fails with error:

Error: ETL config file 's3:/path/to/file/etl.conf' must exist and be readable

File exists in the specified location, IAM role used for the job has full S3 access. What else can cause the problem?

preguntada hace 4 meses222 visualizaciones
1 Respuesta
1

Hello Sviat,

I understand that you are trying to run a spark application in EMR Serverless. Can you confirm if the EMR Serverless Spark application fails to read the file or you are trying to read the file in the driver (either python or scala).

If you are using driver to read the file, the interface which you might using might not support S3. With that said, I would recommend you to pass the file using --files or --conf spark.files and then access using pyspark.SparkFiles.get

An example code snippet on how to use it is as below.

import pyspark
from pyspark.sql import SparkSession
from pyspark import SparkFiles
spark = SparkSession.builder \
                    .appName('').enableHiveSupport() \
                    .getOrCreate()

path = SparkFiles.get("config.json")
f = open(path, "r")
print(f.read())
spark.stop()

if the above doesn't resolve your use-case, may i request you to share the spark properties you have used for the EMR Serverless application and also the code script where you are trying to access the file (if used).

AWS
respondido hace 4 meses
profile pictureAWS
INGENIERO DE SOPORTE
revisado hace 4 meses

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas