EMR Serverless jar job issues

0

Good day. I'm trying to submit EMR Serverless job using custom docker image for the serverless app and submitting JAR file to run. All script and spark arguments are passed correctly. Job stars but fails with error:

Error: ETL config file 's3:/path/to/file/etl.conf' must exist and be readable

File exists in the specified location, IAM role used for the job has full S3 access. What else can cause the problem?

feita há 4 meses227 visualizações
1 Resposta
1

Hello Sviat,

I understand that you are trying to run a spark application in EMR Serverless. Can you confirm if the EMR Serverless Spark application fails to read the file or you are trying to read the file in the driver (either python or scala).

If you are using driver to read the file, the interface which you might using might not support S3. With that said, I would recommend you to pass the file using --files or --conf spark.files and then access using pyspark.SparkFiles.get

An example code snippet on how to use it is as below.

import pyspark
from pyspark.sql import SparkSession
from pyspark import SparkFiles
spark = SparkSession.builder \
                    .appName('').enableHiveSupport() \
                    .getOrCreate()

path = SparkFiles.get("config.json")
f = open(path, "r")
print(f.read())
spark.stop()

if the above doesn't resolve your use-case, may i request you to share the spark properties you have used for the EMR Serverless application and also the code script where you are trying to access the file (if used).

AWS
respondido há 4 meses
profile pictureAWS
ENGENHEIRO DE SUPORTE
avaliado há 4 meses

Você não está conectado. Fazer login para postar uma resposta.

Uma boa resposta responde claramente à pergunta, dá feedback construtivo e incentiva o crescimento profissional de quem perguntou.

Diretrizes para responder a perguntas