EMR Serverless jar job issues

0

Good day. I'm trying to submit EMR Serverless job using custom docker image for the serverless app and submitting JAR file to run. All script and spark arguments are passed correctly. Job stars but fails with error:

Error: ETL config file 's3:/path/to/file/etl.conf' must exist and be readable

File exists in the specified location, IAM role used for the job has full S3 access. What else can cause the problem?

posta 4 mesi fa223 visualizzazioni
1 Risposta
1

Hello Sviat,

I understand that you are trying to run a spark application in EMR Serverless. Can you confirm if the EMR Serverless Spark application fails to read the file or you are trying to read the file in the driver (either python or scala).

If you are using driver to read the file, the interface which you might using might not support S3. With that said, I would recommend you to pass the file using --files or --conf spark.files and then access using pyspark.SparkFiles.get

An example code snippet on how to use it is as below.

import pyspark
from pyspark.sql import SparkSession
from pyspark import SparkFiles
spark = SparkSession.builder \
                    .appName('').enableHiveSupport() \
                    .getOrCreate()

path = SparkFiles.get("config.json")
f = open(path, "r")
print(f.read())
spark.stop()

if the above doesn't resolve your use-case, may i request you to share the spark properties you have used for the EMR Serverless application and also the code script where you are trying to access the file (if used).

AWS
con risposta 4 mesi fa
profile pictureAWS
TECNICO DI SUPPORTO
verificato 4 mesi fa

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande