Questions tagged with Amazon EMR Serverless
Content language: English
Select up to 5 tags to filter
Sort by most recent
Browse through the questions and answers listed below or filter and sort to narrow down your results.
Environment variables for PySpark executor in AWS EMR Serverless and Env key limitations with EMRlg...
Hello, I have gone documentation and practically observed the limitation for ENV Keys `spark.emr-serverless.driverEnv` and `spark.emr-serverless.executorEnv` with EMR Serverless which is limited to 50...
0
answers
0
votes
46
views
asked 2 days agolg...
file not found errorlg...
Hello,
Im trying with EMR Serverless, switching from EMR getting below error
```
Traceback (most recent call last):
File "/tmp/spark-07b8331c-566b-43e9-bf1a-f0a77cb86420/spark_main.py", line 7,...
Accepted AnswerAmazon EMR Serverless
1
answers
0
votes
72
views
asked 10 days agolg...
I am running a Scala spark job on EMR serverless and am trying to pass a postgresql JDBC connector jar through Spark's --driver-class-path config. This is how my spark submit configs look
```
--class...
2
answers
0
votes
67
views
asked 20 days agolg...
I'm running an EMR Serverless Spark job that uses Delta OSS to handle Delta tables. I previously resolved a configuration issue with EMR Serverless and AWS Glue Data Catalog...
1
answers
0
votes
242
views
asked 23 days agolg...
Hi all,
I have shared a Glue table (S3) with another account where I can already query it via Athena.
Now I added LakeFormation permissions for the database and table to the role that I am using...
1
answers
0
votes
116
views
asked 25 days agolg...
I want to use AWS Glue Data Catalog as a metastore. I'm running an EMR Serverless job that inserts and updates data in a Delta Table. I've successfully populated Delta tables on my localhost...
2
answers
0
votes
119
views
asked 25 days agolg...
Hi Team,
All examples that I am seeing are passing application configuration overrides as a dictionary. how to pass this value from a S3 path. This configuration will be pushed from airflow
1
answers
0
votes
382
views
asked a month agolg...
Hi Team,
We are invoking our EMR-S jobs using an airflow EMR job submit operator. The EMR application is configured with a set of spark default run time configurations. while invoking the job from...
2
answers
0
votes
406
views
asked a month agolg...
Let me know if this is something AWS EMR Studio does:
1. in Databricks community edition, and in Google Collab, one can fire up a simple Jupyter notrebook with an automatically started cluster (small...
1
answers
0
votes
390
views
asked a month agolg...
While running the serverless job run, I am getting below errror:
"Number of cores specified by 'spark.driver.cores '7' is invalid".
2
answers
0
votes
450
views
asked 2 months agolg...
I am executing pyspark script on EMR serverless and i am using AWS step function for creating EMR application and submitting job in it. I want to know how to add logging option in AWS step function...
1
answers
0
votes
482
views
asked 2 months agolg...
I have a Serverless EMR appication, I am submitting a spark job via python script. I have packaged all the dependencies an an the script to an s3 bucket. When I execute the job the spark job is...
2
answers
0
votes
484
views
asked 2 months agolg...