log4j2 with AWS EMR and Spark

0

I have a spark application which logs output to my console with this:

logger.info("Hello world")

My problem is, the log output is nowhere to be found after running the spark app in an EMR cluster.
When I create a cluster and submit a spark job, I specify the LogUri (S3 bucket) but I get nothing.

I submit my spark job with: https://docs.aws.amazon.com/aws-sdk-php/v3/api/api-elasticmapreduce-2009-03-31.html#runjobflow

stderr for the spark job shows

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
20/03/07 19:20:39 INFO ShutdownHookManager: Shutdown hook called
20/03/07 19:20:39 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-xxxxxxxx
Command exiting with ret '0'

How do I solve this?
Attached is my log4j2.properties file (renamed for uploading to this discussion)

Edited by: ErickN on Mar 7, 2020 12:02 PM

ErickN
gefragt vor 4 Jahren1512 Aufrufe
1 Antwort
0

Must use log4j version 1. It's supported by spark and EMR.

Edited by: ErickN on Mar 13, 2020 7:38 AM

ErickN
beantwortet vor 4 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen