1 個回答
- 最新
- 最多得票
- 最多評論
0
Hi there,
As per the documentation for the PySparkProcessor class's run
method, the definition for the log parameter is as follows: "Whether to show the logs produced by the job. Only meaningful when wait is True (default: True)" [1].
In your case I see you have set wait to False. May you please confirm if you see logs if you set the wait parameter to True.
spark_processor.run(
submit_app="src/preprocess.py",
arguments=['--s3_output_path', save_location,
... ],
spark_event_logs_s3_uri = s3_log_location,
logs=True,
wait=True,
)
Reference
相關內容
- 已提問 1 年前
- AWS 官方已更新 2 年前
- AWS 官方已更新 2 年前
- AWS 官方已更新 2 年前
- AWS 官方已更新 2 年前
Thanks for the help. Yes, I tried setting 'logs' and 'wait' to 'True'. I am still unable to find my debug statements. Below is a snippet from the submitted job script. Any other suggestions? Thanks.
def main(): print("*** in main", flush=True) logging.info("*** in main info") logging.debug("*** in main debug") ... if name == "main": main()
Below are the areas I am looking for the debug statements.