1 回答
- 最新
- 投票最多
- 评论最多
0
Hi,
The standard logging mechanisms: general/slow/audit logs do not capture query results. Since RDS/Aurora do not provide access to the host, it would not be possible to access files that are written locally, so you will need to write to a file externally to a service like S3, for example.
- Call the stored procedure from a local machine or Lambda and write the execution details to a log file. Configure a Lambda function to query the database and write the output to a log file has more on this. This might be the best option for your use case since you want to log more than just the result of the execution
- If you only need the output of the procedure, you can feed it into a table and setup a pipeline to Export MySQL Data to Amazon S3 Using AWS Data Pipeline
- For Aurora MySQL, you can add SELECT INTO OUTFILE S3 to the procedure and save the result directly into text files stored in an Amazon S3 bucket. Note that this option doesn't support Aurora Serverless or RDS MySQL
已回答 2 年前
相关内容
- AWS 官方已更新 3 年前
- AWS 官方已更新 3 年前
Hi, thank you for your answer, in fact, I call my stored procedure from my was glue pyspark job not from a local machine. Is there a way to capture the result of the execution of my stored procedure from within the glue job? because the issue I'm facing now is that my glue job after loading => call the stored procedure => the stored procedure fail but the glue job is marked as succeeded.
I'm looking at how to know from within my glue job whether or not the stored procedure runs successfully so that I can pass or fail the job.
to avoid the scenario when the stored procedure fails but the job itself is marked as succeeded. Thank you