How to terminate aws_s3.table_import_from_s3 calls?

1

We've been using aws_s3.table_import_from_s3 for about 15 months with great success to load thousands of s3 files. About a month ago we started seeing a few calls to aws_s3.table_import_from_s3 that do not finish. The Java service that's running the command has been stopped but we are unable to terminate the calls using pg_cancel_backend() or pg_terminate_backend(). We've checked the pg_stat_activity table for these queries and they are listed with an active state and are not blocked.

Anyone know how to terminate/kill these aws_s3.table_import_from_s3 calls?

RDS Postgres 14.7

stevem
asked 2 months ago403 views
1 Answer
0

Hi, 1)Check for Locks: Even if the queries are not blocked, they might still be holding locks that prevent them from being canceled. You can check for locks using the pg_locks view. Look for any locks held by the backend processes associated with the problematic queries and try to release them if possible.

  1. Monitor Resource Usage: The queries might be stuck due to resource contention or high resource usage. Monitor the system resources (CPU, memory, disk I/O) to see if there are any spikes or unusual patterns when these queries are running. Address any resource bottlenecks that you identify.
3)Review AWS S3 Access: 

Ensure that there are no issues with accessing the S3 files or that there have been no changes to the AWS credentials or permissions that could be causing the import process to hang.

4)Update AWS SDK: 

If you're using an AWS SDK to interact with S3, make sure it's up to date. Sometimes, older versions of SDKs can have bugs or compatibility issues that cause unexpected behavior.

5)Check Logs and Monitoring: 

Review logs and monitoring metrics to see if there are any error messages or warnings associated with the import process. This can help pinpoint the cause of the issue.

6)Database Restart: As a last resort, if you've exhausted all other options and the queries still can't be terminated, you may need to restart the PostgreSQL database. However, be cautious with this approach as it will interrupt all active connections and may cause downtime for your application.

profile picture
answered 2 months ago
profile picture
EXPERT
reviewed a month ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions