60K updates to RDS Postgres - performance challenge

0

A customer is wanting to process approx 60,000 updates to a table in their RDS Postgres instance and experiencing performance issues in testing meaning that the update would take upwards of 30 hours to complete. The updates are to read usernames (email address) from a single table and change them from mixed case to lower case and write them back to the database.

The process to do the updates is a bash script running in a container, and each update is taking a few minutes to process. RDS instance are PostrGres (not Aurora) and running db.m5.large instance.

Whilst I investigate with the customer (getting them to look at metrics for the RDS instance, etc) I am looking for thoughts on likely bottlenecks. One obvious thought is to increase the instance size for the duration of the update and then reduce once completed.

Any other thoughts on PostGres tuning that may help?

1개 답변
0
수락된 답변

Since the customer is updating a value in a JSON doc, just use the built in PostgreSQL JSON functions

postgres=> CREATE TABLE test_json (a int, b jsonb);
CREATE TABLE
postgres=> INSERT INTO test_json 
postgres-> VALUES (1, '{"key1": "abc"}'), (2, '{"key1": "xyz"}');
INSERT 0 2
postgres=> UPDATE test_json
postgres->    SET b = jsonb_set(b, '{key1}', (upper((b->'key1')::text))::jsonb);
UPDATE 2
postgres=> SELECT * FROM test_json;
 a |        b        
---+-----------------
 1 | {"key1": "ABC"}
 2 | {"key1": "XYZ"}
(2 rows)
AWS
답변함 4년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠