Upsert operation on PostGres or Aurora from S3 CSV file using aws_s3.table_import_from_s3

1

Importing tens of thousands of rows into a PostGres table from S3 is awesome and incredibly fast, but is there any way to perform Upserts using the same process without creating staging tables?

Ideally it would be nice to have the equivalent of the conflict keyword, and perform an update if the PK is already present, but is there anything simpler than a staging+merge operation? At the moment it looks like it needs:

  • Create temporary table
  • aws_s3.table_import_from_s3 into temporary table
  • MERGE operation into main table
  • Clear temporary table

and it needs this for all 20 tables we're merging... Open to any suggestions

Notitia
已提問 2 年前檢視次數 1383 次
2 個答案
0

Your current architecture sounds standard (and good!).

Part of the reason that CSV import is so fast is that it's only handling a limited set of functions. I'm pretty sure there's no ability to handle the on conflict idea there.

已回答 2 年前
0

Thanks for the feedback.

What would be a nice simplification of the solution would be if there was a way of parsing the JSON structure in the S3 bucket directly as part of the import, as I currently have to split the JSON into individual files for each 'table' for the import process, then write a custom stored procedure to do the merge from each bucket.

First world problems...

Notitia
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南