Upsert operation on PostGres or Aurora from S3 CSV file using aws_s3.table_import_from_s3

1

Importing tens of thousands of rows into a PostGres table from S3 is awesome and incredibly fast, but is there any way to perform Upserts using the same process without creating staging tables?

Ideally it would be nice to have the equivalent of the conflict keyword, and perform an update if the PK is already present, but is there anything simpler than a staging+merge operation? At the moment it looks like it needs:

  • Create temporary table
  • aws_s3.table_import_from_s3 into temporary table
  • MERGE operation into main table
  • Clear temporary table

and it needs this for all 20 tables we're merging... Open to any suggestions

Notitia
gefragt vor 2 Jahren1383 Aufrufe
2 Antworten
0

Your current architecture sounds standard (and good!).

Part of the reason that CSV import is so fast is that it's only handling a limited set of functions. I'm pretty sure there's no ability to handle the on conflict idea there.

beantwortet vor 2 Jahren
0

Thanks for the feedback.

What would be a nice simplification of the solution would be if there was a way of parsing the JSON structure in the S3 bucket directly as part of the import, as I currently have to split the JSON into individual files for each 'table' for the import process, then write a custom stored procedure to do the merge from each bucket.

First world problems...

Notitia
beantwortet vor 2 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen