Upsert operation on PostGres or Aurora from S3 CSV file using aws_s3.table_import_from_s3

1

Importing tens of thousands of rows into a PostGres table from S3 is awesome and incredibly fast, but is there any way to perform Upserts using the same process without creating staging tables?

Ideally it would be nice to have the equivalent of the conflict keyword, and perform an update if the PK is already present, but is there anything simpler than a staging+merge operation? At the moment it looks like it needs:

  • Create temporary table
  • aws_s3.table_import_from_s3 into temporary table
  • MERGE operation into main table
  • Clear temporary table

and it needs this for all 20 tables we're merging... Open to any suggestions

Notitia
질문됨 2년 전1383회 조회
2개 답변
0

Your current architecture sounds standard (and good!).

Part of the reason that CSV import is so fast is that it's only handling a limited set of functions. I'm pretty sure there's no ability to handle the on conflict idea there.

답변함 2년 전
0

Thanks for the feedback.

What would be a nice simplification of the solution would be if there was a way of parsing the JSON structure in the S3 bucket directly as part of the import, as I currently have to split the JSON into individual files for each 'table' for the import process, then write a custom stored procedure to do the merge from each bucket.

First world problems...

Notitia
답변함 2년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인