How to import Postgres data that has JSON columns?

0

Hi everyone,

We're interested in copying data from our Postgres database into a Redshift cluster, however we use a couple of redshift-unsupported column types in a few of our tables.

One is a bigserial (how are the rest of you handling that one?) and the really tough one to swallow is one of our columns is a JSON type.

That JSON column has a lot of important data for us. May I ask how others here are dealing with the challenge of getting that data into redshift, any recommended approaches? Thanks much.

larryq
已提問 4 年前檢視次數 383 次
2 個答案
0

Hi larryq,

I'll refer you to the doc. page that has most of what you need here: https://docs.aws.amazon.com/redshift/latest/dg/r_CREATE_TABLE_NEW.html

If the PostgreSQL DB is the true source for this data then I recommend the following:

  1. BIGSERIAL -> BIGINT as they are both INT8. If you source the rows from PostgreSQL you don't need the auto generating value creation of BIGSERIAL in Redshift. Otherwise Redshift has the IDENTITY column attribute to generate values.
  2. JSON -> VARCHAR. Of course you'll have to use Redshift's JSON function to do anything with the JSON data.

Regards,
-Kurt

klarson
已回答 4 年前
0

Thanks Kurt, much appreciated.

larryq
已回答 4 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南