I'll refer you to the doc. page that has most of what you need here: https://docs.aws.amazon.com/redshift/latest/dg/r_CREATE_TABLE_NEW.html
If the PostgreSQL DB is the true source for this data then I recommend the following:
- BIGSERIAL -> BIGINT as they are both INT8. If you source the rows from PostgreSQL you don't need the auto generating value creation of BIGSERIAL in Redshift. Otherwise Redshift has the IDENTITY column attribute to generate values.
- JSON -> VARCHAR. Of course you'll have to use Redshift's JSON function to do anything with the JSON data.
Thanks Kurt, much appreciated.
Does DMS support updating value of sequences in the target for postgres to postgres data migration?asked 6 months ago
How to import Postgres data that has JSON columns?asked 2 years ago
Redshift Federated Query (RDS Postgres) Long Varchar Value Error (code 25101)asked 6 months ago
Migrating partitioned table from postgres to Redshift with pglogicalasked 3 months ago
Redshift data APIasked 4 days ago
Redshift varchar(max) not enough to store json data type column from Postgresasked 9 months ago
import tables with large data columns from redshift federated schema into local tablesasked 2 months ago
Importing json that has a nested arrayAccepted Answerasked 4 years ago
[Pandas] How to write data into JSON column of Postgres SQLasked 4 months ago
Use RDS Postgres Replicas as a clusterAccepted Answerasked 10 months ago