i new redshift. below process flow:
- create csv copies of database(postgres rds) tables in s3.
- create staging tables(for etl purposes) in redshift using "create table" sql client connected redshift.
- moving data s3 redshift using copy command.
problem:
my staging table(which dropped after etl process other staging tables) has same schema source table rds. every time build new staging table have write long "create table" command , becomes frustrating when have table 100's of features. there easy method copy schema? or need change current process make easier?
we use redshift's 'create table like' command. looks like: create table staging_table (like current_table);
. has 1 shortcoming, doesn't inherit primary key , foreign key attributes of current_table
, okay live it.
look @ documentation more details: http://docs.aws.amazon.com/redshift/latest/dg/performing-a-deep-copy.html