is there way provide either multiple credentials or specify other s3://
path redshift copy
load jsonpaths file?
i have read-only access bucket outside of control, someone_elses_bucket
in example. documentation redshift states 2 options json 'auto' or s3 path:
copy example 's3://someone_elses_bucket/data' credentials 'aws_access_key_id=someone_elses_bucket_foo;aws_secret_access_key=someone_elses_bucket_bar' dateformat 'auto' truncatecolumns json 's3://my_bucket/redshift_json.json' gzip;
as not have access someone_elses_bucket
forced put jsonpaths file in account. i'm getting s3serviceexception:access denied,status 403,error accessdenied
errors. file in bucket open public. either i'm doing wrong or copy using iam role on other account limit access. in event have no way supply jsonpaths file making impossible ingest formatted data. auto
not work.
here my bucket policy giving access account on 'someone_elses_bucket` (account number replaced example):
{ "version": "2012-10-17", "statement": [ { "sid": "example permissions", "effect": "allow", "principal": { "aws": "arn:aws:iam::123456789:user/jerdak" }, "action": [ "s3:getbucketlocation", "s3:listbucket" ], "resource": [ "arn:aws:s3:::my_bucket" ] } ] }
no, can't supply more 1 aws credentials in 1 copy
command. whatever credential supply, redshift use access both jsonpaths file , s3 data.
options can think of:
- do access
someone_elses_bucket
iam user? if so, provide read access iam user bucket , use credentials incopy
command. - run pre-processing step copy data
someone_elses_bucket
yours, ,copy
using own aws credentials.