unable to use table_import_from_s3 with any path

0

I just upgraded from postgres 11.8 to 12.4 and I can not figure out how to call aws_s3.table_import_from_s3 on an object in a directory.
This works

SELECT aws_s3.table_import_from_s3(
'my_table', '', '(format csv)',
aws_commons.create_s3_uri('ai-brace-dev-brace-athena-results',
'x.csv',
'us-east-2')
);

whenever i have a directory in the path like this:

SELECT aws_s3.table_import_from_s3(
'my_table"', '', '(format csv)',
aws_commons.create_s3_uri('ai-brace-dev-brace-athena-results',
'6/x.csv',
'us-east-2')
);

I get this error:
ERROR: HTTP 403. Permission denied. Check bucket or provided credentials as they may no longer be valid.

The second command functioned fine before 12.4. I have tried urlencoding the filename and nothing helps.

Edited by: chrisgalardi on Oct 1, 2020 3:07 PM

Edited by: chrisgalardi on Oct 1, 2020 3:13 PM

Edited by: chrisgalardi on Oct 1, 2020 3:15 PM

Edited by: chrisgalardi on Oct 1, 2020 3:17 PM

10개 답변
0

https://forums.aws.amazon.com/thread.jspa?messageID=957689&#957689 - had success with importing when the object is directly within the bucket you might want to try that.

답변함 4년 전
0

We have the same problem using AWS Aurora PostgreSQL compatible DB with engine v.11.8.

Some things we tried without success ...

  1. URL-encoding characters other than letters and numbers in the file path.
  2. Using Unicode U&'002F' instead of the forward slash (/) character.
  3. Explicitly adding the "folder" in a VPC Endpoint policy as a "Resource".
    "arn:aws:s3:::<bucket>/<folder>/*"
  4. Using PostgreSQL dollar quoting ($$) instead of the single quote (') character around the file path string.
  5. Listing the bucket, file path, region as separate arguments to the table_import_from_s3() function.
  6. Using aws_commons.create_s3_uri(<bucket>,<file path>,<region>) as a single argument to table_import_from_s3().
  7. Placing a copy of the file in a new folder directly under the bucket "root" with a very simple 7-letter name.

We can import data without any problem IF the file is at the "root" of the bucket.

답변함 3년 전
0

Wow. This is a really bad P1 bug. Any updates AWS?

mjgp
답변함 3년 전
0

We're experiencing the same s3 permission denied error after upgrade to 11.8.

답변함 3년 전
0

Has there been any update on this?
We are also encountering the same issue.

leahcim
답변함 3년 전
0

We started a new service that used an aurora postgres db with engine_version 11.8.

We we're unable to do s3import from subfolders regardless how we setup the policies for our database iam role. Importing from root of the bucket worked.

We recreated the database with version 11.7 and now things work as expected. Not ideal if you have a production environment though.

fredc
답변함 3년 전
0

My work around was to copy the file to the root of the bucket and then perform the aws_s3.table_import_from_s3. Later, deleted the file from root.

답변함 3년 전
0

UPDATE: (2021-01-11) Upgrading to Aurora PostgreSQL-compatible engine v. 11.9 and ensuring that the path presented to aws_s3.table_import_from_s3() does not begin with a forward slash ("/") enabled successful load of data from .csv.gz files in S3 buckets outside the root (e.g. path has multiple forward slashes) to Aurora DB tables.

답변함 3년 전
0

I updated my Postgres RDS to 11.9 and still am receiving the error: InternalError_: HTTP 403. Permission denied. Check bucket or provided credentials as they may no longer be valid.

I have double and tripled-checked that the IAM role and policy for this have been create properly.
The query I am using is this:

select aws_s3.table_import_from_s3('table1', '', '(format csv)', '<s3-bucket-name>', 'crunchbase/out/cb-20201008.csv', 'us-east-2' );  

I also have tried this version but get the same result:

select aws_s3.table_import_from_s3('table1', '', '(format csv)',   
                                                            aws_commons.create_s3_uri('<s3-bucket-name>','crunchbase/out/cb-20201008.csv', 'us-east-2')  
                                                            );  

I am not using the Aurora RDS, just a basic Postgres RDS database. If you could provide any insight into how to fix this I would greatly appreciate it!

답변함 3년 전
0
답변함 3년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠