error o100.pyWriteDynamicFrame using glue job

0

Hello, I new at glue and created a job to extract the content on a table in oracle (on premise) to S3, the oracle conection is sucessfull but whe trying to write to S3 it says: An error occurred while calling o100.pyWriteDynamicFrame. bucket name

The role hast full access to S3, the bucket is public I tried just to writing the bucket as: s3://bucket_name/ or s3://bucket_name

Not work, anyone has a light on this? Thanks

Erick
질문됨 2달 전112회 조회
4개 답변
1

The most likely cause is a misconfiguration in the S3 path, IAM role permissions, or S3 bucket policy. Double check these settings to make sure that your AWS Glue job has the necessary access to write to the S3 bucket.

Here are examples of how each of these should be configured for your AWS Glue job to write to an S3 bucket successfully:

S3 Path:

output_path = "s3://your-bucket-name/your-folder-name/"

IAM Role Permissions:

 ```json
 {
   "Version": "2012-10-17",
   "Statement": [
     {
       "Effect": "Allow",
       "Action": [
         "s3:PutObject",
         "s3:ListBucket"
       ],
       "Resource": [
         "arn:aws:s3:::your-bucket-name",
         "arn:aws:s3:::your-bucket-name/*"
       ]
     }
   ]
 }
 ```

S3 Bucket Policy:

 ```json
 {
   "Version": "2012-10-17",
   "Statement": [
     {
       "Effect": "Allow",
       "Principal": {
         "AWS": "arn:aws:iam::account-id:role/your-glue-role"
       },
       "Action": "s3:PutObject",
       "Resource": "arn:aws:s3:::your-bucket-name/*"
     }
   ]
 }
 ```
profile picture
전문가
답변함 2달 전
0

Hi, thanks for your update, I have my glue-role and has the policy : amazonfullacessS3policy which is: { "Effect": "Allow", "Action": [ "s3:", "s3-object-lambda:" ], "Resource": "*" }

so basically should have all access according to above. The bucket is public "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::007800227665:role/Glue-conexion-XXXX" }, "Action": "", "Resource": "arn:aws:s3:::demo-dwh/" }

so I think it fullfils what you suggested right?

Erick
답변함 2달 전
  • I did change bucket and got now Error Category: UNCLASSIFIED_ERROR; An error occurred while calling o108.pyWriteDynamicFrame. demo-data-raw-migracion-dwh.s3.amazonaws.com

0

Hi Erick,

Could you specify what the function you're using to perform the write to S3? Are you using a Glue sink function or are you calling a Spark function directly? Also, what transformation steps are you performing? It's possible the error you're seeing is not due to having inadequate permissions. This error is also sometimes seen with converting Glue DynamicFrames into Spark DataFrames and vice versa.

profile pictureAWS
전문가
답변함 2달 전
  • My job has 2 boxes, one the oracle datasource and the other is the the data target S3 bucket , that is all, Perhaps as you mentioned something is missing about spark and dynamicframes, I dont´t really know.

0

I suspect the issue is that you are trying to write at the root of the bucket, which even if possible is not a good practice. Try something like: 3://bucket_name/my_data/

profile pictureAWS
전문가
답변함 2달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠