error o100.pyWriteDynamicFrame using glue job

0

Hello, I new at glue and created a job to extract the content on a table in oracle (on premise) to S3, the oracle conection is sucessfull but whe trying to write to S3 it says: An error occurred while calling o100.pyWriteDynamicFrame. bucket name

The role hast full access to S3, the bucket is public I tried just to writing the bucket as: s3://bucket_name/ or s3://bucket_name

Not work, anyone has a light on this? Thanks

Erick
asked 2 months ago98 views
4 Answers
1

The most likely cause is a misconfiguration in the S3 path, IAM role permissions, or S3 bucket policy. Double check these settings to make sure that your AWS Glue job has the necessary access to write to the S3 bucket.

Here are examples of how each of these should be configured for your AWS Glue job to write to an S3 bucket successfully:

S3 Path:

output_path = "s3://your-bucket-name/your-folder-name/"

IAM Role Permissions:

 ```json
 {
   "Version": "2012-10-17",
   "Statement": [
     {
       "Effect": "Allow",
       "Action": [
         "s3:PutObject",
         "s3:ListBucket"
       ],
       "Resource": [
         "arn:aws:s3:::your-bucket-name",
         "arn:aws:s3:::your-bucket-name/*"
       ]
     }
   ]
 }
 ```

S3 Bucket Policy:

 ```json
 {
   "Version": "2012-10-17",
   "Statement": [
     {
       "Effect": "Allow",
       "Principal": {
         "AWS": "arn:aws:iam::account-id:role/your-glue-role"
       },
       "Action": "s3:PutObject",
       "Resource": "arn:aws:s3:::your-bucket-name/*"
     }
   ]
 }
 ```
profile picture
EXPERT
answered 2 months ago
0

Hi, thanks for your update, I have my glue-role and has the policy : amazonfullacessS3policy which is: { "Effect": "Allow", "Action": [ "s3:", "s3-object-lambda:" ], "Resource": "*" }

so basically should have all access according to above. The bucket is public "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::007800227665:role/Glue-conexion-XXXX" }, "Action": "", "Resource": "arn:aws:s3:::demo-dwh/" }

so I think it fullfils what you suggested right?

Erick
answered 2 months ago
  • I did change bucket and got now Error Category: UNCLASSIFIED_ERROR; An error occurred while calling o108.pyWriteDynamicFrame. demo-data-raw-migracion-dwh.s3.amazonaws.com

0

Hi Erick,

Could you specify what the function you're using to perform the write to S3? Are you using a Glue sink function or are you calling a Spark function directly? Also, what transformation steps are you performing? It's possible the error you're seeing is not due to having inadequate permissions. This error is also sometimes seen with converting Glue DynamicFrames into Spark DataFrames and vice versa.

profile pictureAWS
EXPERT
answered a month ago
  • My job has 2 boxes, one the oracle datasource and the other is the the data target S3 bucket , that is all, Perhaps as you mentioned something is missing about spark and dynamicframes, I dont´t really know.

0

I suspect the issue is that you are trying to write at the root of the bucket, which even if possible is not a good practice. Try something like: 3://bucket_name/my_data/

profile pictureAWS
EXPERT
answered a month ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions