Setting ACL in S3 objects written by an AWS Glue Job

0

I came across the following issue:

I run a Glue job from account A to write into an S3 bucket in account B. This meant that the owner of the object is account A and I couldn't do anything with the objects from account B.

Is there a way to tell the Glue job to apply an ACL with full control for the bucket owner?

profile pictureAWS
ESPERTO
Tasio
posta 5 anni fa922 visualizzazioni
1 Risposta
0
Risposta accettata

You can use Hadoop java configurations in a spark job to update the S3 canned acls. I haven't tested this with Glue Dynamic Frames, but it works for native Spark DataFrames.

import sys
from pyspark import SparkConf
from pyspark.context import SparkContext
from pyspark.sql.functions import *
from pyspark.sql import SQLContext, Row


sc = SparkContext()
sc._jsc.hadoopConfiguration().set("fs.s3.canned.acl", "BucketOwnerFullControl")
AWS
con risposta 5 anni fa

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande