By using AWS re:Post, you agree to the AWS re:Post Terms of Use

S3 actions accross regions

0

I have two accounts with AWS. I want to copy a folder of objects from one S3 bucket to another. The source bucket is in Canada and the Destination is in US-East-1. I have the profiles configured and confirmed working on their respective accounts. I'm getting a 403-forbidden error when calling the HeadObject operation.

This is the code: """ s3 copy """ import boto3 from botocore.exceptions import ClientError

def create_s3_client(aws_profile, region_name): session = boto3.Session(profile_name=aws_profile, region_name=region_name) return session.client('s3')

def copy_files_between_buckets(source_s3_client, dest_s3_client, source_bucket, dest_bucket, prefix=''): try: objects = source_s3_client.list_objects_v2(Bucket=source_bucket, Prefix=prefix) for obj in objects.get('Contents', []): copy_source = { 'Bucket': source_bucket, 'Key': obj['Key'] } file_key = obj['Key'] print(f"Copying {file_key} to {dest_bucket}") dest_s3_client.copy(copy_source, dest_bucket, file_key) except ClientError as e: print(f"An error occurred: {e}")

def main(): source_profile = 'dair' source_region = 'ca-central-1' source_bucket = 'dource-bucket'

dest_profile = 'default'
dest_region = 'us-east-1'
dest_bucket ='destin-bucket'

source_s3_client = create_s3_client(source_profile, source_region)
dest_s3_client = create_s3_client(dest_profile, dest_region)

copy_files_between_buckets(source_s3_client, dest_s3_client, source_bucket, dest_bucket, 'corephotos/')

if name == "main": main()

Any thoughts would be greatly appreciated.

1 Answer
4
Accepted Answer

It's a permissions issue, but I think you already know that.

You've got the account with the source bucket (in Canada), and the account with the target bucket (in us-east-1).

Which account is your Python script being run by? This bit here boto3.Session(profile_name=aws_profile, region_name=region_name)

If these are the credentials of a user in the account which contains the source bucket, then the bucket policy of the target bucket in the target account needs to explicitly grant sufficient privileges to this user in the source account.

(or the other way around, if the credentials used in the script are in the account with the target bucket, then the source bucket policy needs to grant sufficient privileges)

If KMS keys are being used then it's the same story there, the key policy has to explicitly grant access to the user.

This link will really help you https://docs.aws.amazon.com/AmazonS3/latest/userguide/troubleshoot-403-errors.html

profile picture
EXPERT
answered 8 months ago
profile picture
EXPERT
reviewed 8 months ago
profile picture
EXPERT
reviewed 8 months ago
  • Steve, Thanks. I figured it was some permission thing, yes. What I was doing was using both profile in a source/destination fashion and then chasing IAM permissions. I might be making this too hard. The code is on my mac and the AWS CLI config has both profiles with proper, tested keys. So, I think I'm hearing that I can simplify this by picking one profile and giving that IAM user the right permissions. The destination S3 is associated with the default profile so I can just give that user on that account the permissions to read/write for the two buckets. Do I have this right? Thanks for the answer and the link.

  • One other point. I did a quick cp from my terminal with the default account and it wrote a file to the destination bucket, no problem. Armed with that, I'll dig into the documentation, thanks again, and get the right permissions to get the objects from the source. Fun times!

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions