Cross Account Data Copy via Inline Policy and Lambda Function

0

I want to copy S3 data from Account A to Account B via Lambda Function. Looked for different articles over internet, but only got details related to Bucket Policy. But here are my constraints:-

  1. I can't use a Bucket Policy in Account A s3 bucket. I can use Inline Policy and assume role. I just need to COPY the data from Account A to Account B.
  2. I need to setup the event based Lambda function in Account B which will act on any changes in Account A s3 bucket and replicate ( and COPY) the same to Account B s3 bucket.
  3. The source bucket (Account A) and destination bucket (Account B) including the Lambda function is in same AWS region.

I can't do any other operation in Account A apart from creating the IAM role and permission. I would like to know what are the roles and IAM permissions to create for both Lambda function and for s3 & is there any limitations for the above use case. Thanks!

1 Answer
1
Accepted Answer

Hello.

I thought that if you don't need to process files, you don't need to use Lambda.
For example, if you configure cross-account S3 replication, files will be copied from account A to account B's S3 bucket.
Is there any reason to use Lambda?

The requirements for S3 replication are described in the following documents:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/replication.html

If you absolutely need access from Lambda, you need to create an Assum role.

  1. Create an IAM role in account B that can access account B's S3 bucket.
  2. Create an IAM role in account A so that Lambda can use the IAM role in account B.
{
    "Version": "2012-10-17",
    "Statement": {
        "Effect": "Allow",
        "Action": "sts:AssumeRole",
        "Resource": "arn:aws:iam::account B ID:role/IAM role name created in No.1"
    }
}
  1. Set the following policy in the trust policy of the IAM role created in account B.
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::account A ID:role/IAM role name created in No.2"
            },
            "Action": "sts:AssumeRole"
        }
    ]
}
  1. In the Lambda code, you can access the S3 bucket in account B by obtaining the access key for the Assum role as shown below.
import boto3

def lambda_handler(event, context):

    sts_connection = boto3.client('sts')
    acct_b = sts_connection.assume_role(
        RoleArn="arn:aws:iam::222222222222:role/role-on-source-account",
        RoleSessionName="cross_acct_lambda"
    )

    ACCESS_KEY = acct_b['Credentials']['AccessKeyId']
    SECRET_KEY = acct_b['Credentials']['SecretAccessKey']
    SESSION_TOKEN = acct_b['Credentials']['SessionToken']

    # create service client using the assumed role credentials, e.g. S3
    s3 = boto3.resource('s3',
        aws_access_key_id=ACCESS_KEY,
        aws_secret_access_key=SECRET_KEY,
        aws_session_token=SESSION_TOKEN,
    )
profile picture
EXPERT
answered 7 months ago
  • Thanks Riku. I want to crawl the dataset using AWS Glue Crawler using Data Present in Account A and use the crawler to setup some ETL. The data is actually is AWS Billing data (Cost and Usage Report) which can be updated upto 2-3 times a day. Thus wanted to setup a Lambda to copy data from Account A to Account B so that we have full control over the data in our env and do avoid egress. Do you have any other suggestion ? Thanks!

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions