How can I copy files from one Amazon S3 bucket to another using a Lambda function?

Lesedauer: 4 Minute

I want to use an AWS Lambda function to copy files from an Amazon Simple Storage Service (Amazon S3) bucket to another bucket.


Follow these steps to create a Lambda function that copies files uploaded to a source Amazon S3 bucket to a destination S3 bucket.

Step 1: Create the source and destination Amazon S3 buckets

Note: If you already created the source and destination S3 buckets, you can skip this step.

1.    Open the Amazon S3 console, and choose Create bucket.

2.    For Bucket name, enter a name for the source bucket.

3.    Choose the AWS Region dropdown list, and choose your AWS Region.

4.    Choose Create bucket.

5.    Repeat steps 1-4 for your destination bucket.

For more information, see Creating a bucket.

Step 2: Creating the Lambda function

1.    Open the Functions page in the Lambda console.

2.    Choose Create function, and then choose Author from scratch.

3.    For Function name, enter a name for your function.

4.    Choose the Runtime dropdown list, and then choose Python 3.9.

5.    Expand Change default execution role, and then choose Create a new role with basic permissions.

6.    Choose Create function.

7.    Choose the Code tab, and paste the following JSON code:

Note: You can get the source_bucket name from the event object received by the Lambda function. The destination_bucket name can be stored as an environment variable.

import boto3
import botocore
import json
import os
import logging
logger = logging.getLogger()

s3 = boto3.resource('s3')

def lambda_handler(event, context):"New files uploaded to the source bucket.")
    key = event['Records'][0]['s3']['object']['key']
    source_bucket = event['Records'][0]['s3']['bucket']['name']
    destination_bucket = os.environ['destination_bucket']
    source = {'Bucket': source_bucket, 'Key': key}
        response = s3.meta.client.copy(source, destination_bucket, key)"File copied to the destination bucket successfully!")
    except botocore.exceptions.ClientError as error:
        logger.error("There was an error copying the file to the destination bucket")
        print('Error Message: {}'.format(error))
    except botocore.exceptions.ParamValidationError as error:
        logger.error("Missing required parameters while calling the API.")
        print('Error Message: {}'.format(error))

8.    Choose Deploy.

Lambda creates an execution role that grants the function permission to upload logs to Amazon CloudWatch. For more information, see Create a Lambda function with the console.

Step 3: Create an Amazon S3 trigger for the Lambda function

1.    Open the Functions page in the Lambda console.

2.    In Functions, choose the Lambda function that you previously created.

3.    In Function overview, choose Add trigger.

4.    Choose the Trigger configuration dropdown list, and then choose S3.

5.    In Bucket, enter the name of your source bucket.

6.    Choose the Event type dropdown list, and then choose All object create events.

7.    Select the I acknowledge that using the same S3 bucket for both input and output is not recommended agreement, and then choose Add.

Note: Amazon S3 uses different event types to upload objects (POST, PUT, Multipart Upload). You can configure this from Step 6 for your use case.

For more information, see Tutorial: Using an Amazon S3 trigger to invoke a Lambda function.

Step 4: Provide AWS Identity and Access Management (IAM) permissions for the Lambda function's execution role

You must add IAM permissions for the Lambda function's execution role to copy files to the destination S3 bucket similar to the following resource-based policy:


  • Replace destination-s3-bucket with your S3 destination bucket and source-s3-bucket with your S3 source bucket.
  • Replace /* at the end of the resource ARN with the required prefix value for your environment to limit permissions.
  • It's a best practice to grant least privilege for only the permissions required to perform a task. For more information, see Grant least privilege.
  "Version": "2012-10-17",
  "Statement": [
      "Sid": "putObject",
      "Effect": "Allow",
      "Action": [
      "Resource": [
      "Sid": "getObject",
      "Effect": "Allow",
      "Action": [
      "Resource": [

For more information, see Granting function access to AWS services.

Related information

How do I troubleshoot 403 Access Denied errors from Amazon S3?

How do I allow my Lambda execution role to access my Amazon S3 bucket?

AWS OFFICIALAktualisiert vor 21 Tagen
Keine Kommentare