Lambda: save to S3 successful but no files created

0

I'm running a Python Lambda function which when ran locally, successfully saves a file to an S3 bucket. However, when ran in Lambda, there is no error output but no file is created in the target bucket.

Below is the output log:

I'm running a Lambda function which when ran locally, successfully saves a file to an S3 bucket. However, when ran in Lambda, there is no error output but no file is created in the target bucket.

Has anyone experienced this before?

Below is the output log:

2024-04-14 21:32:57 [botocore.httpsession] DEBUG: Certificate path: /var/lang/lib/python3.11/site-packages/certifi/cacert.pem
2024-04-14 21:32:57 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): my-lake.s3.eu-west-2.amazonaws.com:443
2024-04-14 21:32:57 [s3transfer.futures] DEBUG: Submitting task UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/my_json_file', 'part_number': 1, 'extra_args': {}}) to executor <s3transfer.futures.BoundedExecutor object at 0x7f4bac3f40d0> for transfer request: 0.
2024-04-14 21:32:57 [s3transfer.utils] DEBUG: Acquiring 0
2024-04-14 21:32:57 [s3transfer.futures] DEBUG: Submitting task UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/latest/my_json_file.json', 'part_number': 1, 'extra_args': {}}) to executor <s3transfer.futures.BoundedExecutor object at 0x7f4bac1e6ad0> for transfer request: 0.
2024-04-14 21:32:57 [s3transfer.utils] DEBUG: Acquiring 0
2024-04-14 21:32:57 [s3transfer.tasks] DEBUG: UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/my_json_file', 'part_number': 1, 'extra_args': {}}) about to wait for the following futures [<s3transfer.futures.ExecutorFuture object at 0x7f4bac1ca090>]
2024-04-14 21:32:57 [s3transfer.tasks] DEBUG: UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/my_json_file', 'part_number': 1, 'extra_args': {}}) about to wait for <s3transfer.futures.ExecutorFuture object at 0x7f4bac1ca090>
2024-04-14 21:32:57 [s3transfer.tasks] DEBUG: UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/latest/my_json_file.json', 'part_number': 1, 'extra_args': {}}) about to wait for the following futures [<s3transfer.futures.ExecutorFuture object at 0x7f4bac1c87d0>]
2024-04-14 21:32:57 [s3transfer.tasks] DEBUG: UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/latest/my_json_file.json', 'part_number': 1, 'extra_args': {}}) about to wait for <s3transfer.futures.ExecutorFuture object at 0x7f4bac1c87d0>
2024-04-14 21:32:58 [s3transfer.futures] DEBUG: Submitting task UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/my_json_file', 'part_number': 2, 'extra_args': {}}) to executor <s3transfer.futures.BoundedExecutor object at 0x7f4bac3f40d0> for transfer request: 0.
2024-04-14 21:32:58 [s3transfer.utils] DEBUG: Acquiring 0
2024-04-14 21:32:58 [s3transfer.tasks] DEBUG: UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/my_json_file', 'part_number': 2, 'extra_args': {}}) about to wait for the following futures [<s3transfer.futures.ExecutorFuture object at 0x7f4bac1ca090>]
2024-04-14 21:32:58 [s3transfer.tasks] DEBUG: UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/my_json_file', 'part_number': 2, 'extra_args': {}}) about to wait for <s3transfer.futures.ExecutorFuture object at 0x7f4bac1ca090>
2024-04-14 21:32:58 [s3transfer.futures] DEBUG: Submitting task UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/latest/my_json_file.json', 'part_number': 2, 'extra_args': {}}) to executor <s3transfer.futures.BoundedExecutor object at 0x7f4bac1e6ad0> for transfer request: 0.
2024-04-14 21:32:58 [s3transfer.utils] DEBUG: Acquiring 0
2024-04-14 21:32:59 [s3transfer.futures] DEBUG: Submitting task UploadPartTask(transfer_id=0, {'bucket': 'my-lake', 'key': 'my_bucket/my_json_file', 'part_number': 3, 'extra_args': {}}) to executor <s3transfer.futures.BoundedExecutor object at 0x7f4bac3f40d0> for transfer request: 0.
END RequestId: 0330e488-5847-43fa-b604-9a0634e7deca
REPORT RequestId: 0330e488-5847-43fa-b604-9a0634e7deca	Duration: 485178.65 ms	Billed Duration: 485298 ms	Memory Size: 128 MB	Max Memory Used: 129 MB	Init Duration: 118.59 ms
  • Hi Horus, to assist you better, it would be helpful to understand the permissions attached to the function as well as the code that you use to upload to S3

Horus
asked 13 days ago241 views
1 Answer
0

There are a few reasons why your Python Lambda function might be saving a file to S3 successfully locally but not when deployed:

  1. IAM Permissions: Lambda functions run in a specific IAM role. This role needs to have appropriate permissions to access and write objects to the S3 bucket. Ensure your Lambda function's IAM role has the following permissions: s3:PutObject on the specific object key within the target bucket.

Optionally, you might also need s3:GetObject permission if your code reads the file before uploading.

  1. Environment Variables: If your code uses environment variables to store S3 bucket details (e.g., bucket name), make sure these environment variables are set correctly in your Lambda function configuration when deployed. Locally, you might have them set differently.

  2. AWS Credentials: Locally, you might have AWS credentials configured (e.g., through ~/.aws/credentials file). Lambda functions don't use these by default. They rely on IAM roles for authorization. Double-check you're not accidentally relying on local credentials in your code.

profile pictureAWS
akad
answered 13 days ago
profile picture
EXPERT
reviewed 13 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions