Pass a /tmp/<file> from a step function task to next task

0

I have the state function state machine config looks like below: { "StartAt": "DownloadAndValidate", "States": { "DownloadAndValidate": { "Type": "Task", "Resource": "${data.aws_lambda_function.download_and_validate_lambda.arn}", "Next": "ExtractAndUpload", "Catch": [ { "ErrorEquals": ["States.ALL"], "Next": "HandleError" } ] }, "ExtractAndUpload": { "Type": "Task", "Resource": "${data.aws_lambda_function.extract_and_upload_lambda.arn}", "InputPath": "$", "Next": "UpdateDatabase", "Catch": [ { "ErrorEquals": ["States.ALL"], "Next": "HandleError" } ] } }

Here at DownloadAndValidate, I download a .tar file from s3 bucket to /tmp folder and validate. if everything looks good I want to pass the file name to the next task ExtractAndUpload, which does the extraction of tar file continue processing. Currently I am getting error that the file downloaded by the first task not available in next task. Is it possible for a file downloaded from s3 bucket to the /tmp folder by a lambda function make available to the next lambda function to process it?

Suresh
asked 13 days ago91 views
2 Answers
0

Hello there - I don't believe /tmp can be shared across different Lambdas:

There is also a local file system available at /tmp for all Lambda functions. This is local to each function but shared across invocations within the same execution environment. If your function must access large libraries or files, these can be downloaded here first and then used by all subsequent invocations. This mechanism provides a way to amortize the cost and time of downloading this data across multiple invocations. https://docs.aws.amazon.com/lambda/latest/operatorguide/execution-environment.html

David
answered 12 days ago
profile picture
EXPERT
reviewed 12 days ago
0

It is not possible to pass data from /tmp in one function to the other directly. You have a options:

  1. If your files size is smaller than 256KB, you can return the content from the function and pass it in the state machine payload.
  2. If it is larger, pass the name of the object in S3, and let the second function read the object again.
profile pictureAWS
EXPERT
Uri
answered 11 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions