How can I troubleshoot Lambda function cold start issues?

4 minute read
1

My AWS Lambda function is experiencing a high latency cold start duration.

Resolution

The initial setup of a Lambda function request for the environment and code are referred to as a cold start time or startup latency. To minimize the cold start time and latency of your Lambda function, follow these instructions for your use case.

Lambda function code and configuration best practices

  • Increase the memory allocated to the Lambda function. The cold start range depends on the size of your function, the amount of memory that you allocated, and the complexity of your code. Adding more memory proportionally increases the amount of CPU, increasing the overall computational power available. For more information, see Memory and computing power.
  • Minimize the size of your deployment package. The smaller your deployment package, the faster your function starts up. Minimize the number of dependencies and external libraries that your function imports, and keep your deployment package size under 50 MB. For more information, see Lambda deployment packages.
  • Optimize your Lambda function code to minimize the time that it takes to initialize. Reduce the number of dependencies and external libraries that your function imports. Reduce the amount of code run during initialization.
  • Avoid complex computation at startup. If your function requires complex computation at startup, such as loading large datasets, you can perform this in the background. Run the computation in a background thread during the Init phase. Then, cache the results for subsequent invocations. Caching the results helps reduce the time required for complex computation at startup.
  • Reuse Amazon Relational Database Service (Amazon RDS) database connections. If your function connects to an Amazon RDS database, you can create an Amazon RDS Proxy database proxy for your function. A database proxy manages a pool of database connections. Reusing Amazon RDS database connections reduces the time to establish a connection for each time your function's invoked.
  • Configure provisioned concurrency. Functions using provisioned concurrency don't show cold start behavior because the execution environment is prepared before invocation. You can specify the number of function instances to keep warm and manage traffic.
  • Minimize the complexity of your dependencies. Use simple frameworks that load quickly on execution environment startup.
  • To reduce the time that it takes Lambda to unpack deployment packages authored in Java, put your dependency .jar files in a separate /lib directory. Separating the .jar files is faster than putting all your function’s code in a single jar with a large number of .class files. For more information, see Deploy Java Lambda functions with .zip or JAR file archives.
  • Use monitoring to discover issues and observability to discover why. Monitor the cold start performance of your functions, and use Lambda Insights in Amazon CloudWatch to troubleshoot performance issues.

For more information, see Best practices for working with Lambda functions.

Long INIT duration in provisioned concurrency

In on-demand Lambda functions, the static initializer is run after a request is received but before the handler is invoked. This results in latency for the requester and contributes to the overall cold start duration.

  • Determine whether your function concurrency exceeds the configured level of provisioned concurrency. You can check the number of times your function is invoked using the ProvisionedConcurrencySpilloverInvocations CloudWatch metric. A non-zero value indicates that all provisioned concurrency is in use and some invocation occurred with a cold start.
  • Check your invocation frequency (requests per second). Functions with provisioned concurrency have a maximum rate of 10 requests per second per provisioned concurrency (Lambda API requests). For example, a function configured with 100 provisioned concurrencies can handle 1000 requests per second. If the invoke rate exceeds 1000 requests per second, some cold starts might occur. For more information, see Lambda: Cold starts with provisioned concurrency.

For more information, see How do I troubleshoot Lambda function provisioned concurrency issues?

Amazon API Gateway requests integrated with Lambda

If you're using Lambda with API Gateway and you see a high IntegrationLatency metric, review your Lambda function's CloudWatch Logs. High latency must be addressed when an API endpoint that’s integrated with a Lambda function takes too long to send responses to a client. Cold starts in Lambda functions aren't recorded in the function's duration metric, so your API's integration latency might be longer than the function's duration. To see the duration of your function with a cold start, use AWS X-Ray.

For more information, see How do I troubleshoot high latency in my API Gateway requests that are integrated with Lambda?

Related information

How do I reduce initialization and invocation duration latency for my Java Lambda function?

Improving startup performance with Lambda SnapStart for Java 11 Runtime

Operating Lambda: Performance optimization – Part 1

AWS OFFICIAL
AWS OFFICIALUpdated a year ago