Lambda Max Memory Used capped below available memory

0

A lambda of ours (which uses Puppeteer) has started crashing. The errors don't correlate to any code change, but do correlate to a bit of a memory spike. We got a few among them that indicate that there's no space left in the /tmp directory. The RESULT logs from the lambda show that Max Memory Usage is far below Memory Size, but it appears capped at an arbitrary number. It always reports Max Memory Usage of 580 MB, when it has 2048MB available to it.

I increased the amount of Ephemeral Storage available for the lambda, which didn't help. Additionally, I print out the size and contents of the /tmp directory using du whenever the lambda errors out, and the directory itself doesn't report being 100% full (often, it's around 25% full). I've pushed the current version of the code up to the lambda in an attempt to sort of force a refresh, which also didn't help.

Has anyone seen this before, or have an idea of what could solve it? Short of destroying the lambda and recreating it, I'm not sure how to figure out what memory issues are happening.

Mardown
質問済み 1ヶ月前162ビュー
1回答
0

First, there is no relation between memory and storage. The function doesn't use more than 580 MB probably because it doesn't need to.

With regards to storage, you need to remember that the same execution environment may be reused between invocations. This means that if you store files in /tmp and you do not delete them and the beginning/end of the invocation, your /tmp may become full eventually.

profile pictureAWS
エキスパート
Uri
回答済み 1ヶ月前
profile pictureAWS
エキスパート
レビュー済み 1ヶ月前
  • We do clean the /tmp directory, and as I mentioned it is far from full.

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ