How to debug pod failure on EKS?

0

Hello there.

I'm currently using EKS v1.11.5 (control plane only) and just noticed that one of my pod failed a week ago.

Kubernetes automatically created a new pod right away, so there was no downtime, but still want to debug what really happened.

I tried to get logs from the failed pod but it only returned me container

"<deployment-name>" in pod "<pod-name>" is not available

Is there any way to track down this problem deeply? Like kubernetes audit-logging from control plane?

Thanks.

  • Maybe something left on syslog of worker nodes, but I blocked ssh connection to nodes for security reasons. so it's not possible for now.
kycfeel
已提问 6 年前613 查看次数
1 回答
0

As long as you have not removed the pod you could do

kubectl describe pod ....

It should indicate why the pod failed, could be due to OOM or other resource failure etc.

已回答 6 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则