Hello there.
I'm currently using EKS v1.11.5 (control plane only) and just noticed that one of my pod failed a week ago.
Kubernetes automatically created a new pod right away, so there was no downtime, but still want to debug what really happened.
I tried to get logs from the failed pod but it only returned me container
"<deployment-name>" in pod "<pod-name>" is not available
Is there any way to track down this problem deeply? Like kubernetes audit-logging from control plane?
Thanks.
- Maybe something left on syslog of worker nodes, but I blocked ssh connection to nodes for security reasons. so it's not possible for now.