Overnight, I lost access to my Pods/Containers via my k9s tool. As far as I know, no changes were made.
I can still see several resources - nodes, pods, containers - listed in my namespace, but I can no longer access the logs or shell into the containers.
In the developer console, the nodes for this node group show unknown status in the EKS Service console.
I tried updating the Kubernetes version in the console and got the following error:
Trying to check out the cluster's IAM Access Role, EKSClusterRoleLatest fails to resolve:
My user appears to be okay in permissions for kubectl commands:
➜ ~ kubectl auth can-i get pods/exec
yes
➜ ~ kubectl auth can-i create pods/exec
yes
It seems some of the service accounts are having problems.
It seems creating my own EKSClusterRoleLatest with the EKS Cluster Role restored access. I believe this role likely is standard/populated by AWS and not typically a user created role. Unsure how that Role was lost and why my Cluster was still pointed at it.