EKS Upgrade Failed from 1.13 to 1.14

0

We had a cluster created on version 1.12. We managed to upgrade it to 1.13 successfully. We also upgrade the nodes.
It ran for 2 weeks and today, we decided to upgrade it to 1.14.
The cluster upgrade from 1.13 to 1.14 was triggered from AWS EKS console. It was in 'updating' state for more than an hour before marking it as failed. We checked the errors section, it showed none.

When I check the actual cluster version using kubectl version command, it shows v1.14.9-eks-f459c0.
The AWS console still shows 1.13 and when I try to upgrade it fails. We have coredns, cni, kube-proxy all at expected versions as mentioned in https://docs.aws.amazon.com/eks/latest/userguide/update-cluster.html

Any pointers would be very much appreciated as this is a production environment.
Thanks,
Abhishek

feita há 4 anos316 visualizações
1 Resposta
0

Well, we contacted AWS support. They debugged and got back to us saying it was because the security groups per ENI limit on our account was set to 1. They increased it to 5 and then the upgrade was successful.
None of the parties are sure why the limit was set to 1.

respondido há 4 anos

Você não está conectado. Fazer login para postar uma resposta.

Uma boa resposta responde claramente à pergunta, dá feedback construtivo e incentiva o crescimento profissional de quem perguntou.

Diretrizes para responder a perguntas