EKS Upgrade Failed from 1.13 to 1.14

0

We had a cluster created on version 1.12. We managed to upgrade it to 1.13 successfully. We also upgrade the nodes.
It ran for 2 weeks and today, we decided to upgrade it to 1.14.
The cluster upgrade from 1.13 to 1.14 was triggered from AWS EKS console. It was in 'updating' state for more than an hour before marking it as failed. We checked the errors section, it showed none.

When I check the actual cluster version using kubectl version command, it shows v1.14.9-eks-f459c0.
The AWS console still shows 1.13 and when I try to upgrade it fails. We have coredns, cni, kube-proxy all at expected versions as mentioned in https://docs.aws.amazon.com/eks/latest/userguide/update-cluster.html

Any pointers would be very much appreciated as this is a production environment.
Thanks,
Abhishek

posta 4 anni fa317 visualizzazioni
1 Risposta
0

Well, we contacted AWS support. They debugged and got back to us saying it was because the security groups per ENI limit on our account was set to 1. They increased it to 5 and then the upgrade was successful.
None of the parties are sure why the limit was set to 1.

con risposta 4 anni fa

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande