Unable to connect to role-based EKS cluster w/ kubectl

0

I'm starting to set up brand new infrastructure, and I'm unable to use kubectl to connect to a new EKS cluster.

As my root user I created an EKS cluster and a node group, each with their own roles as described here and here .

The EKS's auth config map specifies the role as expected: Enter image description here

I've updated the trusted entities for the role:

Enter image description here

And I've updated the user's policies to assume the role:

Enter image description here

I've got aws cli installed and I've created an access key for the user, however when I update the kubeconfig I get a nondescript unauthorized error

Enter image description here

Based off of the debugging instructions here, it looks like the EKS cluster above already has a rolearn that should match the role that I've created. Is it required that I also add users explicitly to the configmap if they weren't the user that created the cluster?

已提問 1 年前檢視次數 274 次
1 個回答
0

The error may be related to the path of the kubeconfig. Can u please check the path of the kubeconfig. and if this seems to be correct then This could be because the cluster was created with credentials for one IAM principal and kubectl is using credentials for a different IAM principal. To get more information about this refer to this: Unauthorized or access denied (kubectl).

profile picture
已回答 1 年前
  • Hi, it's not the kubeconfig path - I've used kubectl with other providers for a while no w/ no issues.

    The troubleshooting page that you linked recommends creating a kubeconfig using a role. I attempted this (my last screenshot) and it still didn't authenticate. I've fixed it in the short term by adding my low priv user to the cluster directly with eksctl and a userarn mapUsers entry, but my question is can I authenticate directly with the role as the docs imply.

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南