Unable to connect to role-based EKS cluster w/ kubectl

0

I'm starting to set up brand new infrastructure, and I'm unable to use kubectl to connect to a new EKS cluster.

As my root user I created an EKS cluster and a node group, each with their own roles as described here and here .

The EKS's auth config map specifies the role as expected: Enter image description here

I've updated the trusted entities for the role:

Enter image description here

And I've updated the user's policies to assume the role:

Enter image description here

I've got aws cli installed and I've created an access key for the user, however when I update the kubeconfig I get a nondescript unauthorized error

Enter image description here

Based off of the debugging instructions here, it looks like the EKS cluster above already has a rolearn that should match the role that I've created. Is it required that I also add users explicitly to the configmap if they weren't the user that created the cluster?

1개 답변
0

The error may be related to the path of the kubeconfig. Can u please check the path of the kubeconfig. and if this seems to be correct then This could be because the cluster was created with credentials for one IAM principal and kubectl is using credentials for a different IAM principal. To get more information about this refer to this: Unauthorized or access denied (kubectl).

profile picture
답변함 일 년 전
  • Hi, it's not the kubeconfig path - I've used kubectl with other providers for a while no w/ no issues.

    The troubleshooting page that you linked recommends creating a kubeconfig using a role. I attempted this (my last screenshot) and it still didn't authenticate. I've fixed it in the short term by adding my low priv user to the cluster directly with eksctl and a userarn mapUsers entry, but my question is can I authenticate directly with the role as the docs imply.

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠