I have an old eks cluster I don't know how it got build and what are the configurations of it, but there one employee in the company that can access this cluster and he doesn't know how to give my IAM user access to this cluster.
I tried to create cluster role binding and bind my IAM user to cluster-admin cluster role which is a built-in role.
And I tried to map my user also in aws-auth config map, but all of these tries didn't get any changes.
Can you help with your suggestions?
Related
I have an EKS cluster where my aws-auth configmap has the aws roles to access the cluster. However, the aws roles mapped to it are all deleted. Now I cannot connect to the cluster.
I tried creating the same roles with the same name but because it was AWS SSO created role, I cannot re-use the name. AWS doesn't allow it.
Can someone please help me out of this please?
You can find previous SSO role name from cloudtrail event logs. Once you find the exact name create the iam role with same name.
I've created an EKS cluster with an IAM role and I'm trying to access the cluster from my CI build agent which is present in a different account.
My CI build ec2 instance is not being to able identify the EKS cluster when I attach an Instance profile and a trust relationship with the role used to create the EKS cluster.
But when I go with User authentication, I'm getting the expected output.
I want it to be accessible with the IAM role instead of the IAM User.
Any thoughts or help would be appreciated.
Note: The EKS cluster is created using Terraform with an Assume role authentication.
This is a follow up question to my post AWS IAM user that belongs to an IAM group cannot assume IAM role that the IAM group was allowed to assume?, which has an answer.
So now my cijenkins user can issue kubectl commands on the EKS cluster. However, if I log into AWS console and access the EKS cluster there, I see. Why?
I have a EKS cluster created with eksctl cli tool.
Now the user or role which was used to create the EKS cluster got deleted from AWS IAM and I haven't added any other user the permission to access Kubernetes resources inside the cluster.
I have admin access to my AWS account. Is there a way to get the access to kubernetes cluster resources running inside EKS?
I tried the solution provided in the below article and it didn't workout as the IAM user and role was deleted from AWS: https://aws.amazon.com/premiumsupport/knowledge-center/amazon-eks-cluster-access/
Help would be appreciated. Thanks
I believe there is a simple solution:
Simply create the IAM User or IAM Role with the same name as previously used for the cluster. Then generate and export User credentials or assume the IAM Role.
Then simply run
aws eks update-kubeconfig --name ${cluster_name}
and then you should be able to perform actions against API using kubectl.
I have rancherOS running on my bootstrap node from where I want to launch aws EC2 containers. I have to switch the roles in my aws account to launch instances eg: I will be logged in as a user and have to change to admin role. On the rancher UI I can give my access and secret access keys and it is directly pointing me to use the VPC and subnets of 'user' IAM role instead I want to use 'admin' IAM role VPC and subnets. How can I change the roles in this case?
Following is where I got stuck, there I want to switch to another role in aws
You can use the same role switch same as you do with other linux systems,
http://docs.aws.amazon.com/cli/latest/userguide/cli-roles.html
Append your rolename in the configuration and cli takes care from there.
In your cli use which profile you want to use, it will assume his role.
Hope it helps.