AWS IAM Role in EC2 and access to S3 from JupyterHub - amazon-web-services

In JupyterHub, installed in an EC2 instance with an IAM role which allows access to a specific S3 bucket when I try to access a file in that bucket with this code:
s3nRdd = spark.sparkContext.textFile("s3n://bucket/file")
I get this error:
IllegalArgumentException: u'AWS Access Key ID and Secret Access Key
must be specified as the username or password (respectively) of a s3n
URL, or by setting the fs.s3n.awsAccessKeyId or
fs.s3n.awsSecretAccessKey properties (respectively).'
However, when I export the AWS access key id and secret access key in the kernel configuration having the same policy as that role, the read for that file succeeds.
As the best practice is to use IAM roles, why doesn't the EC2 role work in this situation?
--update--
The EC2 IAM role has these 2 policies:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1488892557621",
"Action": "s3:*",
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::<bucket_name>",
"arn:aws:s3:::<bucket_name>/*"
]
}
]
}
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "ec2:*",
"Effect": "Allow",
"Resource": "*"
},
{
"Sid": "Stmt1480684159000",
"Effect": "Allow",
"Action": [
"iam:PassRole"
],
"Resource": [
"*"
]
}
]
}
Also, I am using hadoop version 2.4.0 which doesn't support s3a protocol and updating is not an option.

S3n doesn't support IAM roles, and 2.4 is a very outdated version anyway. Not as buggy as 2.5 when it comes to s3n, but still less than perfect.
If you want to use IAM roles, you are going to have to switch to S3a, and yes, for you, that does mean upgrading Hadoop. sorry.

You must create a bucket policy to allow access from particular IAM roles. Since S3 doesn't trust the roles, the API just fallback and ask for access key.
Just add soemthing like this in your bucket policy, replace all the custom <> parameter with your own values.
{
"Version": "2012-10-17",
"Id": "EC2IAMaccesss",
"Statement": [{
"Sid": "MyAppIAMRolesAccess",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::<acc_id>:role/<yourIAMroleName>"
]
},
"Action": [
"s3:ListBucket",
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::<yourbucket>/*",
"arn:aws:s3:::<yourbucket>"
]
}
]
}
(updates)
Make sure you give proper policy to the EC2 IAM Roles, because IAM roles is very powerful, no Policy is attach to it out of the box. You must assign a policy, e.g. for minimal S3 access, add AWSS3ReadOnly policy to the roles.
You may encounter issues of spark problematic interaction with IAM roles. Please check the documentation on spark access through s3n:// schema. Otherwise, use s3a://

Related

sam pipeline bootstrap created an omnipotent role

In the CI/CD section of the AWS SAM tutorial workshop, when I ran
sam pipeline init --bootstrap and went through the configurations, a role was created with this policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "*",
"Resource": "*",
"Effect": "Allow"
}
]
}
Doesn't this grant the role complete permission over my AWS account which is a big no no? Or is it fine because the permission is granted to an AWS service, and not a user?
This is the trust relationship:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "cloudformation.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
Having a role that exists with those permissionsis fine.
When you create a vanilla AWS Account (in other words I am not including those created by enterprise landing zones like Control Tower) it comes with a policy called AdministratorAccess and a role called Administrator.
The best practice is in who or what you allow to use that policy and when.
Roles are preferred over users, since roles provide security credentials. With a user you have durable credentials you need to secure.
In this case you are allowing CloudFormation to assume this role. This makes sense since CloudFormation often needs to be able to create and modify any resources including IAM roles. If you know you will not be creating or modifying IAM resources you can user a more restrictive role (least privilege), for example using the PowerUserAccess policy which looks like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"NotAction": [
"iam:*",
"organizations:*",
"account:*"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"iam:CreateServiceLinkedRole",
"iam:DeleteServiceLinkedRole",
"iam:ListRoles",
"organizations:DescribeOrganization",
"account:ListRegions"
],
"Resource": "*"
}
]
}

How to add sagemaker createApp to user profile executionrole?

I created a aws sagemaker user profile using terraform. I tried to launch the sagemaker studio from the user profile but was confronted with this error: SageMaker is unable to use your associated ExecutionRole [arn:aws:iam::xxxxxxxxxxxx:role/sagemaker-workshop-data-ml] to create app. Verify that your associated ExecutionRole has permission for 'sagemaker:CreateApp'. The role has sagemaker full access policy attached to it, but that policy doesn't have the createApp permission which is weird. Are there any policies I can attach to the role with the sagemaker createApp permission, or do I need to attach a policy to the role through terraform?
Make sure your execution role does not have any permission boundaries. By default, the SageMakerFullAccess policy allows create app permissions - see this statement -
{
"Effect": "Allow",
"Action": [
"sagemaker:CreatePresignedDomainUrl",
"sagemaker:DescribeDomain",
"sagemaker:ListDomains",
"sagemaker:DescribeUserProfile",
"sagemaker:ListUserProfiles",
"sagemaker:*App",
"sagemaker:ListApps"
],
"Resource": "*"
},
You can add an inline policy such as below to make sure your role has permissions to create app -
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowCreateApp",
"Effect": "Allow",
"Action": "sagemaker:CreateApp",
"Resource": "*"
}
]
}
Are you talking about arn:aws:iam::aws:policy/AmazonSageMakerFullAccess? If you take a look at this policy, you'll find this as one of the statements:
{
"Effect": "Allow",
"Action": [
"sagemaker:CreatePresignedDomainUrl",
"sagemaker:DescribeDomain",
"sagemaker:ListDomains",
"sagemaker:DescribeUserProfile",
"sagemaker:ListUserProfiles",
"sagemaker:DescribeSpace",
"sagemaker:ListSpaces",
"sagemaker:*App",
"sagemaker:ListApps"
],
"Resource": "*"
},
The sagemaker:*App action on "Resource": "*" means that the policy actually does have the sagemaker:CreateApp permission.
It is a common guardrail (even listed in the AWS Whitepaper on "SageMaker Studio Administration Best Practices") to limit notebook access to specific instances, and that guardrail denies on the CreateApp action. And the recommendation in the whitepaper is to control this at the service control policy level (in AWS Organizations, which you may not have visibility into), with this being an example policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "LimitInstanceTypesforNotebooks",
"Effect": "Deny",
"Action": [
"sagemaker:CreateApp"
],
"Resource": "*",
"Condition": {
"ForAnyValue:StringNotLike": {
"sagemaker:InstanceTypes": [
"ml.c5.large",
"ml.m5.large",
"ml.t3.medium",
"system"
]
}
}
}
]
}

Cross-Account IAM Access Denied with GUI Client, but permitted via CLI

I am stuck with provisioning end-user access into a cross account shared bucket, and need help figuring out if there are specific policy requirements for using clients to access the bucket, vs straight CLI.
IAM User Accounts are managed in our "Core" AWS Account.
S3 Bucket is provisioned in our "Dev" AWS Account.
S3 Bucket in Dev account is encrypted with KMS key in Dev Account.
We have configured our Bucket Policy to permit the user access.
We have configured user policies to permit access to the S3 bucket.
We have configured user policies to permit use of the KMS key.
When using the CLI our user account can succesfully access and use the S3 bucket. When attempting to connect with a GUI Client (Win-SCP, CyberDuck, MAC ForkLift) we receive permission denied errors.
BUCKET POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::[DEVACCOUNT#]:role/EC2-ROLE-FOR-APP-ACCESS",
"arn:aws:iam::[COREACCOUNT#]:user/end.user"
]
},
"Action": "s3:List*",
"Resource": [
"arn:aws:s3:::dev-mybucket",
"arn:aws:s3:::dev-mybucket/*"
]
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::[DEVACCOUNT#]:role/EC2-ROLE-FOR-APP-ACCESS",
"arn:aws:iam::[COREACCOUNT#]:user/end.user"
]
},
"Action": [
"s3:GetObject",
"s3:Put*"
],
"Resource": "arn:aws:s3:::dev-mybucket/*"
}
]
}
User Policy - access KMS
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowUseOfDevAPPSKey",
"Effect": "Allow",
"Action": [
"kms:Encrypt",
"kms:Decrypt",
"kms:ReEncrypt*",
"kms:GenerateDataKey*",
"kms:Describe*"
],
"Resource": [
"arn:aws:kms:ca-central-1:[DEVACCOUNT#]:key/[redacted-key-number]"
]
},
{
"Sid": "AllowAttachmentOfPersistentResources",
"Effect": "Allow",
"Action": [
"kms:CreateGrant",
"kms:List*",
"kms:RevokeGrant"
],
"Resource": [
"arn:aws:kms:ca-central-1:[DEVACCOUNT#]:key/[redacted-key-number]"
],
"Condition": {
"Bool": {
"kms:GrantIsForAWSResource": true
}
}
}
]
}
User policy - Access S3 Bucket
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowAccessToMyBucket",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::dev-mybucket/",
"arn:aws:s3:::dev-mybucket/*"
]
}
]
}
From aws s3 commands we can 'ls' content and 'cp' content from local to remote and from remote to local.
When configuring access with the GUI Clients we always receive somewhat generic 'permission denied' or 'access denied' type errors.
The GUI client is probably making a call that is not List*, Put* or GetObject.
For example, it might be calling GetObjectVersion, GetObjectAcl or GetBucketAcl.
Try adding Get* permissions in addition to List*.
You might also be able to look at the events in your AWS CloudTrail trail to see what specific API calls were denied.
For details, see: Specifying Permissions in a Policy - Amazon Simple Storage Service
Access to an S3 bucket via a GUI such as the AWS web console or SFTP clients with s3 functionality(FileZilla, Cyberduck, ForkLift, etc.) requires the s3:ListAllMyBuckets action in a policy attached to that IAM user. This is very unfortunate as the user will now have access to see ALL your bucket names in that account even if they just have read, write, and or List access to just one bucket in that account.
https://docs.aws.amazon.com/AmazonS3/latest/API/API_Operations.html
https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListBuckets.html
One other option is to go to the bucket URL directly. The user/role will require access via that bucket's Bucket policy.
https://s3.console.aws.amazon.com/s3/buckets/dev-mybucket

AWS. Get list of Permissions for IAM user

I have some question about IAM permissions. I have IAM User. who has such minimal permissions
1) For IAM:
{
"Version": "2010-12-14",
"Statement": [
{
"Effect": "Allow",
"Action": [
"iam:ChangePassword"
],
"Resource": [
"arn:aws:iam::*:user/${aws:username}"
]
},
{
"Effect": "Allow",
"Action": [
"iam:GetAccountPasswordPolicy"
],
"Resource": "*"
}
]
}
2) For S3
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1234567890123",
"Effect": "Allow",
"Action": [
"s3:DeleteObject",
"s3:GetObject",
"s3:ListBucket",
"s3:PutObject"
],
"Resource": [
"*"
]
}
]
}
And I need to write some Api, using Java SDK, which be ablle to check if user has this minimal permissions, but on this level of access, I can not get my own permissions, policies, roles. It is possible to do so with this level of access?
Using AWS java sdk you can get IAM permissions of yours and other IAM users. But you need to have required AWS resource permission.
For ex http://docs.aws.amazon.com/cli/latest/reference/iam/list-user-policies.html
To list other user policies you should have IAM:list-user-policies.
Likewise whatever AWS resources you try to access require permissions to query the resource. Your permissions can be set directly to you in permissions or role with permissions have been assigned to you.
I had an issue with identifying IAM user permission and I had to write an API that had to be responsible for that. So used AWS java SDK, IAM module, where such ability had already presented. I used simulatePrincipalPolicy request

Connecting Amazon S3 bucket to Other Server - IAM

I am trying to connect Amazon S3 to other services through Bucket policy.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {"arn:aws:iam::ACCOUNT-ID:user/augmen",
}
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation",
"s3:GetObject"
],
"Resource": ["arn:aws:s3:::rajatv.input",
"arn:aws:s3:::rajatv.input/*"]
}
]
}
Still getting errors like:
This policy contains invalid Json
Invalid Bucket syntax
No Resources
It appears that you are wanting to give bucket access to a specific IAM User. If so, the best way is to put a policy on the IAM User themselves, so that the permissions apply only to them.
This policy would grant bucket access to whichever user has it as an IAM policy. To add it, go to the user, Add Inline Policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PermitBucketAccess",
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation",
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::rajatv.input",
"arn:aws:s3:::rajatv.input/*"
]
}
]
}
Bucket Policies, which are applied to the bucket itself, are best used to grant access to everyone, whereas an IAM policy is best for granting permissions to specific IAM Users, Groups and Roles.
Principal needs to have this format:
"Principal": {"AWS": ["arn:aws:iam::ACCOUNT-ID-WITHOUT-HYPHENS:root"]},