Writing to S3 from Databricks with IAM role - amazon-web-services

I am have an instance profile already created by admin with that I am able to access some S3 buckets, but now I need write permission to a different S3 bucket. Can I request permission to add that instance profile to IAM role in S3?
I am not Databricks admin.

Related

Get access from account1's ec2 to account2 s3 object using was sdk

I have running app on auto-scaled ec2 env. of account1 created via AWS CDK (it also should have support to be run on multiple regions). During the app execution I need to get object from account2's s3.
One of the ways to get s3 data is use tmp credentials(via sts assume role):
on account1 side create a policy for ec2 instance role to assume sts tmp credentials for s3 object
on account2 side create a policy GetObject access to the s3 object
on account2 site create role and attach point2's policy to it + trust relationship to account1's ec2 role
Pros: no user credentials are required to get access to the data
Cons: after each env update requires manual permission configuration
Another way is to create a user in account2 with permission to get s3 object and put the credentials on account1 side.
Pros: after each env update doesn't require manual permission configuration
Cons: Exposes IAM user's credentials
Is there a better option to eliminate manual permission config and explicit IAM user credentials sharing?
You can add a Bucket Policy on the Amazon S3 bucket in Account 2 that permits access by the IAM Role used by the Amazon EC2 instance in Account 1.
That way, the EC2 instance(s) can access the bucket just like it is in the same Account, without have to assume any roles or users.
Simply set the Principal to be the ARN of the IAM Role used by the EC2 instances.

how to grant user to access only bucket the user created? aws s3

is there a possibility for one iam user to be able to create bucket and only have full access to the bucket the user creates and NOT other buckets?

Access denied when accessing Athena in SQLalchemy

Using pyathena and SQLalchemy, I connect to AWS Athena.
If I use keys of AWS admin, all is working fine, can query data.
If I use keys of an aws user that have AmazonAthenaFullAccess and AWSQuicksightAthenaAccess permissions, I get access deny.
I have permission to the output S3, and Athena access a public data set S3 bucket.
What permissions am I missing?
Thanks
AmazonAthenaFullAccess policy provides access to S3 buckets such as: "arn:aws:s3:::aws-athena-query-results-" and "arn:aws:s3:::athena-examples".
You have 2 options:
Create a new policy and add content from AmazonAthenaFullAccess policy, but with different S3 resources.
Add AmazonS3FullAccess policy to your user, which grants permissions for all your S3 buckets

AWS User vs Resource Permissions

When a user has Resource-based permissions to a ressource but does not have User-based permissions for that service. Can he use that service than?
example : user Jack has Resource based permission to use the S3 bucket 'jamm'. But Jack has no permission to use S3. Can Jack use the S3 bucket?
If you don't have permissions to access the S3 service, then you cannot use it at all.
In order to access any S3 bucket, you must have permissions to execute the S3 commands such as s3:GetObject. These permissions tells AWS which commands the user is allowed to execute. Anything not explicitly allowed is automatically denied.
The S3 bucket policy (your resource-level permissions) instruct the S3 service which users are allowed to access the bucket. But that only happens after the user has been given the needed permissions to execute S3 commands with which to access the bucket.
So you need:
Give the user permissions to execute the S3 commands to access the bucket (default is none), and
Give the bucket a policy to restrict the users that can access the bucket (default is anyone in the AWS account)
It is possible to restrict some S3 commands to your bucket, so the user has permission to execute s3:GetObject (for example), but only on your bucket.
But some commands, such as s3:ListAllMyBuckets cannot be restricted this way.

Using IAM roles transitively

I have a question on using IAM roles with EC2 and EMR. Here's my current setup:
I have a EC2 machine launched with a particular IAM role (let's call this role 'admin'). My workflow is to upload a file to S3 from this machine and then create an EMR cluster with a particular IAM role (a 'runner' role). The EMR cluster works on the file uploaded to S3 from the admin machine.
Admin is a role with privileges to all APIs in all AWS services. Runner has access to all APIs in EMR, EC2 and S3.
For some reason, the EMR cluster is unable to access the input file loaded in S3. It keeps getting an 'access denied' exception from s3.
I guess writing to s3 from one IAM role and reading it from a different IAM role is what is causing the issue.
Any ideas on what is going wrong here or whether this is even a supported use-case is appreciated.
Thanks!
http://blogs.aws.amazon.com/security/post/TxPOJBY6FE360K/IAM-policies-and-Bucket-Policies-and-ACLs-Oh-My-Controlling-Access-to-S3-Resourc
S3 objects are protected in three ways as seen in the post I linked to.
Your IAM role will need the permission to read S3 objects.
The S3 bucket policy must allow your IAM role access to the object.
The S3 ACL for the specific object must also allow your IAM role access to the object.