I want to restrict user from executing INSERT queries in master table(Not CTAS table) in athena.
If there way, I can achieve this ?
user will executing queries from Lambda.
Athena just supports StartQueryExecution and StopQueryExecution as actions in IAM permission policies - so there is no differentiation which type of SQL Command (DDL, DML) is being executed.
However, I think you can overcome this by denying permissions on glue and S3 so Athena queries that try to execute INSERTs will fail:
glue permissions can be managed on catalog, database and table level, some examples can be found in AWS' Identity-Based Policies (IAM Policies) for Access Control for Glue
Relevant glue actions to deny: BatchCreatePartition, CreatePartition, UpdatePartition - see Actions, resources, and condition keys for AWS Glue
On S3 you need to deny PutObject or Put* for the S3 location of the specific table, see Actions defined by Amazon S3 - again this can be defined on a object level in a bucket.
Related
I'm trying to query a partitioned table that is based on S3 bucket from Lambda
and get the following error:
But, when I used the same query via Athena it works well.
My Lambda role includes S3 full permission for all the resources.
BTW I received access to other S3 bucket (another account), this is not my bucket but I've read, and list permissions. and using Lambda I'm able to create the partition table on their bucket.
Using Lambda, this query is working
ALTER TABLE access_Partition ADD PARTITION
(year = '2022', month = '03',day= '15' ,hour = '01') LOCATION 's3://sddds/2022/03/15/01/';
But select query on the above table (after the creation) get a permission error
(When I open the executed query on Athena it's marked as failed but I can run it successfully )
select * from access_Partition
Please advise!!!
Amazon Athena uses the permissions of the entity making the call to access Amazon S3. So, when you run an Athena query in the console, it is using permissions from your IAM User. When it is run from Lambda, it uses the permissions from the IAM Role associated with the Lambda function.
When this command is run:
ALTER TABLE access_Partition ADD PARTITION
(year = '2022', month = '03',day= '15' ,hour = '01') LOCATION 's3://sddds/2022/03/15/01/';
it is updating information (metadata) in the data catalog used in Athena in your own account. It is not actually accessing the bucket until a query is run.
The fact that the query fails when it is run suggests that the IAM Role does not have permission to access the bucket in the other AWS Account.
You should add a Bucket Policy on the S3 bucket in the other account that grants access permission for the IAM Role used by the Lambda function.
Can I create a database and table in Athena service within my account to access S3 data in another account?
I went over the below link and I assume as per this documentation both Amazon Athena and S3 bucket have to be in the same account and access is provided to the user in another account.
https://console.aws.amazon.com/athena/home?force®ion=us-east-1#query
From Access Control Policies - Amazon Athena:
To run queries in Athena, you must have the appropriate permissions for:
The Athena actions.
The Amazon S3 locations where the underlying data is stored that you are going to query in Athena.
...
So, it seems that the IAM User who is executing the Athena query requires access to the Amazon S3 location.
This could be done by adding a Bucket Policy to the S3 bucket in the other account that permits the IAM User access to the bucket.
To explain better:
Account-A with IAM-User-A and AWS Athena
Account-B with Bucket-B that has a Bucket Policy granting access to IAM-User-A
This answer deals with the additional information that:
A Lambda function in Account-A must be able to create a table in Amazon Athena in Account-B
I haven't tested it, but I think you will require:
Role-A in Account-A for the Lambda function that:
Permits AssumeRole on Role-B
Role-B in Account-B that:
Permits access to Amazon Athena and the source bucket in Amazon S3
Trusts Role-A
The Lambda function will run with Role-A. It will then use credentials from Role-A to call AssumeRole on Role-B. This will return a new set of credentials that can be used to call Amazon Athena in Account-B.
I have manually created a Glue table with S3 bucker as the source.
The S3 bucket has a bucket policy defined to allow access only from
root
my user_id
or a role defined for Glue
Now when a different user who has AWSGlueConsoleFullAccess tries to access the table from Glue console he gets access denied although Glue has service access to the S3 bucket.
Request help in understanding this behavior.
Thanks
Can you please look into the policy details of role "AWSGlueConsoleFullAccess"? Most probably its expecting the S3 bucket will have certain prefix e.g. "aws-glue-*". In that case either update your policy or rename your bucket to have aws-glue- prefix.
"Resource": [
"arn:aws:s3:::aws-glue-*"
Using pyathena and SQLalchemy, I connect to AWS Athena.
If I use keys of AWS admin, all is working fine, can query data.
If I use keys of an aws user that have AmazonAthenaFullAccess and AWSQuicksightAthenaAccess permissions, I get access deny.
I have permission to the output S3, and Athena access a public data set S3 bucket.
What permissions am I missing?
Thanks
AmazonAthenaFullAccess policy provides access to S3 buckets such as: "arn:aws:s3:::aws-athena-query-results-" and "arn:aws:s3:::athena-examples".
You have 2 options:
Create a new policy and add content from AmazonAthenaFullAccess policy, but with different S3 resources.
Add AmazonS3FullAccess policy to your user, which grants permissions for all your S3 buckets
For reasons beyond my control, I have the following:
A table CustomerPhoneNumber in DynamoDB under one AWS account.
A Redshift cluster under a different AWS account (same geographic region; EU)
Is there any way to run the COPY command to move data from Dynamo into Redshift across accounts?
Typically if they were under the same account, it would be done via IAM role pretty easily:
copy public.my_table (col1, col2, col3) from 'dynamodb://CustomerPhoneNumber' iam_role 'arn:aws:iam::XXXXXXXXXXX:role/RandomRoleName' readratio 40;
But obviously this doesn't work in my case.
Any ideas?
Answer above by John is not applicable any more; This is how you can do it-
AWS account with required resource like dynamodb in this case(trustING account) need to have account requiring access(trusTED AWS account)... as trusted in their Dynamo db read only role: arn:aws:iam:::role/
Create a policy which does sts:AssumeRole (above trustING account's role arn as the resource),
attach that policy to redshift-access-role(which has all privileges required to run the copy command).
Run the command as:
iam_role 'arn:aws:iam::<trusTEDawsAccountId>:role/redshift_access_role,arn:aws:iam::<trusTINGawsAccountId>:role/<dynamodbreadrole>'
readratio 50
Details in:
https://docs.aws.amazon.com/redshift/latest/mgmt/authorizing-redshift-service.html
You can use CREDENTIALS and specify the access key and secret key for the other account. Add the following to your COPY statement:
credentials 'aws_access_key_id=AKIAXXXXX;aws_secret_access_key=yyyyyy'
You cannot use cross account roles with Redshift. To quote Amazon documentation:
An IAM role can be associated with an Amazon Redshift cluster only if
both the IAM role and the cluster are owned by the same AWS account.
Authorizing COPY and UNLOAD Operations Using IAM Roles
apparently stackoverflow need formatting, the code is:
copy redshift_tbl from 'dynamodb://dynamotbl'
iam_role 'arn:aws:iam::<TRUSTEDacAWSid>:role/redshift_access_role,arn:aws:iam::<trusTINGacAWSid>:role/<dynamodb-role-in-trustingac>'
readratio 50
*Note: no space between commas in roles