I've come across very weird permission issue. I'm trying to upload a file to s3, here's my function
def UploadFile(FileName, S3FileName):
session = boto3.session.Session()
s3 = session.resource('s3')
s3.meta.client.upload_file(FileName, "MyBucketName", S3FileName)
I did configure aws-cli on the server. This function works fine when I log into server and launch python interpreter but fails when called from my django rest api with:
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
No idea why the same function works when called from interpreter and fails when called from django. Both are in the same virtual environment. Any suggestions?
According to the boto3 docs, boto3 is looking for credentials in the following places:
Passing credentials as parameters in the boto.client() method
Passing credentials as parameters when creating a Session object
Environment variables
Shared credential file (~/.aws/credentials)
AWS config file (~/.aws/config)
Assume Role provider
Boto2 config file (/etc/boto.cfg and ~/.boto)
Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
Note that many of these places are paths with "~" in them. "~" refers to the current user's home directory. Most likely, your REST API is running under a different system user than you are using to test your code.
The proper solution is to use IAM roles, as this allows your server to have S3 access without you needing to give it IAM credentials. However, if that doesn't work for your setup, you should put the IAM credentials in the /etc/boto.cfg file as that is user agnostic.
Related
I have Ec2 running with Fullec2 accessrole and running some script which has Boto3 module and calling some aws service.
Which extra IAM permission require to run boto3 ? other than configure credential file under .aws folder.
Thanks
AN
boto3 is a Python library for making API calls to AWS. It is the AWS Python SDK.
Any API calls made to AWS must be made using AWS credentials. These credentials are associated with an IAM (Identity and Access Management) User. The User must be assigned the necessary permissions to allow them to make the call.
For example, if you wish to make an API call to create an Amazon SQS queue, the call must be made using credentials from an IAM User that has permission to call CreateQueue().
SETUP
I created a new aws user via the aws web console, and selected both console and programmatic/cli access
I have added the AdministratorAccess policy directly to it.
I have not enabled MFA for this user
I have verified that my credentials file within the aws directory contains the proper values for aws_access_key_id and aws_secret_access_key
I have verified that my config file within the aws directory does not contain any lines that would overwrite data for the profile
I am verifying I am using the correct profile info by with aws configure list
THE ISSUE
Executing aws ec2 describe-regions returns:
An error occurred (UnauthorizedOperation) when calling the DescribeRegions operation: You are not authorized to perform this operation.
The error is pretty straightforward, but I'm not sure what else I can do to authorize this user. I had a coworker follow the same steps and the CLI worked as expected for him.
I researched the steps from This S.O. post but am still scratching my head.
Your AWS CLI is getting credentials from somewhere else. See Configuration Settings and Precedence:
http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html#config-settings-and-precedence
Make sure it is not getting the credentials from environment variables or from other locations. The AWS CLI looks for credentials and configuration settings in the following order:
Command Line Options – region, output format and profile can be specified as command options to override default settings.
Environment Variables – AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, etc.
1)The AWS credentials file – ~/.aws/credentials on Linux, OS X, or Unix, or at C:\Users\USERNAME .aws\credentials on Windows.
Can contain multiple named profiles in addition to a default profile.
2)The CLI configuration file – at ~/.aws/config on Linux, OS X, or Unix, or at C:\Users\USERNAME .aws\config on Windows. Can contain a default profile, named profiles, and CLI specific configuration parameters for each.
3)Instance profile credentials – these credentials can be used on EC2 instances with an assigned instance role
Through boto3 library, I uploaded and downloaded file from AWS s3 successfully.
But after few hours, it shows InvalidAccessKeyId suddenly for the same code.
What I have done:
set ~/.aws/credentials
Set environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
I tried the following solutions, but the error still heppens.
adding quotes on config values
ref2
Do I miss anything? Thanks for your help.
You do not need to configure both .aws/credentials AND environment variables.
From Credentials — Boto 3 documentation:
The order in which Boto3 searches for credentials is:
Passing credentials as parameters in the boto.client() method
Passing credentials as parameters when creating a Session object
Environment variables
Shared credential file (~/.aws/credentials)
AWS config file (~/.aws/config)
Assume Role provider
Boto2 config file (/etc/boto.cfg and ~/.boto)
Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
The fact that your credentials stopped working after a period of time suggests that they were temporary credentials created via the AWS Security Token Service, with an expiry time.
If you have the credentials in ~/.aws/credentials there is no need to set environment variables AWS_ACCESS_KEY_ID & AWS_SECRET_ACCESS_KEY.
Environment variables are valid only for a session.
If you are using boto3, you can specify the credentials while creating client itself.
The best way to configure AWS credential is to install the AWS Command-Line Interface (CLI) and run aws configure from the bash console:
~/.aws/credentials format
[default]
aws_access_key_id = ***********
aws_secret_access_key = ************
I found this article for the same issue.
Amazon suggests to generate new key, and I did.
Then it works, but we don't know the root cause.
Suggest to do so for saving a lot of time when having the same problem.
My library needs to connect to arbitrary AWS-IoT accounts, so I'm passing the required credentials when I instantiate the AWS client:
client = boto3.client('iot',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,...)
According to the boto3 docs (excerpt below), credentials passed in this way should override all other credentials. However this call generates the following log message from Boto3:
"Found credentials in shared credentials file: ~/.aws/credentials"
I really don't want boto3 picking-up whatever credentials a user may have happen to have configured on their system - I want it to use just the ones I'm passing to boto3.client(). Any ideas how to ensure this?
boto3 docs excerpt:
The order in which Boto3 searches for credentials is:
1. Passing credentials as parameters in the boto.client() method
2. Passing credentials as parameters when creating a Session object
3. Environment variables
4. Shared credential file (~/.aws/credentials)
5. AWS config file (~/.aws/config)
6. Assume Role provider
7. Boto2 config file (/etc/boto.cfg and ~/.boto)
8. Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
From my testing, when passing aws_access_key_id & aws_secret_access_key parameters to boto3.client(..), it accepts them even if they are incorrect and does not load any other credentials from file.
When an API call is sent, it returns a NoCredentialsError exception.
However, when above parameters not passed i.e.boto3.client('iot') AND a default profile exist in the system, I can see "Found credentials in shared credentials file: ~/.aws/credentials".
Meaning, as long as your explicitly provide credential parameters, you should be protected from calls executed on the wrong account.
I have this snippet to upload a file on S3
s3 = boto3.resource('s3')
s3.Object('bucketname', timestamped_filename).put(Body=open(FILE_SAVE_PATH, 'rb'))
my bucket has a delete/upload permission for everyone, so it does work on my Windows machine.
However, when I try to run the same code on my Mac it throws
botocore.exeptions.NoCredentialsError: Unable to locate credentials
Is this behavior normal?
And what kind of credentials I can possibly provide if I'm accessing a public bucket?
Thank you.
When making an API call to AWS, valid credentials must be provided. These credentials are associated with an IAM User and grant access to AWS services.
When making API calls (or using the AWS Command-Line Interface (CLI)) from an Amazon EC2 instance, these credentials can be granted to the EC2 instance by assigning an IAM Role to the instance at launch time.
When making calls from a non-EC2 computer, credentials must be provided via a configuration file or environment variables.
It appears that your Windows machine is either an EC2 instance with a role, or it has a local configuration file with valid credentials; and it appears that your Mac has neither of these.
See: boto3 Credentials documentation