I am running something that uses AWS services on a production server. The most often provided solutions for providing credentials to session are one of:
from boto3 import Session
session = Session(profile_name='my_aws_profile')
OR
from boto import Session
session = Session(
aws_access_key_id="AWS_ACCESS_KEY",
aws_secret_access_key="AWS_SECRET_ACCESS_KEY"
)
What are my options so that I can
share the code without sharing my credentials, and
specify the path of my aws credentials file instead of assuming that it has to be ~/.aws/credentials?
The documentation lists all the ways Boto can find AWS credentials: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#configuring-credentials
In particular, the best practice would be to put credentials in environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. Boto will pick those up automatically.
save your credentials as environment variables
if you want another location for your credentials, save it in the config file and change set the AWS_CONFIG_FILE location to your desired path. If you have credentials stored in credentials file and config file, the one in the credentials file takes precedence.
Related
I am trying to use the credentials file to load my aws credentials on windows.
I am using boto3, and it works properly when I set the credentials on windows system environment variables, but not when I created the credentials file in the %UserProfile%/.aws folder
The content of my credentials file is as follows ,(replaced my real tokens with XXXX):
[default]
aws_access_key_id=XXXXXXXXX
aws_secret_access_key=XXXXXXX
aws_session_token=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Is there any way to trace the precedence boto3 uses to retrieve the credentials? Any thoughts on why it is not picking the values from the file?
Please follow this steps:
pip install awscli
aws configure -> put all your credentials
Then try again and the error go away.
C:\Users\USERNAME.aws\credentials on Windows. This file can contain the credential details for the default profile and any named profiles.
I am trying to get a service account to create blobs in Google Cloud Storage
from within a Python script, but I am having issues with the credentials.
1) I create the service account for my project and then download the key file in json:
"home/user/.config/gcloud/service_admin.json"
2) I give the service account the necessary credentials (via gcloud in a subprocess)
roles/viewer, roles/storage.admin, roles/resourcemanager.projectCreator, roles/billing.user
Then I would like to access a bucket in GCS
from google.cloud import storage
import google.auth
credentials, project = google.auth.default()
client = storage.Client('myproject', credentials=credentials)
bucket = client.get_bucket('my_bucket')
Unfortunately, this results in:
google.api_core.exceptions.Forbidden: 403 GET
https://www.googleapis.com/storage/v1/b/my_bucket?projection=noAcl:
s_account#myproject.iam.gserviceaccount.com does not have
storage.buckets.get access to my_bucket
I have somewhat better luck if I set the environment variable
export GOOGLE_APPLICATION_CREDENTIALS="home/user/.config/gcloud/service_admin.json"
and rerun the script. However, I want it all to run in one single instance of the script that creates the accounts and continues to create the necessary files in the buckets. How can I access my_bucket if I know where my json credential file is.
Try this example from the Documentation for Server to Server Authentication:
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
This way you point the file containing the key of the Service Account directly in your code.
I've come across very weird permission issue. I'm trying to upload a file to s3, here's my function
def UploadFile(FileName, S3FileName):
session = boto3.session.Session()
s3 = session.resource('s3')
s3.meta.client.upload_file(FileName, "MyBucketName", S3FileName)
I did configure aws-cli on the server. This function works fine when I log into server and launch python interpreter but fails when called from my django rest api with:
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
No idea why the same function works when called from interpreter and fails when called from django. Both are in the same virtual environment. Any suggestions?
According to the boto3 docs, boto3 is looking for credentials in the following places:
Passing credentials as parameters in the boto.client() method
Passing credentials as parameters when creating a Session object
Environment variables
Shared credential file (~/.aws/credentials)
AWS config file (~/.aws/config)
Assume Role provider
Boto2 config file (/etc/boto.cfg and ~/.boto)
Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
Note that many of these places are paths with "~" in them. "~" refers to the current user's home directory. Most likely, your REST API is running under a different system user than you are using to test your code.
The proper solution is to use IAM roles, as this allows your server to have S3 access without you needing to give it IAM credentials. However, if that doesn't work for your setup, you should put the IAM credentials in the /etc/boto.cfg file as that is user agnostic.
Through boto3 library, I uploaded and downloaded file from AWS s3 successfully.
But after few hours, it shows InvalidAccessKeyId suddenly for the same code.
What I have done:
set ~/.aws/credentials
Set environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
I tried the following solutions, but the error still heppens.
adding quotes on config values
ref2
Do I miss anything? Thanks for your help.
You do not need to configure both .aws/credentials AND environment variables.
From Credentials — Boto 3 documentation:
The order in which Boto3 searches for credentials is:
Passing credentials as parameters in the boto.client() method
Passing credentials as parameters when creating a Session object
Environment variables
Shared credential file (~/.aws/credentials)
AWS config file (~/.aws/config)
Assume Role provider
Boto2 config file (/etc/boto.cfg and ~/.boto)
Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
The fact that your credentials stopped working after a period of time suggests that they were temporary credentials created via the AWS Security Token Service, with an expiry time.
If you have the credentials in ~/.aws/credentials there is no need to set environment variables AWS_ACCESS_KEY_ID & AWS_SECRET_ACCESS_KEY.
Environment variables are valid only for a session.
If you are using boto3, you can specify the credentials while creating client itself.
The best way to configure AWS credential is to install the AWS Command-Line Interface (CLI) and run aws configure from the bash console:
~/.aws/credentials format
[default]
aws_access_key_id = ***********
aws_secret_access_key = ************
I found this article for the same issue.
Amazon suggests to generate new key, and I did.
Then it works, but we don't know the root cause.
Suggest to do so for saving a lot of time when having the same problem.
My library needs to connect to arbitrary AWS-IoT accounts, so I'm passing the required credentials when I instantiate the AWS client:
client = boto3.client('iot',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,...)
According to the boto3 docs (excerpt below), credentials passed in this way should override all other credentials. However this call generates the following log message from Boto3:
"Found credentials in shared credentials file: ~/.aws/credentials"
I really don't want boto3 picking-up whatever credentials a user may have happen to have configured on their system - I want it to use just the ones I'm passing to boto3.client(). Any ideas how to ensure this?
boto3 docs excerpt:
The order in which Boto3 searches for credentials is:
1. Passing credentials as parameters in the boto.client() method
2. Passing credentials as parameters when creating a Session object
3. Environment variables
4. Shared credential file (~/.aws/credentials)
5. AWS config file (~/.aws/config)
6. Assume Role provider
7. Boto2 config file (/etc/boto.cfg and ~/.boto)
8. Instance metadata service on an Amazon EC2 instance that has an IAM role configured.
From my testing, when passing aws_access_key_id & aws_secret_access_key parameters to boto3.client(..), it accepts them even if they are incorrect and does not load any other credentials from file.
When an API call is sent, it returns a NoCredentialsError exception.
However, when above parameters not passed i.e.boto3.client('iot') AND a default profile exist in the system, I can see "Found credentials in shared credentials file: ~/.aws/credentials".
Meaning, as long as your explicitly provide credential parameters, you should be protected from calls executed on the wrong account.