I want to secure my secret variables for my django project, which is deployed on AWS Elastic Beanstalk.
I've seen others use environment variables and am wondering if this is secure. Or, is it more secure to use KMS? Thanks!
Usually if you have secrets you would store them in AWS Systems Manager Parameter Store or AWS Secrets Manager.
Then, instead of hard-coding the secrets' values in your .ebextensions or environment variables you would pass references to the parameter or secret in either AWS Systems Manager Parameter Store or AWS Secrets Manager.
This means that in your application, e.g. .ebextensions, in you would need to add extra logic to obtained the actual values from the secrets as well as you would have to modify instance role used by EB with permissions to do so.
Related
We are managing an elastic beanstalk application via terraform, and are unsure of the best way to handle sensitive environment variables for our application.
Currently, we are storing these sensitive values in an AWS Secrets Manager secret. During the terraform apply step, we use an aws_secrets_manager_secret data source to load the secret. We then iterate over the key/value pairs in the secret, and create setting blocks within our aws_elastic_beanstalk_environment resource.
There are a couple of concerns we have with this approach:
We have to mark our sensitive values as nonsensitive, because terraform does not allow the use of sensitive values as arguments to for_each. This means that the plaintext values are logged as part of our terraform plan and terraform apply steps. This is an issue in our CD pipeline, but our workaround for this is to redirect all logs to /dev/null.
Our sensitive values appear in plaintext in our tfstate file. We keep this file in an S3 bucket, whose access is restricted to administrators and the deployment user. This is probably not a huge issue. The values are accessible via the Secrets Manager console anyway, and access is restricted in a similar way.
Is there a better solution that others are using to manage environment variables for an elastic beanstalk app managed via terraform?
I have a react app which is deployed using AWS Amplify. I'm using Google maps in my application and I wanted to know the right place to put the Google Maps API key. I have read about AWS Amplify Environment variables where we can save the api key in key-value pairs. Also, I know that we have AWS Secrets, which is for saving private data.
What is the right approach to save the API key in my use case? Is saving the api key in Amplify Environment variables safe enough? Or should I go for AWS secrets?
The google maps api best practices include (depends exactly what you are using):
Store API keys and signing secrets outside of your application’s source code.
Store API keys or signing secrets in files outside of your application's source tree.
Amplify Environment variables are suited to store:
Third-party API keys
since
As a best practice, you can use environment variables to expose these configurations. All environment variables that you add are encrypted to prevent rogue access, so you can use them to store secret information.
So you can use them, as their are native to Amplify. AWS Secrets Manager is not natively supported by amplify, and you would have to add extra code to your backend to to make use of them.
The important thing to note is that these Amplify Environment variables are only to be used by your backend service. Not by a front-end code.
We have an API app that uses Firebase Admin to send messages to devices.
Earlier, we used to specify the service account key using environment variable like GOOGLE_APPLICATION_CREDENTIALS="path_to_file.json".
But now, since we are shifting to AWS Elastic Container Service on Fargate, I am unable to figure out how to put this file in the container for AWS ECS.
Any advice highly appreciated.
Thanks
Solved it by storing the service key as a JSON Stringified environment variable & using admin.credential.cert() instead of defaultAppCredentials.
Refer: https://firebase.google.com/docs/reference/admin/node/admin.credential#cert
I would suggest instead AWS Secrets Manager that is purpose-built for storing secrets. Take a look to his blog post:
https://aws.amazon.com/blogs/compute/securing-credentials-using-aws-secrets-manager-with-aws-fargate/
Even better than using environment variables which have their own downsides, you can leverage AWS Parameter Store which is a secure way to manage secrets in the AWS environment (where secrets are encrypted both in transit and at rest).
You'd need to create an IAM role for Amazon ECS for your code to have access to the Parameter Store.
You may want to check this article: https://aws.amazon.com/blogs/compute/managing-secrets-for-amazon-ecs-applications-using-parameter-store-and-iam-roles-for-tasks/
Use the specific method from_service_account_info as described here. You then pass the content of the credentials json file as a dictionary.
I am having properties file specific for dev, test and other environments. I have to store this files in some secure place in aws. I am using AWS Native tools for build and deployment. Please let me know how to store these files in aws
There are many ways to deal with a secret in case of AWS, but one thing is clear it depends on the service where you will use and consume these secret.
But you explore these two
The simplest way is to use the Environment variable.
AWS Secrets Manager
s3 ( for keeping files)
One common approach is to pass your secret as an environment variables, but in case of AWS I will recommend to go with AWS Secrets Manager
AWS Secrets Manager is an AWS service that makes it easier for you to
manage secrets. Secrets can be database credentials, passwords,
third-party API keys, and even arbitrary text. You can store and
control access to these secrets centrally by using the Secrets Manager
console, the Secrets Manager command line interface (CLI), or the
Secrets Manager API and SDKs.
Basic Secrets Manager Scenario
The following diagram illustrates the most basic scenario. It shows
how you can store credentials for a database in Secrets Manager, and
then use those credentials in an application that needs to access the
database.
Compliance with Standards
AWS Secrets Manager has undergone auditing for the these standards and can be part of your solution when you need to obtain compliance certification.
You can explore this article to read and write secret.
If you need to maintain files, not only object then you can store in s3 and pull files during deployment. but better to enable server-side encprtion.
But I will prefer secret manager instead of s3 and environment variable.
You can for s3 here and here
Bajjuri,
As Adil said in answer,
AWS Secret Manager -- if you want to store data as key, value pairs.
AWS S3 -- if you want to store files securely.
Adding to his answer, you can use AWS CodeDeploy Environment
Variables to fetch the files according to the your environment.
Let's say you've CodeDeploy deployment group for dev environment with
name "DEV" and deployment group for prod environment with name "PROD",
you can use this variable in bash script and call it in life cycle
hooks of appspec file to fetch the files or secret accordingly.
I've been using this technique in production for long and it works like a charm.
I have deployed the django application on aws . I want that application should be deployed by team as well. What is procedure to do this? I have searched a lot and almost spent couple of hours . Anyone has any answer or tutorial?
Can we share these keys ?
aws_access_key_id
aws_secret_access_key
No, the AWS access keys should be kept secret and not even stored under version control.
For deployment (i.e. the credentials needed to actually release the code - used by EB), you should use an aws profile. Add a ~/.aws/credentials file with
[myprofile]
aws_access_key_id=...
aws_secret_access_key=...
and then, on all eb commands use --profile. e.g.
eb create --profile myprofile
If your application requires other AWS services (e.g. RDS, S3, SQS), then you can use the same local profile for development (although I would recommend not requiring any other AWS for testing) by using then environment variable export AWS_PROFILE=myprofile. And then rely on AWS roles and policies for the production environment.
If you feel you need the secret keys as django settings, then consider using https://django-environ.readthedocs.org where you can keep all those secrets on a .env file that gets loaded by django. But again, this file should not be under version control.
You should also create IAM users for every person in your team, so each person has its own credentials, and you can more easily monitor or if needed, revoke credentials.