Django website failing when using os.einviron.get for aws keys - django

I am using django, aws and heroku. I have a website that goes live locally when I have the following ids set.
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID","aaaa")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY","bbbb")
However, this is bad form because I can't commit this file to gitlab. If I did, my keys would be public (my gitlab repo isn't private yet, still practicing before I make that commitment) and all hell breaks loose. However, if I use the following code and try to go live locally
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
I get the following error:
botocore.exceptions.NoCredentialsError: Unable to locate credentials
When I use the latter code and go live via heroku as an actual webpage it works.
It's as if my os.environ.get command doesn't work in my local settings. Does anyone konw what to do? I'd hate to have to change these lines of code everytime I want to commit to git or push to heroku. I feel like the 2nd code should work locally, but I don't see how.

check out django environ.
It let you set all your secrets in an environment file(.env), and read in your env vars to your django application.

Related

passing aws creds to kitchen ec2 command line

I am trying to do chef cookbook development via Jenkinsfile pipeline. I have my jenkins server running as a container (using jenkinsci/blueocean image). As one of the stages, I am trying to do aws configure and then run kitchen test. For some reason with below code, I am getting unauthorized operation error. For some reason, my AWS creds are not sent properly to .kitchen.yml (No need to check IAM creds, because they have admin access)
stage('\u27A1 Verify Kitchen') {
steps {
sh '''mkdir -p ~/.aws/
echo 'AWS_ACCESS_KEY_ID=...' >> ~/.aws/credentials
echo 'AWS_SECRET_ACCESS_KEY=...' >> ~/.aws/credentials
cat ~/.aws/credentials
KITCHEN_LOCAL_YAML=.kitchen.yml /opt/chefdk/embedded/bin/kitchen list
KITCHEN_LOCAL_YAML=.kitchen.yml /opt/chefdk/embedded/bin/kitchen test'''
}
}
Is there anyway, I can pass AWS creds here. Also .kitchen.yml no longer supports passing AWS creds inside the file. Is there someway I can pass creds on command i.e. .kitchen.yml access_key=... secret_access_key=... /opt/chefdk/embedded/bin/kitchen test
Really appreciate your help.
You don't need to set KITCHEN_LOCAL_YAML=.kitchen.yml, that's already the primary config file.
You probably want to be using a Jenkins credential file, not hardcoding things into the job. But that said, the reason this isn't working is because the AWS credentials file is not a shell script, which is the syntax you're using there. It's an INI/TOML file and is paired with a similar config file that shares a similar structure.
You should probably just be using the environment variable support in kitchen-ec2 via the withEnv pipeline helper method or similar things for integrating with Jenkins managed credentials.

Partial credentials found in env, missing: AWS_SECRET_ACCESS_KEY

Just configured the AWS CLI on my computer with my AWS Access and Secret Key. When I try to use the AWS CLI though it gives me this error.
Partial credentials found in env, missing: AWS_SECRET_ACCESS_KEY
I went to ~/.aws/config, and sure enough those credentials are there, including the AWS Secret Key, so I'm not sure why its squawking at me.
You should have this file ~/.aws/credentials
and the contents should be in the following format:
[default]
aws_access_key_id = XXXXXXXXXXXXXXXXXX
aws_secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
For anyone who is having the same problem - this is solution that worked for me:
If you are on Windows - check if you don't have AWS_ACCESS_KEY_ID set in your system variables. AWS CLI uses something called configuration provider chain - and environment variables take precedence over configuration file. In my case somehow I had only set AWS_ACCESS_KEY_ID thus the error message.
If you are using MacOS, this may be caused because you set other credentials in the environmental variables.
Setting the new credentials to the environmental variables might solve your problem.
To do so run this in the terminal:
export AWS_ACCESS_KEY_ID=X
export AWS_SECRET_ACCESS_KEY=Y
export AWS_DEFAULT_REGION=REGION
Substitute X, Y, and REGION with the values corresponding to your application.
Source documentation: https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html
I ran with this problem and I did rerun the same workflow yml again and again but changes were never actually occurs. Finally, I had to remove existing workflow from GitHub and re-initiate/pushed yml configuration file again. Worked for me. Thank You!

Deploy app from CircleCI with

I'm looking to automatically deploy my app once we release a new version. We use CircleCI, so firing these commands shouldn't be a big deal.
cf login -a https://api.lyra-836.appcloud.swisscom.com -u myuser -p seret
cf push myapp
However I don't want to expose my personal credentials (Passeport acount) into our git repository. Is it possible to generate an API key for that purpose?
How do you handle that? I might also need to ssh into the instance to fire some migrations scripts after the deployment, same goes there.
Currently Swisscoms Application cloud does not offer technical accounts but you can create an additional account easily. Then add it to your org/space as developer and it should be able to fulfill your needs.
CircleCI documentation has a section about handling secrets: Using CircleCI Environment Variables
Setting environment variables for all commands without adding them to
git
Occasionally, you’ll need to add an API key or some other secret
as an environment variable. You might not want to add the value to
your git history. Instead, you can add environment variables using the
Project settings > Environment Variables page of your project.
This documentation describes how to store encrypted stuff within your VCS.
If you prefer to keep your sensitive environment variables checked
into git, but encrypted, you can follow the process outlined at
circleci/encrypted-files.

Lumen on AWS Elastic Beanstalk - .env

I'm trying to deploy a Lumen app in Elastic Beanstalk.
The problem is around the .env file, of course it's gitignored, so how can I put that file into the server?
I tried to manually create the file after deploying, but the next deploy the file disappear and I have to manually recreate that file again. I don't think this is a solution....
What's the correct the way?
I tried with this solution but looks like the env variable are not being created after the deploy, so, is the only way to add directly to the AWS console?
Update
I manually added env variables through the AWS console, those variables are being displayed if I "echo" them (eg. echo $APP_ENV gives me the correct value production) but still those env variables are not being loaded in the Lumen app and ignored. Any thoughts?
Updating vlucas/phpdotenv from 1.0 to 2.2 solved the issue.

Can't upload Django project to Heroku

I'm making an app with Django and I want to upload it to Heroku but when I do
git push heroku master
I got this error and I don't know how to fix it.
! Your account myemail#gmail.com does not have access to fathomless-depths-4588.
!
! SSH Key Fingerprint: xx:xx:xx:xx:xx:xx:xx:xx:xx:02:xx:xx:79:xx:0f:xx
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
and I'm not using that mail "myemail#gmail.com" it's from an old project, I already use: heroku logout, and heroku login but it doesn't work.
I don't know is the SSH is important that's why I put it on xx xD!
I'll appreciate any help.
You should have a look at heroku-accounts - allows you to manage multiple accounts on the same machine.
https://github.com/ddollar/heroku-accounts.
Hope this helps.