How to get the .env file into the remote host server - django

Quick question how to get the .env conf file into the server.
I have three settings file in my project base, development, production. I think testing and staging is an overkill
It is encouraged to put .env in .gitignore file. so it won't be included in version control. in Heroku, it is easy to set the environment variable in the settings tab. what about other hosting services without such a feature? since I push to my repository and pull from the server, How I am I suppose to get the API keys stored in the .env since it will ignore it (it's in gitignore) file? should I create in the server? or is there a way it is done? Newbie question I know.

You are correct. You should put your .env file in .gitignore and create .env file on server manually - either by connecting to server (SSH) and creating file or transfer file using one of secure transfer protocols (SCP, SFTP).

Related

Trying to hide django secret key, but getting error when deploying in PythonAnywhere

I'm trying to deploy a django project. I hid the secret key putting it in a file called .env and added it to .gitignore, so I have django's secret key only locally, and not in the git repository. But when I try to deploy the project in PythonAnywhere, an error occurs because there is no secret key in the repository (there is no file from where the server could read the secret key).
So I understand that it's not good to hard code the key, but how would I do to get a secret key for deployment?
I shouldn't have any secret key in the git repository, right?
I shouldn't have any secret key in the git repository, right?
Right.
So I understand that it's not good to hard code the key, but how would I do to get a secret key for deployment?
The secret key can be stored as an environment variable or in a file that deploys alongside your project. Both options would have to be done through your host. Here's a Pythonanywhere suggestion for a file that deploys alongside your project: https://www.pythonanywhere.com/forums/topic/14207/
You can have a separate .env file on PythonAnywhere to store a separate secret key. The .env file also a good place to store PythonAnywhere database passwords etc.
Having a different file with different values locally and on PythonAnywhere makes sense from a security perspective. Don't forget to update your PythonAnywhere .gitignore file to include it if you ever upload to your git repository from PA!
You can create a new .env file in your project directory via the PA files section, and, using your local .env file as a base, generate a new key value either by:
$ python -c "from django.core.management.utils import get_random_secret_key; print(get_random_secret_key())"
or use a web tool like https://djecrety.ir/

Django collectstatic keeps waiting when run through Github Action

We are facing a very weird issue. We ship a django application in a docker container through Github Actions on each push. Everything is working fine except collectstatic.
We have the following lines at the end of our CD github action:
docker exec container_name python manage.py migrate --noinput
docker exec container_name python manage.py collectstatic --noinput
migrate works perfectly fine, but collectstatic just keeps on waiting if ran through the github action. If I run the command directly on the server then it works just fine and completes with in few minutes.
Can someone please help me figuring out what could be the issue?
Thanks in advance.
Now I am far from the most experienced but I did this recently and I have some suggestions of where to look. I'm definitely not the greatest authority though.
I wasn't using docker so I can't say anything about that. From the issues, I had here are some suggestions I can recommend to try.
Take note that all of this was for a self-hosted runner. Things would be very different otherwise.
Check to make sure STATIC_ROOT and MEDIA_ROOT variables are set correctly in the settings file.
If the STATIC and MEDIA root variables are environment variables make sure you are serving the correct environment variables file like a .env file which I used.
I used django-environ to serve my environment variables. From the docs, it says to have the .env file in the same directory as the settings file. Well if you are putting the project on a production server with github actions, you won't be able to put the .env file anywhere in the project because it will get overwritten every time new code is pushed.
So to fix that you need to specify the correct .env file from somewhere else on the server. Do that by specifying ENV_PATH.
https://django-environ.readthedocs.io/en/latest/
Under the section Multiple env files
Another resource that was helpful:
https://github.com/joke2k/django-environ/issues/143
I set up my settings file like how they did there.
I put my .env file in a proj directory I made in the virtualenvironment folder for the project.
I don't know if it's a good place to put it but that's how I did it. I didn't find much great info online for this stuff. Had to figure out a lot on my own.
Make sure the user which is running the github action has permissions to read the .env file.
Also like .env file, if you have the static files being collected into the base directory of your project you might have an issue with github actions overwriting those files every time new code is pushed. If you have a media directory where the user uploads files to then that will really be an issue because those files won't get overwritten. They'll just disappear.
Now if this was an issue it shouldn't cause github actions to just get stuck on the collect static command. It would just cause files to get overwritten every time the workflow runs and the media files will disappear.
If you do change the directory of where the static and media files are located as stated before, make sure all the variables for the paths are correct in the settings file and the .env file.
You will also need to update the nginx config file for the static and media root directories if you used nginx. Not sure about how apache does this.
You can do that with this command:
sudo nano /etc/nginx/sites-available/myproject
Don't forget to restart the nginx server after doing that.
If you are writing static and media files at a different location from the base project directory on the server, also check permissions on those directories. Make sure the user running the github action has permissions to write to those directories. I suspect that might cause it to hang but it very well might just cause an error.
Check all the syntax in the github actions yml file. Make sure everything is correct and it's not hanging cause it had an incomplete command or something like that.
But yeah, that's some things I had to take a look at. Honestly, none of this might be relevant for you. All of these issues should cause an error somewhere for the most part.
I couldn't really offer many external resources for you to look deeper into this because I'm just speaking from personal experience.
Hope I could help.
Heres my github repo for the project I did: https://github.com/pkudlanov/personal-portfolio-django
I hosted it on digitalocean on a linux server using nginx and gunicorn.

Laravel and environment variables in command line production without .env file

When I upgraded to Laravel 5 I started using the env system where I set all my configurations inside this file at the root.
.env
DB_HOST=x.x.x.x.x
DB_USERNAME
...
This file is not part of my repo and when I deploy I set the same variables onto my system environment variables (using amazon beanstalk here) so they are accessible without the need of the .env file.
So that all works but now when I use the command line in my scripts in production these system environment variables don't seem to be accessible in the command line. They are however accessible when the PHP is served through the web server.
I found if I copy add back this .env file then it works in the command line but that's why I'm confused, I thought this system was to prevent committing sensitive information to the repo and now it seems I have to do so so that I can use my php artisan migrate and other commands in production.
Am I missing something? Is there a way to get the system environment variables available in the command line or do I have to somehow create that .env file dynamically in production?
I always thought that the .env file was just part of setting up your server. The .env file is ignored by git, so once it's setup, you can push/pull to your heart's content and it'll leave the .env file alone. That's at least how I've done it so far.

*.tar.gz file download use apache2 webserver

I would like to implement a function that user can upload/download file.
But when user upload a *tar.gz file and download, seems the file was changed, file's md5sum was changed and the files size was chaged.
Productions env use apache webserver. The webframework is django,
It's seems django's develop webserver upload/download *tar.gz file is okay (the file will not changed).
Is relate to apache server? any suggestions?

Django and live server, on rewrite, the old sources files are still used

I'm using Django 1.3 on an Apache server and mod_wsgi(daemon mode), with Nginx for serving static file. The database is on a separate server. The wsgi daemon runs on 2 threads with a maximum requests of 100.
I get in trouble when I override old .py files ... Not .pyc ... I'm also overriding the .wsgi config file (http://code.google.com/p/modwsgi/wiki/ReloadingSourceCode). Sometimes, some server requests uses the old code, and therefore an error is generated (HTTP ERROR 500). Is there a server side cache that needs be emptied?
Can this be generated by the .pyc files? Do I need to restart the Apache server or the wsgi daemon?
If you remove the .pyc files and touch your wsgi files it should reload the wsgi daemon when it gets a chance, and you should be good.
On occasion I have had to restart apache in order to get my changes take affect.
Setup up ownership/permissions so that the user that code runs as under Apache cannot change code files nor create .pyc files. The user application runs as should only have ability to write to data or upload directories that it really needs to as is safer anyway.
The most reliable deployment method would be to install new version into a completely new directory hierarchy, with WSGI script file outside that tree. Then replace WSGI script file with new one referring to new directory. The WSGI script file in doing this should not be edited in place however, but a new file moved into place so filesystem does an atomic replace of the whole file and no risk of on the fly edit being picked up.