I know there are a lot of questions floating around there relating to similar issues, but I think I have a specific flavor which hasn't been addressed yet. I'm attempting to create my local postgresql database so that I can do local development in addition to pushing to Heroku.
I have found basic answers on how to do this, for example (which I think is a wee bit outdated):
'#DATABASES = {'default': dj_database_url.config(default='postgres://fooname:barpass#localhost/dbname')}'
This solves the "ENGINE" is not configured error. However, when I run 'python manage.py syncdb' I get the following error:
'OperationalError: FATAL: password authentication failed for user "foo"
FATAL: password authentication failed for user "foo"'
This happens for all conceivable combinations of username/pass. So my ubuntu username/pass, my heroku username/pass, etc. Also this happens if I just try to take out the Heroku component and build it locally as if I was using postgresql while following the tutorial. Since I don't have a database yet, what the heck do those username/pass values refer to? Is the problem exactly that, that I need to create a database first? If so how?
As a side note I know I could get the db from heroku using the process outlined here: Should I have my Postgres directory right next to my project folder? If so, how?
But assuming I were to do so, where would the new db live, how would django know how to access it, and would I have the same user/pass problems?
Thanks a bunch.
Assuming you have postgres installed, connect via pgadmin or psql and create a new user. Then create a new database and with your new user as the owner. Make sure you can connect via psql with the new user into to the database. you will then need to set up an env variable in your postactivate file in your virtualenv's bin folder and save it. Here is what I have for the database:
export DATABASE_URL='postgres://{{username}}:{{password}}#localhost:5432/{{database}}'
Just a note: adding this value to your postactivate doesn't do anything. The file is not run upon saving. You will either need to run this at the $ prompt, or simply deactivate and active your virtualenv.
Your settings.py should read from this env var:
DATABASES = {'default': dj_database_url.config()}
You will then configure Heroku with their CLI tool to use your production database when deployed. Something like:
heroku config:set DATABASE_URL={{production value here}}
(if you don't have Heroku's CLI tool installed, you need to do it)
If you need to figure how exactly what that value you need for your production database, you can get it by logging into heroku's postgresql subdomain (at the time this is being written, it's https://postgres.heroku.com/) and selecting the db from the list and looking at the "Connection Settings : URL" value.
This way your same settings.py value will work for both local and production and you keep your usernames/passwords out of version control. They are just env config values.
Related
I am developing Django Wagtail application on my local machine connected to a local postgres server.
I have a test server and a production server.
However when I develop locally and try upload it there is always some issue with makemigration and migrate e.g. KeyError etc.
What are the best practices of ensuring I do not get run into these issues? What files do I need to port across?
so ill tell you what i do and what most of the companies that i worked as a django developer did and i can tell you by experience that worked pretty well.
First containerize your application, this will make your life much more easy and you will remove external influence in your code, also will get you an easy way to reproduce your environment.
Your Dockerfile should be from some python image and should do 3 basically things:
Install your requirements dependencies
Run the python manage.py migrate --noinput command
Run a http server such as gunicorn with gunicorn -c /gunicorn.py wsgi:application
You ill do the makemigration in your local machine and make sure that everything is working before commit then to the repo.
In your gunicorn.py you ill put your settings to run the app such as the number of CPU, the binding port, the folder that your app is, something like:
import os
import multiprocessing
# Chdir to specified directory before apps loading.
# https://docs.gunicorn.org/en/stable/settings.html#chdir
chdir = '/app/'
# Bind the application on localhost both on ipv6 and ipv4 interfaces.
# https://docs.gunicorn.org/en/stable/settings.html#bind
bind = '0.0.0.0:8000'
Second containerize your other stuff, for example the postgres database, the redis (for cache), a connection pooler for the database depending on the size of your application.
Its highly recommend that you have a step in the pipeline to do tests, they need to run before everything, maybe just after lint
Ok what now? now you need a way to deploy that stuff, the best for that scenario is: pull your image to github registry, and you can add a tag to that for example:
IMAGE_ID=ghcr.io/${{ github.repository_owner }}/$IMAGE_NAME
# Change all uppercase to lowercase
IMAGE_ID=$(echo $IMAGE_ID | tr '[A-Z]' '[a-z]')
docker tag $IMAGE_NAME $IMAGE_ID:staging
docker push $IMAGE_ID:staging
This can be add in a github action in the build step for example.
After having your new code in a new image inside github you just need to update the current one, this can be done by creaaing a script to do it in the server and running that script from github action, is something like:
docker pull ghcr.io/${{ github.repository_owner }}/$IMAGE_NAME
echo 'Restarting Application...'
docker stop {YOUR_CONTAINER} && docker up -d
sudo systemctl restart nginx
echo 'Cleaning old images'
sudo docker system prune -af
You can see that i create the image with a staging tag, you can create a rule in github actions for example to trigger that action when you create a new release for example, and create another action to be trigger in every new commit and build/deploy for a dev tag.
For the migration problem, the first thing is, when your application go live squash every migration to the first one (you can drop the database and all the migration then create the database and run the makemigration command again to reach this), so you can have a clean migration in the server. Never creates unnecessary relation between the tables, prefer always doing cached properties instead of add new columns, use UUID for unique ids, and try to not do breaking changes in the database, its hard but if you plan the database before is not so difficult to do.
Hit me if you have any questions. A lot of the stuff that i said can be done in a lot of other platforms such as gitlab, travis, circle ci, but i use the github action in the example because i think is more simple to picture.
EDIT:
I forgot to tell you to have a cron in your server doing backups of your databases, the migrate command ill apply the changes only after the verification but if something else break the database this can save your life.
After updating the django settings.py default database to correct values.
i am able to run makemigrations and migrate command and also create super user. but when i login to the admin it gives me error of “no such table - auth user, OPERATIONAL ERROR”.
I noticed that even if i delete db.sqlite3 file, it comes back when i try to login, i think django looks for table in db.sqlite3 and not postgres.
why db.sqlite3 file re appear after deleting ?
how do i correctly configure my settings.py ?
i am integration using digitalocean managed database services with django installed in one droplet, i have integrated both preciously without error but i installed postgres, this is the first time using managed database service.
Thanks
It seems that your django settings are still pointing to the SQlite database.
Did you reload your WSGI process ? If not, the old SQlite settings are still used in memory.
How does one migrate their Django .sql3 development database to heroku?
Per here, and here I tried: heroku pg:psql --app sblic < database.sql3 but my Django admin shows no new uploads (even after syncdb/migrate/ or collectstatic
Perhaps there may be a way to directly upload an sql3 file to Heroku, but I went with the path of clearest certainty (convert local sql3 db to postgre db and upload a dump of postgre db to Heroku via pgbackups tool):
Create a dump of your sql3 database as a json file
With PostgreSql installed and with its bin directory in the Path environment variable, create a postgre user and database (or just plan on using your initial super user account created upon installing postgresql)
Update your settings.py with a reference to your newly created postgre database (note, 'HOST' may need to be set as 'localhost', 'user' is your postgre user login)
run python manage.py syncdb to initiate your new postgre db
Optional: if necessary, truncate your postgre db of contenttypes
load your dump from step 1 (if you have fixture issues, see here)
Now you have a working postgre local db. To migrate, first create a postgre dump
Post your postgre dump somewhere accessible by URL (such as drop box or amazon s3 as suggested in previous link).
Perform pgbackups restore command, referencing your dump url
Your heroku app should now reference the contents of your local db.
Heroku command line tool uses the psql binary. You have to install PostgreSQL on your local development machine to have psql available. From the (documentation)[https://devcenter.heroku.com/articles/heroku-postgresql#pg-psql]:
You must have PostgreSQL installed on your system to use heroku pg:psql.
You can in fact keep using your SQLite database with Heroku, but this is not recommended as it will be rewritten with your local copy if you re-deploy it to another dyno. Migrating the data to psql is recommended as described at https://devcenter.heroku.com/articles/sqlite3
I have a django based herokuapp site. I set everything up a long time ago and am now unable to get things working with a local instance of postgresql. In my settings file, I updated:
DATABASES['default'] = dj_database_url.config()
to work with a local database:
DATABASES['default'] = dj_database_url.config(default='postgres://localhost/appDB')
When running foreman, I can view the site, but the database is not currently populated (although I did create the empty DB). Running:
heroku run python manage.py dumpdata
Returns the contents of the remote (herokuapp) database, while a syncdb command results in "Installed 0 object(s) from 0 fixture(s)". So it looks like I'm still contacting the remote database. I'm pretty sure the postgresql DB is setup correctly locally; how can I force the app to use it?
I'm sure this is simple, but I haven't seen anything useful yet. I did try
export DATABASE_URL=postgres:///appDB
but that hasn't helped.
Cheers
I am creating a Django app on Heroku. While I am testing it locally, finally I am checking by uploading everything to heroku and seeing if it works for real on the remote server as well
To quickstart, I went to the admin panel and created a bunch of data to run the view functions on. When I am pushing the source code to heroku via git (git push heroku master), I also want to push up the database from local to heroku, so that I dont need to enter it again on the server side
How do I achieve this?
Thanks a lot
Grab your the postgres string from your Heroku database using heroku config:get
Look for the Heroku Postgres url (example: HEROKU_POSTGRESQL_RED_URL: postgres://user3123:passkja83kd8#ec2-117-21-174-214.compute-1.amazonaws.com:6212/db982398).
Next, run this on your command line:
pg_dump --host=<host_name> --port=<port> --username=<username> --password --dbname=<dbname> > output.sql
The terminal will ask for your password then run it and dump it into output.sql.
Then import it:
psql -d <heroku postgres string> -f output.sql
Note: in each place where you see <.....> in the above code, be sure to substitute your specific information. Thus you would put your username where it says <username>, and your heroku database url where it says <heroku postgres string>.
Similar answer here.