I am using django-environ package for my Django project.
I provided the DB url in the .env file, which looks like this:
DATABASE_URL=psql://dbuser:dbpassword#dbhost:dbport/dbname
My DB settings in settings.py:
DATABASES = {
"default": env.db(),
}
So far, I have no issues.
Then, I created a docker-compose.yml where I specified that my project uses Postgres database, i.e.:
version: '3.8'
services:
...
db
image: postgres:13
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=???
- POSTGRES_PASSWORD=???
- POSTGRES_DB=???
- "POSTGRES_HOST_AUTH_METHOD=trust"
Now I am confused a little.
How do I provide these POSTGRES_* env. variables there? Do I need to provide them as separate variables alongside with the DATABASE_URL in my .env file? If yes, what's the best way do accomplish this? I aim to avoid duplication in my settings.
You can use variable expansion in your .env file. Something like
DB_NAME=dbname
DB_USER=dbuser
DB_PASSWORD=dbpassword
DATABASE_URL=psql://$DB_USER:$DB_PASSWORD#dbhost:dbport/$DB_NAME
and then something like this in your compose file
services:
postgresdb:
container_name: projectname_db
image: postgres:15
environment:
POSTGRES_DB: ${DB_NAME}
POSTGRES_USER: ${DB_USER}
POSTGRES_PASSWORD: ${DB_PASSWORD}
ports:
- "127.0.0.1:5432:5432"
...
I am not exactly familiar with django-environ but this should work
Related
I have the following setup:
docker-compose.yml
# Mentioning which format of dockerfile
version: "3.9"
# services or nicknamed the container
services:
# web service for the web
web:
# you should use the --build flag for every node package added
build: .
# Add additional commands for webpack to 'watch for changes and bundle it to production'
command: python manage.py runserver 0.0.0.0:8000
volumes:
- type: bind
source: .
target: /code
ports:
- "8000:8000"
depends_on:
- db
environment:
- "DJANGO_SECRET_KEY=django-insecure-m#x2vcrd_2un!9b4la%^)ou&hcib&nc9fvqn0s23z%i1e5))6&"
- "DJANGO_DEBUG=True"
expose:
- 8000
db:
image: postgres:13
#
volumes:
- postgres_data:/var/lib/postgresql/data/
# unsure of what this environment means.
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
# - "POSTGRES_USER=postgres"
# Volumes set up
volumes:
postgres_data:
and a settings file as
ALLOWED_HOSTS = ['0.0.0.0', 'localhost', '127.0.0.1']
#127.0.0.1 is my localhost address.
With my host's IP as 192.168.0.214
Can you please help me deploy the django site on my host's local network?
Do I have to set up something on my router?
Or could you direct me towards resources(understanding networking) which will help me understand the same.
I have a large monorepo Django app that I want to break into two separate repositories (1 to handle external api requests and the other to handle my front end that I plan on showing to users). I would still like to have both django apps have access to the same db when running things locally. Is there a way for me to do this? I'm running docker for both and am having issues with my front end facing django app being able to connect to the Postgres DB i have set up in a separate docker-compose file than the one I made for my front end app.
External API docker-compose file (Postgres DB docker image gets created here when running docker-compose up --build)
---
version: "3.9"
services:
db:
image: postgres:13.4
ports:
- "5432:5432"
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
api:
restart: always
build: .
image: &img img-one
command: bash start.sh
volumes:
- .:/app
ports:
- "8000:8000"
depends_on:
- db
env_file:
- variables.env
Front end facing docker-compose file (This is the one I want to be able to connect to the DB above):
---
version: "3.9"
services:
dashboard:
restart: always
build: .
image: &img img-two
volumes:
- .:/code
ports:
- "8010:8010"
depends_on:
- react-app
env_file:
- variables.env
react-app:
restart: always
build: .
image: *img
command: yarn start
env_file:
- variables.env
volumes:
- .:/app
- /app/node_modules
ports:
- "3050:3050"
Below is the database configuration I have set up in the front end django app that I want to connect to the DB but I keep getting connection refused errors when I try to run python manage.py runserver
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ.get("DB_NAME", "postgres"),
"USER": os.environ.get("DB_USERNAME", "postgres"),
"PASSWORD": os.environ.get("DB_PASSWORD", "postgres"),
"HOST": os.environ.get("DB_HOSTNAME", "db"),
"PORT": os.environ.get("DB_PORT", 5432),
}
}
Any ideas on how to fix this issue? (For reference, I've also tried changing HOST to localhost instead of db but still get the same connection refused errors)
I have been working on this all day and I am completely confused.
I have create a Django project and using docker and a docker-compose.yml to hold my environment variables. I was struggling to get the DEBUG variable to be False. But I have since found out that my SECRET_KEY isn't working either.
I have added a print statement after the SECRET_KEY and it prints out (test) as that is what I currently have in the docker-compose.yml file but this should fail to build...
If I hard code the DEBUG I can get it to change but I have completely removed the secret key and the project still starts. Any ideas where Django could be pulling this from or how I am able to trace it back to see?
settings.py
SECRET_KEY = os.environ.get('SECRET_KEY')
DEBUG = os.environ.get('DEBUG')
docker-compose.yml
version: '3.8'
services:
web:
build: .
container_name: django
command: gunicorn config.wsgi -b 0.0.0.0:8000
environment:
- ENVIRONMENT=development
- SECRET_KEY=(test)
- DEBUG=0
- DB_USERNAME=(test)
- DB_PASSWORD=(test)
volumes:
- .:/code
ports:
- 8000:8000
depends_on:
- db
- redis
celery:
build: .
image: celery
container_name: celery
command: celery -A config worker -l INFO
volumes:
- .:/code
environment:
- SECRET_KEY=(test)
- DEBUG=0
- DJANGO_ALLOWED_HOSTS=['127.0.0.1','localhost']
- CELERY_BROKER=redis://redis:6379/0
- CELERY_BACKEND=redis://redis:6379/0
depends_on:
- db
- redis
celery-beat:
build: .
environment:
- SECRET_KEY=(test)
- CELERY_BROKER=redis://redis:6379/0
- CELERY_BACKEND=redis://redis:6379/0
The reason was False/0 from the docker-compose.yml were being formatted to a string and a string is evaluated to True.
To solve this use;
DEBUG=eval(os.environ.get('DEBUG', False))
or
DEBUG=int(os.environ.get('DEBUG', 0))
This is my docker-compose file section for postgres container. These settings are fine, but my django app requires this user to have superuser previlleges through this command inside postgresql.
ALTER ROLE project_admin SUPERUSER;
How can this be accomodated inside this docker-compose file?
db:
image: postgres:latest
container_name: project-db
environment:
- POSTGRES_USER='project_admin'
- POSTGRES_PASS='projectpass'
- POSTGRES_DB='project'
You need to save your command as a script say ./scripts/01_users.sql:
ALTER ROLE project_admin SUPERUSER;
Then your docker-compose:
...
db:
image: postgres:latest
container_name: project-db
environment:
- POSTGRES_USER='project_admin'
- POSTGRES_PASS='projectpass'
- POSTGRES_DB='project'
volumes:
- ./scripts/:/docker-entrypoint-initdb.d/
This will run the script at startup and alter your user's privileges.
I have a Django-Postgres app deployed with Docker. I have two docker containers web and dband one docker volume for persistent storage.
My docker-compose.yml file:
version: '2'
services:
db:
image: postgres
ports:
- '5432:5432'
volumes:
- 'postgres:/var/lib/postgresql/data'
env_file: .env
web:
build: .
command: ./start.sh
volumes:
- .:/app
ports:
- "8000:8000"
depends_on:
- db
env_file: .env
volumes:
postgres:
I made a change in my django model.py:
class Meeting(models.Model):
[...]
participants = models.CharField(max_length=200)
to
class Meeting(models.Model):
[...]
user_participants = models.CharField(max_length=200)
However, this change is not reflected on my Django app and I get the following error:
column call_meeting.user_participants does not exist
I ran:
python manage.py makemigrations call
python manage.py migrate
Then I tried to delete Django migrations and re-run the above commands. I have tried to re-run docker-compose build but none of them worked.
Why does this happen? Should I change the column names with raw SQL?
From your question and comments, I get the impression that your django is not using the correct database. Or more probably (rather than two co-existing db instances), the database is recreated on every docker restart even though the docker-compose file does not look like it.
We are using a similar setup which works for us, but the data volume is mounted differently - so that might be the issue here:
- ./pgdata:/var/lib/postgresql/data/pgdata
settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ.get('POSTGRES_DB', 'my_db'),
'HOST': os.environ.get('POSTGRES_HOST', 'localhost'),
'PORT': os.environ.get('POSTGRES_PORT', '5432'),
'USER': os.environ.get('POSTGRES_USER', 'my_user'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD', 'my_password'),
'CONN_MAX_AGE': 120
}
}
docker-compose.yml
version: '2'
services:
postgres:
image: postgres:9.5
volumes:
- ./pgdata:/var/lib/postgresql/data/pgdata
environment:
- PGDATA=/var/lib/postgresql/data/pgdata
- POSTGRES_DB=<name> #django.settings.DATABASES['default']['NAME']
- POSTGRES_USER=<user> #django.settings.DATABASES['default']['USER']
- POSTGRES_PASSWORD=<password> #django.settings.DATABASES['default']['PASSWORD']
ports:
- 5432:5432
web:
depends_on:
- postgres
volumes:
- .:/app
environment:
- POSTGRES_HOST=postgres
- POSTGRES_DB=<host> # no need if settings.py is fine
- POSTGRES_USER=<user> # no need if settings.py is fine
- POSTGRES_PASSWORD=<password> # no need if settings.py is fine
build: .
ports:
- 8000:8000
command: ./start.sh