How to properly populate Docker Django image with data from Postgres? - django

I wanted to create a Docker image for my Django project called mysite that has two apps tracks and users. I used docker build -t mysite to build my docker image. I have written a Dockerfile like it says on dockerhub. Then I created docker-compose.yml file and bash script entypoint.sh that I use in docker-compose file.
These are my files:
Dockerfile:
FROM django:onbuild
COPY entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh
docker-compose.yml:
version: '3'
services:
db:
image: postgres
restart: always
container_name: postgres_container
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=postgres
- DJANGO_SETTINGS_MODULE=mysite.settings_docker
ports:
- 5432:5432
volumes:
- /var/lib/postgresql/10/main
web:
image: mysite:latest
build:
context: .
dockerfile: Dockerfile
container_name: mysite_container
ports:
- "8000:8000"
depends_on:
- db
environment:
- DJANGO_SETTINGS_MODULE=mysite.settings_docker
entrypoint.sh
#!/bin/sh
python manage.py makemigrations
python manage.py migrate
python manage.py runserver 0.0.0.0:8000
exec "$#"
So when I wanted to start the whole container I used following commands
docker build .
docker-compose build
docker-compose up
My database in settings.py is
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'localhost',
'PORT': '5432',
}
}
My database in settings_docker.py, that I use in container is
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': '5432',
}
}
Then I added new settings_docker.py where I set my host to 'db' (In original settings.py I have HOST='localhost' and when I start the Django app with python manage.py runserver I have all my tables and relations froPostgreses database) file and added environment variable DOCKER_SETTINGS_MODULE=mysite.settins_docker to my web service in docker-compose.yml and the container starts properly, but now I don't have my database relations and tables.
So, how can I migrate those relations to docker? Which is the right way to populate my docker container with relations from postgres? I have tried with volumes in docker-compose.yml where I specified my postgres data file (my data is in /var/lib/postgresql/10/main) but it's still not working.

In case you consider switching to another approach - a native and quite portable* solution for providing an initial state of a DB would be to utilize Django fixtures. Utilizing that method, you have to bring your DB state into an expected state, meaning to provide initial tables and records. When ready, perform:
python manage.py dumpdata
In order to store the data in a file. Then, reuse the saved file with:
python manage.py loaddata fixturename
Another, but less flexible, way would be to provide .sql file to a init-db location - read in the manual where that is depending specifically on your Postgres version.
*portable, meaning you can maneuver with the file, send it, edit it and easily apply it.

Related

Cannot connect django to postgres running inside docker container

My django app is not dockerized, but I run postgres inside docker container using docker-compose.yml script. After docker-compose up I can connect to db with dbeaver, but not with django app. Every time I'm getting an error:
django.db.utils.OperationalError: could not translate host name "db" to address:
Temporary failure in name resolution
File docker-compose.yml:
version: "3.9"
services:
db:
image: postgres:13
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
- POSTGRES_USER="postgres"
- POSTGRES_PASSWORD="postgres"
- POSTGRES_DB="postgres"
ports:
- 5432:5432
volumes:
postgres_data
Django project file config/settings.py:
...
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': env.str("DB_NAME"),
'USER': env.str("DB_USER"),
'PASSWORD': env.str("DB_PASS"),
'HOST': env.str("DB_HOST"),
'PORT': env.decimal("DB_PORT")
}
}
It was some kind of pipenv error. When I restarted terminal and executed pipenv shell again and then python manage.py runserver everything worked just fine

could not translate host name to "db" to address: Unknown host

I'm attempting to follow the guide provided by: https://docs.docker.com/compose/django/
Whenever I attempt to makemigrations, it gives me the Unknown host error given in the title. I'm trying to use PostgreSQL with Django and Wagtail as its CMS
My docker-compose.yml looks like:
version: "3.9"
services:
db:
image: postgres
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
and my settings in the settings.py file look like:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
Am I missing anything?
Your code can run in two different environments, and hard-coding the connection information might not be correct.
You mention in a comment that you're running something like:
docker-compose up -d db
python manage.py makemigrations
In this environment python is running outside of Docker. If you add ports: [5432:5432] to the database configuration in the docker-compose.yml file, the database will be accessible via (probably) localhost. On the other hand, when you run docker-compose up, the application runs inside Docker and the database will be reachable at db.
You can use an environment variable to configure this. I find it useful to give these variables default values that would be useful for a developer, and set them to different values in my deployment setup (the docker-compose.yml).
DATABASES = {
'default': {
...
'HOST': os.getenv('DB_HOST', 'localhost'),
...
}
}
version: "3.9"
services:
db:
ports:
- '5432:5432' # makes this accessible from your development environment
...
web:
environment:
- DB_HOST=db
...

Docker Compose Django + PostgreSQL access postgres host without using service name

So, basically i have this docker-compose.yml config:
services:
postgres:
container_name: youtube_manager_postgres
restart: always
image: postgres:alpine
environment:
- POSTGRES_HOST_AUTH_METHOD=trust
- POSTGRES_USER=admin
- POSTGRES_PASSWORD=qwerty123
- POSTGRES_DB=ytmanager
volumes:
- postgres_data:/var/lib/postgresql/data/
ports:
- "5432:5432"
django:
container_name: youtube_manager_django
restart: always
build:
context: ../
dockerfile: deploy/django/Dockerfile
command: sh -c "poetry run python3 manage.py migrate &&
poetry run python3 manage.py collectstatic --no-input --clear &&
poetry run uwsgi --ini /etc/uwsgi.ini"
ports:
- "8000:8000"
volumes:
- staticfiles:/code/static
- mediafiles:/code/media
depends_on:
- postgres
My Django's database preferences are:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'ytmanager',
'USER': 'admin',
'HOST': '0.0.0.0',
'PASSWORD': 'qwerty123',
'PORT': '5432',
}
}
I wan't to use it in two ways:
1. Running docker-compose up -d postgres and then python3 manage.py runserver (actually, poetry run python3 manage.py runserver but for now it doesn't matter) during development.
2. Running docker-compose up during deployment.
For now, it works fine with the 1 option, but when I'm execution docker-compose up I'm getting an error:
youtube_manager_django | django.db.utils.OperationalError: could not connect to server: Connection refused
youtube_manager_django | Is the server running on host "0.0.0.0" and accepting
youtube_manager_django | TCP/IP connections on port 5432?
If I'm changing Django database settings this way:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'ytmanager',
'USER': 'admin',
'HOST': '0.0.0.0',
'PASSWORD': 'qwerty123',
'PORT': '5432',
}
}
Executing docker-compose up -d postgres and then python manage.py runserver causes an error:
django.db.utils.OperationalError: could not translate host name "postgres" to address: Temporary failure in name resolution
How could I properly config docker-compose.yml to use the same HOST in Django settings? (for example, 'HOST': '0.0.0.0' or 'HOST': 'postgres' for both options).
I've tried to use network_mode: host on my django and postgres services. It works fine, but is there any other way to solve a problem (for example, using networking settings? I've read docker-compose documentation on their website, but can't get what's going on there).
I think you are mixing develop a production environments (by the way, second time you pasted your Django database settings probably you meant 'HOST': 'postgres')
So, if I'm not wrong:
On development you want in your Django setting .py file: 'HOST': '0.0.0.0', since i think your are
executing python manage.py runserver outside docker,
keeping postgres in docker.
On production you want the same in your Django setting .py file: 'HOST': '0.0.0.0', but to make it work you need 'HOST': 'postgres' (matching the name of the service in the compose file) and run everything on docker (executing the whole compose file as it is: docker-compose up). In this case, Django can't get access to '0.0.0.0' database host since it is running 'containerized' and that ip don't bind to any service, so it needs the ip or name of the service 'postgres'.
Posible solution:
In my opinion the solution is having two yml files to be called by Docker ( e.g. docker-compose -f docker-compose-development.yml up):
docker-compose-development.yml
docker-compose-production.yml
In each .yml file you can use different env variables or settings to cover differences between development and production in a clean way.
You can have a look at:
https://github.com/pydanny/cookiecutter-django. It is a template Django project using Docker.
It follows "The Twelve Factors" app methodology:
https://12factor.net/
In short:
Environment variables are set in files under the .envs folder.
in the compose .yml files you point to them in order to load the environment variables:
env_file:
./.envs/.production/.postgres
Django settings .py files get access to the env variables using the django-environ package.
Before editing Django DB config, please ensure the following:
Both containers are running in the same network.
Postgres service is up and running in the container.
Service 'postgres' is accessible from the webapp container. For this you can login to the container and perform a ping test.
docker exec -it containerID /bin/bash (in some cases /bin/sh) # to login to container
Solutions:
Similar:
Django connection to postgres by docker-compose
For you to connect to DB using service name in Django, as per the documentation if the HOST and PORT keys are left out of the dictionary then Django will try connecting with the complete "NAME" as a SID.
Hope it helps

Dockerized Django REST Framework / Postgresql - Disallowed host

When I try to execute the following command:
sudo docker-compose up
I get the following error inside my terminal:
Here is what my docker file looks like:
FROM python:3.6
ENV PYTHONUNBUFFERED 1
RUN mkdir /agent-technologies
WORKDIR /agent-technologies
COPY . /agent-technologies
RUN pip install -r requirements.txt
EXPOSE 8000
Here is what my docker-compose.yml looks like:
version: '3'
services:
db:
image: postgres
environment:
- POSTGRES_USER=stefan_radonjic
- POSTGRES_PASSWORD=cepajecar995
- POSTGRES_DB=agent_technologies_db
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/agent-technologies
ports:
- "8000:8000"
links:
- db
depends_on:
- db
And finally here are settings of PostgresSQL DB I have previously created :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'agent_technologies_db',
'USER': 'stefan_radonjic',
'PASSWORD': 'cepajecar995',
'HOST': 'db',
'PORT': '',
}
}
The error is quite self explanatory.. You didn't set your ip in your ALLOWED_HOSTS setting. Try adding these to your settings file:
ALLOWED_HOSTS = ['*'] # wildcard, allows all
Or if you want to be explicit:
ALLOWED_HOSTS = ['0.0.0.0'] # explicit
Your django app is trying to connect to 'localhost' database, but as it is inside its own container -- localhost is not right uri to reach database.
Dockercompose will resolve your namequeries to database if you will refer to is by db container name. In your case it is "db"
As text on picture reads -- you have to add "0.0.0.0" into your ALLOWED_HOSTS variable inside your settings module.
Be sure that you've read django documentation carefully. It is pretty vigorous.

Django rest Docker with MySQL

I am trying to dockerize my existing Django Rest project. I am using MySQL database instead of default SqlLite.
My Dockerfile looks like following:
FROM python:2.7
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY . /code/
RUN pip install -r requirements.txt
and Docker-compose:
version: '3'
services:
db:
image: mysql
environment:
MYSQL_ROOT_PASSWORD: docker
MYSQL_DATABASE: docker
MYSQL_USER: docker
MYSQL_PASSWORD: docker
ports:
- "3306:3306"
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
I did not run docker-compose run web python manage.py migrate
docker-compose build is successful
However docker-compose up fails eventually saying Can't connect to local MySQL server. I am guessing that I need to install MySQl in my container as well, but do not know how. What am I missing in my Dockerfile or docker-compose?
UPDATE: My settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', #django.db.backends.mysql
'NAME': 'libraries', #local: libraries #server:
'USER': 'root', #root #root
'PASSWORD': 'root', #local: root #server:
'HOST': 'localhost', #local: localhost #server:
'PORT': '3306',
}
}
Change the HOST in the database settings from localhost to db. There's no MySQL in the web container so the Python code couldn't connect to the db.