GeoDjango and postgis setup in docker - django

I'm trying to create a docker setup such that I can easily build and deploy a geodjango app (with postgis backend). I have the following folder structure:
|-- Dockerfile
|-- Pipfile
|-- Pipfile.lock
|-- README.md
|-- app
| |-- manage.py
| |-- app
| `-- app_web
In my Dockerfile to setup Django I have the following:
# Pull base image
FROM python:3.7
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install dependencies
RUN pip install pipenv
COPY . /code
WORKDIR /code/
RUN pipenv install --system
# Setup GDAL
RUN apt-get update &&\
apt-get install -y binutils libproj-dev gdal-bin python-gdal python3-gdal
# set work directory
WORKDIR /code/app
CMD ["python", "manage.py", "migrate", "--no-input"]
In my docker-compose.yml file:
version: '3.7'
services:
postgis:
image: kartoza/postgis:12.1
volumes:
- postgres_data:/var/lib/postgresql/data/
web:
build: .
command: python /code/app/manage.py runserver 0.0.0.0:8000
ports:
- 8000:8000
volumes:
- .:/code
depends_on:
- postgis
volumes:
postgres_data:
And finally in settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'postgis',
},
}
Now when I: run docker-compose up --build
It all seems to work (both the database as well as the django app spin up their containers). But whenever I try to actually work with the database (the app is blank rightnow, so I still need to migrate) django doesn't seem to recognize the database.
Can anybody tell me what is going wrong?

run docker-compose up --build It all seems to work (both the database as well as the django app spin up their containers)
Make sure to check the logs because containers not necessarily stop after finding an error.
docker-compose logs
Also you're already setting your working dir in Dockerfile
WORKDIR /code/app
so take a look at the command you execute in docker-compose.yml:
command: python /code/app/manage.py runserver 0.0.0.0:8000

Did you install the app?
INSTALLED_APPS = [
# [...]
'django.contrib.gis'
]

In settings.py, PASSWORD needs to be specified:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'postgis',
},
}

Related

Docker: User localhost instead of db to connect to PostgresDatabase

I'm running into problems when using Docker. To connect from Django application to the Postgres database I have to use:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
However to run tests via pytest in my Pipenv shell I have to change 'Host': from db to localhost. Is there a way that I can always use localhost?
Docker-Compose:
version: '3'
services:
db:
image: postgres
ports:
- "5432:5432"
web:
build: .
env_file: .env
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
container_name: test
Dockerfile:
# Pull base image
FROM python:3
# Set environment varibles
ENV PYTHONUNBUFFERED 1
# Set work directory
RUN mkdir /code
WORKDIR /code
# Install dependencies
RUN pip install --upgrade pip
RUN pip install pipenv
COPY ./Pipfile /code/Pipfile
RUN pipenv install --deploy --system --skip-lock --dev
# Define ENTRYPOINT
COPY ./docker-entrypoint.sh /docker-entrypoint.sh
RUN chmod +x /docker-entrypoint.sh
ENTRYPOINT ["/docker-entrypoint.sh"]
# Copy project
COPY . /code/

Django fail to connect to postgres database in gitlab ci

I have the following .gitlab-ci.yml :
image: python:3.6
stages:
- lint
- test
services:
- postgres:10.1-alpine
cache:
paths:
- /root/.local/share/virtualenvs/
before_script:
- python -V
- pip install pipenv
- pipenv install --dev
lint:
stage: lint
script:
- pipenv run pylint --output-format=text --load-plugins pylint_django project/ | tee pylint.txt
- score=$(sed -n 's/^Your code has been rated at \([-0-9.]*\)\/.*/\1/p' pylint.txt)
- echo "Pylint score was $score"
- pipenv run anybadge --value=$score --file=pylint.svg pylint
artifacts:
paths:
- pylint.svg
test:
stage: test
script:
- pipenv run python manage.py test
And I am connecting to the database like this :
# settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': '',
'HOST': 'db', # set in docker-compose.yml
'PORT': 5432 # default postgres port
}
}
For now, I just have this tests in users/tests.py :
from .models import CustomUser
from django.test import TestCase
class LogInTest(TestCase):
def setUp(self):
self.credentials = {
'username': 'testuser',
'password': 'secret'}
CustomUser.objects.create_user(**self.credentials)
def testLogin(self):
# send login data
response = self.client.post('/users/login/', self.credentials, follow=True)
# should be logged in now
self.assertTrue(response.context['user'].is_authenticated)
Which fails with the following error :
psycopg2.OperationalError: could not translate host name "db" to address: Name or service not known
In development, I use the following docker-compose.yml file :
version: '3.3'
services:
db:
image: postgres:10.1-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- 8000:8000
environment:
- SECRET_KEY=changemeinprod
depends_on:
- db
volumes:
postgres_data:
With this Dockerfile :
FROM python:3.6
# Set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /code
# Install dependencies
RUN pip install --upgrade pip
RUN pip install pipenv
COPY ./Pipfile /code/Pipfile
RUN pipenv install --deploy --system --skip-lock --dev
# Copy project
COPY . /code/
I don't understand why the app cab't connect to the database in the CI but connect just fine in the development ENV with docker
In your docker-compose.yml you set the service name as db, and django settings.py for use db, but in gitlab will use the image name as the name of the service, postgres in case.
You have two options:
1 - Use environments variables in settings.py, some example here
2 - Set a alias in gitlab-ci.yml, like:
services:
- name: postgres:10.1-alpine
alias: db
link: gitlab docs

How to properly populate Docker Django image with data from Postgres?

I wanted to create a Docker image for my Django project called mysite that has two apps tracks and users. I used docker build -t mysite to build my docker image. I have written a Dockerfile like it says on dockerhub. Then I created docker-compose.yml file and bash script entypoint.sh that I use in docker-compose file.
These are my files:
Dockerfile:
FROM django:onbuild
COPY entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh
docker-compose.yml:
version: '3'
services:
db:
image: postgres
restart: always
container_name: postgres_container
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=postgres
- DJANGO_SETTINGS_MODULE=mysite.settings_docker
ports:
- 5432:5432
volumes:
- /var/lib/postgresql/10/main
web:
image: mysite:latest
build:
context: .
dockerfile: Dockerfile
container_name: mysite_container
ports:
- "8000:8000"
depends_on:
- db
environment:
- DJANGO_SETTINGS_MODULE=mysite.settings_docker
entrypoint.sh
#!/bin/sh
python manage.py makemigrations
python manage.py migrate
python manage.py runserver 0.0.0.0:8000
exec "$#"
So when I wanted to start the whole container I used following commands
docker build .
docker-compose build
docker-compose up
My database in settings.py is
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'localhost',
'PORT': '5432',
}
}
My database in settings_docker.py, that I use in container is
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': '5432',
}
}
Then I added new settings_docker.py where I set my host to 'db' (In original settings.py I have HOST='localhost' and when I start the Django app with python manage.py runserver I have all my tables and relations froPostgreses database) file and added environment variable DOCKER_SETTINGS_MODULE=mysite.settins_docker to my web service in docker-compose.yml and the container starts properly, but now I don't have my database relations and tables.
So, how can I migrate those relations to docker? Which is the right way to populate my docker container with relations from postgres? I have tried with volumes in docker-compose.yml where I specified my postgres data file (my data is in /var/lib/postgresql/10/main) but it's still not working.
In case you consider switching to another approach - a native and quite portable* solution for providing an initial state of a DB would be to utilize Django fixtures. Utilizing that method, you have to bring your DB state into an expected state, meaning to provide initial tables and records. When ready, perform:
python manage.py dumpdata
In order to store the data in a file. Then, reuse the saved file with:
python manage.py loaddata fixturename
Another, but less flexible, way would be to provide .sql file to a init-db location - read in the manual where that is depending specifically on your Postgres version.
*portable, meaning you can maneuver with the file, send it, edit it and easily apply it.

Dockerized Django REST Framework / Postgresql - Disallowed host

When I try to execute the following command:
sudo docker-compose up
I get the following error inside my terminal:
Here is what my docker file looks like:
FROM python:3.6
ENV PYTHONUNBUFFERED 1
RUN mkdir /agent-technologies
WORKDIR /agent-technologies
COPY . /agent-technologies
RUN pip install -r requirements.txt
EXPOSE 8000
Here is what my docker-compose.yml looks like:
version: '3'
services:
db:
image: postgres
environment:
- POSTGRES_USER=stefan_radonjic
- POSTGRES_PASSWORD=cepajecar995
- POSTGRES_DB=agent_technologies_db
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/agent-technologies
ports:
- "8000:8000"
links:
- db
depends_on:
- db
And finally here are settings of PostgresSQL DB I have previously created :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'agent_technologies_db',
'USER': 'stefan_radonjic',
'PASSWORD': 'cepajecar995',
'HOST': 'db',
'PORT': '',
}
}
The error is quite self explanatory.. You didn't set your ip in your ALLOWED_HOSTS setting. Try adding these to your settings file:
ALLOWED_HOSTS = ['*'] # wildcard, allows all
Or if you want to be explicit:
ALLOWED_HOSTS = ['0.0.0.0'] # explicit
Your django app is trying to connect to 'localhost' database, but as it is inside its own container -- localhost is not right uri to reach database.
Dockercompose will resolve your namequeries to database if you will refer to is by db container name. In your case it is "db"
As text on picture reads -- you have to add "0.0.0.0" into your ALLOWED_HOSTS variable inside your settings module.
Be sure that you've read django documentation carefully. It is pretty vigorous.

Django rest Docker with MySQL

I am trying to dockerize my existing Django Rest project. I am using MySQL database instead of default SqlLite.
My Dockerfile looks like following:
FROM python:2.7
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY . /code/
RUN pip install -r requirements.txt
and Docker-compose:
version: '3'
services:
db:
image: mysql
environment:
MYSQL_ROOT_PASSWORD: docker
MYSQL_DATABASE: docker
MYSQL_USER: docker
MYSQL_PASSWORD: docker
ports:
- "3306:3306"
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
I did not run docker-compose run web python manage.py migrate
docker-compose build is successful
However docker-compose up fails eventually saying Can't connect to local MySQL server. I am guessing that I need to install MySQl in my container as well, but do not know how. What am I missing in my Dockerfile or docker-compose?
UPDATE: My settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', #django.db.backends.mysql
'NAME': 'libraries', #local: libraries #server:
'USER': 'root', #root #root
'PASSWORD': 'root', #local: root #server:
'HOST': 'localhost', #local: localhost #server:
'PORT': '3306',
}
}
Change the HOST in the database settings from localhost to db. There's no MySQL in the web container so the Python code couldn't connect to the db.