I am getting an error when I run PostGIS and Django in Docker. I am trying to set up a Django project to use PostGIS database.
docker-compose.yml
version: '3'
services:
db:
image: postgis/postgis
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- 5432:5432
web:
build: .
command: bash -c "
python manage.py makemigrations
&& python manage.py migrate
&& python manage.py runserver 0.0.0.0:8000
"
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY requirements.txt /code/
RUN pip install -r requirements.txt
COPY . /code/
error message:
web_1 | File "/usr/local/lib/python3.8/site-packages/psycopg2/__init__.py", line 127, in connect
web_1 | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
web_1 | django.db.utils.OperationalError: could not connect to server: Connection refused
web_1 | Is the server running on host "db" (192.168.192.2) and accepting
web_1 | TCP/IP connections on port 5432?
web_1 |
ht_djangoapp_web_1 exited with code 1
It's most likely a timing issue: your application is trying to connect to the database before the database is ready. The easiest solution here is probably to just set a restart policy on your application container, so that docker will restart it when it fails. You might as well get rid of the depends-on directive, because this is functionally useless: while docker knows when the database container is up, it knows nothing about the state of the database application itself.
web:
build: .
command: bash -c "
python manage.py makemigrations
&& python manage.py migrate
&& python manage.py runserver 0.0.0.0:8000
"
volumes:
- .:/code
ports:
- "8000:8000"
restart: on-failure
- db web:
build: .
command: bash -c "
python manage.py makemigrations
&& python manage.py migrate
&& python manage.py runserver 0.0.0.0:8000
"
volumes:
- .:/code
ports:
- "8000:8000"
restart: on-failure
An alternative solution is to put a script into your application container that explicitly waits for the database to respond before starting up the web application.
The problem with this solution is that you may want to be able to restart the database container (for example, to upgrade to a newer version of postgres), in which case your application will probably fail again, unless it has reconnect logic built in. Using the restart policy is probably the better choice.
Related
I'm trying to dockerize my djangp app with postgres db and I'm having trouble, when run docker-compose, error is:
failed to solve: rpc error: code = Unknown desc = failed to solve with frontend dockerfile.v0: failed to read dockerfile: open /var/lib/docker/tmp/buildkit-mount4260694681/Dockerfile: no such file or directory
Project structure in screenshot:
Dockerfile:
FROM python:3
ENV PYTHONUNBUFFERED=1
WORKDIR /app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY . .
docker-compose:
version: "3.9"
services:
gunter:
restart: always
build: .
container_name: gunter
ports:
- "8000:8000"
command: python manage.py runserver 0.0.0.0:8080
depends_on:
- db
db:
image: postgres
Then: docker-compose run gunter
What am I doing wrong?
I tried to change directories, but everything seems to be correct, I tried to change the location of the Dockerfile
In your docker-compose.yml, you are trying to make a build from your app, from .. But the Dockerfile is not there, so docker-compose wouldn't be able to build anything.
You can pass the path of you Dockerfile:
version: "3.9"
services:
gunter:
restart: always
build: ./gunter_site/
container_name: gunter
ports:
- "8000:8000"
command: python manage.py runserver 0.0.0.0:8080
depends_on:
- db
db:
image: postgres
I'm trying to deploy django project to Heroku using docker image. My Procfile contains command:
web: gunicoron myProject.wsgi
But when I push and release to heroku - somewhy dyno process command according to a dashboard is
web: python3
Command heroku ps tells
web.1: crashed
And I can not change it anyhow. Any manipulation with Procfile doesn't work.
When I deploy the same project with git - everything works fine. But why heroku container deploy does not work. Everything done following heroku instruction.
My Dockerfile:
FROM python:3
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /app
COPY requirements.txt /app/
RUN pip install -r requirements.txt
COPY . /app/
My docker-compose.yml:
version: "3.9"
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
# ports:
# - "5432:5432"
environment:
- POSTGRES_DB=${SQL_NAME}
- POSRGRES_USER=${SQL_USER}
- POSTGRES_PASSWORD=${SQL_PASSWORD}
web:
build: .
# command: python manage.py runserver 0.0.0.0:8000
command: gunicorn kereell.wsgi --bind 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
# env_file: .env
environment:
- DEBUG=${DEBUG}
- SECRET_KEY=${SECRET_KEY}
- DB_ENGINE=${SQL_ENGINE}
- DB_NAME=${SQL_NAME}
- DB_USER=${SQL_USER}
- DB_PASSWD=${SQL_PASSWORD}
- DB_HOST=${SQL_HOST}
- DB_PORT=${SQL_PORT}
depends_on:
- db
My requirements.txt:
Django
psycopg2
gunicorn
Please help me resolve this. Thanks in advance.
Here is the case. I have simple django app with cucumber tests. I dockerized the django app and it works perfect, but I want to dockerize the cucumber test too and run them. Here is my project sturcutre:
-cucumber_drf_tests
-feature
-step_definitions
axiosinst.js
config.js
package.json
cucumber.js
Dockerfile
package-lock.json
-project_apps
-common
docker-compose.yaml
Dockerfile
manage.py
requirements.txt
Here is my cucumber_drf_tests/Dockerfile
FROM node:12
WORKDIR /app/src
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8000
CMD ["yarn", "cucumber-drf"] (this is how I run my test locally)
My second Dockerfile
FROM python:3.8
ENV PYTHONUNBUFFERED=1
RUN mkdir -p /app/src
WORKDIR /app/src
COPY requirements.txt /app/src
RUN pip install -r requirements.txt
COPY . /app/src
And my docker-compose file
version: "3.8"
services:
test:
build: ./cucumber_drf_tests
image: cucumber_test
container_name: cucumber_container
ports:
- 8000:8000
depends_on:
- app
app:
build: .
image: app:django
container_name: django_rest_container
ports:
- 8000:8000
volumes:
- .:/django #describes a folder that resides on our OS within the container
command: >
bash -c "python manage.py migrate
&& python manage.py loaddata ./project_apps/fixtures/dummy_data.json
&& python manage.py runserver 0.0.0.0:8000"
depends_on:
- db
db:
image: postgres
container_name: postgres_db
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=bla
- POSTGRES_PASSWORD=blaa
If I remove I remove test service and run the tests locally everything is fine, but otherwise I got different errors the last one is:
Bind for 0.0.0.0:8000 failed: port is already allocated
It is logic I know, but how to tell to test_container to make the API calls to the address of the running django_rest_container. Maybe this dummy question but I am new of containers world so every sharing of good practice is wellcomed
The issue is in exposing the ports. You are exposing both app and test on the same port (8000). For container you can keep it same. But for host it has to be different.
<host port> : <container port>
This is how ports are mapped in docker. So either change the host port in app or test to different port like below.
For app keep below ports:
7500:8000
Now your app will be accessible at port 7500 and test at 8000
I am new to Docker so I'm probably missing something obvious but here goes. I am trying to test a simple Django app with docker postgres. All I want right now is to verify that the home page is working on localhost. Debug output window gives me the following:
web_1 | System check identified no issues (0 silenced).
web_1 | April 28, 2020 - 17:06:23
web_1 | Django version 3.0.5, using settings 'app.settings'
web_1 | Starting development server at http://0.0.0.0:8000/
web_1 | Quit the server with CONTROL-C.
However when I go to 0.0.0.0:8000 I get an error that says the site can't be reached
"The webpage at http://0.0.0.0:8000/ might be temporarily down or it may have moved permanently to a new web address.
ERR_ADDRESS_INVALID
Here is my docker-compose.yml:
version: '3.7'
services:
web:
build: .
command: python /app/manage.py runserver 0.0.0.0:8000
volumes:
- .:/app
ports:
- 8000:8000
depends_on:
- db
db:
image: "postgres:latest"
ports:
- "5432:5432"
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
Here is my dockerfile:
# Pull base image
FROM python:3.8
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /app
# Install dependencies
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
# Copy project
COPY . /app/
EXPOSE 8000
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
Would greatly appreciate any insight or help with troubleshooting. Because there are no errors in the debug window I'm not really sure where to start.
Thanks!
Had the same issue. Solved it by using Iain Shelvington's comment:
Try http://localhost:8000/ or http://127.0.0.1:8000/
Apparently 0.0.0.0 in the Dockerfile maps to localhost in the browser.
Hi I am having issues running migrations on postgres db container from django.
Here is my docker-compose file
web:
restart: always
build: ./web
expose:
- "8000"
links:
- postgres:postgres
volumes:
- /usr/src/app
- /usr/src/app/static
env_file:
- ./.env
environment:
- DEBUG=1
command: /usr/local/bin/gunicorn mysite.wsgi:application -w 2 -b :8000
nginx:
restart: always
build: ./nginx/
ports:
- "80:80"
volumes:
- /www/static
volumes_from:
- web
links:
- web:web
postgres:
restart: always
build: ./postgres
env_file: .env
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data/
The directory structure is below.
The .env file defines the Postgres DB , user name and password
DB_NAME=test
DB_USER=test
DB_PASS=test!
DB_SERVICE=postgres
DB_PORT=5432
POSTGRES_USER=test
POSTGRES_DB=test
POSTGRES_PASSWORD=test!
When I run docker-compose build and docker-compose up -d nginx, postgres and web containers start. The postgres startup (default) creates the db, user and password. The django startup container installs requirements.txt and starts the django server (everything looks good).
On running makemigrations
docker-compose run web /usr/local/bin/python manage.py makemigrations polls
I get the following output
Migrations for 'polls':
0001_initial.py:
- Create model Choice
- Create model Question
- Add field question to choice
But when I run
docker-compose run web /usr/local/bin/python manage.py showmigrations polls
the output is.
polls
(no migrations)
on running
docker-compose run web /usr/local/bin/python manage.py migrate --fake polls
the output I see
Operations to perform:
Apply all migrations: (none)
Running migrations:
No migrations to apply.
Your models have changes that are not yet reflected in a migration, and so won't be applied.
Run 'manage.py makemigrations' to make new migrations, and then re-run 'manage.py migrate' to apply them.
The tables are not created in postgres. What am I doing wrong ? Sorry for the long post but I wanted to put all the details here.