docker-compose build requirements.txt not update - django

I want to use docker to publish my Django project.
I have create a docker-compose.yml file, a .dockerignore and a Dockerfile like this one:
FROM python:3.6-alpine
RUN apk add --no-cache gcc musl-dev linux-headers
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev
RUN mkdir /code
COPY requirements.txt /code
WORKDIR /code
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "manage.py", "runserver", "127.0.0.1:8000"]
When I first run docker compose I get an error installing a package contained in my requirements.txt file, at this point, I remove the packages from my file and run:
docker-compose down
docker-compose build --no-cache
here my docker-compose.yml
version: '3'
networks:
mynetwork:
driver: bridge
services:
db:
image: postgres
restart: always
ports:
- "5432:5432"
networks:
- mynetwork
environment:
POSTGRES_USER: myuser
POSTGRES_PASSWORD: mypass
POSTGRES_DB: mydb
volumes:
- ./data:/var/lib/postgresql/data
web:
build: .
networks:
- mynetwork
volumes:
- .:/DEV
ports:
- "8000:8000"
depends_on:
- db
When I proceed to execute pip install -r requirements.txt there is again the package in file that execute causing an issue... how can I clear the cache and use my new saved requirements.txt file?

Related

Docker socket is not accesible in Docker.prod

I have the following docker-compose file which builds and starts 4 containers one of them is Django container for which I am mounting the /var/run/docker.sock in volumes so that Django container can access the host docker engine.
version: '3.8'
services:
web:
build:
context: ./app
dockerfile: Dockerfile.prod
command: gunicorn hello_django.wsgi:application --bind 0.0.0.0:8000
volumes:
- static_volume:/home/app/web/staticfiles
- media_volume:/home/app/web/mediafiles
- /var/run/docker.sock:/var/run/docker.sock
expose:
- 8000
env_file:
- ./.env.prod
depends_on:
- postgresdb
restart: always
postgresdb:
container_name: postgresdb
image: timescale/timescaledb:latest-pg11
volumes:
- ./:/imports
- postgres_data:/var/lib/postgresql/data/
command: 'postgres -cshared_preload_libraries=timescaledb'
ports:
- "5432:5432"
env_file:
- ./.env.prod.db
restart: always
nginx:
build: ./nginx
volumes:
- static_volume:/home/app/web/staticfiles
- media_volume:/home/app/web/mediafiles
ports:
- 80:80
depends_on:
- web
restart: always
volttron1:
container_name: volttron1
hostname: volttron1
build:
context: ./volttron
dockerfile: Dockerfile
image: volttron/volttron:develop
volumes:
- ./volttron/platform_config.yml:/platform_config.yml
- ./volttron/configs:/home/volttron/configs
- ./volttron/volttronThingCerts:/home/volttron/volttronThingCerts
environment:
- CONFIG=/home/volttron/configs
- LOCAL_USER_ID=1000
network_mode: host
restart: always
mem_limit: 700m
cpus: 1.5
volumes:
postgres_data:
static_volume:
media_volume:
The content of the Docker.prod for django web container is following
###########
# BUILDER #
###########
# pull official base image
FROM python:3.9.6-alpine as builder
# set work directory
WORKDIR /usr/src/app
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install psycopg2 dependencies
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev
RUN apk add libc-dev
RUN apk add --update-cache
RUN apk add --update alpine-sdk && apk add libffi-dev openssl-dev && apk --no-cache --update add build-base
# lint
RUN pip install -U pip
RUN pip install flake8==3.9.2
COPY . .
RUN flake8 --ignore=E501,F401 ./hello_django
# install dependencies
COPY ./requirements.txt .
RUN pip wheel --no-cache-dir --no-deps --wheel-dir /usr/src/app/wheels -r requirements.txt
#########
# FINAL #
#########
# pull official base image
FROM python:3.9.6-alpine
# create directory for the app user
RUN mkdir -p /home/app
# create the app user
RUN addgroup -S app && adduser -S app -G app
# create the appropriate directories
ENV HOME=/home/app
ENV APP_HOME=/home/app/web
RUN mkdir $APP_HOME
RUN mkdir $APP_HOME/staticfiles
RUN mkdir $APP_HOME/mediafiles
WORKDIR $APP_HOME
# install dependencies
RUN apk update && apk add libpq
COPY --from=builder /usr/src/app/wheels /wheels
COPY --from=builder /usr/src/app/requirements.txt .
RUN pip install --no-cache /wheels/*
# copy entrypoint.prod.sh
COPY ./entrypoint.prod.sh .
RUN sed -i 's/\r$//g' $APP_HOME/entrypoint.prod.sh
RUN chmod +x $APP_HOME/entrypoint.prod.sh
# copy project
COPY . $APP_HOME
# chown all the files to the app user
RUN chown -R app:app $APP_HOME
RUN chmod 666 /var/run/docker.sock
# change to the app user
USER app
# run entrypoint.prod.sh
ENTRYPOINT ["/home/app/web/entrypoint.prod.sh"]
The problem is in the statement RUN chmod 666 /var/run/docker.sock which raises the following error
chmod: cannot access "/var/run/docker.sock": No such file or directory
but why I am getting this error? when I have mounted the /var/run/docker.sock in docker.compose.yml file
You're trying to chmod the docker.sock file when building the image. The volume is only mounted and used when running the container. You'll probably need to change permissions of the socket file on the host if needed.

Django/Docker: web container not up-to-date code

EDIT 22/05/2022
Docker version 20.10.14
docker-compose version 1.25.0
I delete again all containers/images and re-build using docker-compose -f docker-compose.preprod.yml build --force-rm --no-cache but I still observe the same issue: code in not up-to-date in web container
I use Django docker app and do not manage to apply code update to my web container.
I've tried to delete all containers (docker rm -f ID ; docker system prune) and images (docker rmi -f ID ; docker image prune) related to my app and re-build with docker-compose -f docker-comose.preprod.yml build
Then I run docker-compose -f docker-comose.preprod.yml up but for some reasons when I connect to my web running container (docker exec -it web sh) and read my updated files, I observe that update are not applied...
How should I do to make my update applied?
# Pull the official base image
FROM python:3.8.3-alpine
# Set a work directory
WORKDIR /usr/src/app
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install psycopg2 dependencies
RUN apk update && apk add postgresql-dev gcc g++ python3-dev musl-dev
RUN apk --update add libxml2-dev libxslt-dev libffi-dev musl-dev libgcc openssl-dev curl postgresql-client
RUN apk add jpeg-dev zlib-dev freetype-dev lcms2-dev openjpeg-dev tiff-dev tk-dev tcl-dev nano
RUN pip3 install psycopg2 psycopg2-binary
# install xgettext for i18n
RUN apk add gettext
# Install dependencies
COPY requirements/ requirements/
RUN pip install --upgrade pip && pip install -r requirements/preprod.txt
# Copy the entrypoint.sh file
COPY entrypoint.preprod.sh .
# Copy the initdata sql file
COPY initdata.preprod.sql .
# Copy the project's files
COPY . .
RUN chmod +x entrypoint.preprod.sh
version: '3.7'
services:
web:
restart: always
container_name: ecrf_web
build:
context: ./app
dockerfile: Dockerfile.preprod
command: gunicorn core.wsgi:application --bind 0.0.0.0:8000
volumes:
- app_volume:/usr/src/app
- static_volume:/usr/src/app/static
- media_volume:/usr/src/app/media
expose:
- 8000
env_file:
- ./.env.preprod
entrypoint: [ "/usr/src/app/entrypoint.preprod.sh" ]
depends_on:
- redis
healthcheck:
test: [ "CMD", "curl", "-f", "http://localhost:8000/" ]
interval: 30s
timeout: 10s
retries: 50
redis:
container_name: ecrf_redis
image: "redis:alpine"
celery:
container_name: ecrf_celery
build:
context: ./app
dockerfile: Dockerfile.preprod
command: celery -A core worker -l info
volumes:
- app_volume:/usr/src/app
env_file:
- ./.env.preprod
depends_on:
- web
- redis
celery-beat:
container_name: ecrf_celery-beat
build:
context: ./app
dockerfile: Dockerfile.preprod
command: celery -A core beat -l info
volumes:
- app_volume:/usr/src/app
env_file:
- ./.env.preprod
depends_on:
- web
- redis
nginx:
container_name: ecrf_nginx
build: ./nginx
restart: always
volumes:
- static_volume:/usr/src/app/static
- media_volume:/usr/src/app/media
ports:
- 1370:80
depends_on:
- web
volumes:
static_volume:
media_volume:
app_volume:

Docker with django issue "no module psycopg2" - but it's installed and in requirements.txt

I have the issue, that django_api has the error "django.core.exceptions.ImproperlyConfigured: Error loading psycopg2 module: No module named 'psycopg2'"
psycopg2 is installed ->
Requirement already satisfied: psycopg2 in /Users/swestphal/Documents/privateProjects/vue_and_django/ddv/env/lib/python3.9/site-packages (2.8.6)
My requirements.txt
asgiref==3.3.4
Django==3.2.3
django-cors-headers==3.5.0
djangorestframework==3.12.4
psycopg2==2.8.6
psycopg2-binary==2.8.6
PyJWT==1.7.1
pytz==2021.1
And here my Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED 1
WORKDIR /app
COPY requirements.txt /app/requirements.txt
RUN pip install -r requirements.txt
COPY . /app
CMD python manage.py runserver 0.0.0.0:8000
And here my docker-compose.yaml
version: '3.8'
services:
admin_db:
container_name: django_admin_db
image: postgres
restart: always
environment:
- POSTGRES_DB=django_admin
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
volumes:
- ./data/admin_db:/var/lib/postgresql/data
ports:
- 33066:3306
admin_api:
container_name: django_api
build:
context: .
dockerfile: Dockerfile
volumes:
- .:/app
ports:
- '8000:8000'
depends_on:
- admin_db
Thanks for helping!
It is a common problem with postgresql. What you need are linux dev-packages.
Adding this to your dockerfile would help.
RUN apt-get update \
# dependencies for building Python packages
&& apt-get install -y build-essential \
# psycopg2 dependencies
&& apt-get install -y libpq-dev
but did you install it inside the container ?
command to reinstall from requirements.txt
docker exec admin_api pip install -r requirements.txt
adding a line in the requirements.txt doesn't install the package (unless you rebuild the container)
you can also speed it up by just installing the new package :
docker exec admin_api pip install <package name>

How can i resolve this issue: ERROR: unsatisfiable constraints: postgresql-client (missing): required by: world[postgresql-client]?

Please, can someone help me?
I just started learning Django-REST-API using Docker. I want to install postgresql-11-alpine but i keep getting this error messages when i run; docker-compose build:
WARNING: Ignoring http://dl-cdn.alpinelinux.org/alpine/v3.12/main/x86_64/APKINDEX.tar.gz: temporary error (try again later)
WARNING: Ignoring http://dl-cdn.alpinelinux.org/alpine/v3.12/community/x86_64/APKINDEX.tar.gz: temporary error (try again later)
ERROR: unsatisfiable constraints:
postgresql-client (missing):
required by: world[postgresql-client]
ERROR: Service 'app' failed to build: The command '/bin/sh -c apk add --update --no-cache postgresql-client' returned a non-zero code: 1
Below is my Dockerfile file
FROM python:3.7-alpine
MAINTAINER Tboost Technology
#set environment variable
ENV PYTHONUNBUFFERED 1
#install dependencies
COPY ./requirements.txt /requirements.txt
RUN apk add --update --no-cache postgresql-client
RUN apk add --update --no-cache --virtual .tmp-build-deps \
gcc libc-dev python3-dev linux-headers postgresql-dev
RUN pip3 install -r /requirements.txt
RUN apk del .tmp-build-deps
#mkdir to store your apps source code
RUN mkdir /app
WORKDIR /app
COPY ./app /app
#create user to run apps in the docker
RUN adduser -D user
#switch to the user
USER user
Below is my docker-compose.yml file
version: "3"
services:
app:
build:
context: .
ports:
- "8000:8000"
extra_hosts:
- "host.docker.internal:172.17.0.1"
volumes:
- ./app:/app
command: >
sh -c "python manage.py runserver 0.0.0.0:8000"
environment:
- DB_HOST=db
- DB_NAME=app
- DB_USER=postgres
- DB_PASS=supersecretpassword
depends_on:
- db
db:
image: postgres:11-alpine
restart: always
environment:
- POSTGRES_DB=app
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=supersecretpassword
My requirements.txt file:
Django>=3.0.7,<3.0.8
djangorestframework>=3.11.0,<3.12.0
psycopg2>=2.7.5,<2.8.0
flake8>=3.6.0,<3.7.0
Thanks.
I encountered the same issue and was able to solve it by:
Installing the postgresql-client, I used the link below
Create a Postgresql Docker Container
Restart the docker service: sudo systemctl restart docker
Run the docker compose build command: docker-compose build
In my case it was issue of network. Container did not have access to internet.
It was solved by running a command:
docker build --network=host . -t recipe_app_api
and using recipe_app_api image inside docker-compose.yml.

'docker-compose up' hangs when attaching files for running Django server in virtualization

I'm trying to run Django inside a Docker container and connect to it using VS code remote, but it keeps hanging at the attach part.
Here is my Dockerfile
FROM registry.gitlab.com/datadrivendiscovery/images/primitives:ubuntu-bionic-python36-v2020.1.9
ENV PYTHONPATH=$PYTHONPATH:/app
WORKDIR /app/
EXPOSE 8000
COPY requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
COPY . .
CMD ["sh", "-c", "python3 /app/manage.py runserver"]
My docker-compose.yml
version: '3'
services:
web:
build: .\
volumes:
- .:/app
ports:
- "8000:8000"
I don't have a database setup so I don't have that in the yml.