docker-compose not downloading additions to requirements.txt file - django

I have a Django project running in docker.
When I add some packages to my requirments.txt file, they don't get downloaded when I run docker-compose up
Here is the relevant commands from my Dockerfile:
ADD ./evdc/requirements.txt /opt/evdc-venv/
ADD ./env-requirements.txt /opt/evdc-venv/
# Active venv
RUN . /opt/evdc-venv/bin/activate && pip install -r /opt/evdc- venv/requirements.txt
RUN . /opt/evdc-venv/bin/activate && pip install -r /opt/evdc-venv/env-requirements.txt
It seems docker is using a cached version of my requirements.txt file, as when I shell into the container, the requirements.txt file in /opt/evdc-venv/requirements.txt does not include the new packages.
Is there some way I can delete this cached version of requirements.txt?
Dev OS: Windows 10
Docker: 17.03.0-ce
docker-compose: 1.11.2

docker-compose up doesn't build a new image unless you have the build section defined with your Dockerfile and you pass it the --build parameter. Without that, it will reuse the existing image.
If your docker-compose.yml does not include the build section, and your build your images with docker build ..., then after you recreate your image, a docker-compose up will recreate the impacted containers.

Related

docker-compose doesn't mount volumes correct for django container

Running Docker on Windows 10 with WSL 2 Ubuntu on it. I have the following Dockerfile:
FROM ubuntu
#base directory
ENV HOME /root
#subdirectory name for the REST project
ENV PROJECT_NAME django_project
#subdirectory name of the users app
ENV APP_NAME users
#set the working directory
WORKDIR $HOME
#install Python 3, the Django REST framework and the Cassandra Python driver
RUN apt-get update
RUN apt -y install python3-pip 2> /dev/null
RUN pip3 install djangorestframework
RUN pip3 install cassandra-driver
#initialize the project (blank project) and creates a folder called $PROJECT_NAME
#with manager.py on its root directory
RUN django-admin startproject $PROJECT_NAME .
#install an app in the project and create a folder named after it
RUN python3 manage.py startapp $APP_NAME
ENV CASSANDRA_SEEDS cas1
ENTRYPOINT ["python3","manage.py", "runserver", "0.0.0.0:8000"]
I build a image with docker build -t django-img . and then I have the following .yaml:
version: '3'
services:
django_c:
container_name: django_c
image: django-img
environment:
- CASSANDRA_SEEDS='cas1'
ports:
- '8000:8000'
volumes:
- /mnt/c/Users/claud/docker-env/django/django_project:/django_project
When I run docker-compose up -d inside django-project folder (.yml and Dockerfile are there), I get the container running, but I can't see in the host any file from the container. If I run ls in the container, however, I see all files are there:
How am I supposed to edit the container files using an editor in my host?
p.s.: I've already tested the volume slashes ("/") with another container and they work fine since I'm using WSL.
ADDITION
Here the content of my container folders using relative paths, I tried
volumes:
- /mnt/c/Users/claud/docker-env/django/django_project:/root/django_project
but it still did not show the files in the host.
I think the issue is that your volume mount refers to the absolute path /django_project, but you specify WORKDIR $HOME which is /root in your Dockerfile. An additional clue is that you see your files when you ls -la ./django_project in the container using a relative path.
I'll bet you can fix the problem by updating your docker-compose.yml django_c service definition to specify /root/django_project as your volume mount instead:
volumes:
- /mnt/c/Users/claud/docker-env/django/django_project:/root/django_project

ModuleNotFoundError: No module named 'flask_sqlalchemy' error observer while i do running the docker images

I am getting ModuleNotFoundError No module named 'flask_sqlalchemy' when try to docker run of builded docker images.The same python flask run using terminal is working fine for me but not in docker?
FROM python:3.6
ADD . /app
WORKDIR /app
RUN pip install flask gunicorn
EXPOSE 8000
CMD ["gunicorn", "-b", "0.0.0.0:8000", "app"]
The good practice is to save file with installed pip packages, e.g. in requirements.txt. (preferably with only needed packages, e.g. when you are using pyevn).
This is done by pip freeze > requirements.txt (when located on local machine, in terminal).
Then, all you should do is replace RUN pip install flask gunicorn with RUN pip install -r requirements.txt and all installed packages on the local machine will be installed in the docker too.

Multistage build with docker from multiple sources

I have a django project with different applications and I am trying to build a docker image for every single app. However in development I want a docker image to contain the whole project. I was using some multistage before to handle dev dependencies. Is there any way to achieve the following in a different way?
FROM python:3.7-stretch AS base
RUN pip install -r /projectdir/requirements.txt
FROM base AS app1
RUN pip install -r /projectdir/app1/requirements.txt
FROM base AS app2
RUN pip install -r /projectdir/app2/requirements.txt
FROM app1, app2 AS dev
RUN pip install -r /projectdir/requirements_dev.txt

Docker ASPNET Core container with Python installed

I have an application that is running some processes and exposes them through WebAPI. Part of these processes need to execute Python scripts through the IronPython library. For this to happen though, Python 2.7 must also be installed on the system.
Has anyone solved this problem by figuring out how to install Python in the ASPNET Core Docker image (or by any other means). The only other hack I can think of it putting the Python executable into a dependency directory for the API.
Our current Docker File contents:
FROM microsoft/aspnetcore:2.0
ARG source
WORKDIR /app
EXPOSE 80
COPY ${source:-obj/Docker/publish} .
ENTRYPOINT ["dotnet", "Api.dll"]
You can use the RUN command to install it on the image. Simply add the following to your Dockerfile.
The image I pulled from Dockerhub seemed to be running Debian Linux as the base OS so the following should work. If it's another linux distro as the base in your instance, try yum instead or for windows OS chocolatey.
FROM microsoft/aspnetcore:2.0
RUN apt-get update -y && apt-get install python2.7 -y
ARG source
WORKDIR /app
EXPOSE 80
COPY ${source:-obj/Docker/publish} .
ENTRYPOINT ["dotnet", "AIA.Vietnam.dll"]
Now the python executable should be available in /usr/bin/python2.7

Django rest project dockerfile

I am absolutely new to docker. I have an existing Django Rest project whose structure looks like following:
My requirements.txt:
django==1.8.8
djangorestframework
markdown
django-filter
django-rest-auth
django-cors-headers
django-secure
django-sslserver
django-rest-auth[extras]
Normally I create a virtual env > activate it > do pip install requirements.txt and additionally I need easy_install mysql-python to get started.
I want to dockerize this project. Can someone help me build a simple docker file this project?
As #DriesDeRydt suggest in his comment, in the provided link there is a very simple example of a docker file which installs requirements:
Add the following content to the Dockerfile.
FROM python:2.7
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
ADD requirements.txt /code/
RUN pip install -r requirements.txt
ADD . /code/
This Dockerfile starts with a Python 2.7 base image. The base image is
modified by adding a new code directory. The base image is further
modified by installing the Python requirements defined in the
requirements.txt file.
You can change the image to fit your needed python version, by pulling the needed python image. For example:
FROM python:2.7 change to FROM python:3.5 or FROM python:latest
But as the above Dockerfile stands and assuming that you will place it inside the server folder, it will work for a test case.
Here is the Dockerfile documentation for further reading.