I'm using Docker locally for Django development and trying to use Heroku to deploy with Docker. But I'm getting complains about "no web processes running" aka no Dynos spun up. So missing this config somehow but find no mention of it on Heroku or the few tutorials out there.
Dockerfile:
FROM python:3.7-slim
# Set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /code
# Install dependencies
COPY Pipfile Pipfile.lock /code/
RUN pip install pipenv && pipenv install --system
# Copy project
COPY . /code/
heroku.yml
setup:
addons:
- plan: heroku-postgresql
build:
docker:
web: Dockerfile
run:
web: python /code/manage.py runserver 0.0.0.0:$PORT
I suspect the issue is in the run section of heroku.yml but pretty stuck.
So I figured it out. Somehow when running heroku stack:set container instead the heroku-18 stack was used, which also automatically runs collectstatic. So that was the issue.
Related
I am trying to develop a web application using Django in Docker.
I made a DockerFile and a docker-compose.yml file as below to check the code in the development environment.
Even though I'm using the runserver command in the file, when I start the container and access the localhost, the CSS for the admin page isn't working.
How should I change the codes to make it work?
Here are the codes:
Dockerfile
FROM python:3.8.3-alpine
WORKDIR /usr/src/app
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN pip install --upgrade pip
COPY ./requirements.txt .
RUN pip install -r requirements.txt
COPY . .
docker-compose.yml
version: '3.7'
services:
web:
build: ./django_project
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./django_project/:/usr/src/app/
ports:
- 8000:8000
docker desktop: 4.1.1
Python: 3.8
Django: 3.0
I am trying to upload a Django app to Docker Hub. On the local machine (Ubuntu 18.04) everything works fine, but on Docker Hub there is an issue that the requirements.txt file cannot be found.
Local machine:
sudo docker-compose build --no-cache
Result (it's okay):
Step 5/7 : COPY . .
---> 5542d55caeae
Step 6/7 : RUN file="$(ls -1 )" && echo $file
---> Running in b85a55aa2640
Dockerfile db.sqlite3 hello_django manage.py requirements.txt venv
Removing intermediate container b85a55aa2640
---> 532e91546d41
Step 7/7 : RUN pip install -r requirements.txt
---> Running in e940ebf96023
Collecting Django==3.2.2....
But, Docker Hub:
Step 5/7 : COPY . .
---> 852fa937cb0a
Step 6/7 : RUN file="$(ls -1 )" && echo $file
---> Running in 281d9580d608
README.md app config docker-compose.yml
Removing intermediate container 281d9580d608
---> 99eaafb1a55d
Step 7/7 : RUN pip install -r requirements.txt
---> Running in d0e180d83772
[91mERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'
Removing intermediate container d0e180d83772
The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
app/Dockerfile
FROM python:3.8.3-alpine
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
COPY . .
RUN file="$(ls -1 )" && echo $file
RUN pip install -r requirements.txt
docker-composer.yml
version: '3'
services:
web:
build:
context: app
dockerfile: Dockerfile
volumes:
- ./app/:/code/
ports:
- "8000:8000"
env_file:
- ./config/.env.dev
command: python manage.py runserver 0.0.0.0:8000
Project Structure:
UPDATE:
Docker is building from Github.
File requirements.txt is in the GitHub repository (app folder), but for some reason during build Docker Hub copies files from the project root folder and not the contents of the app folder.
Github:
https://github.com/sigalglebru/django-on-docker
The problem is that you need to tell Docker Hub where to find your build context.
When you run docker-compose build locally, docker-compose reads your docker-compose.yml file and knows to build inside the app directory, because you've explicitly set the build context:
build:
context: app
dockerfile: Dockerfile
When you build on Docker Hub, by default it will assume the build
context is the top level of your repository. If you set the path to
your Dockerfile to, e.g., app/Dockerfile, this is equivalent to
running:
docker build -f app/Dockerfile .
If you try that, you'll see if fail the same way. Rather than setting
the path to the Dockerfile, you need to set the path to the build
context to the app directory. For example:
(Look at the "Build Context" column).
When configured correct, your repository builds on Docker Hub without errors.
Thank you, I found solution:
I just copied files from./app to the mounted volume, and little changed context, but still don't understand why it worked fine on the local machine
Dockerfile:
FROM python:3.8.3-alpine
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
COPY ./app .
RUN pip install -r requirements.txt
docker-compose.yml
version: "3.6"
services:
python:
restart: always
build:
context: .
dockerfile: docker/Dockerfile
expose:
- 8000
ports:
- 8000:8000
command: "python manage.py runserver 0.0.0.0:8000"
I am a newbie to Docker. I have created one Django project and can run it in Docker. However, I have started a second project and have encountered a problem.
I created a virtual env and entered it
pipenv install django~=3.1.0 && pipenv shell
I created a Django project
django-admin startproject config .
I ran it within the virtualenv
python manage.py runserver
and could see the Django spaceship
I then exited the virtualenv and created a Dockerfile
Dockerfile
# Pull base image
FROM python:3.8
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /code
# Install dependencies
COPY Pipfile Pipfile.lock /code/
RUN pip install pipenv && pipenv install --system
# Copy project
COPY . /code/
I ran
docker build .
and it reported a successful build
I created a docker-compose.yml file
docker-compose.yml
version: '3.8'
services:
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- 8000:8000
When I run
docker-compose up
it complains
ImportError: Couldn't import Django. Are you sure it's installed and available on your PYTHONPATH environment variable? Did you forget to activate a virtual environment?
I have read in the comments to this question that virtual envs should not be used in docker files, so I replaced
RUN pip install pipenv && pipenv install --system
with
RUN pip install django~=3.1.0
but I still get the same error.
What is wrong?
Have you tried installing your list of requirements from a separate file, something like this?
COPY requirements.txt /code/requirements.txt
WORKDIR /code
RUN pip install -r requirements.txt
Once you have it installed you can run docker-compose run web /bin/sh to start a shell and then use django-admin startproject to create a django project. You may need to change the path in the docker-compose file so that it reflects where your manage.py file ended up (I moved mine to the root). I was able to get it working with the following:
requirements.txt
django==3.1.0
docker-compose.yml
version: '3.8'
services:
web:
build:
context: .
dockerfile: Dockerfile
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- 8000:8000
Dockerfile
# Pull base image
FROM python:3.8
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /code
# Install dependencies
COPY requirements.txt /code/requirements.txt
WORKDIR /code
RUN pip install -r requirements.txt
# Copy project
COPY . /code/
File tree looks like this:
I am new on Django and Docker and I have a problem to enter site localhost:8000.
I built django app and it is working on my local server but I'd like to dockerize my app. So I created two files:
Dockerfile :
RUN python:3.6.7-alpine
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
ADD requirements.txt /code/
RUN pip install -r requirements.txt
ADD ./ /code/
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
and docker-compose.yml
version: '3'
services:
web:
build: .
command: python mysite/manage.py runserver 8000
ports:
- "8000:8000"
My next steps:
docker built --tag django_docker:latest .
and:
docker run django_docker
It's open server, but when I want to open localhost:8000 from my browser I can't because of "Unable to connect"
Where is my fault?
More about django app : it's project from book Python Crash Course : Learning_log. I'd like to build an image and push it to hub docker, but I am stuck. Thanks for help!
You are using a docker-compose.yml file, therefore you need to use the docker-compose command to run it:
docker-compose up
That's all you need, and you can read more about it in the official docs.
To run it without using docker compose, then your docker command needs to be:
docker run --publish 8000:8000 django_docker
If you want to restrict the site to be available only on your localhost, then bind to 127.0.0.1:
docker run --publish 127.0.0.1:8000:8000 django_docker
Try these
update dockerfile
# Pull base image
FROM python:3.7
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /code
# Install dependencies
COPY Pipfile Pipfile.lock /code/
RUN pip install pipenv && pipenv install --system
# Copy project
COPY . /code/
update dockor-compose.yml
version: '3.7'
services:
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- 8000:8000
after updating just run one commands in terminal
docker-compose up -d --build
to stop it use
docker-compose down
When I was trying to dockerize my django app, I followed a tutorial telling me to structure my Dockerfile like this
FROM python:3.6
ENV PYTHONUNBUFFERED 1
COPY . /code/
WORKDIR /code/
RUN pip install pipenv
RUN pipenv install --system
EXPOSE 8000
After I saved that and run docker build .
the system threw me this error
Warning: --system is intended to be used for pre-existing Pipfile
installation,not installation of specific packages. Aborting.
I think it is complaining about the --system suffix above but the tutorial says it's crucial to have it so that my packages are applied in the entire docker container. I'm new to docker and even pipenv because I took over a previous person's code and isn't sure where their pipfile is or even if they have a pipfile. If you have any insights on how to fix this error thank you in advance.
pipenv --rm
This helped me! I was starting the "Django for beginners" and at the very beginning, got this error (accidently deleted Pipfile & Pipfile.lock)
Your warning is saying you that there is no Pipfile in your project dir.
--system is intended to be used for pre-existing Pipfile.
So before running
docker build .
run
pipenv install
in your project folder
Above solution didn't work for me.
After installing in the virtual env I also had to explicitly include Pipfile and Pipfile.lock into my dockerfile:
COPY Pipfile* .
# Install dependencies
RUN pip install pipenv && pipenv install --system
Then rebuild with docker compose:
docker-compose build
You can find more info in this thread
It has Error in pipenv
It is 👇 ERROR:: --system is intended to be used for pre-existing Pipfile installation, not installation of specific packages. Aborting.
try it
pipenv check or python3 -m pipenv check
Be careful when using Docker bind mounts!
Summary: In my case, I was using bind mounts in my dev environment, and mounting a docker bind mount on a non-empty directory would overwrite the contents of the container's directory, removing the Pipfile and Pipfile.lock, which showed the error mentioned when running the container.
Explanation
Directory structure on the host
> ls project/
docker-compose.yml Dockerfile Pipfile Pipfile.lock app/
Dockerfile
My Dockerfile would copy the contents of the project and then install the dependencies with pipenv, like this:
FROM python:3.8
# ...
COPY Pipfile Pipfile.lock /app/
RUN pipenv install --deploy --ignore-pipfile
COPY ./app /app/
CMD ["pipenv", "run", "uvicorn", "etc..", "--reload"]
Pipfile, Pipfile.lock and the code of ./app would all be in the same /app directory inside the container.
docker-compose.yml
I wanted uvicorn to hot-reload, so I mounted the code in /app inside the container's /app directory.
service:
app:
#...
volumes:
- type: bind
source: ./app
target: /app
This meant that when I changed the code in /app, the code in the container's /app directory would also change.
Effects
The side effect of this bind mount is that the content mounted on /app "obscured" the content previously copied in there.
Container's content with the bind mount:
> ls app/
code1.py code2.py
Container's content without the bind mount:
> ls app/
Pipfile Pipfile.lock code1.py code2.py
Solution
Either make sure that you include the Pipfile and Pipfile.lock as well when mounting the bind mount, or make sure that you COPY these 2 files to a directory that won't get overwritten by a bind mount.