Dockerfile with copy command and relative path - unit-testing

Have some way to use copy command with the relative path in dockerfile? I'm trying to use:
COPY ./../folder/*.csproj ./
OBS.: My struct folder (I'm running the dockerfile on project-test and other files are in the project-console folder) is:
|- project-console
|- project-test
And I receive the following error:
ERROR: service 'app' failed to build: COPY failed: no source files were specified.
My purpose is have two projects in the same docker. I have a dotnet core console and another with a unity test (NUnity), I'm trying to run the unity test in docker.
UPDATE
Is possible to use Multi-stage: https://docs.docker.com/develop/develop-images/multistage-build/
Or using docker-compose with build: https://docs.docker.com/engine/reference/commandline/build/
Like:
services:
worker:
build:
context: ./workers
dockerfile: Dockerfile

Reference: Allow Dockerfile from outside build-context
You can try this way
$ cd project-console
$ docker build -f ../project-test/Dockerfile .
Update:
By using docker-compose
build:
context: ../
dockerfile: project-test/Dockerfile
../ will be set as the context, it should include project-console and project-test in your case. So that you can COPY project-console/*.csproj in Dockerfile.

Related

Docker: Cannot COPY from parent directory while building the image [duplicate]

This question already has answers here:
How to include files outside of Docker's build context?
(19 answers)
Closed last month.
I am trying to use docker-compose up -d for deploying my django application. The problem is that my Dockerfile and docker-compose.yml are in one directory, but need access to a requirements.txt from the parent directory.
Minimal Example:
Filestructure:
requirements.txt (file)
docker (directory)
docker/Dockerfile (file)
docker/docker-compose.yml (file)
Dockerfile:
FROM python:3.10-slim
COPY ./../requirements.txt /requirements.txt
docker-compose.yml:
version: '3'
services:
django:
container_name: django_123
build:
context: ./..
dockerfile: ./docker/Dockerfile
expose:
- "8000"
The setup works on Docker Desktop 4 on Windows 10, but not on Ubuntu 22.
I get the error:
Step 1/2 : COPY ./../requirements.txt /requirements.txt
COPY failed: forbidden path outside the build context: ../requirements.txt ()
ERROR: Service 'django' failed to build : Build failed
I already read that I should build the image from the parent directory, but I get the same error message if I use docker build -f ../Dockerfile ..
What could be the problem? And why does it work on Windows and not on Ubuntu?
For security reasons, you can only copy from the directory set as the "build context" and below. So doing COPY ./../requirements.txt is not allowed, since that would copy from a directory above the build context.
Since you've set your build context to the parent directory, you should be able to get at the requirements file. You just need to specify your host file paths as originating from the build context, like this:
COPY ./requirements.txt /requirements.txt

Getting "Error processing tar file(exit status 1): open /myenv/include/python3.6m/Python-ast.h: no such file or directory" while docker-compose build

So I am pretty new to docker and django. Unfortunately while running the below command on my linux machine which i am connected using my physical windows machine using putty:
docker-compose build
I am getting an error:
Error processing tar file(exit status 1): open /myenv/include/python3.6m/Python-ast.h: no such file or directory
'myenv' is the environment I have created inside my folder.
I am getting a container started on port 9000. The app doesn't have anything yet just a simple project so i just expect to see the 'congratulations' screen. I don't know where I am going wrong. My final goal would be to run the docker url in my windows browser and see the screen of docker container.
This is my docker-compose.yml file:
version: '3'
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:9000
ports:
- 202.179.92.106:8000:9000
the IP: 202.179.92.106 is my public IP. I did the above binding so as to access the docker container from my windows machine. Would request additional inputs for the port binding as well if correct/incorrect.
Below is my Dockerfile:
FROM python:3.6.9
RUN mkdir djangotest
WORKDIR djangotest
ADD . /djangotest
RUN pip install -r requirements.txt
Please help me out peeps!
If you have a virtual environment in your normal development tree, you can't copy it into a Docker image. You can exclude this from the build sequence by mentioning it in a .dockerignore file:
# .dockerignore
myenv
Within the Dockerfile, the RUN pip install line will install your application's dependencies into the Docker image, so you should have a complete self-contained image.

Building on Docker Hub ignores context path when copying files

I am trying to upload a Django app to Docker Hub. On the local machine (Ubuntu 18.04) everything works fine, but on Docker Hub there is an issue that the requirements.txt file cannot be found.
Local machine:
sudo docker-compose build --no-cache
Result (it's okay):
Step 5/7 : COPY . .
---> 5542d55caeae
Step 6/7 : RUN file="$(ls -1 )" && echo $file
---> Running in b85a55aa2640
Dockerfile db.sqlite3 hello_django manage.py requirements.txt venv
Removing intermediate container b85a55aa2640
---> 532e91546d41
Step 7/7 : RUN pip install -r requirements.txt
---> Running in e940ebf96023
Collecting Django==3.2.2....
But, Docker Hub:
Step 5/7 : COPY . .
---> 852fa937cb0a
Step 6/7 : RUN file="$(ls -1 )" && echo $file
---> Running in 281d9580d608
README.md app config docker-compose.yml
Removing intermediate container 281d9580d608
---> 99eaafb1a55d
Step 7/7 : RUN pip install -r requirements.txt
---> Running in d0e180d83772
[91mERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'
Removing intermediate container d0e180d83772
The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
app/Dockerfile
FROM python:3.8.3-alpine
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
COPY . .
RUN file="$(ls -1 )" && echo $file
RUN pip install -r requirements.txt
docker-composer.yml
version: '3'
services:
web:
build:
context: app
dockerfile: Dockerfile
volumes:
- ./app/:/code/
ports:
- "8000:8000"
env_file:
- ./config/.env.dev
command: python manage.py runserver 0.0.0.0:8000
Project Structure:
UPDATE:
Docker is building from Github.
File requirements.txt is in the GitHub repository (app folder), but for some reason during build Docker Hub copies files from the project root folder and not the contents of the app folder.
Github:
https://github.com/sigalglebru/django-on-docker
The problem is that you need to tell Docker Hub where to find your build context.
When you run docker-compose build locally, docker-compose reads your docker-compose.yml file and knows to build inside the app directory, because you've explicitly set the build context:
build:
context: app
dockerfile: Dockerfile
When you build on Docker Hub, by default it will assume the build
context is the top level of your repository. If you set the path to
your Dockerfile to, e.g., app/Dockerfile, this is equivalent to
running:
docker build -f app/Dockerfile .
If you try that, you'll see if fail the same way. Rather than setting
the path to the Dockerfile, you need to set the path to the build
context to the app directory. For example:
(Look at the "Build Context" column).
When configured correct, your repository builds on Docker Hub without errors.
Thank you, I found solution:
I just copied files from./app to the mounted volume, and little changed context, but still don't understand why it worked fine on the local machine
Dockerfile:
FROM python:3.8.3-alpine
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
COPY ./app .
RUN pip install -r requirements.txt
docker-compose.yml
version: "3.6"
services:
python:
restart: always
build:
context: .
dockerfile: docker/Dockerfile
expose:
- 8000
ports:
- 8000:8000
command: "python manage.py runserver 0.0.0.0:8000"

Docker: ERROR: for web Cannot start service web: OCI runtime create failed

I'm having some error when building with sudo docker-compose up. The image builts well but when running the container it throws the following two errors:
ERROR: for b21bd1503fed_django-docker-boilerplate_web_1 Cannot start service web: OCI runtime create failed: container_linux.go:370: starting container process caused: process_linux.go:459: container init caused: write sysctl key net.ipv4.ip_unprivileged_port_start: open /proc/sys/net/ipv4/ip_unprivileged_port_start: no such file or directory: unknown
ERROR: for web Cannot start service web: OCI runtime create failed: container_linux.go:370: starting container process caused: process_linux.go:459: container init caused: write sysctl key net.ipv4.ip_unprivileged_port_start: open /proc/sys/net/ipv4/ip_unprivileged_port_start: no such file or directory: unknown
I'm running docker on MacOS Catalina and the versions of docker and docker compose are version 20.10.5, build 55c4c88 and version 1.28.5, build c4eb3a1f respectively.
My configuration files are the following:
directories
.
|-- Dockerfile
|-- README.md
|-- docker-compose.yml
`-- requirements.txt
Dockerfile
FROM python:3.10.0a6-slim-buster
WORKDIR .
ENV PYTHONUNBUFFERED=1
# Copy file into the working directory
COPY requirements.txt .
RUN pip install -r requirements.txt
# Copy source code into the image
COPY . .
docker-compose.yml
version: "3.9"
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .
ports:
- "8000:8000"
env_file:
- ./.env
Any idea what's going on wrong here?

Docker compose with multiple Dockfiles and images (Django+Angular)

I'm building an app that uses Django and Angular that is split up into two different repositories and docker images.
My file structure is:
.
docker-compose.yml
djangoapp/
Dockerfile
manage.py
...
angularapp/
Dockerfile
...
I've been unable to get it to work properly, all documentation I found on this seem to expect you to have the docker-compose.yml file together with the DockerFile.
I've tried multiple different variations but my current (not working) docker-compose.yml file looks like this:
version : '3'
services:
web:
build: ./djangoapp
command: python manage.py runserver 0.0.0.0:8000
ports:
- "8000:8000"
This gives me error can't open file 'manage.py': [Errno 2] No such file or directory.
If I go into the djangoapp/ directory and create a docker-compose.yml file there according to the offical Docker docs, it works fine. So it's nothing wrong with the actual build, the issue is accessing it from outside like I'm trying to do.
Update: I also decided to add my Dockerfile located at ./djangoapp/Dockerfile.
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
ADD requirements.txt /code/
RUN pip install -r requirements.txt
ADD . /code/
From what I can see it seems just to be a simple typo
version : '3'
services:
web:
build: ./djangoapp
command: python manage.py runserver 0.0.0.0:8000
ports:
- "8000:8000"
That ./ is all your missing I think. I'm pretty sure without it the docker compose file doesn't go into the directory specified.
Update
Now that I saw your Dockerfile, I've noticed that you havent added the manage.py to your container. The same way you added your requirements.txt is what you need to do with the manage.py. When a Docker container is built, it only has what you give it plus any other file structure it has from the base Image.
You can either add it in the Dockerfile or you can have a shared volume between a local directory and a container directory. I think adding it in the Dockerfile would be easiest though.
This issue was solved by rebuilding my Docker image with docker-compose build web command. Seems like at the stage when I was having the error, that build didn't include the files needed. So in the code that I show above in the question post, nothing is wrong, it just needed to be rebuild.