apt-get not found in Docker - dockerfile

I've got this Dockerfile:
FROM python:3.6-alpine
FROM ubuntu
FROM alpine
RUN apk update && \
apk add --virtual build-deps gcc python-dev musl-dev
RUN apt-get update && apt-get install -y python-pip
WORKDIR /app
ADD . /app
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "main.py"]
and it's throwing error saying /bin/sh: apt-get: not found.
I thought apt-get package is part of Ubuntu image that I'm pulling on the
second line but yet it's giving me this error.
How can I fix this ?

as tkausl said: you can only use one base image (one FROM).
alpine's package manager is apk not apt-get. you have to use apk to install packages. however, pip is already available.
that Dockerfile should work:
FROM python:3.6-alpine
RUN apk update && \
apk add --virtual build-deps gcc python-dev musl-dev
WORKDIR /app
ADD . /app
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "main.py"]

apt-get does not work because the active Linux distribution is alpine, and it does not have the apt-get command.
You can fix it using apk command.

most probbly the image you're using is Alpine,
so you can't use apt-get
you can use Ubuntu's package manager.
you can use
apk update and apk add

Multiple FROM lines can be used in a single Dockerfile.
See discussion and Multi stage tutorial
The use of Python Alpine, plus Ubuntu, plus Ubuntu is probably redundant. The Python Alpine one should be sufficient as it uses Alpine internally.
I had a similar issue not with apk but with apt-get.
FROM node:14
FROM jekyll/jekyll
RUN apt-get update
RUN apt-get install -y \
sqlite
Error:
/bin/sh: apt-get: not found
If I change the order, then it works.
FROM node:14
RUN apt-get update
RUN apt-get install -y \
sqlite
FROM jekyll/jekyll
Note, as in first link I added above, multiple FROMs might removed from Docker as a feature.

Related

Add different programmes, python code, other dockerfiles to one single dockerfile

I want to create a dockerfile out of several different programmes including other dockerfiles and python code.
It should contain the following:
connectome-workbench
FSL
FreeSurfer
Multimodal Surface Matching (MSM)
Ciftify
fmriprep-docker (also a dockerfile)
mcflirt
Different Python Scripts
Can I add every single dependency as following?
FROM ubuntu
USER root
RUN apt-get update && \
apt-get install -y curl gnupg gnupg1 gnupg2 python3-pip
RUN apt-get update && \
apt-get install -y git-annex && \
pip install ciftify
#Get the other like that as well
CMD python /app/app.py
Cheers

Docker, WSL2 & vs code - bad git/ssh path

I set up my WSL2, Docker and vs code environment this weekend.
I am finding an issue when attempting to use git:
root#bb7f765df0d6:/var/www/html# git clone git#github.com:hsimah/my-repo.git
Cloning into 'my-repo'...
fatal: cannot run C:/Windows/System32/OpenSSH/ssh.exe: No such file or directory
fatal: unable to fork
Dockerfile:
FROM wordpress:latest
ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update \
&& apt-get -y install --no-install-recommends apt-utils dialog 2>&1 \
&& apt-get -y install git \
&& apt-get autoremove -y \
&& apt-get clean -y \
&& rm -rf /var/lib/apt/lists/*
ENV DEBIAN_FRONTEND=dialog
If I remove the Dockerfile install of git and run apt-get update && apt-get install git from my container there is no issue. In this case git uses my host ssh keys (loaded via ssh-agent service on Windows) and can pull and push both via terminal or vs code itself.
There are no errors or warnings in the log.
Okay I posted a few minutes too soon.
I checked the git config and VS Code was pulling my Windows config into the workspace, it's a known issue.
Unblocking answer is to change this to your ssh location (/usr/bin/ssh):
core.sshcommand=C:/Windows/System32/OpenSSH/ssh.exe

Error of "Command 'pip' not found" when trying to install requirements.txt

I'm trying to do: pip install -r requirements.txt on an AWS server. I recently pulled a git commit.
I get this error:
Command 'pip' not found, but can be installed with:
sudo apt install python-pip
So I tried entering:
sudo apt install python-pip install -r requirements.txt
and then
sudo apt install python-pip -r requirements.txt
But both attempts gave me this error:
E: Command line option 'r' [from -r] is not understood in combination with the other options.
What is the correct command to install this? Thank you.
You are mixing multiple commands.
apt ; It is Debian's package manager. It has nothing to do with python packages. You install pip through apt. There are also other ways of doing it.
pip : As understood it is python package manager. You can install dependencies for your project by listing them in requirements.txt.
The correct way would be :
sudo apt install python-pip
#install from a file requirements.txt:
sudo pip install -r requirements.txt
#install as a user :
pip install -U -r requirements.txt

gcc error while building docker image for django on windows

I am trying to build a docker image using Visual Studio Code following this tutorial "https://code.visualstudio.com/docs/python/tutorial-deploy-containers".
I created a django app with a connection to a MSSQLserver on azure with the package pyodbc.
During the build of the docker image i receive the following error messages:
unable to execute 'gcc': No such file or directory
error: command 'gcc' failed with exit status 1
----------------------------------------
Failed building wheel for pyodbc
and
unable to execute 'gcc': No such file or directory
error: command 'gcc' failed with exit status 1
----------------------------------------
Failed building wheel for typed-ast
I read solutions for linux systems where one should install python-dev, but since i am working on a windows machine this is no solution.
Then i read that on windows all the needed files are in the 'include' directory of the python installation. But in a venv installation this directory is empty... so i created a directory junction to the original 'include'. The error still exists.
My docker file is included below.
# Python support can be specified down to the minor or micro version
# (e.g. 3.6 or 3.6.3).
# OS Support also exists for jessie & stretch (slim and full).
# See https://hub.docker.com/r/library/python/ for all supported Python
# tags from Docker Hub.
FROM tiangolo/uwsgi-nginx:python3.6-alpine3.7
# Indicate where uwsgi.ini lives
ENV UWSGI_INI uwsgi.ini
# Tell nginx where static files live (as typically collected using Django's
# collectstatic command.
ENV STATIC_URL /app/static_collected
# Copy the app files to a folder and run it from there
WORKDIR /app
ADD . /app
# Make app folder writable for the sake of db.sqlite3, and make that file also writable.
# RUN chmod g+w /app
# RUN chmod g+w /app/db.sqlite3
# If you prefer miniconda:
#FROM continuumio/miniconda3
LABEL Name=hello_django Version=0.0.1
EXPOSE 8000
# Using pip:
RUN python3 -m pip install -r requirements.txt
CMD ["python3", "-m", "hello_django"]
# Using pipenv:
#RUN python3 -m pip install pipenv
#RUN pipenv install --ignore-pipfile
#CMD ["pipenv", "run", "python3", "-m", "hello_django"]
# Using miniconda (make sure to replace 'myenv' w/ your environment name):
#RUN conda env create -f environment.yml
#CMD /bin/bash -c "source activate myenv && python3 -m hello_django"
I could use some help in building the image without the errors.
Based on the answer of 2ps i added these lines almost at the top of the docker file
FROM tiangolo/uwsgi-nginx:python3.6-alpine3.7
RUN apk update \
&& apk add apk add gcc libc-dev g++ \
&& apk add libffi-dev libxml2 libffi-dev \
&& apk add unixodbc-dev mariadb-dev python3-dev
and received a new error...
fetch http://dl-cdn.alpinelinux.org/alpine/v3.7/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.7/community/x86_64/APKINDEX.tar.gz
v3.7.1-98-g2f2e944c59 [http://dl-cdn.alpinelinux.org/alpine/v3.7/main]
v3.7.1-105-g7db92f4321 [http://dl-cdn.alpinelinux.org/alpine/v3.7/community]
OK: 9053 distinct packages available
ERROR: unsatisfiable constraints:
add (missing):
required by: world[add]
apk (missing):
required by: world[apk]
The command '/bin/sh -c apk update && apk add apk add gcc libc-dev g++ && apk add libffi-dev libxml2 libffi-dev && apk add unixodbc-dev mariadb-dev python3-dev' returned a non-zero code: 2
Found out that adding
RUN echo "ipv6" >> /etc/modules
helped with the errors above. Taken from: https://github.com/gliderlabs/docker-alpine/issues/55
The app now works, exept that the intended connection to the MsSQL database still not works.
Error at /
('01000', "[01000] [unixODBC][Driver Manager]Can't open lib 'ODBC Driver 13 for SQL Server' : file not found (0) (SQLDriverConnect)")
I think i should get my hands dirty on some docker documentation.
I gave up on the solution with alpine and switched to debian
FROM python:3.7
# needed files for pyodbc
RUN apt-get update
RUN apt-get install gcc libc-dev g++ libffi-dev libxml2 libffi-dev unixodbc-dev -y
# MS SQL driver 17 for debian
RUN apt-get install apt-transport-https \
&& curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -\
&& curl https://packages.microsoft.com/config/debian/9/prod.list > /etc/apt/sources.list.d/mssql-release.list \
&& apt-get update \
&& ACCEPT_EULA=Y apt-get install msodbcsql17 -y
You'll need to use apk to install gcc and other native dependencies needed to build your pip dependencies. For the ones that you listed (typedast and pyodbc), I think they would be:
RUN apk update \
&& apk add apk add gcc libc-dev g++ \
&& apk add libffi-dev libxml2 libffi-dev \
&& apk add unixodbc-dev mariadb-dev python3-dev

how to install scrapy on ubuntu?

I know that intall the scrapy should install the w3lib first,so I install the w3lib firstly,but when I import the scrapy in python ide,the program is crashed.
the error:
creating Twisted.egg-info
writing requirements to Twisted.egg-info\requires.txt
writing Twisted.egg-info\PKG-INFO
writing top-level names to Twisted.egg-info\top_level.txt
writing dependency_links to Twisted.egg-info\dependency_links.txt
writing manifest file 'Twisted.egg-info\SOURCES.txt'
warning: manifest_maker: standard file '-c' not found
reading manifest file 'Twisted.egg-info\SOURCES.txt'
writing manifest file 'Twisted.egg-info\SOURCES.txt'
copying twisted\internet\_sigchld.c -> build\lib.win-amd64-2.7\twisted\internet
creating build\lib.win-amd64-2.7\twisted\internet\iocpreactor\iocpsupport
copying twisted\internet/iocpreactor/iocpsupport\iocpsupport.c -> build\lib.win-amd64-2.7\twisted\internet/iocpreactor/i
ocpsupport
copying twisted\internet/iocpreactor/iocpsupport\winsock_pointers.c -> build\lib.win-amd64-2.7\twisted\internet/iocpreac
tor/iocpsupport
copying twisted\python\_epoll.c -> build\lib.win-amd64-2.7\twisted\python
copying twisted\python\_initgroups.c -> build\lib.win-amd64-2.7\twisted\python
copying twisted\python\sendmsg.c -> build\lib.win-amd64-2.7\twisted\python
copying twisted\runner\portmap.c -> build\lib.win-amd64-2.7\twisted\runner
copying twisted\test\raiser.c -> build\lib.win-amd64-2.7\twisted\test
running build_ext
What's wrong?
This is how I installed scrapy on ubuntu:
sudo apt-get update
sudo apt-get install python-pip build-essential python-dev libxslt-dev libxml2-dev
sudo -H pip install Scrapy
scrapy version
The important thing that solved my issues was sudo -H pip install Scrapy specifically the -H flag.
I also exited out of the terminal and started a new terminal to ensure the all the environment variables were set correctly
Make sure you had installed the Twisted, pyOpenSSL and pycrypto.
These are my steps to install scrapy on ubuntu.
1.install gcc and lxml:
sudo apt-get install python-dev
sudo apt-get install libevent-dev
sudo apt-get install libxml2 libxml2-dev
apt-get install libxml2-dev libxslt-dev
apt-get install python-lxml
2.install twisted:
sudo apt-get install python-twisted python-libxml2 python-simplejson
sudo apt-get install build-essential libssl-dev libffi-dev python-dev
3.install pyOpenSSL:
wget http://pypi.python.org/packages/source/p/pyOpenSSL/pyOpenSSL-0.13.tar.gz
tar -zxvf pyOpenSSL-0.13.tar.gz
cd pyOpenSSL-0.13
sudo python setup.py install
4.install pycrypto
wget http://pypi.python.org/packages/source/p/pycrypto/pycrypto-2.5.tar.gz
tar -zxvf pycrypto-2.5.tar.gz
cd pycrypto-2.5
sudo python setup.py install
5.install easy_install:(if you don't have easy_install)
wget http://peak.telecommunity.com/dist/ez_setup.py
python ez_setup.py
6.install w3lib
sudo easy_install -U w3lib
7.install scrapy
sudo easy_install Scrapy
If you wanna know much,please goto my blog.
First install system dependencies
sudo apt-get install -y \
python-dev python-pip python-setuptools \
libffi-dev libxml2-dev libxslt1-dev \
libtiff5-dev libjpeg62-turbo-dev zlib1g-dev libfreetype6-dev \
liblcms2-dev libwebp-dev tcl8.5-dev tk8.5-dev python-tk
Then add followings in your requirements.txt
lxml
pyOpenSSL
Scrapy
Pillow
And finally pip install -r requirements.txt
You can look around gist.github.com as well to resolve latest dependencies issues. I'm using docker to setup scrapy deps in a separate container.
I've created one for mine needs here