Add different programmes, python code, other dockerfiles to one single dockerfile - dockerfile

I want to create a dockerfile out of several different programmes including other dockerfiles and python code.
It should contain the following:
connectome-workbench
FSL
FreeSurfer
Multimodal Surface Matching (MSM)
Ciftify
fmriprep-docker (also a dockerfile)
mcflirt
Different Python Scripts
Can I add every single dependency as following?
FROM ubuntu
USER root
RUN apt-get update && \
apt-get install -y curl gnupg gnupg1 gnupg2 python3-pip
RUN apt-get update && \
apt-get install -y git-annex && \
pip install ciftify
#Get the other like that as well
CMD python /app/app.py
Cheers

Related

Docker, WSL2 & vs code - bad git/ssh path

I set up my WSL2, Docker and vs code environment this weekend.
I am finding an issue when attempting to use git:
root#bb7f765df0d6:/var/www/html# git clone git#github.com:hsimah/my-repo.git
Cloning into 'my-repo'...
fatal: cannot run C:/Windows/System32/OpenSSH/ssh.exe: No such file or directory
fatal: unable to fork
Dockerfile:
FROM wordpress:latest
ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update \
&& apt-get -y install --no-install-recommends apt-utils dialog 2>&1 \
&& apt-get -y install git \
&& apt-get autoremove -y \
&& apt-get clean -y \
&& rm -rf /var/lib/apt/lists/*
ENV DEBIAN_FRONTEND=dialog
If I remove the Dockerfile install of git and run apt-get update && apt-get install git from my container there is no issue. In this case git uses my host ssh keys (loaded via ssh-agent service on Windows) and can pull and push both via terminal or vs code itself.
There are no errors or warnings in the log.
Okay I posted a few minutes too soon.
I checked the git config and VS Code was pulling my Windows config into the workspace, it's a known issue.
Unblocking answer is to change this to your ssh location (/usr/bin/ssh):
core.sshcommand=C:/Windows/System32/OpenSSH/ssh.exe

How do you set up a Python 3.8 kernel on GCP's AI Platform JupyterLab instances?

My goal is to be able to start a JupyterNotebook in JupyterLab with Python3.8
Update Python version to 3.8 in GCP AI Platform Jupyter Notebooks
AI Platform Notebooks environments are provided by container images that you select when creating the instance. In this page you will see the available container image types.
In order to specify the container image to run on the notebook you can either choose between using one of the list provided by Google Cloud mentioned above or in case that none of them comes with Python 3.8, you can create a derivative container based on one of the standard AI Platform images and edit the Dockerfile in order to set the Python 3.8 installation command.
To test it out I have made a small modification to a provided container image to incorporate a Python 3.8 kernel in JupyterLab. In order to do it I have created a Dockerfile that does the following:
Creates a layer from the latest tf-gpu Docker image
Installs Python 3.8 and dependencies
Activates a Python 3.8 environment
Installs the Python 3.8 kernel to Jupyter Notebooks
Once the image has been built and pushed to Google Container Registry, you will be able to create an AI Platform Jupyter Notebook with the new kernel.
The code is the following:
FROM gcr.io/deeplearning-platform-release/tf-gpu:latest
RUN apt-get update -y \
&& apt-get upgrade -y \
&& apt-get install -y apt-transport-https \
&& apt-get install -y build-essential zlib1g-dev libncurses5-dev libgdbm-dev libnss3-dev libssl-dev libreadline-dev libffi-dev wget libbz2-dev \
&& wget https://www.python.org/ftp/python/3.8.0/Python-3.8.0.tgz
RUN tar xzf Python-3.8.0.tgz \
&& echo Getting inside folder \
&& cd Python-3.8.0 \
&& ./configure --enable-optimizations \
&& make -j 8 \
&& make altinstall \
&& apt-get install -y python3-venv \
&& echo Creating environment... \
&& python3.8 -m venv testenv \
&& echo Activating environment... \
&& . testenv/bin/activate \
&& echo Installing jupyter... \
&& pip install jupyter \
&& pip install ipython \
&& apt-get update -y \
&& apt-get upgrade -y \
&& ipython kernel install --name "Python3.8" --user
In case you need it, you can also specify a custom image that will allow you to customize the environment for your specific needs. Take into account that the product is in Beta and might change or have limited support.

FreeTds inside Docker throwing unknown symbol exception

I'm trying to run a CGI website in a docker.
The software is written in c++ and uses to FreeTDS-dev package to connect to a mssql database.
so far its working as it should the only problem is:
If I try to compile or run it inside of the docker I receive
the following exception:
undefined symbol: dbprcollen
now I know where the exact line of code is and I also know that this particular function should be inside the freetds-dev package.
So I've included this package into the dockerfile
but it still won't work. Does Anyone know what I am missing?
Here is my dockerfile:
FROM php:apache
COPY ./html/ /var/www/html/
COPY ./work.cgi /var/www/html/work.cgi
RUN chmod +x /var/www/html/work.cgi
RUN a2enmod rewrite
RUN echo "<Directory /var/www/html/>\n\
AllowOverride all\n\
Options +ExecCGI\n\
AddHandler cgi-script .cgi\n\
</Directory>" >> /etc/apache2/apache2.conf
RUN ln -s /etc/apache2/mods-available/cgi.load /etc/apache2/mods-enabled/cgi.load
RUN apt update -y
RUN apt upgrade -y
RUN apt install build-essential -y
RUN apt install binutils -y
RUN apt install libcgicc-dev -y
RUN apt install freetds-bin -y
RUN apt install freetds-dev -y
RUN apt update --fix-missing -y
RUN apt upgrade -y

apt-get not found in Docker

I've got this Dockerfile:
FROM python:3.6-alpine
FROM ubuntu
FROM alpine
RUN apk update && \
apk add --virtual build-deps gcc python-dev musl-dev
RUN apt-get update && apt-get install -y python-pip
WORKDIR /app
ADD . /app
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "main.py"]
and it's throwing error saying /bin/sh: apt-get: not found.
I thought apt-get package is part of Ubuntu image that I'm pulling on the
second line but yet it's giving me this error.
How can I fix this ?
as tkausl said: you can only use one base image (one FROM).
alpine's package manager is apk not apt-get. you have to use apk to install packages. however, pip is already available.
that Dockerfile should work:
FROM python:3.6-alpine
RUN apk update && \
apk add --virtual build-deps gcc python-dev musl-dev
WORKDIR /app
ADD . /app
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "main.py"]
apt-get does not work because the active Linux distribution is alpine, and it does not have the apt-get command.
You can fix it using apk command.
most probbly the image you're using is Alpine,
so you can't use apt-get
you can use Ubuntu's package manager.
you can use
apk update and apk add
Multiple FROM lines can be used in a single Dockerfile.
See discussion and Multi stage tutorial
The use of Python Alpine, plus Ubuntu, plus Ubuntu is probably redundant. The Python Alpine one should be sufficient as it uses Alpine internally.
I had a similar issue not with apk but with apt-get.
FROM node:14
FROM jekyll/jekyll
RUN apt-get update
RUN apt-get install -y \
sqlite
Error:
/bin/sh: apt-get: not found
If I change the order, then it works.
FROM node:14
RUN apt-get update
RUN apt-get install -y \
sqlite
FROM jekyll/jekyll
Note, as in first link I added above, multiple FROMs might removed from Docker as a feature.

how to install scrapy on ubuntu?

I know that intall the scrapy should install the w3lib first,so I install the w3lib firstly,but when I import the scrapy in python ide,the program is crashed.
the error:
creating Twisted.egg-info
writing requirements to Twisted.egg-info\requires.txt
writing Twisted.egg-info\PKG-INFO
writing top-level names to Twisted.egg-info\top_level.txt
writing dependency_links to Twisted.egg-info\dependency_links.txt
writing manifest file 'Twisted.egg-info\SOURCES.txt'
warning: manifest_maker: standard file '-c' not found
reading manifest file 'Twisted.egg-info\SOURCES.txt'
writing manifest file 'Twisted.egg-info\SOURCES.txt'
copying twisted\internet\_sigchld.c -> build\lib.win-amd64-2.7\twisted\internet
creating build\lib.win-amd64-2.7\twisted\internet\iocpreactor\iocpsupport
copying twisted\internet/iocpreactor/iocpsupport\iocpsupport.c -> build\lib.win-amd64-2.7\twisted\internet/iocpreactor/i
ocpsupport
copying twisted\internet/iocpreactor/iocpsupport\winsock_pointers.c -> build\lib.win-amd64-2.7\twisted\internet/iocpreac
tor/iocpsupport
copying twisted\python\_epoll.c -> build\lib.win-amd64-2.7\twisted\python
copying twisted\python\_initgroups.c -> build\lib.win-amd64-2.7\twisted\python
copying twisted\python\sendmsg.c -> build\lib.win-amd64-2.7\twisted\python
copying twisted\runner\portmap.c -> build\lib.win-amd64-2.7\twisted\runner
copying twisted\test\raiser.c -> build\lib.win-amd64-2.7\twisted\test
running build_ext
What's wrong?
This is how I installed scrapy on ubuntu:
sudo apt-get update
sudo apt-get install python-pip build-essential python-dev libxslt-dev libxml2-dev
sudo -H pip install Scrapy
scrapy version
The important thing that solved my issues was sudo -H pip install Scrapy specifically the -H flag.
I also exited out of the terminal and started a new terminal to ensure the all the environment variables were set correctly
Make sure you had installed the Twisted, pyOpenSSL and pycrypto.
These are my steps to install scrapy on ubuntu.
1.install gcc and lxml:
sudo apt-get install python-dev
sudo apt-get install libevent-dev
sudo apt-get install libxml2 libxml2-dev
apt-get install libxml2-dev libxslt-dev
apt-get install python-lxml
2.install twisted:
sudo apt-get install python-twisted python-libxml2 python-simplejson
sudo apt-get install build-essential libssl-dev libffi-dev python-dev
3.install pyOpenSSL:
wget http://pypi.python.org/packages/source/p/pyOpenSSL/pyOpenSSL-0.13.tar.gz
tar -zxvf pyOpenSSL-0.13.tar.gz
cd pyOpenSSL-0.13
sudo python setup.py install
4.install pycrypto
wget http://pypi.python.org/packages/source/p/pycrypto/pycrypto-2.5.tar.gz
tar -zxvf pycrypto-2.5.tar.gz
cd pycrypto-2.5
sudo python setup.py install
5.install easy_install:(if you don't have easy_install)
wget http://peak.telecommunity.com/dist/ez_setup.py
python ez_setup.py
6.install w3lib
sudo easy_install -U w3lib
7.install scrapy
sudo easy_install Scrapy
If you wanna know much,please goto my blog.
First install system dependencies
sudo apt-get install -y \
python-dev python-pip python-setuptools \
libffi-dev libxml2-dev libxslt1-dev \
libtiff5-dev libjpeg62-turbo-dev zlib1g-dev libfreetype6-dev \
liblcms2-dev libwebp-dev tcl8.5-dev tk8.5-dev python-tk
Then add followings in your requirements.txt
lxml
pyOpenSSL
Scrapy
Pillow
And finally pip install -r requirements.txt
You can look around gist.github.com as well to resolve latest dependencies issues. I'm using docker to setup scrapy deps in a separate container.
I've created one for mine needs here