How do I use command substition in Dockerfile - dockerfile

In my Dockerfile I need to use command substition to add some environment variables. I want to set
ENV PYTHONPATH /usr/local/$(python3 -c 'from distutils import sysconfig; print(sysconfig.get_python_lib())')
but it doesn't work. The result is
foo#bar:~$ echo $PYTHONPATH
/usr/local/$(python3 -c from distutils import sysconfig; print(sysconfig.get_python_lib()))
What's wrong?

What went wrong
The $( ... ) command substitution you attempted is for Bash, whereas the Dockerfile is not Bash. So docker doesn't know what to do with that, it's just plain text to docker, docker just spews out what you wrote as-is.
Recommendation
To avoid hard-coding values into a Dockerfile, and instead, to dynamically change a setting or custom variable as PYTHONPATH during the build, perhaps the ARG ... , --build-arg docker features might be most helpful, in combination with ENV ... to ensure it persists.
Within your Dockerfile:
ARG PYTHON_PATH_ARG
ENV PYTHONPATH ${PYTHON_PATH_ARG}
In Bash where you build your container:
python_path="/usr/local$(python3 -c 'from distutils import sysconfig; print(sysconfig.get_python_lib())')"
docker build --build-arg PYTHON_PATH_ARG=$python_path .
Explanation
According to documentation, ARG:
The ARG instruction defines a variable that users can pass at build-time to the builder with the docker build command using the --build-arg <varname>=<value> flag.
So, in Bash we first:
python_path="/usr/local$(python3 -c 'from distutils import sysconfig; print(sysconfig.get_python_lib())')"
$(...) Bash command substitution is used to dynamically put together a Python path value
this value is stored temporarily in a Bash variable $python_path for clarity
docker build --build-arg PYTHON_PATH_ARG=$python_path .
Bash variable $python_path value is passed to docker's --build-arg PYTHON_PATH_ARG
Within the Dockerfile:
ARG PYTHON_PATH_ARG
so PYTHON_PATH_ARG stores the value from --build-arg PYTHON_PATH_ARG...
ARG variables are not equivalent to ENV variables, so we couldn't merely do ARG PYTHONPATH and be done with it. According to documentation about Using arg variables:
ARG variables are not persisted into the built image as ENV variables are.
So finally:
ENV PYTHONPATH ${PYTHON_PATH_ARG}
We use Dockerfile's ${...} convention to get the value of PYTHON_PATH_ARG, and save it to your originally named PYTHONPATH environment variable
Differences from original code
You originally wrote:
ENV PYTHONPATH /usr/local/$(python3 -c 'from distutils import sysconfig; print(sysconfig.get_python_lib())')
I re-wrote the Python path finding portion as a Bash command, and tested on my machine:
$ python_path="/usr/local/$(python3 -c 'from distutils import sysconfig; print(sysconfig.get_python_lib())')"
$ echo $python_path
/usr/local//usr/lib/python3/dist-packages
Notice there is a double forward slash ... local//usr ... , not sure if that will break anything for you, depends on how you use it in your code.
Instead, I changed it to:
$ python_path="/usr/local$(python3 -c 'from distutils import sysconfig; print(sysconfig.get_python_lib())')"
Result:
$ echo $python_path
/usr/local/usr/lib/python3/dist-packages
So this new code will have no double forward slashes.

You should use ARG if possible. But sometimes you really need to use command substitution for a dynamic variable. As long as you put all the commands in the same RUN statement, then you can still access the value.
RUN foo=$(date) && \
echo $foo

Related

How can I correct this Dockerfile to take arguments properly?

I have this dockerfile content:
FROM python:latest
ARG b=8
ARG r=False
ARG p=1
ENV b=${b}
ENV r=${r}
ENV p=${p}
# many lines of successful setup
ENTRYPOINT python analyze_all.py /analysispath -b $b -p $p -r $r
My intention was to take three arguments at the command line like so:
docker run -it -v c:\analyze containername -e b=16 -e p=22 -e r=False
But unfortunately, I'm misunderstanding something fundamental and simple here instead of something complicated, so I'm helpless :).
If I understood the question correctly, this Dockerfile should do what is required:
FROM python:latest
# test script that prints argv
COPY analyze_all.py /
ENTRYPOINT ["python", "analyze_all.py", "/analysispath"]
Launch:
$ docker run -it test:latest -b 16 -p 22 -r False
sys.argv=['analyze_all.py', '/analysispath', '-b', '16', '-p', '22', '-r', 'False']
Looks like your Dockerfile is designed to build and run a container on Windows. I tested my Dockerfile on Linux, it probably won't be much different to use this approach on Windows.
I think ARG instructions isn't needed in this case because it defines a variable that users can pass at build-time using the docker build command. I would also suggest that you take a look at the Dockerfile reference for the ENTRYPOINT instruction:
Command line arguments to docker run will be appended after all elements in an exec form ENTRYPOINT, and will override all elements specified using CMD. This allows arguments to be passed to the entry point, i.e., docker run -d will pass the -d argument to the entry point.
Also, this question will probably be useful for you: How do I use Docker environment variable in ENTRYPOINT array?

Does Dockerfile support inline comments? [duplicate]

I am writing a Dockerfile. Is there a way to make comments in this file?
Does Docker have a comment option that takes the rest of a line and ignores it?
You can use # at the beginning of a line to start a comment (whitespaces before # are allowed):
# do some stuff
RUN apt-get update \
# install some packages
&& apt-get install -y cron
#'s in the middle of a string are passed to the command itself, e.g.:
RUN echo 'we are running some # of cool things'
As others have mentioned, comments are referenced with a # and are documented here. However, unlike some languages, the # must be at the beginning of the line. If they occur part way through the line, they are interpreted as an argument and may result in unexpected behavior.
# This is a comment
COPY test_dir target_dir # This is not a comment, it is an argument to COPY
RUN echo hello world # This is an argument to RUN but the shell may ignore it
It should also be noted that parser directives have recently been added to the Dockerfile which have the same syntax as a comment. They need to appear at the top of the file, before any other comments or commands. Originally, this directive was added for changing the escape character to support Windows:
# escape=`
FROM microsoft/nanoserver
COPY testfile.txt c:\
RUN dir c:\
The first line, while it appears to be a comment, is a parser directive to change the escape character to a backtick so that the COPY and RUN commands can use the backslash in the path. A parser directive is also used with BuildKit to change the frontend parser with a syntax line. See the experimental syntax for more details on how this is being used in practice.
With a multi-line command, the commented lines are ignored, but you need to comment out every line individually:
$ cat Dockerfile
FROM busybox:latest
RUN echo first command \
# && echo second command disabled \
&& echo third command
$ docker build .
Sending build context to Docker daemon 23.04kB
Step 1/2 : FROM busybox:latest
---> 59788edf1f3e
Step 2/2 : RUN echo first command && echo third command
---> Running in b1177e7b563d
first command
third command
Removing intermediate container b1177e7b563d
---> 5442cfe321ac
Successfully built 5442cfe321ac
Use the # syntax for comments
From: https://docs.docker.com/engine/reference/builder/#format
# My comment here
RUN echo 'we are running some cool things'
Dockerfile comments start with #, just like Python.
kstaken has good examples:
# Install a more-up-to date version of MongoDB than what is included in the default Ubuntu repositories.
FROM ubuntu
MAINTAINER Kimbro Staken
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv 7F0CEB10
RUN echo "deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen" | tee -a /etc/apt/sources.list.d/10gen.list
RUN apt-get update
RUN apt-get -y install apt-utils
RUN apt-get -y install mongodb-10gen
#RUN echo "" >> /etc/mongodb.conf
CMD ["/usr/bin/mongod", "--config", "/etc/mongodb.conf"]
Docker treats lines that begin with # as a comment, unless the
line is a valid parser directive. A # marker anywhere else in a line
is treated as an argument.
example code:
# this line is a comment
RUN echo 'we are running some # of cool things'
Output:
we are running some # of cool things
Format
Here is the format of the Dockerfile:
We can use # for commenting purpose, as for example #COMMENT
#FROM microsoft/aspnetcore
FROM microsoft/dotnet
COPY /publish /app
WORKDIR /app
ENTRYPOINT ["dotnet", "WebApp.dll"]
From the above file when we build the docker, it skips the first line and goes to the next line because we have commented it using #
# this is comment
this isn't comment
is the way to do it. You can place it anywhere in the line and anything that comes later will be ignored

Can't run utop after installation

I've just installed utop on openSUSE but when i type utop in terminal i have
If 'utop' is not a typo you can use command-not-found to lookup the package that contains it, like this:
cnf utop
Typing eval 'opam config env' gives me this:
OPAM_SWITCH_PREFIX='/home/jadw1/.opam/default'; export OPAM_SWITCH_PREFIX;
CAML_LD_LIBRARY_PATH='/home/jadw1/.opam/default/lib/stublibs:/usr/lib64/ocaml/stublibs:/usr/lib64/ocaml'; export CAML_LD_LIBRARY_PATH;
OCAML_TOPLEVEL_PATH='/home/jadw1/.opam/default/lib/toplevel'; export OCAML_TOPLEVEL_PATH;
MANPATH='/usr/local/man:/usr/share/man:/home/jadw1/.opam/default/man'; export MANPATH;
PATH='/home/jadw1/.opam/default/bin:/home/jadw1/bin:/usr/local/bin:/usr/bin:/bin:/usr/lib/mit/sbin'; export PATH;
You're supposed to type this:
$ eval `opam config env`
Not this:
$ opam config env
What happens is that opam config env writes out some shell commands that you want to execute. That's what the eval does. If you see output like you saw, it means you left out the eval.
eval is a command that constructs commands from the arguments it's given and executes them.
eval 'opam config env', with apostrophes, is therefore equivalent to just running opam config env directly, which just writes out a sequence of shell commands.
If you replace the apostrophes with backticks, however, it will first execute the quoted command, then pass its output to eval and execute that.
eval `opam config env`
is therefore more or less equivalent to running opam config env, then copying and pasting the output back into the console, which you could also do.

Calling Python package from command line / PyCharm

I am creating a Python package, I anticipate that it will be called both by command line and from other scripts. Here is a simplified version of my file structure:
GREProject/
__init__.py
__main__.py
Parsing.py
Parseing.py contains a method, parse(), it takes two arguments, an input file and an output file. I am trying to figure out the proper code for "__main__.py" so that when the following is called from the command line or terminal the arguments will be passed to "parse()":
Python GREProject -i input.file -o output.file
I have tried this numerous ways but they have all met with failure, I do believe I need the "-m" flag for the interpreter but more than that I don't know. Example with the flag:
Python -m GREProject -i input.file -o output.file
When running the later command I receive the following error:
Import by filename is not supported.
Presumably from this line:
from . import Parsing
Ok, turns out this was a problem with my IDE, PyCharm. No idea why I recieved this error but I have setting that fixed it:
Import by filename is not supported.
For the record here are the options I set in my Pycharm project
Script:
GREProject
Script parameters:
-i .\GREProject\pr2.nyc1 -o .\GREProject\Test.pkl
Enviroment variables:
PYTHONUNBUFFERED=1
Python interpreter:
Python 2.7.11 (c:\Python27\python.exe)
Interpreter options:
-m
Working directory:
C:\Users\probert.dan\PycharmProjects
Here is an explanation of the options:
Script: This is the script to run, by default PyCharm will only insert absolute references to .py files, nothing prevents you from manually typing in a relative reference, in this case it is the GREProjects folder.
Script Parameters: These are passed onto the script itself, in this case I am telling my script that the input file is ".\GREProject\pr2.nyc1" which means, look the file "pr2.nyc1" in the "GREProject" directory below the current working directory.
Environment variables: This was set by PyCharm and left unchanged.
Python interpreter: My active interpreter.
Interpreter options: The option here tells python that we are calling a module, python then knows to access the "__main__.py" file.
Working directory: The directory the script is run from, I chose the directory above "GREProject"
For reference here is the contents of my "__main__.py file":
from . import Parsing
import argparse
parser = argparse.ArgumentParser(description='Parse flags.')
parser.add_argument('-i', help='Import file.')
parser.add_argument('-o', help='(Optional) Output file.')
arguments = parser.parse_args()
Parsing.parse(arguments.i, arguments.o)
It is also important to note that debugging in PyCharm is not possible like this. Here is the solution to debugging: Intellij/Pycharm can't debug Python modules

How can I create command autocompletion for Fabric?

I created a fabric file that contain several commands from short to long and complex. I need to have an autocomplete feature so that when user type fab[tab][tab] then all available fab commands are shown, just like we have in bash.
i.e.
user#someone-ubuntu:~/path/to/fabfile$ fab[tab][tab]
command1 command2 command3 ..and so on
How can I do this ?
There are instructions you can follow here: http://evans.io/legacy/posts/bash-tab-completion-fabric-ubuntu/
Basically you run a script that calls fab --shortlist, the output gets fed into complete which is a bash function that you can read more about here: https://www.gnu.org/software/bash/manual/html_node/Programmable-Completion-Builtins.html
I did this for my new fabfile using fabric 2.5 & Python 3:
~/.config/fabfile
#!/usr/bin/env zsh
_fab()
{
local cur
COMPREPLY=()
# Variable to hold the current word
cur="${COMP_WORDS[COMP_CWORD]}"
# Build a list of the available tasks from: `python3 -m fabric --complete`
local cmds=$(python3 -m fabric --complete 2>/dev/null)
# Generate possible matches and store them in the
# array variable COMPREPLY
COMPREPLY=($(compgen -W "${cmds}" $cur))
}
# Assign the auto-completion function for our command.
complete -F _fab fab
And in my ~/.zshrc:
source ~/.config/fabfile
I updated the fabric 1.X version here: gregorynichola's gist