dns could not translate host name in docker - django

here is my docker file :
FROM ubuntu:18.04
MAINTAINER bussiere "bussiere#toto.fr"
EXPOSE 8000
RUN echo "nameserver 8.8.8.8" >> /etc/resolv.conf
RUN echo "nameserver 80.67.169.12" >> /etc/resolv.conf
RUN echo "nameserver 208.67.222.222" >> /etc/resolv.conf
#RUN echo "dns-nameservers 8.8.8.8 8.8.4.4 80.67.169.12 208.67.222.222" >> /etc/network/interfaces
ENV LANG C.UTF-8
ENV TZ=Europe/Paris
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN apt-get update -y
RUN apt-get install -y --no-install-recommends apt-utils
RUN apt-get install -y python3 python3-pip python3-dev build-essential libpq-dev
RUN python3 -m pip install pip --upgrade
RUN python3 -m pip install pipenv
RUN export LC_ALL=C.UTF-8
RUN export LANG=C.UTF-8
COPY app /app
WORKDIR /app
RUN pipenv --python 3.6
RUN pipenv install -r requirements.txt
ENTRYPOINT ["pipenv"]
CMD ["run","python", "manage.py", "collectstatic", "--noinput"]
CMD ["run","python", "manage.py", "runserver", "0.0.0.0:8000"]
Here is the error :
psycopg2.OperationalError: could not translate host name "toto.postgres.database.azure.com" to address: Temporary failure in name resolution
with the command :
docker run --rm -it -p 8000:8000 admin_marque
When i try to open localhost:8000 in a browser.
The main goal is to deploy it on azure
It's a django app, i know i don't have to user django runserver in prod but i'am ine the process of debugging.
And i'am sure that the url of the database is good.

Related

Import boto3 module error in aws batch job

I was trying to run a batch job on an image in aws and is getting below error
ModuleNotFoundError: No module named 'boto3'
But boto3 is getting imported in dockerfile
Dockerfile
FROM ubuntu:20.04
ENV SPARK_VERSION 2.4.8
ENV HADOOP_VERSION 3.0.0
RUN apt update
RUN apt install openjdk-8-jdk -y
RUN apt install scala -y
RUN apt install wget tar -y
#RUN wget https://apache.mirror.digitalpacific.com.au/spark/spark-$SPARK_VERSION/spark-$SPARK_VERSION-bin-hadoop$HADOOP_VERSION.tgz
RUN wget http://archive.apache.org/dist/hadoop/common/hadoop-$HADOOP_VERSION/hadoop-$HADOOP_VERSION.tar.gz
RUN wget https://downloads.apache.org/spark/spark-$SPARK_VERSION/spark-$SPARK_VERSION-bin-without-hadoop.tgz
RUN tar xfz hadoop-$HADOOP_VERSION.tar.gz
RUN mv hadoop-$HADOOP_VERSION /opt/hadoop
RUN tar xvf spark-$SPARK_VERSION-bin-without-hadoop.tgz
RUN mv spark-$SPARK_VERSION-bin-without-hadoop /opt/spark
RUN apt install software-properties-common -y
RUN add-apt-repository ppa:deadsnakes/ppa
RUN apt update && \
apt install python3.7 -y
ENV SPARK_HOME /opt/spark
ENV HADOOP_HOME /opt/hadoop
ENV HADOOP_CONF_DIR $HADOOP_HOME/etc/hadoop
ENV PATH $PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin:${HADOOP_HOME}/bin
ENV PYSPARK_PYTHON /usr/bin/python3.7
RUN export SPARK_HOME=/opt/spark
RUN export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin:${HADOOP_HOME}/bin
RUN export PYSPARK_PYTHON=/usr/bin/python3.7
RUN export SPARK_DIST_CLASSPATH=$(hadoop classpath)
RUN update-alternatives --install /usr/bin/python python /usr/bin/python3.7 1
RUN update-alternatives --set python /usr/bin/python3.7
RUN apt-get install python3-distutils -y
RUN apt-get install python3-apt -y
RUN apt install python3-pip -y
RUN pip3 install --upgrade pip
COPY ./pipeline_union/requirements.txt requirements.txt
#RUN python -m pip install -r requirements.txt
RUN pip3 install -r requirements.txt
#RUN wget https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk/1.10.6/aws-java-sdk-1.10.6.jar -P $SPARK_HOME/jars/
RUN wget https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-bundle/1.11.874/aws-java-sdk-bundle-1.11.874.jar -P $SPARK_HOME/jars/
RUN wget https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.0.0/hadoop-aws-3.0.0.jar -P $SPARK_HOME/jars/
RUN wget https://repo1.maven.org/maven2/net/java/dev/jets3t/jets3t/0.9.4/jets3t-0.9.4.jar -P $SPARK_HOME/jars/
#RUN wget https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-s3/1.10.6/aws-java-sdk-s3-1.10.6.jar -P ${HADOOP_HOME}/share/hadoop/tools/lib/
#RUN wget https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-s3/1.10.6/aws-java-sdk-s3-1.10.6.jar -P ${SPARK_HOME}/jars/
# COPY datalake/spark-on-spot/src/jars $SPARK_HOME/jars
# COPY datalake/spark-on-spot/src/pipeline_union ./
# COPY datalake/spark-on-spot/src/pipeline_union/spark.conf spark.conf
COPY ./jars $SPARK_HOME/jars
COPY ./pipeline_union ./
COPY ./pipeline_union/spark.conf spark.conf
#RUN ls /usr/lib/jvm
ENV JAVA_HOME /usr/lib/jvm/java-8-openjdk-amd64
ENV PATH $PATH:$HOME/bin:$JAVA_HOME/bin
RUN export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
RUN hadoop classpath
ENV SPARK_DIST_CLASSPATH=/opt/hadoop/etc/hadoop:/opt/hadoop/share/hadoop/common/lib/*:/opt/hadoop/share/hadoop/common/*:/opt/hadoop/share/hadoop/hdfs:/opt/hadoop/share/hadoop/hdfs/lib/*:/opt/hadoop/share/hadoop/hdfs/*:/opt/hadoop/share/hadoop/yarn/lib/*:/opt/hadoop/share/hadoop/yarn/*:/opt/hadoop/share/hadoop/mapreduce/lib/*:/opt/hadoop/share/hadoop/mapreduce/*:/opt/hadoop/contrib/capacity-scheduler/*.jar:/opt/hadoop/share/hadoop/tools/lib/*
ENTRYPOINT ["spark-submit", "--properties-file", "spark.conf"]
#ENTRYPOINT ["spark-submit", "--packages", "org.apache.hadoop:hadoop-aws:2.8.5"]
#ENTRYPOINT ["spark-submit", "--properties-file", "spark.conf", "--packages", "org.apache.hadoop:hadoop-aws:2.8.5"]
requirements.txt
boto3==1.13.9
botocore
colorama==0.3.9
progressbar2==3.39.3
pyarrow==1.0.1
requests
psycopg2-binary
pytz
I ran another image successfully, with 2 differences
code line in dockerfile
RUN pip install -r requirements.txt
requirement.txt
requests
boto3
psycopg2-binary
pytz
pandas
pynt
Is there any knowns issues in:
Using pip3 in Dockerfile instead of pip
Specifying boto3 version

xmlrpc.py Connection refused error while using supervisor in docker

hello guys im writing a docker file and compose with ubuntu 20.04 and try to install supervisor inside it
docker file :
...
FROM ubuntu:20.04
WORKDIR /src/app
ENV BACKENDENVIRONMENT 0
COPY gsigner .
COPY docker_requirements.txt ./
ARG DEBIAN_FRONTEND=noninteractive
RUN apt-get update
RUN apt install -y python-is-python3
RUN apt-get install -y python3.9
RUN apt-get install python3-pip -y
RUN apt-get install gcc musl-dev python3-dev libffi-dev openssl libssl-
dev cargo -y
RUN apt install -y postgresql postgresql-contrib
RUN apt-get update && apt-get install -y postgresql-server-dev-all gcc
python3-dev musl-dev
RUN pip install --upgrade pip setuptools wheel \
&&pip install -r docker_requirements.txt
RUN mkdir /run/gsigner
RUN apt-get install -y supervisor
COPY backend_supervisord.conf /etc/supervisor/conf.d/
dockerfile updated
docker compose :
version: "3.9"
services:
gsigner:
build: .
command: bash -c "python manage.py migrate && supervisorctl reread && supervisorctl reload&&supervisorctl start daphne"
ports:
- 8000:8000
volumes:
- static:/var/static/gsigner/
- media:/var/media/gsigner/
- gsigner:/src/app/
- log:/var/log/gsigner/
volumes:
static:
media:
log:
gsigner:
dockercompose updated
daphne is my program name in my supervisor conf file
my supervisor conf file :
[supervisord]
[supervisorctl]
[program:daphne]
command=daphne gsigner.asgi:application
directory=/src/app/gsigner/
user=root
autostart=true
autorestart=true
i really did not realize what is happening here
and this is the err msg :
error:error: <class 'ConnectionRefusedError'>, [Errno 111] Connection refused: file: /usr/lib/python3/dist-packages/supervisor/xmlrpc.py line: 560

I am having problem installing wkhtmltopdf on my docker container for a Django application?

I am using pdfkit in my django application and it seems to be working fine after I installed wkhtmltopdf on my machine.
But when I build a docker image of my application for production and run it locally, it gives me OS Error for docker image. I have tried everything I found on the web but can't seem to install wkhtmltopdf on my docker container.
Here's my Docker File for building an image, this gives error while installing the package.
FROM python:3.6.9
RUN wget https://github.com/wkhtmltopdf/wkhtmltopdf/releases/download/0.12.1/wkhtmltox-0.12.1_linux-wheezy-amd64.deb
RUN dpkg -i ~/Downloads/wkhtmltox-0.12.1_linux-wheezy-amd64.deb
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
Here's the error I get in the terminal while building the image
Here's the error without wkhtmltopdf in docker
I figured it out.
My DockerFile was missing some code.
FROM python:3.6.9
RUN wget https://s3.amazonaws.com/shopify-managemant-app/wkhtmltopdf-0.9.9-static-amd64.tar.bz2
RUN tar xvjf wkhtmltopdf-0.9.9-static-amd64.tar.bz2
RUN mv wkhtmltopdf-amd64 /usr/local/bin/wkhtmltopdf
RUN chmod +x /usr/local/bin/wkhtmltopdf
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
Now the image is running just fine
This Dockerfile works with django and the newest version of wkhtmltopdf (0.12.6-1)
# pull official base image
FROM python:3.9-buster
RUN apt-get update \
&& apt-get install -y \
curl \
libxrender1 \
libjpeg62-turbo \
fontconfig \
libxtst6 \
xfonts-75dpi \
xfonts-base \
xz-utils
RUN curl "https://github.com/wkhtmltopdf/packaging/releases/download/0.12.6-1/wkhtmltox_0.12.6-1.buster_amd64.deb" -L -o "wkhtmltopdf.deb"
RUN dpkg -i wkhtmltopdf.de
# set work directory
WORKDIR /usr/src/app
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install dependencies
RUN pip install --upgrade pip
COPY ./requirements.txt .
RUN pip install -r requirements.txt
# copy project
COPY . .
EXPOSE 8000
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]

Django: Dockerfile error with collectsatic

I am trying to deploy Django application with Docker and Jenkins.
I get the error
"msg": "Error building registry.docker.si/... - code: 1 message: The command '/bin/sh -c if [ ! -d ./static ]; then mkdir static; fi && ./manage.py collectstatic --no-input' returned a non-zero code: 1"
}
The Dockerfile is:
FROM python:3.6
RUN apt-get update && apt-get install -y python-dev && apt-get install -y libldap2-dev && apt-get install -y libsasl2-dev && apt-get install -y libssl-dev && apt-get install -y sasl2-bin
ENV PYTHONUNBUFFERED 1
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install --no-cache-dir --upgrade pip
RUN pip install --no-cache-dir --upgrade setuptools
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
RUN chmod u+rwx manage.py
RUN if [ ! -d ./static ]; then mkdir static; fi && ./manage.py collectstatic --no-input
RUN chown -R 10000:10000 ./
EXPOSE 8080
CMD ["sh", "./run-django.sh"]
My problem is that, with same dockerfile other Django project deploy without any problem...

How to run Django Daphne service on Google Kubernetes Engine and Google Container Registry

Dockerfile
FROM ubuntu:18.04
RUN apt-get update
RUN apt-get install build-essential -y
WORKDIR /app
COPY . /app/
# Python
RUN apt-get install python3-pip -y
RUN python3 -m pip install virtualenv
RUN python3 -m virtualenv /env36
ENV VIRTUAL_ENV /env36
ENV PATH /env36/bin:$PATH
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
# Start Daphne [8443]
ENV DJANGO_SETTINGS_MODULE=settings
CMD daphne -e ssl:8443:privateKey=/ssl-cert/privkey.pem:certKey=/ssl-cert/fullchain.pem asgi:application
# Open port 8443
EXPOSE 8443
Enable Google IP Alias in order that we may connect to Google Memorystore/Redis
Build & Push
$ docker build -t [GCR_NAME] -f path/to/Dockerfile .
$ docker tag [GCR_NAME] gcr.io/[GOOGLE_PROJECT_ID]/[GCR_NAME]:[TAG]
$ docker push gcr.io/[GOOGLE_PROJECT_ID]/[GCR_NAME]:[TAG]
Deploy to GKE
$ envsubst < k8s.yml > patched_k8s.yml
$ kubectl apply -f patched_k8s.yml
$ kubectl rollout status deployment/[GKE_WORKLOAD_NAME]
I configured Daphne on GKE/GCR. If you guys have other solutions, please give me your advice.
system is not included in the Ubuntu:18.04 docker image.
Add an ENTRYPOINT to your Dockerfile with commands in ExecStart property of project-daphne.service.