My gitlab ci pipeline keeps failing. It seems am stuck here. Am actually still new to the CI thing so I don't know what am doing wrong. Any help will be appreciated Below is .gitlab-ci.yml file
image: python:latest
services:
- postgres:latest
variables:
POSTGRES_DB: thorprojectdb
POSTGRES_PASSWORD: $''
POSTGRES_USER: $majesty
POSTGRES_HOST_AUTH_METHOD: trust
# This folder is cached between builds
# http://docs.gitlab.com/ee/ci/yaml/README.html#cache
cache:
paths:
- ~/.cache/pip/
before_script:
- python -V
connect:
image: postgres
script:
# official way to provide password to psql: http://www.postgresql.org/docs/9.3/static/libpq-envars.html
- export PGPASSWORD=$POSTGRES_PASSWORD
- psql -h "postgres" -U "$POSTGRES_USER" -d "$POSTGRES_DB" -c "SELECT 'OK' AS status;"
build:
stage: build
script:
- pip install -r requirements.txt
- python manage.py migrate
only:
- EC-30
In my settings.py file, I have the following settings
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'projectdb',
'HOST': 'postgres',
'PASSWORD': ''
}
}
But when I push to gitlab, the build process keeps failing. The - pip install -r requirements.txt runs perfectly but once it gets to - python manage.py migrate, it fails. Below is the error I do get
django.db.utils.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
Cleaning up project directory and file based variables
ERROR: Job failed: exit code 1
Analyzing the description of the .gitlab-ci.yml file, it is clear that you declared the database in the file (POSTGRES_DB), but you are missing information related to the credentials, DB_USER, DB_PASS, as described at this link below:
gitlab-config-postgres
Remembering that it's good practice use the declaring variables at CI/CD section at your repository. For more information:
ci-cd-gitlab-variables
Repository with a config example:
repository-example-gitlab-postgres
services:
- postgres
variables:
# Configure postgres service (https://hub.docker.com/_/postgres/)
POSTGRES_DB: custom_db
POSTGRES_USER: custom_user
POSTGRES_PASSWORD: custom_pass
connect:
image: postgres
script:
# official way to provide password to psql: http://www.postgresql.org/docs/9.3/static/libpq-envars.html
- export PGPASSWORD=$POSTGRES_PASSWORD
- psql -h "postgres" -U "$POSTGRES_USER" -d "$POSTGRES_DB" -c "SELECT 'OK' AS status;"
Related
I'm very new for docker, now I am trying to run django with mariadb in docker through docker-compose, but I always get this error:
I use Docker version 17.09.1-ce, build 19e2cf6, docker-compose version 1.18.0, build 8dd22a9
django.db.utils.OperationalError: (2003, 'Can\'t connect to MySQL
server on \'mariadb55\' (111 "Connection refused")')
I can connect db correctly after run docker-compose up db in local or remote, and I even can run python manage.py runserver 0.0.0.0:6001 correctly in anaconda virtual environment to connect db service in docker by setting parameters of settings.py file like below:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'test',
'USER': 'belter',
# 'HOST': 'mariadb55',
'HOST': '127.0.0.1',
'PORT': '3302',
'PASSWORD': 'belter_2017',
'default-character-set': 'utf8',
'OPTIONS': {
'sql_mode': 'traditional',
}
}
}
This is my docker-compose.yml file
version: '3'
services:
db:
image: mariadb:5.5
restart: always
environment:
- MYSQL_HOST=localhost
- MYSQL_PORT=3306
- MYSQL_ROOT_HOST=%
- MYSQL_DATABASE=test
- MYSQL_USER=belter
- MYSQL_PASSWORD=belter_2017
- MYSQL_ROOT_PASSWORD=123456_abc
volumes:
- /home/belter/mdbdata/mdb55:/var/lib/mysql
ports:
- "3302:3306"
web:
image: onlybelter/django_py35
command: python3 manage.py runserver 0.0.0.0:6001
volumes:
- /mnt/data/www/mysite:/djcode
ports:
- "6001:6001"
depends_on:
- db
links:
- db:mariadb55
I almost tried everything I can find, but still cannot figure it out, any help would be nice!
What I have tried:
Docker compose mysql connection failing
Linking django and mysql containers using docker-compose
Django connection to postgres by docker-compose
Finally, I figured it out!
The key point is, just as #SangminKim said, I need to use 3306 not 3302 in settings.py, and use db as HOST not 127.0.0.1.
So this is my docker-compose.yml file now:
version: '3'
services:
db:
image: mariadb:5.5
restart: always
environment:
- MYSQL_HOST=localhost
- MYSQL_PORT=3306 # cannot change this port to other number
- MYSQL_ROOT_HOST=%
- MYSQL_DATABASE=test
- MYSQL_USER=belter
- MYSQL_PASSWORD=belter_2017
- MYSQL_ROOT_PASSWORD=123456_abc
volumes:
- /home/belter/mdbdata/mdb55:/var/lib/mysql
ports:
- "3302:3306"
web:
image: onlybelter/django_py35
command: python3 manage.py runserver 0.0.0.0:6001
volumes:
- .:/djcode
ports:
- "6001:6001"
depends_on:
- db
So now we can connect this docker-mysql by mysql -h 127.0.0.1 -P 3302 -u root -p in shell directly, but we have to use db and 3306 in django settings.py file:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'test',
'USER': 'belter',
# 'HOST': 'mariadb55',
'HOST': 'db', #<---
'PORT': '3306', #<---
'PASSWORD': 'belter_2017',
'default-character-set': 'utf8',
'OPTIONS': {
'sql_mode': 'traditional',
}
}
}
And we can still check if this port is open, by running extra command in docker-compose.yml file:
...
web:
image: onlybelter/django_py35
command: /bin/sh -c "python check_db.py --service-name mysql --ip db --port 3306"
volumes:
- .:/djcode
...
Here is check_db.py file:
# check_db.py
import socket
import time
import argparse
""" Check if port is open, avoid docker-compose race condition """
parser = argparse.ArgumentParser(description='Check if port is open, avoid\
docker-compose race condition')
parser.add_argument('--service-name', required=True)
parser.add_argument('--ip', required=True)
parser.add_argument('--port', required=True)
args = parser.parse_args()
# Get arguments
service_name = str(args.service_name)
port = int(args.port)
ip = str(args.ip)
# Infinite loop
while True:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
result = sock.connect_ex((ip, port))
if result == 0:
print("{0} port is open! Bye!".format(service_name))
break
else:
print("{0} port is not open! I'll check it soon!".format(service_name))
time.sleep(3)
By the way, this is my Dockerfile for build django-py35:
FROM python:3.5-alpine
MAINTAINER Xin Xiong "xiongxin20008#126.com"
ENV PYTHONUNBUFFERED 1
RUN set -e; \
apk add --no-cache --virtual .build-deps \
gcc \
libc-dev \
linux-headers \
mariadb-dev \
python3-dev \
postgresql-dev \
freetype-dev \
libpng-dev \
g++ \
;
RUN mkdir /djcode
WORKDIR /djcode
ENV REFRESHED_AT 2017-12-25
ADD requirements.txt /djcode/
RUN pip install --no-cache-dir -r /djcode/requirements.txt
RUN pip install uwsgi
ADD . /djcode/ # copy . to /djcode/
EXPOSE 6001
See more details from here: https://github.com/OnlyBelter/django-compose
You should use the container name instead of localhost (or 127.0.0.1) in your settings.py file. Try providing a container name to the db service in the docker-compose.yml file using container_name attribute and replace the host name in the settings.py by the value of the container_name. (Make sure that they are in the same network that docker compose creates for you.)
Build container with this:
docker run --name mysql-latest \
-p 3306:3306 -p 33060:33060 \
-e MYSQL_ROOT_HOST='%' -e MYSQL_ROOT_PASSWORD='strongpassword' \
-d mysql/mysql-server:latest
Make sure MYSQL_ROOT_HOST='%', that means root can connect from any IP.
I'm trying to make a django microservice with postgres database. and I have a problem which I cant solve it for few days. this question contains serveral errors. so you may ignore them and answer last error which is about access psql shell (port 5432 failed: FATAL: password authentication failed for user "postgres" Error section) or ignore all and answer the main question of how to have a docker-compose with django and postgres containers.
the docker-compose.yml looks like:
version: "3.9"
services:
# Redis
redis:
image: redis:7.0.4-alpine
container_name: redis
# rabbit
rabbit:
hostname: rabbit
image: "rabbitmq:3.10.7-alpine"
environment:
- RABBITMQ_DEFAULT_USER=admin
- RABBITMQ_DEFAULT_PASS=mypass
ports:
- "15672:15672"
- "5672:5672"
mongodb_container:
image: mongo:5.0.10
ports:
- "27017:27017"
depends_on:
- redis
# Main Database Postgres
main_postgres_ser:
image: postgres:14.4-alpine
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB= postgres # NAME
- POSTGRES_USER= postgres # USER
- POSTGRES_PASSWORD= postgrespass # PASSWORD
container_name: postgres_container
restart: always
ports:
# - 8000:8000 # HTTP port
- 5432:5432 # DB port
networks:
- djangonetwork
depends_on:
- rabbit
# Main Django Application
main_django_ser:
build:
context: . #/main_ms
dockerfile: Dockerfile_main_ms
container_name: main_django
command: "python manage.py runserver 0.0.0.0:8000"
environment:
PYTHONUNBUFFERED: 1
ports:
- 8000:8000
volumes:
- .:/main_ms
networks:
- djangonetwork
depends_on:
- main_postgres_ser
- rabbit
links:
- main_postgres_ser:main_postgres_ser
networks:
djangonetwork:
driver: bridge
volumes:
main_postgres_ser:
driver: local
the Dockerfile for django service looks like:
FROM python:3.10.6-buster
ENV PYTHONUNBUFFERED=1
RUN apt-get update -y
RUN apt-get update && \
apt-get -y install sudo
WORKDIR /main_ms
COPY requirements.txt ./main_ms/requirements.txt
RUN pip3 install -r ./main_ms/requirements.txt
and in settings.py in DATABASES I have
DATABASES = {
'default': {
# 'default':'psql://postgres:postgrespass#postgres:5432/postgres',
# 'default':'postgres://postgres:postgrespass#postgres:5432/postgres',
# 'ENGINE': 'django.db.backends.postgresql_psycopg2',
'ENGINE': 'django.db.backends.postgresql',
'NAME' : 'postgres',
'USER' : 'postgres',
'PASSWORD' : 'postgrespass',
# HOST should be as postgres service name
'HOST' : 'main_postgres_ser',
'PORT' : '5432',
}}
as I know the HOST should be as postgres service name(here main_postgres_ser) because DNS name will be the same.
password authentication failed for user Error
so I built docker-compose then initialized the django project, ran docker-compose up and inside the django container made migrations but when I do python manage.py migrate I get django.db.utils.OperationalError: FATAL: password authentication failed for user "postgres"
so with just keeping the default in DATABASES in settings.py ('default':'postgres://postgres:postgrespass#main_postgres_ser:5432/postgres',) with pattern of 'default':'postgres://postgres:pass#host:port/dbname',.
thus I got everything commented in DATABASES (except default and ENGINE) and got rid of last error. note that I had to keep 'ENGINE': 'django.db.backends.postgresql', (not to get another error) in DATABASES.
Please supply the NAME or OPTIONS['service'] Error
but then again I received django.core.exceptions.ImproperlyConfigured: settings.DATABASES is improperly configured. Please supply the NAME or OPTIONS['service'] value. note I also can access http://localhost:8000/ on browser and there I had django's yellow debug page.
connections on Unix domain socket PGSQL.5432 Error
but when I uncomment 'NAME' : 'postgres', in DATABASES I got connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"? error I have not access to http://localhost:8000/ in the browser again.
no password supplied Error
so I uncommented HOST in DATABASES and got django.db.utils.OperationalError: fe_sendauth: no password supplied
again password authentication failed for user Error
so I uncommented PASSWORD and got django.db.utils.OperationalError: FATAL: password authentication failed for user "root" which is same as django.db.utils.OperationalError: FATAL: password authentication failed for user "postgres" (in the case of uncommenting 'USER'='postgres') so I'm again at the first step!!!
inside the django container I tried to CREATE USER with CREATE USER postgres WITH PASSWORD 'postgrespass'; or postgres=# CREATE USER postgres WITH PASSWORD 'postgrespass'; but I got CREATE: not foundI again tried it in postgres container and got same result.
I also tried last solution by adding local all postgres peer to pg_hba.conf still didnt work.
unrecognized service Error
sudo -u root postgresql psql resulted sudo: postgresql: command not found or sudo service postgresql start resulted postgresql: unrecognized service
psql: error: connection PGSQL.5432 Error
then I tried to access to psql shell with docker exec -it -u postgres postgres_container psql which I received the psql: error: connection to server on socket "/var/run/postgresql/.s.PGSQL.5432" failed: FATAL: role "postgres" does not exist
port 5432 failed: FATAL: password authentication failed for user "postgres" Error
in order to try CREATE USER user WITH PASSWORD 'pass'; I want to access psql shell with docker-compose run --rm main_postgres_ser psql -h main_postgres_ser -U postgres -d postgres things seems to fine and working because it pops up enter password input but when I enter the postgrespass which is my password for postgres user I get port 5432 failed: FATAL: password authentication failed for user "postgres" ERROR: 2 Error in the postgres container and in the terminal that docker-compose is up I get DETAIL: Role "postgres" does not exist..
I also specified user: postgres in postgres docker-compose like this suggestion and didnt change any thing.
questions
so why I get error even I put my password for postgres user?
or how can I make a docker-compose for django and postgres??
thanks to #hedayat I realized that the spaces after = for environment variables in docker-compose are not the cause and the main cause was the in between step of creating django folders(project and the apps) the data\db gets created and it doesn't let the authentication happen. so by deleting data\db I was able to do migrate the database.
So, basically i have this docker-compose.yml config:
services:
postgres:
container_name: youtube_manager_postgres
restart: always
image: postgres:alpine
environment:
- POSTGRES_HOST_AUTH_METHOD=trust
- POSTGRES_USER=admin
- POSTGRES_PASSWORD=qwerty123
- POSTGRES_DB=ytmanager
volumes:
- postgres_data:/var/lib/postgresql/data/
ports:
- "5432:5432"
django:
container_name: youtube_manager_django
restart: always
build:
context: ../
dockerfile: deploy/django/Dockerfile
command: sh -c "poetry run python3 manage.py migrate &&
poetry run python3 manage.py collectstatic --no-input --clear &&
poetry run uwsgi --ini /etc/uwsgi.ini"
ports:
- "8000:8000"
volumes:
- staticfiles:/code/static
- mediafiles:/code/media
depends_on:
- postgres
My Django's database preferences are:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'ytmanager',
'USER': 'admin',
'HOST': '0.0.0.0',
'PASSWORD': 'qwerty123',
'PORT': '5432',
}
}
I wan't to use it in two ways:
1. Running docker-compose up -d postgres and then python3 manage.py runserver (actually, poetry run python3 manage.py runserver but for now it doesn't matter) during development.
2. Running docker-compose up during deployment.
For now, it works fine with the 1 option, but when I'm execution docker-compose up I'm getting an error:
youtube_manager_django | django.db.utils.OperationalError: could not connect to server: Connection refused
youtube_manager_django | Is the server running on host "0.0.0.0" and accepting
youtube_manager_django | TCP/IP connections on port 5432?
If I'm changing Django database settings this way:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'ytmanager',
'USER': 'admin',
'HOST': '0.0.0.0',
'PASSWORD': 'qwerty123',
'PORT': '5432',
}
}
Executing docker-compose up -d postgres and then python manage.py runserver causes an error:
django.db.utils.OperationalError: could not translate host name "postgres" to address: Temporary failure in name resolution
How could I properly config docker-compose.yml to use the same HOST in Django settings? (for example, 'HOST': '0.0.0.0' or 'HOST': 'postgres' for both options).
I've tried to use network_mode: host on my django and postgres services. It works fine, but is there any other way to solve a problem (for example, using networking settings? I've read docker-compose documentation on their website, but can't get what's going on there).
I think you are mixing develop a production environments (by the way, second time you pasted your Django database settings probably you meant 'HOST': 'postgres')
So, if I'm not wrong:
On development you want in your Django setting .py file: 'HOST': '0.0.0.0', since i think your are
executing python manage.py runserver outside docker,
keeping postgres in docker.
On production you want the same in your Django setting .py file: 'HOST': '0.0.0.0', but to make it work you need 'HOST': 'postgres' (matching the name of the service in the compose file) and run everything on docker (executing the whole compose file as it is: docker-compose up). In this case, Django can't get access to '0.0.0.0' database host since it is running 'containerized' and that ip don't bind to any service, so it needs the ip or name of the service 'postgres'.
Posible solution:
In my opinion the solution is having two yml files to be called by Docker ( e.g. docker-compose -f docker-compose-development.yml up):
docker-compose-development.yml
docker-compose-production.yml
In each .yml file you can use different env variables or settings to cover differences between development and production in a clean way.
You can have a look at:
https://github.com/pydanny/cookiecutter-django. It is a template Django project using Docker.
It follows "The Twelve Factors" app methodology:
https://12factor.net/
In short:
Environment variables are set in files under the .envs folder.
in the compose .yml files you point to them in order to load the environment variables:
env_file:
./.envs/.production/.postgres
Django settings .py files get access to the env variables using the django-environ package.
Before editing Django DB config, please ensure the following:
Both containers are running in the same network.
Postgres service is up and running in the container.
Service 'postgres' is accessible from the webapp container. For this you can login to the container and perform a ping test.
docker exec -it containerID /bin/bash (in some cases /bin/sh) # to login to container
Solutions:
Similar:
Django connection to postgres by docker-compose
For you to connect to DB using service name in Django, as per the documentation if the HOST and PORT keys are left out of the dictionary then Django will try connecting with the complete "NAME" as a SID.
Hope it helps
Dockerfile
FROM python:3.8
ENV PYTHONUNBUFFERED 1
ENV WORKDIR /usr/src/app
WORKDIR ${WORKDIR}
RUN pip install --upgrade pip && pip install pipenv
COPY ./Pipfile* ${WORKDIR}/
RUN pipenv lock --requirements > requirements.txt
RUN pip install -r requirements.txt
ADD . ${WORKDIR}/
docker-compose.yml
version: '3'
services:
database:
image: postgres:latest
environment:
POSTGRES_PASSWORD: development
POSTGRES_USER: development
POSTGRES_DB: dev_db
ports:
- 5432:5432
backend:
build: .
command: python /usr/src/app/manage.py runserver 0.0.0.0:8000
volumes:
- .:/usr/src/app
ports:
- 8000:8000
depends_on:
- database
Django database configurations
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'dev_db',
'USER': 'development',
'PASSWORD': 'development',
'HOST': 'database',
'PORT': 5432,
}
}
However, when I start docker compose, it gives the following message:
> Is the server running on host "database" (172.29.0.2) and accepting
TCP/IP connections on port 5432?
Can anyone provide some guidance on why I'm getting this error? As far as I know both services are inside the same network
This is probably because when the Django instance is ready, Postgres is still starting up.
A common solution is to use a script like wait-for to delay the execution of a command until a service is replying on a specific host/port.
Take a look at the Docker documentation about controlling the startup order of multiple containers.
I think you are missing the hostname in the yml file. See below
services:
database:
image: postgres:latest
hostname: database
environment:
POSTGRES_PASSWORD: development
POSTGRES_USER: development
POSTGRES_DB: dev_db
ports:
- 5432:5432
I have the following .gitlab-ci.yml :
image: python:3.6
stages:
- lint
- test
services:
- postgres:10.1-alpine
cache:
paths:
- /root/.local/share/virtualenvs/
before_script:
- python -V
- pip install pipenv
- pipenv install --dev
lint:
stage: lint
script:
- pipenv run pylint --output-format=text --load-plugins pylint_django project/ | tee pylint.txt
- score=$(sed -n 's/^Your code has been rated at \([-0-9.]*\)\/.*/\1/p' pylint.txt)
- echo "Pylint score was $score"
- pipenv run anybadge --value=$score --file=pylint.svg pylint
artifacts:
paths:
- pylint.svg
test:
stage: test
script:
- pipenv run python manage.py test
And I am connecting to the database like this :
# settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': '',
'HOST': 'db', # set in docker-compose.yml
'PORT': 5432 # default postgres port
}
}
For now, I just have this tests in users/tests.py :
from .models import CustomUser
from django.test import TestCase
class LogInTest(TestCase):
def setUp(self):
self.credentials = {
'username': 'testuser',
'password': 'secret'}
CustomUser.objects.create_user(**self.credentials)
def testLogin(self):
# send login data
response = self.client.post('/users/login/', self.credentials, follow=True)
# should be logged in now
self.assertTrue(response.context['user'].is_authenticated)
Which fails with the following error :
psycopg2.OperationalError: could not translate host name "db" to address: Name or service not known
In development, I use the following docker-compose.yml file :
version: '3.3'
services:
db:
image: postgres:10.1-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- 8000:8000
environment:
- SECRET_KEY=changemeinprod
depends_on:
- db
volumes:
postgres_data:
With this Dockerfile :
FROM python:3.6
# Set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /code
# Install dependencies
RUN pip install --upgrade pip
RUN pip install pipenv
COPY ./Pipfile /code/Pipfile
RUN pipenv install --deploy --system --skip-lock --dev
# Copy project
COPY . /code/
I don't understand why the app cab't connect to the database in the CI but connect just fine in the development ENV with docker
In your docker-compose.yml you set the service name as db, and django settings.py for use db, but in gitlab will use the image name as the name of the service, postgres in case.
You have two options:
1 - Use environments variables in settings.py, some example here
2 - Set a alias in gitlab-ci.yml, like:
services:
- name: postgres:10.1-alpine
alias: db
link: gitlab docs