I have build my postgres and django appplication using the following
version: "3.8"
services:
django:
build: .
container_name: django
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/usr/src/app
ports:
- "8000:8000"
depends_on:
- db
db:
image: postgres
container_name: pgdb
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
When I check the docker-desktop, I got 2 docker containers, "django" and "pgdb".
When I check the django, it says
django.core.exceptions.ImproperlyConfigured: The SECRET_KEY setting must not be empty.
Originally, on my windows 10 machine, I saved the secret key in the windows variable. What is the way to build the docker-compose so it has the secret get?
SECRET_KEY = os.environ.get('DJANGO_SECRET_KEY')
You would need to create a .env file with the SECRET_KEY.
In the django_secrets.env you can store like this:
SECRET_KEY=my_secret_key
Then in the docker-compose.yml file you can specify the django_secrets.env file:
version: "3.8"
services:
django:
build: .
container_name: django
command: python manage.py runserver 0.0.0.0:8000
env_file:
- ./django_secrets.env
volumes:
- .:/usr/src/app
ports:
- "8000:8000"
depends_on:
- db
db:
image: postgres
container_name: pgdb
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
And then you can get the value in the settings.py file like this:
SECRET_KEY = os.environ.get("SECRET_KEY", 'my_default_secret_key')
You can have the django_secrets.env file in any path, you just need to specify the path in the docker-compose.yml file. Also you can name it as you like
Related
I have a problem with the initial launch of docker-compose up, when DB is not initialized yet and django throws an error.
I tried to use 'restart_police', but it didn't help and the webservice was restarted almost without waiting and forward the DB service, regardless of which reload period I wouldn't set
version: "3.9"
services:
web:
build: .
command: bash -c "python manage.py makemigrations && python manage.py migrate && python manage.py runserver 0.0.0.0:8000"
volumes:
- .:/app
ports:
- "8000:8000"
deploy:
restart_policy:
condition: on-failure
delay: 15s
environment:
- POSTGRES_NAME=${POSTGRES_DB}
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_HOST=db
depends_on:
- db
db:
container_name: db_pg
image: postgres
hostname: postgres
environment:
POSTGRES_DB: ${POSTGRES_DB}
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_HOST: db
volumes:
- ./data/db:/var/lib/postgresql/data
restart: unless-stopped
pgadmin:
image: dpage/pgadmin4
depends_on:
- db
ports:
- "5555:80"
environment:
PGADMIN_DEFAULT_EMAIL: pgadmin4#pgadmin.org
PGADMIN_DEFAULT_PASSWORD: admin
volumes:
- ./data/pgadmin:/var/lib/pgadmin/data
I'm trying to import PostgreSql dump to docker container, but it doesn't work
Dockerfile:
FROM postgres
COPY postgres.sql /docker-entrypoint-initdb.d/
version: "3.9"
docker-compose.yml
services:
db:
build: ./DB
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=gamenews
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=321678
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
environment:
- POSTGRES_NAME=gamenews
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=321678
depends_on:
- db
structure:
docker compose up LOGS:
enter image description here
I would suggest that you use a proper Postgres image:
postgres:
image: postgres:13
volumes:
- '.:/app:rw'
- 'postgres:/var/lib/postgresql/data'
Here's a list of all the tags you can use: https://hub.docker.com/_/postgres
Just spin that up, your volume maps the data to your hd, so it's not ephemeral in the container, then you can run pg_restore on your dump file.
I try to initialize my postgresql database using sql script in docker-entrypoint-initdb.d folder
I have no error message but database is not initialized even if I suppress container and rebuilt
what is wrong with my docker-compose.yml file?
docker-compose.dev.yml
version: '3.7'
services:
web:
restart: always
container_name: coverage-africa-web-dev
build:
context: ./app
dockerfile: Dockerfile.dev
restart: always
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app:/usr/src/app
- ./docker-entrypoint-initdb.d/initdata.dev.sql:/docker-entrypoint-initdb.d/initdata.dev.sql
ports:
- 8000:8000
env_file:
- ./.env.dev
I dont know what part I am missing but celery not conneting to redis when I am running docker-compose up --build
error: Cannot connect to redis://127.0.0.1:6379/0: Error 111 connecting to 127.0.0.1:
6379. Connection refused.
Here is my file docker-compose.yml
version: '3'
services:
web:
build: .
image: resolution
depends_on:
- db
- redis
- celery
command: bash -c "python3 /code/manage.py migrate && python3 /code/manage.py initialsetup && python3 /code/manage.py runserver 0.0.0.0:8000"
volumes:
- .:/code
ports:
- "8000:8000"
links:
- db:db
- redis:redis
- celery:celery
restart: always
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- PGHOST=trust
- PGPORT=5432
db:
image: postgres:latest
environment:
POSTGRES_DB: 'postgres'
POSTGRES_PASSWORD: 'postgres'
POSTGRES_USER: 'postgres'
POSTGRES_HOST: 'trust'
redis:
image: "redis:alpine"
ports:
- "6379:6379"
restart: on-failure
celery:
image: resolution
command: celery -A mayan worker -l info
environment:
- DJANGO_SETTINGS_MODULE=mayan.settings.production
volumes:
- .:/code
depends_on:
- db
- redis
links:
- redis:redis
restart: on-failure
celery and redis are running in different containers.
According to the error message that you shared, most likely, your celery is trying to connect to localhost to reach the RedisDB, which is not on localhost.
Seach for the celery configuration file that contains the CELERY_BROKER_URL and CELERY_RESULT_BACKEND values. Most likely they look like this:
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
They should look like this, pointing to the redis service name that you defined in your compose file:
CELERY_BROKER_URL = 'redis://redis:6379'
CELERY_RESULT_BACKEND = 'redis://redis:6379'
If you don't have such a config, search directly for the place where the Celery instance is initialized and make sure it looks like this:
app = Celery('server', broker='redis://redis:6379/0')
I'm doing a project online book store on django, when i try to setup environment variable I am facing the problem.
My docker-compose.yml file looks like:
version: '3.7'
services:
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
environment:
- SECRET_KEY=secret_key
volumes:
- .:/code
ports:
- 8000:8000
depends_on:
- db
db:
hostname: db
image: postgres:11
volumes:
- postgres_data:/var/lib/postgresql/data
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
ports:
- "5432:5432"
volumes:
postgres_data:
and my settings.py:
SECRET_KEY = os.environ.get('SECRET_KEY')
The secret_key is:
SECRET_KEY=-)hy!+7txe6))yhcy4o3ruxj(gy(vwx)6^h&+-i*=0f$4q(&bh