I want to integrate MySQL with Django, MySQL is running over Docker and I put the config like this to connect to the db docker with Django:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'restaurant',
'HOST': 'db',
'PORT': '3306',
'USER': 'root',
'PASSWORD': 'root',
'default-character-set': 'utf8',
'OPTIONS': {
'init_command': 'SET default_storage_engine=INNODB',
}
}
}
But when Django is trying to connect to the mysql db, it throws me this error:
I tried to install mysqlclient with pip, but I have this error:
These are the docker-compose.dev.yml and Dockerfile configs.
If someone needs the complete code, here you can find it, and test it with docker-compose -f docker-compose.dev.yml up --build.
Thanks :).
mysqlclient has native dependencies that must be installed before you can pip install it. When running in docker, and especially in alpine, you probably want to switch over to using mysql-connector-python which is a pure python library that does not have any native dependencies l,ike mysqlclient. Update your requirements file and update your settings to use mysql.connector.django if you want to use mysql-connector-python.
Another response would be adding:
RUN apk add --no-cache bash mariadb-dev mariadb-client mariadb-libs python3-dev build-base
On the dockerfile.
Related
My django project runns normally on localhost and on heroku also, but when I deployed it to google cloud platform I am getting this error:
could not connect to server: Connection refused
Is the server running locally and accepting connections on Unix domain socket "/cloudsql/<connection_name>/.s.PGSQL.5432"?
The connection to the database in settings.py looks like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'database_name',
'USER': 'postgres',
'PASSWORD': "password",
# production
'HOST': '/cloudsql/connection_name',
'PORT': '5432',
}
Additionally, my app.yaml looks like
runtime: python37
handlers:
- url: /static
static_dir: static/
- url: /.*
script: auto
env_variables:
DJANGO_SETTINGS_MODULE: fes_app.settings
requirements.txt looks like this plus
sqlparse==0.4.2
toml==0.10.2
uritemplate==4.1.1
urllib3==1.25.11
whitenoise==5.2.0
twilio==6.9.0
I have tried using the binary version of psycopg, also gave a role client sql to the service account in the cloud.
**NOTE : ** I am using an app engine standard environment
Likely need more information to help you out, but did you follow the tutorial below in the official Google Docs? That's usually how I get started and then I make modifications from there.
I would compare how Google is deploying a Django app to your own settings and see what's missing. For example, your requirements.txt file does not look complete (unless you only pasted part of it) so I would start there.
https://cloud.google.com/python/django/appengine
I need to install and configure postgresql and django in one container, I can't use docker-compose because the hosting doesn't have support for docker-compose.
i try of following in dockerfile
FROM ubuntu:16.04
RUN apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys B97B0AFCAA1A47F044F244A07FCC7D46ACCC4CF8
RUN echo "deb http://apt.postgresql.org/pub/repos/apt/ precise-pgdg main" > /etc/apt/sources.list.d/pgdg.list
RUN apt-get update && apt-get install -y python-software-properties software-properties-common postgresql-9.3 postgresql-client-9.3 postgresql-contrib-9.3
USER postgres
RUN /etc/init.d/postgresql start &&\
psql --command "CREATE USER docker WITH SUPERUSER PASSWORD 'postgres';" &&\
createdb -O docker docker
RUN echo "host all all 0.0.0.0/0 md5" >> /etc/postgresql/9.3/main/pg_hba.conf
RUN echo "listen_addresses='*'" >> /etc/postgresql/9.3/main/postgresql.conf
# Expose the PostgreSQL port
EXPOSE 5432
# Add VOLUMEs to allow backup of config, logs and databases
VOLUME ["/etc/postgresql", "/var/log/postgresql", "/var/lib/postgresql"]
CMD ["/usr/lib/postgresql/9.3/bin/postgres", "-D", "/var/lib/postgresql/9.3/main", "-c", "config_file=/etc/postgresql/9.3/main/postgresql.conf"]
FROM python
ENV PYTHONUNBUFFERED 1
RUN mkdir /app
WORKDIR /app/roche
COPY requirements.txt /app/roche
RUN pip install -r requirements.txt
COPY . /app/roche
CMD ["python3", "manage.py", "runserver", "0.0.0.0:8000"]
but I get this error
This is my configure postgres in settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'docker',
'PASSWORD': 'postgres',
'HOST': 'localhost',
'PORT': 5432,
}
}
this is the link of the answer why I can't use more than one container in plesk
https://talk.plesk.com/threads/networking-between-docker-containers.353312/
thanks to all who read and respond
I'm trying to create a docker setup such that I can easily build and deploy a geodjango app (with postgis backend). I have the following folder structure:
|-- Dockerfile
|-- Pipfile
|-- Pipfile.lock
|-- README.md
|-- app
| |-- manage.py
| |-- app
| `-- app_web
In my Dockerfile to setup Django I have the following:
# Pull base image
FROM python:3.7
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install dependencies
RUN pip install pipenv
COPY . /code
WORKDIR /code/
RUN pipenv install --system
# Setup GDAL
RUN apt-get update &&\
apt-get install -y binutils libproj-dev gdal-bin python-gdal python3-gdal
# set work directory
WORKDIR /code/app
CMD ["python", "manage.py", "migrate", "--no-input"]
In my docker-compose.yml file:
version: '3.7'
services:
postgis:
image: kartoza/postgis:12.1
volumes:
- postgres_data:/var/lib/postgresql/data/
web:
build: .
command: python /code/app/manage.py runserver 0.0.0.0:8000
ports:
- 8000:8000
volumes:
- .:/code
depends_on:
- postgis
volumes:
postgres_data:
And finally in settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'postgis',
},
}
Now when I: run docker-compose up --build
It all seems to work (both the database as well as the django app spin up their containers). But whenever I try to actually work with the database (the app is blank rightnow, so I still need to migrate) django doesn't seem to recognize the database.
Can anybody tell me what is going wrong?
run docker-compose up --build It all seems to work (both the database as well as the django app spin up their containers)
Make sure to check the logs because containers not necessarily stop after finding an error.
docker-compose logs
Also you're already setting your working dir in Dockerfile
WORKDIR /code/app
so take a look at the command you execute in docker-compose.yml:
command: python /code/app/manage.py runserver 0.0.0.0:8000
Did you install the app?
INSTALLED_APPS = [
# [...]
'django.contrib.gis'
]
In settings.py, PASSWORD needs to be specified:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'postgis',
},
}
I'm starting to develop an application with a Django back-end, and I wish to do it inside a Docker. I almost managed to do it, but I'm still having an issue. Currently I have two containers running :
The first one contains my django app and the complete command is
python3 manage.py runserver 0.0.0.0:8000
and the second one is hosting my database.
My docker-compose.yml file is this one :
version: '3'
services:
db:
image: mysql:5.7
environment:
MYSQL_ROOT_PASSWORD : root
MYSQL_DATABASE : ml_gui
back:
build: ./back/
command: python3 manage.py runserver
ports:
- "8000:8000"
depends_on:
- db
and my django settings concerning the database is :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'ml_gui',
'USER': 'root',
'PASSWORD': 'root',
'HOST': 'db',
'PORT': '3306',
'OPTIONS': {
'init_command': "SET sql_mode='STRICT_TRANS_TABLES'",
},
'TEST': {
'CHARSET': 'utf8',
'COLLATION': 'utf8_general_ci',
},
},
}
The problem is, when I do requests outside of the container (I've tried in my browser, with curl and with Postman) on localhost:8000, I have no answer. But, when I do the same request inside the container with curl, it works.
How could I make those requests work from outside of the containers ?
you should never run ./manage.py runserver in production, it is only for testing... also if you dont specify the IP address that the server listens to it is available only for localhost (see here)
so in you docker-compose.yml the command should be
./manage.py runserver 0:8000
I have a working Django project that will deploy using Heroku. I am having trouble getting the app to deploy on GAE. When I run it locally, I get an error referring to an Improperly Configured database backend.
Any help would be appreciated.
Error:
...
raise ImproperlyConfigured(error_msg)
ImproperlyConfigured: 'postgresql' isn't an available database backend.
Try using django.db.backends.XXX, where XXX is one of:
'dummy', 'mysql', 'oracle', 'postgresql_psycopg2', 'sqlite3' <br>
Error was: No module named postgresql.base
...
app.yaml
runtime: python27
api_version: 1
threadsafe: true
handlers:
- url: /.*
script: main.app
libraries:
- name: django
version: "latest"
beta_settings:
cloud_sql_instances: <cloudsql-connection-string>
settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'xxx',
'USER': '*****',
'PASSWORD': '******',
'HOST': 'xx.xx.xx.xx',
'PORT': '5432',
}
}
If I change the Engine to
'ENGINE': 'django.db.backends.postgresql_psycopg2'
I get the error:
ImportError: No module named psycopg2.extensions
pip freeze returns:
Django==1.11.4 psycopg2==2.7.3.1 pytz==2017.2
The GAE standard env. does not allow for psycopg2, and it seems that my original app.yaml (above) instructed GAE to allocate a standard env. instance. This disconnect was highlighted by Dan's comments (above). The correct procedure here is to 1) change the app to be able to deploy on GAE standard, or 2) change the app.yaml to deploy on a GAE flex (according to my current understanding). The following pages seem to help on the second option (https://cloud.google.com/appengine/docs/flexible/python/upgrading) and (https://cloud.google.com/appengine/docs/flexible/python/testing-and-deploying-your-app).