django can't drop the test DB via pgbouncer - django

I'm using pgbouncer with Django. I've added test_foo database to its config to be able to run tests, because apparently Django can't use a different port for the test DB. Now the test run but at the end, when Django tries to drop the test DB, I receive
django.db.utils.DatabaseError: database "test_foo" is being accessed by other users
DETAIL: There are 1 other session(s) using the database.
I suppose that is caused by the open connection stored by pgbouncer. What can I do?

This is not the perfect solution, but it does the trick. You can force Django to use different database settings when running unit tests by adding to your settings.py:
if 'test' in sys.argv or 'test_coverage' in sys.argv:
# Use 5432 as db port (avoid going through pgbouncer, can't delete test DB).
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'xxx',
'USER': 'xxx',
'PASSWORD': 'xxx',
'HOST': '',
'PORT': '5432'
},
}

Related

Django - Postgres connection

I am super beginner, but want to learn super fast building web application.
I am right now developing an Income-Expense web app on Django Framework (Python, js and Ajax).
I am now stuck with the server and get different errors. Anyone can support me ?
ERROR
"django.db.utils.OperationalError: connection to server on socket "/tmp/.s.PGSQL.5432" failed: fe_sendauth: no password supplied"
I think I shut down everything not properly and when a came back my virtual environment was not working.
Thank You
Don't know what to try more
To resolve the issue, make sure that you have the correct credentials to access your PostgreSQL database. You may need to update the database settings in your Django settings.py file to include the correct database name, username, password, host, and port information.
for example
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'mydatabase',
'USER': 'mydatabaseuser',
'PASSWORD': 'mypassword',
'HOST': 'localhost',
'PORT': '5432',
}
}

Connecting to a remote db through a jump server in Django

I'm trying to tunnel my db connection in a django application through a jump server but can't seem to get it working because django manage.py handles & process the connections.
here's that I have in the settings.py
#process ssh_key first
ssh_key= os.getenv('SSH_KEY', '').encode('utf8').decode('unicode_escape')
server ={}
with sshtunnel.open_tunnel(
(os.environ.get('SSH_HOST'),int(os.getenv('SSH_PORT'))),
ssh_pkey=paramiko.RSAKey.from_private_key(io.StringIO(ssh_key)),
ssh_username= os.environ.get('SSH_USERNAME'),
remote_bind_address=(os.environ.get('DB_HOST'), int(os.getenv('DB_PORT'))),
) as ssh_proxy_host:
server={
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'HOST': 'localhost',
'PORT': ssh_proxy_host.local_bind_port,
'NAME': os.environ.get('DB_NAME'),
'USER': os.environ.get('DB_USER'),
'PASSWORD': os.environ.get('DB_PASS'),
}
# here's where I should have the connection function to db, but don't know if django has that option available
The true remote host is specified upon tunnel creation. The HOST used for the db connection should be localhost because you need it to find your end of the tunnel, not the other end of it.

How to two apps from django to conect one db postgresql

I have a little question.
I have an app in Django rest framework with PostgreSQL(the project called djangoFall), and I build other projects with Django called djangoRuim, but I don't know how to connect and read the tables in the djangoRuim for example
djangoFall connect with the PostgreSQL is working
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'colonybitdb0',
'USER': 'postgres',
'PASSWORD': 'root2017',
'HOST': '127.0.0.1',
'PORT': '5432',
} }
in here I can read tables like this
from djangoFall.profile_clbt.models import HelperNotificationMsg
djangoRuim connect with the same connect PostgreSQL DB is working but I don't how to read the tables
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'colonybitdb0',
'USER': 'postgres',
'PASSWORD': 'root2017',
'HOST': '127.0.0.1',
'PORT': '5432',
}}
in here I can't read the same table
from .models import HelperNotificationMsg # wrong true ?
because here I don't have models.
please help me, how to read these tables.
You can use the connection object to execute direct SQL query to the table's name generated by Django in the second app or the value you assigned to db_table in the Meta option in the second app models.py. This will work here because you are using the same database settings for the two apps.
In case of different database settings, you will need to explicitly connect to the external database with the driver and execute the SQL query.
That being said, since you are using the same database, I think you should consider merging those two apps, or package one. But it depends on what you're trying to archive and the overall architecture.
If it is really mandatory to access to the very same database from two different django applications, here is what needs to be done:
Create the same object model in both applications, make sure each object member are identical.
Use the same credentials and url to access the database.
Now, both of these applications can access to the database with implicit race condition protection, presuming that the database has native support for that. (All modern databases have that support.)

Django (nose) test speeding up with reuse_db not working

I am using django-nose to run my unit tests in django (1.4).
TEST_RUNNER = 'django_nose.NoseTestSuiteRunner'
Creating the database takes a long time.
So I found out putting this in settings.py:
os.environ['REUSE_DB'] = "1"
should do the trick.
actually django itsellve gives this suggestion:
To reuse old database "<path not very interesting>/var/sqlite/unittest.db" for speed, set env var REUSE_DB=1.
of course you need to run it once (or after every database change) with this flag =0
However, when you set the flag to 0, my tests end with the remark:
Destroying test database for alias 'default'...
So when I want to run it with reuse.... there is nothing to reuse... and I will get errors saying the table does not exist
DatabaseError: no such table: <and than a table name>
The test runs perfectly when set the reuse_db to 0
I am using the test database alias in my development settings:
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'development.db'),
'TEST_NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
I am not using the in-memory sqllite database for testing because I read somewhere this doesn't work well with django-nose.
So how can I reuse the DB when it is destroying the databse in the end...
according to this https://docs.djangoproject.com/en/1.4/topics/testing/#the-test-database django is doing this, but it does not show how to prevent this (if I can), or how to use the reuse_db option.
should I use other settings?
If I have understood correctly, you don't know how to create the test database first time (in order to reuse it later).
NoseTestSuiteRunner should create it automatically if DB does not exist even if you set REUSE_DB = 0.
If you want to create test DB manually you can create the following file:
test_db_settings.py
in which you specify:
from settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
# TestRunner uses a database with name that consists
# of prefix test_ and your database name if your database
# name is db then the test database name has to be test_db
'NAME': 'test_db',
'USER': 'postgres_user',
'PASSWORD': 'postgres_user',
'HOST': 'localhost',
'PORT': '5432',
}
}
after that create test_db:
createdb -U postgres_user -h localhost test_db # if you use postgres
python manage.py syncdb --settings test_db_settings.py
python manage.py migrate --settings test_db_settings.py (only if you use South)
Now we have DB that is used by TestRunner. We may run test:
REUSE_DB=1 python manage.py test
Updated
Are you sure that you use NoseTestSuiteRunner? Here is some code from django_nose.NoseTestSuiteRunner. As we can see if option REUSE_DB is set then teardown_database is disabled. If you want you can debug it for example set here a breakpoint to check you really use its Runner etc.
def teardown_databases(self, *args, **kwargs):
"""Leave those poor, reusable databases alone if REUSE_DB is true."""
if not _reusing_db():
return super(NoseTestSuiteRunner, self).teardown_databases(
*args, **kwargs)
# else skip tearing down the DB so we can reuse it next time
I accepted the answer of Andrey Kaygorodov, because he lead me to the solution... reading his answer I thought (how easy, and how stupid of me...)
Anyway, putting this settings_test_db.py next to the settings with this content:
(filling in the name of your project)
from <myproject>.settings import *
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
And for sqlite you do not have to create a db.
just run
python manage.py syncdb --settings <myproject>.settings_test_db
(note: use the projectname, and without .py)
and to run the migrations
python manage.py migrate --settings <myproject>.settings_test_db
(only if you use South)
and set
in your settings.py:
os.environ['REUSE_DB'] = "1"
so you can use
python manage.py test
by the way, in stead of using a separate test settings file...
you can define a test anme in the same settings:
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'test.db'),
'TEST_NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '', # empty string for localhost.
'PORT': '', # empty string for default.
}
}

django unit testing on multiple databases

I'm working on a django project where all my unit test cases were working perfectly.
Ass soon as I introduced a second database all my test cases that inherit from TestCase are broken. At this stage I haven't build any test case for that second database but my router is working fine.
When I run the tests I get the error,
"KeyError: 'SUPPORTS_TRANSACTIONS'"
It appears to me that is trying to check that that all the databases that I've got setup support transactions but the second database is never created.
Any ideas on how to have the test script to build the second database.
I realise this is quite an old thread, but I ran into it with the same issue, and my resolve was adding the multi_db = True flag to my testcase, e.g:
class TestThingWithMultipleDatabases(TestCase):
multi_db = True
def test_thing(self):
pass
Source https://github.com/django/django/blob/master/django/test/testcases.py#L861
This causes django to call flush on all databases (or rollback if they support transactions)
I too am using a db router
I'm afraid I cant find this in Django's documentation, so no link for that
yes I had a similar problem... my fix was to set 'SUPPORTS_TRANSACTIONS': True for each of the database connections in the settings file. Not sure if this is the correct way to fix it, but it worked for me.
'SUPPORTS_TRANSACTIONS':True worked for me too.
However I have a kind of weird multiple db setup using database routers.
#user298404: how does your multiple db setup look like?
ps. sorry; not enough points for comment...
Here is a multiple db setup that I currently have in production:
DATABASES = {
# 'default' is used as the WRITE (master) connection
DB_PRIMARY_MASTER: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'main',
'USER': 'main_write',
'PASSWORD': 'XXXX',
'HOST': 'db-master',
'PORT': '3306',
'SUPPORTS_TRANSACTIONS': True,
},
# Slave connections are READONLY
DB_PRIMARY_SLAVE: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'main',
'USER': 'main_read',
'PASSWORD': 'XXXX',
'HOST': 'db-slave',
'PORT': '3306',
'TEST_MIRROR': DB_PRIMARY_MASTER,
'SUPPORTS_TRANSACTIONS': True,
},
# 'mail_default' is used as the WRITE (master) connection for the mail database
DB_MAIL_MASTER: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbmail',
'USER': 'dbmail_write',
'PASSWORD': 'XXXX',
'HOST': 'db-mail-master',
'PORT': '3306',
'SUPPORTS_TRANSACTIONS': True,
},
# Slave connections are READONLY
DB_MAIL_SLAVE: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbmail',
'USER': 'dbmail_read',
'PASSWORD': 'XXXX',
'HOST': 'db-mail-slave',
'PORT': '3306',
'TEST_MIRROR': DB_MAIL_MASTER,
'SUPPORTS_TRANSACTIONS': True,
},
}
DB_PRIMARY_MASTER, DB_PRIMARY_SLAVE, DB_MAIL_MASTER, and DB_MAIL_SLAVE are all string constants so that they can be used in my database router.
Hint: DB_PRIMARY_MASTER='default'
I hope this helps!
Referring to that link
Django doc Multi-Db
you can:
from django.test import TransactionTestCase
class TestMyViews(TransactionTestCase):
databases = {'default', 'other'} # {'__all__'} shold work too
def test_index_page_view(self):
call_some_test_code()
thanks to
#sih4sing5hog5