Why response time with postgres is 15x higher than with sqlite? - django

Simple get request (django + django-rest-framework) with sqlite3 database takes 2ms.
Using postgres, response time extends to over 30ms. Why is that, how to make postgres faster?
Settings with sqlite:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}
Settings with postgres:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'drf-test',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'localhost',
'PORT': '5432',
}
}
class Product(models.Model):
name = models.CharField(max_length=50)
description = models.CharField(max_length=150, blank=True)
price = models.DecimalField(max_digits=5, decimal_places=2, default=0)
class Meta:
verbose_name = ("Product")
verbose_name_plural = ("Products")
def __str__(self):
return self.name
class ProductListAPIView(ListAPIView):
queryset = Product.objects.all()
serializer_class = ProductModelSerializer
class ProductModelSerializer(serializers.ModelSerializer):
class Meta:
model = Product
fields = '__all__'

That difference is practically negligible.
Postgres is running a server essentially, so there could be I/O difference in timing. SQLite is file based, and you probably have a really small SQLite file/not many rows in your tables. Because of this, sqlite parsing will be very fast. But don't use SQLite in production; in this case, once the file starts growing, it'll not only consume more memory but also take much longer to parse through and your server response time will be much longer.

Related

django.db.utils.ProgrammingError: type "raster" does not exist

my models: I have created 3 models here and When I migrate then I get the error.
from django.contrib.gis.db import models
from django.contrib.gis.db.models.fields import RasterField
class WorldBorder(models.Model):
# Regular Django fields corresponding to the attributes in the
# world borders shapefile.
name = models.CharField(max_length=50)
area = models.IntegerField()
pop2005 = models.IntegerField('Population 2005')
fips = models.CharField('FIPS Code', max_length=2)
iso2 = models.CharField('2 Digit ISO', max_length=2)
iso3 = models.CharField('3 Digit ISO', max_length=3)
un = models.IntegerField('United Nations Code')
region = models.IntegerField('Region Code')
subregion = models.IntegerField('Sub-Region Code')
lon = models.FloatField()
lat = models.FloatField()
# GeoDjango-specific: a geometry field (MultiPolygonField)
mpoly = models.MultiPolygonField()
# Returns the string representation of the model.
def __str__(self):
return self.name
class Zipcode(models.Model):
code = models.CharField(max_length=5)
poly= models.PolygonField()
class Elevation(models.Model):
name = models.CharField(max_length=100,blank=True, null=True)
rast = RasterField(srid=2346)
my settings .. The database I used is Postgres so that I can use postgis for geodjango
DATABASES = {
"default": {
"ENGINE": "django.contrib.gis.db.backends.postgis",
"NAME": "django_course",
"USER": "postgres",
"PASSWORD": "**************",
"HOST": "localhost",
"PORT": "5432",
}
}
You are missing the PostGIS Raster extension. Create the extension and try again:
CREATE EXTENSION postgis_raster;
Quote from the documentation:
Note that a major change in 3.0 is that the raster functionality has
been broken out as a separate extension.
Use this in your database setting
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'database_name',
'HOST': 'localhost',
'USER': 'user_name',
'PASSWORD': 'password',
'PORT': '5432',
}
}
Add these two lines to your migrations file elevation_zipcode.py:
from django.contrib.postgres.operations import CreateExtension
operations = [
CreateExtension('postgis_raster'),

how to connect database programatically at runtime in #django framework

I have in my User Profile, one field to define the Database name, that user can connect.
DATABASES = {
'app_data': {
'NAME': 'app_data',
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': 'postgres_user',
'PASSWORD': 's3krit'
},
'DBNAME_1': {
'NAME': 'user_data',
'ENGINE': 'django.db.backends.mysql',
'USER': 'mysql_user',
'PASSWORD': 'priv4te'
}
}
class UserProfile(models.Model):
user = models.OneToOneField(User)
dbname = models.CharField(u'Database name', max_length=100, null=True, blank=True)
not connecting to 2 different database only creating in single database
You can use database of your choice with using keyword at runtime.
UserProfile.objects.using('user_data').all()

Django and postgresql schemas

I've been trying to solve this one all week, help very much appreciated.
I have various schemas in a postgres db and I would like to be able to map to them from within the same or across different django apps.
Some of the schemas are :
samples
excavation
geophysics
...
I have tried the recommended way, but I'm not getting any data to display from the schemas, I can only connect to the public schema with managed tables. Here is the database connections from the settings.py file.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'OPTIONS': {
'options': '-c search_path=django,public'
},
'NAME': 'gygaia',
'USER': 'appuser',
'PASSWORD': 'secret',
},
'samples': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'OPTIONS': {
'options': '-c search_path=samples,public'
},
'NAME': 'gygaia',
'USER': 'appuser',
'PASSWORD': 'secret',
},
}
source: https://www.amvtek.com/blog/posts/2014/Jun/13/accessing-multiple-postgres-schemas-from-django/
In the model.py I add:
from django.db import models
# Create your models here.
class Storage(models.Model):
#id = models.IntegerField(default=0)
storage_id = models.AutoField(primary_key=True)
store_name = models.CharField(max_length=200, default='')
address_1 = models.CharField(max_length=200, default='')
address_2 = models.CharField(max_length=200, default='')
region = models.CharField(max_length=200, default='')
city = models.CharField(max_length=200, default='')
zip = models.CharField(max_length=200, default='')
country = models.CharField(max_length=200, default="Turkey")
user = models.CharField(max_length=200, default="Gygaia")
datestamp = models.DateTimeField(auto_now=True)
class Meta():
managed=False
db_table = 'samples\".\"store'
I don't want to restrict schemas to users, and the database was created a few years ago so I'm not allowed to bring it all under one schema. I know there are various solutions posted on stackoverflow and other coreners of the internet, I have tried these, but I'm unable to get this to work. Any ideas how to solve thos one??
Because Django does not support Postgres database schemas out of the box, in order to get this to work, use a database router.
I created a test database to try this out with, here's how to reproduce it:
Create a test database with psql:
CREATE USER tester WITH PASSWORD 'lol so easy';
CREATE DATABASE multi_schema_db WITH OWNER tester;
CREATE SCHEMA samples AUTHORIZATION tester;
CREATE TABLE samples.my_samples (
id INTEGER NOT NULL PRIMARY KEY,
description CHAR(255) NOT NULL
);
Add the schemas to the settings as different database connections, remember to add HOST to avoid the “Peer authentication failed” error.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'OPTIONS': {
'options': '-c search_path=django,public'
},
'NAME': 'multi_schema_db',
'USER': 'tester',
'PASSWORD': 'lol so easy',
'HOST': 'localhost'
},
'samples': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'OPTIONS': {
'options': '-c search_path=samples,public'
},
'NAME': 'multi_schema_db',
'USER': 'tester',
'PASSWORD': 'lol so easy',
'HOST': 'localhost'
},
}
Next create the MySample model:
from django.db import models
class MySample(models.Model):
description = models.CharField(max_length=255, null=False)
class Meta:
managed = False
db_table = 'my_samples'
Create a database router to direct all sample-related queries to the sample database:
from database_test.models import MySample
ROUTED_MODELS = [MySample]
class MyDBRouter(object):
def db_for_read(self, model, **hints):
if model in ROUTED_MODELS:
return 'samples'
return None
def db_for_write(self, model, **hints):
if model in ROUTED_MODELS:
return 'samples'
return None
Basically, the router will route all the models specified in ROUTED_MODELS to the database connection samples and return None for all the other models. This will route them to the default database connection.
Finally add the router to your settings.py
DATABASE_ROUTERS = ('database_test.db_router.MyDBRouter',)
And now when doing a query for the MySample model, it will fetch data from the samples schema.
I also consulted that source, but I could not solve it like you, but by performing tests I achieved the following.
If we have for example, the schemas foo and bar, writing in the Meta:
class MySample1 (models.Model):
description = models.CharField (max_length = 255, null = False)
class Goal:
managed = True
db_table = 'fo\".\"my_samples1'
class MySample2 (models.Model):
description = models.CharField (max_length = 255, null = False)
class Goal:
managed = True
db_table = 'bar\".\"my_samples2'
Then we can redirect each model to the scheme we want provided we have the variable managed in True. The limitation is that we have to name the table ourselves.
First create tables in postgres using schemas and then access these tables in django
using command python manage.py inspectdb > models.py then migrate back.

Django ValueError: Can't do subqueries with queries on different DBs

DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'default_db',
'USER': 'user',
'PASSWORD': '123123123',
'HOST': 'localhost',
'PORT': '',
},
'omskgkh': {
'NAME': 'general',
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': 'user',
'PASSWORD': '123123123',
'HOST': '123.123.123.123',
'PORT': '',
}}
in my view:
def districtreport(request):
info = models.InfoAddress.objects.using('general')
kao = info.filter(okrug='КАО').values('home')
kao_accounts = models.Operation.objects.using('general').filter(account_id__home_id=kao)
On dev server with (default database is sqlite3) that view works fine, but on production server Django raise
Exception Value: Can't do subqueries with queries on different DBs.
Please, help.
ADD:
models.py
class Home(models.Model):
id = models.IntegerField(primary_key=True)
...
class Meta:
db_table = "home"
managed = False
class InfoAddress(models.Model):
id = models.IntegerField(primary_key=True)
home = models.ForeignKey(Home)
okrug = models.CharField(max_length=255)
...
class Meta:
db_table = "infoaddress"
managed = False
class Account(models.Model):
id = models.IntegerField(primary_key=True)
home = models.ForeignKey(Home)
...
class Meta:
db_table = "account"
managed = False
class Operation(models.Model):
id = models.IntegerField(primary_key=True)
account = models.ForeignKey(Account)
...
class Meta:
db_table = "account_op"
managed = False
I found one way to solve problem:
Change QuerySet to list and filter with it.
Updating to cacheops 2.4.1 or later will fix it.
A corresponding issue is now fixed.
The problem was caused by 'cacheops' app. Fixed by author.

Django using legacy database - OperationalError 'Unknown column'

I am using a legacy database(MySQL) in my Django project.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'mydb',
'USER': 'root',
'PASSWORD': 'mydbpassword',
'HOST': '127.0.0.1',
'PORT': '3306',
}
}
The syncdb is runs fine. It doesn't have any errors. Whenever I run this command,
>>> from webservice.models import Users
>>> u = Users.objects.all()
>>> print u
I get this error,
OperationalError: (1054, "Unknown column 'users.id' in 'field list'")
Here's my Users model from inspectdb,
class Users(models.Model):
emp_id = models.IntegerField()
username = models.CharField(max_length=128)
email = models.CharField(max_length=128)
status = models.CharField(max_length=5)
class Meta:
managed = False
db_table = 'users'
def __unicode__(self):
return self.username
What am I missing here?
Update
I also tried this solution but doesn't work for me.