Django get queryset fail on unmanaged database - django

I'm trying to use Django's ORM on a read-only database.
Here is the model used:
class Languages(models.Model):
idlanguage = models.IntegerField(primary_key=True)
language = models.CharField(max_length=50)
code = models.CharField(max_length=50)
class Meta:
managed = False
db_table = 'languages'
The configuration of the database part in the settings.py file :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'XXXXXXX',
'USER': 'XXX',
'PASSWORD': 'XXXXXXXXXXXXXXXXXXXXX',
'HOST': '',
'PORT': '',
},
'ro_bdd': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'XXXXX',
'USER': 'XXXXXXXXXX',
'PASSWORD': 'XXXXX',
'HOST': 'XXXXXXXXX',
'PORT': 'XXXX',
}
}
And querysets :
In [1]: Languages.objects.all().using("ro_bdd")
Out[1]: <QuerySet [<Languages: Languages object (1)>]>
In [2]: Languages.objects.filter(idlanguage=1).using("ro_bdd")
Out[2]: <QuerySet [<Languages: Languages object (1)>]>
In [3]: Languages.objects.get(idlanguage=1).using("ro_bdd")
/....../python3.9/site-packages/django/db/backends/utils.py in _execute(self, sql, params, *ignored_wrapper_args)
82 return self.cursor.execute(sql)
83 else:
---> 84 return self.cursor.execute(sql, params)
85
86 def _executemany(self, sql, param_list, *ignored_wrapper_args):
ProgrammingError: ERROR: relation « languages » does not exist
LINE 1: ..., "languages"."language", "languages"."code" FROM "languages...
^
Values :
'sql' => SELECT "languages"."idlanguage", "languages"."language", "languages"."code" FROM "languages" WHERE "languages"."idlanguage" = %s LIMIT 21
'params' => (1,)
Why the get queryset does not work ?

You should put .using() first, since .get() resolves eagerly, and then it is thus "too late", so:
Languages.objects.using('ro_bdd').get(idlanguage=1)
A model object also has no .using(), so it would eventually result in a type error if the .get() was successful.

Related

Not saving the record into mongodb DB in Django, Actually I am trying explore multiple different DB (Postgresql, Mongodb) in a single app

I am trying to use multiple different DB in a single app(todo). I am using Djongo package for dealing mongodb.
Settings.py
DATABASES = {
'default':{},
'sql_db': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'my_db',
'USER': '******',
'PASSWORD': '***',
'HOST': 'localhost',
'PORT': '5432',
},
'mongodb':{
'ENGINE': 'djongo',
'NAME': 'mongo_db'
}
}
todo/models.py
class Task(models.Model):
todo = models.CharField(max_length=200)
status = models.BooleanField(default=False)
def __str__(self):
return self.todo
todo/serializers.py
class TodoSerializer(serializers.ModelSerializer):
class Meta:
model = Task
fields = '__all__'
todo/views.py
#api_view(['POST'])
def todoCreate(request):
serializer = TodoSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
serializer.save(using='mongodb')
return Response(serializer.data)
it successfully saved the record into 'sql_db' but not save in the 'mongodb'.
My suggestion is to put the dual database saving into a save function in models.py:
class Task(models.Model):
...
def save(self, *args, **kwargs):
super(Task, self).save(using='sql_db')
super(Task, self).save(using='mongodb')
and drop the 2nd save in the view.
This is so any save of a Task object will trigger the dual save (E.g. from the django admin or another script) rather than just when it comes from this one serializer view.
I'm not sure the serializer.save method passes the 'uses' through to the objects save function which may be the problem for you. If you want to do it in the serializer then you may have to override the create and save in the TodoSerializer: see Serializers
For dealing with module named Djongo for connecting Django with MongoDb , one should be completely aware of the modules versions being used in the environment.
asgiref==3.5.0,
Django==4.0.3,
djongo==1.3.6,
dnspython==2.2.1,
pykerberos==1.2.4,
pymongo==3.12.1,
python-snappy==0.6.1,
pytz==2022.1,
sqlparse==0.2.4,
configure the Database settings in the settings.py. Its not necessary to use host while connnecting to the localhost of MongoDb.
DATABASES = {
'default': {
'ENGINE': 'djongo',
'NAME': 'your-db-name',
}
}

Processing Django Migrations

We have a multiple database Django project that uses database routers.
For some reason when we run a migration the reference to the migration end up in the django_migrations table but no actual migrations are actually run - that is - there is no change in target database.
Following is the database router for the elegant database.
class ElegantRouter:
"""
A router to control all database operations on models in the
elegant application.
"""
def db_for_read(self, model, **hints):
"""
Attempts to read elegant models go to elegant.
"""
if model._meta.app_label == 'elegant':
return 'elegant'
return None
def db_for_write(self, model, **hints):
"""
Attempts to write elegant models go to elegant.
"""
if model._meta.app_label == 'elegant':
return 'elegant'
return None
def allow_relation(self, obj1, obj2, **hints):
"""
Allow relations if a model in the elegant app is involved.
"""
if obj1._meta.app_label == 'elegant' or \
obj2._meta.app_label == 'elegant':
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
Make sure the elegant app only appears in the 'elegant'
database.
"""
print('allow_migrate',app_label,db)
if app_label == 'elegant':
return db == 'elegant'
return None
Following is the DATABASES settings for the project.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'depot_maestro': {
'ENGINE': 'django_filemaker_pyodbc',
'HOST': os.getenv('FILEMAKER_HOST'),
'PORT': os.getenv('FILEMAKER_PORT'),
'USER': os.getenv('FILEMAKER_USER'),
'PASSWORD': os.getenv('FILEMAKER_PASSWORD'),
'NAME': os.getenv('FILEMAKER_FILENAME'),
'OPTIONS' : {
'driver' : os.getenv('FILEMAKER_DRIVER'),
'driver_supports_utf8' : True,
'autocommit' : True ,
},
'TEST': {
'NAME': os.getenv('FILEMAKER_FILENAME'),
'SERIALIZE': False,
}
},
'postgres': {
'NAME': os.getenv('POSTGRES_DATABASE'),
'ENGINE': 'django.db.backends.postgresql',
'USER': os.getenv('POSTGRES_USER'),
'PASSWORD': os.getenv('POSTGRES_PASSWORD')
},
'elegant': {
'NAME': os.getenv('ELEGANT_DATABASE'),
'ENGINE': 'django.db.backends.postgresql',
'USER': os.getenv('ELEGANT_USER'),
'PASSWORD': os.getenv('ELEGANT_PASSWORD')
},
}
Update
Using sqlmigrate for the project with database routers returns no sql.
./manage.py sqlmigrate elegant 0057 --database elegant
BEGIN;
--
-- Remove field internal_id from organisationelegant
--
--
-- Add field uuid to organisationelegant
--
--
-- Alter field id on organisationelegant
--
COMMIT;
where as the project without database routers returns SQL
./manage.py sqlmigrate elegant 0057
BEGIN;
--
-- Remove field internal_id from organisationelegant
--
ALTER TABLE "organisation" DROP COLUMN "id" CASCADE;
...
--
-- Alter field id on organisationelegant
--
DROP INDEX IF EXISTS "organisation_uuid_19796862_like";
COMMIT;
How can you make a migration with database routers take effect when allow migrate already returns true and the migration files are being produced ?
I ended up creating a new project with the same app referencing the same database and migrated from that project.
Update
I added symbolic links to the main project.

Updating to non-standard database in DRF

Using Django Rest Framework, how to execute update/PUT calls to a non-standard database?
In my Django project I'm using a separate database for the application data:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'bookstore': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'bookstore.sqlite3'),
}
}
I use the ModelSerializer and ModelViewSet to automatically create the API for me. I loop over the models to automatically generate all the Serializers and ViewSets (quite a big number of tables), the generated classes end up looking like this:
ModelSerializer:
class BookSerializer(serializers.ModelSerializer):
class Meta:
model = Book
fields = '__all__'
ModelViewSet:
class BookViewSet(viewsets.ModelViewSet):
queryset = Book.objects.using('bookstore').all()
serializer_class = BookSerializer
I needed to add the using('bookstore') here to make this work.
Reading (GET) works fine. When I try to do a PUT update, I get a no such table: book Sqlite error.
It seems like the update isn't routed to the bookstore database.
For completeness, I use these loops to generate the ModelSerializer and ModelViewSets from the models:
ModelSerializer, generator:
models = dict(apps.all_models['api'])
for name, model in models.items():
class_name = name[:1].upper() + name[1:] + 'Serializer'
Meta = type('Meta', (object, ), {'model': model, 'fields': '__all__'})
print(class_name)
globals()[class_name] = type(class_name, (serializers.ModelSerializer,), {'Meta':Meta, })
ModelViewSet, generator:
models = dict(apps.all_models['api'])
for name, model in models.items():
capitalized_class_name = name[:1].upper() + name[1:]
viewset_class_name = capitalized_class_name + 'ViewSet'
serializer_class_name = capitalized_class_name + 'Serializer'
globals()[viewset_class_name] = type(viewset_class_name, (viewsets.ModelViewSet,), {'queryset': model.objects.using('bookstore').all(), 'serializer_class': globals()[serializer_class_name],'filter_backends': [filters.OrderingFilter], })

django admin specify database (not default)

I am using django admin to facilitate editing a database. Recently, we have added another database so using the default in DATABASES in the settings file would not make sense any more.
I now have this in my settings.py file:
DATABASES = {
'default': {},
'animal_tracking': {
'ENGINE':'django.db.backends.mysql',
'NAME': 'AnimalTracking',
'USER': 'foo',
'PASSWORD': 'bar',
'HOST': '127.0.0.1',
'PORT': '3306',
},
'animal_information': {
'ENGINE':'django.db.backends.mysql',
'NAME': 'AnimalInformation',
'USER': 'foo',
'PASSWORD': 'bar',
'HOST': '127.0.0.1',
'PORT': '3306',
},
}
My admin.py file contains:
from django.contrib import admin
from animal_tracking.models import at_animal_types, at_animals
# Register your models here.
class AnimalTypesAdmin(admin.ModelAdmin):
# Sets how the fields are displayed in the add / change section.
fieldsets = [
(None, {'fields': ['type', ]}),
]
# Sets what fields to be displayed in the change (view) section.
list_display = ('type', )
# Registers the current model administration to the admin page.
admin.site.register(at_animal_types, AnimalTypesAdmin)
class AnimalsAdmin(admin.ModelAdmin):
# Sets how the fields are displayed in the add / change section.
fieldsets = [
(None, {'fields': ['tracker_id', 'type', ]}),
('Log information (should not change)', {'fields': ['last_log', 'last_bat', 'last_lat', 'last_lon', ], 'classes': ['collapse']}),
]
# Sets what fields to be displayed in the change (view) section.
list_display = ('tracker_id', 'type', 'last_log', 'last_bat', 'last_lat', 'last_lon', )
# Sets what fields to allow filtering by.
list_filter = ['type', ]
# Sets what fields to allow searching by. Use table__field if foreign key.
search_fields = ['tracker_id', 'type__type', ]
# Registers the current model administration to the admin page.
admin.site.register(at_animals, AnimalsAdmin)
Initially, the admin section would connect with the default database. But now that I removed the default and added the other 2 databases, I'm getting the following error:
If I copy the settings of animal_tracking to default, it works. My question therefore is:
How can I specify which database django admin should use?
Thank you!
You'd better specify the default database, if you leave default empty, you should write db routers.
in your admin, you can use any db you like:
class AnimalTypesAdmin(admin.ModelAdmin):
using = 'animal_tracking'
def get_queryset(self, request):
# Tell Django to look for objects on the 'other' database.
return super(MultiDBModelAdmin, self).get_queryset(request).using(self.using)
save_model,delete_model,formfield_for_foreignkey,formfield_for_manytomany should also be overridden like this in the example.

django "no such table" when use ModelForm.save

I have a simple model and I want to save it using the ModelForm.
Here's the code:
#models.py
class MyArchive(models.Model):
archive_id = models.CharField(max_length = 20, primary_key=True)
description = models.CharField(max_length = 50, blank = True)
archive_file = models.FileField(upload_to = "my_archives/")
#views.py
class MyArchiveForm(ModelForm):
class Meta:
model = MyArchive
def upload(request):
if request.method == 'POST':
form = MyArchiveForm(request.POST, request.FILES)
if form.is_valid():
form.save()
return HttpResponse('uploaded success!')
else:
logger.debug("invalid form")
return HttpResponse('upload fail!')
I've synced DB and saw the tables created. But every time it goes to form.save then says
DatabaseError, no such table.
Is my way to save using ModelForm wrong?
UPDATE:
What's even weird that when I removed this line: form.save(), it fails at if form.is_valid() with the same error no such table, but when I run django in debug mode, if form.is_valid() works fine.
It's a little trick, turns out led by bad database configuration, I was using relative path in settings.py, so it didn't find the db file at running time, but when run django sycdb, it can find where the db loc.
'default' : {
'ENGINE': 'django.db.backends.sqlite3', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME': 'sqlite.db', # Or path to database file if using sqlite3.
}
UPDATE:
To use relative path on db file, I should have db setting like this:
from os.path import dirname, join
PROJECT_DIR = dirname(__file__)
DATABASES = {
# ...
'NAME': join(PROJECT_DIR, 'sqlite.db'),
# ...
}
see Can I make the Django database path (for sqlite3) "cross-platform"?