How to update a Foreign Key value in Flask Sqlalchemy? - flask

class VehicleModel(db.Model):
__tablenam__ = 'vehicle'
id = db.Column(db.Integer,primary_key=True)
class DriverModel(db.Model):
__tablename__ = 'driver'
id = db.Column(db.Integer,primary_key=True)
v_id = db.Column(db.Integer,db.ForeignKey('vehicle.id'))
v_rel = db.relationship('VehicleModel',backref=db.backref('vehicle',uselist=False))
def update(self,id,v_id):
self.id = id
self.v_id = v_id
db.session.commit()
In this above code, i am not able to update the value of v_id. Sqlalchemy is not throwing any sql error either. I don't know, the update method is working fine but not able to update v_id value. The value of v_id remains same as it was set when adding the column for first time.

This is because Alembic can’t do particularly detailed checks on your dB.Models do you have to change the migration script yourself. If you’re using Flask-migrate it’ll be in a migrations/versions folder. Change this using the alembic syntax found elsewhere in the file to change a foreign key.

Well it worked automatically after making certain changes in the relation, so never mind!
And thank you all for your responses!

Related

Update last inserted record in Django

In Django, I want to retrieve the last inserted record from the database and update its values.
I use this model:
def User(models.Model):
name = models.CharField(max_length=15)
I run the following code to retrieve the last inserted record and update the name:
User.objects.last().update(name=‘NEW NAME’)
However, the error is that update is not a known method.
Does .last() indeed return the entire record, or only the primary key?
Thank you very much.
Does .last() indeed return the entire record, or only the primary key?
.last() [Django-doc] returns the last User object, or None, if there is no such record.
Now a single model object has indeed no .update(..) method. Only a QuerySet has an update(..) [Django-doc] method. You thus can for example retrieve it, alter the field, and then save the object:
last_user = User.objects.last()
if last_user is not None:
last_user.name = 'NEW NAME'
last_user.save()
you must at first get the user model then use last like this
user = User.objects.all().last()
and now you can update with this code
user.name = 'Daniel'
user.save()
Suppose you model/database is this
class User(models.Model):
name = models.CharField(max_length=15)
Then you go to the terminal and start the shell to do some testing. For that you do
python manage.py shell
In the shell, first you need to import your database/model
from nameofyourapp.models import User
To get the last element you could write
lastUser = User.objects.all().last()
Now change the last User
lastUser["name"] = "Josh"
Save the changes
lastUser.save()
Exit from the terminal

Refreshing a model's unmanaged related model in Django

I have a model (lets call it Entity) that has an attribute (Attribute) that changes over time, but I want to keep a history of how that attribute changes in the database. I need to be able to filter my Entities by the current value of Attribute in its manager. But because Django (as far as I can tell) won't let me do this in one query natively, I have created a database view that produces the latest value of Attribute for every Entity. So my model structure looks something like this:
class Entity(models.Model):
def set_attribute(self, value):
self.attribute_history.create(value=value)
def is_attribute_positive(self, value):
return self.attribute.value > 0
class AttributeEntry(models.Model):
entity = models.ForeignKey(Entity, related_name='attribute_history')
value = models.IntegerField()
time = models.DateTimeField(auto_now_add=True)
class AttributeView(models.Model)
id = models.IntegerField(primary_key=True, db_column='id',
on_delete=models.DO_NOTHING)
entity = models.OneToOneField(Entity, related_name='attribute')
value = models.IntegerField()
time = models.DateTimeField()
class Meta:
managed = False
My database has the view that produces the current attribute, created with SQL like this:
CREATE VIEW myapp_attributeview AS
SELECT h1.*
FROM myapp_attributehistory h1
LEFT OUTER JOIN myapp_attributehistory h2
ON h1.entity_id = h2.entity_id
AND (h1.time < h2.time
OR h1.time = h2.time
AND h1.id < h2.id)
WHERE h2.id IS NULL;
My problem is that if I set the attribute on a model object using set_attribute() checking it with is_attribute_positive() doesn't always work, because Django may be caching that the related AttributeView object. How I can I make Django update its model, at the very least by requerying the view? Can I mark the attribute property as dirty somehow?
PS: the whole reason I'm doing this is so I can do things like Entity.objects.filter(attribute__value__exact=...).filter(...), so if someone knows an easier way to get that functionality, such an answer will be accepted, too!
I understand that the attribute value is modified by another process (maybe not even Django) accessing the same database. If this is not the case you should take a look at django-reversion.
On the other hand if that is the case, you should take a look at second answer of this. It says that commiting transaction invalidate query cache and offer this snippet.
>>> from django.db import transaction
>>> transaction.enter_transaction_management()
>>> transaction.commit() # Whenever you want to see new data
I never directly solved the problem, but I was able to sidestep it by changing is_attribute_positiive() to directly query the database table, instead of the view.
def is_attribute_positive(self, value):
return self.attribute_history.latest().value > 0
So while the view gives me the flexibility of being able to filter queries on Entity, it seems the best thing to do once the object is received is to operate directly on the table-backed model.

Django: making a custom PK auto-increment?

I've been using custom primary keys for a model in Django. (This was because I was importing values into the database and they already had ID's attached, and it made sense to preserve the existing values.)
class Transaction(models.Model):
id = models.IntegerField(primary_key=True)
transaction_type = models.IntegerField(choices=TRANSACTION_TYPES)
date_added = models.DateTimeField(auto_now_add=True)
However, now I want to add new instances of the model to the database, and I'd like to autogenerate a unique primary key. But if I don't specify the ID at the time of creating the instance, I get an error:
t = Transaction(transaction_type=0)
t.save()
gives:
IntegrityError at /page
(1048, "Column 'id' cannot be null")
How can I autogenerate a unique ID to specify for new values, without having to alter the way I import the existing values?
UPDATE
I've written this custom method, which seems to work...
class Transaction(models.Model):
def save(self, *args, **kwargs):
if not self.id:
i = Transaction.objects.all().order_by('-id')[0]
self.id = i.id+1
super(Transaction, self).save(*args, **kwargs)
You can use AutoField for the column id instead of IntegerField. The following should work for you:
id = models.AutoField(primary_key=True)
id will now increase automatically and won't have concurrency problems as it may encounter in save method.
I've ended up using very similar piece of code, but have made it slightly more generic:
def save(self, *args, **kwargs):
if self.id is None:
self.id = self.__class__.objects.all().order_by("-id")[0].id + 1
super(self.__class__, self).save(*args, **kwargs)
it uses self.__class__ so you can just copy paste this code to any model class without changing anything.
How are you importing the existing values? It would be trivial to write something into your Transactions __init__ to generate a new ID for you, but without knowing how you're importing the other values I can't say for sure whether it will alter the way you work with them.
If you remove your declared id field, django will automatically assume this:
id = models.AutoField(primary_key=True)
In Django 1.8, inspectdb will automatically detect auto_increment and use an AutoField when generating models.
Django migrations will do most of the hard work for you here.
Firstly, stop any access to your app so users can't change the database whilst you are working on it.
It would then be very wise to backup your database, before performing any work, as a precaution.
Remove your manually declared id field from your models.py (i.e. delete it).
Run makemigrations and then migrate. Django will modify the id field to the correct implementation for your database version.
Run this (example) command in psql adapting, if need be, to your table names:
select setval(pg_get_serial_sequence('transactions_transaction', 'id'), max(id)) from transactions_transaction;
This will set your id field to the correct serial sequence value in postgres for your table (i.e. the largest value of the id field of your existing records). This is crucial, as otherwise the value will be 1!
And that's it: from now on everything will be automatic again.

Django: pkey is none after saving

I have a strange behaviour on my Django/PostgreSQL system.
After saving a model object the primary key is none although it's an AutoField and the id is correctly saved in the database.
The following script passage returns None for the id:
a = SomModelClass()
a.someattribute = 'xyz'
a.save()
a.someattribute
>>> 'xyz'
a.id
>>> None
The model class looks somehow like this:
class SomeModelClass(models.Model):
id = models.AutoField(db_column = 'id', primary_key = True)
someattribute = models.CharField(db_column = 'someattribute', max_length = 200)
This behaviour occurs only on this model; all other models work fine.
The problem appeared one day without changing the model structure.
Perhaps there is some problem with the data integrity of the database? Using another database server it works fine.
Best regards!
I solved the problem now. The relationship between the sequence and the serial column was somehow destroyed. A simple
ALTER SEQUENCE <<sequence_name>> OWNED_BY <<table_name>>.<<pk_column_name>>
solved the problem.
Best regards!
Can you check you Postgresql log and find out what query is being fired? This might give some clues. Also write a quick unit test to exercise the same code and see if it works.

Django: Querying read-only view with no primary key

class dbview(models.Model):
# field definitions omitted for brevity
class Meta:
db_table = 'read_only_view'
def main(request):
result = dbview.objects.all()
Caught an exception while rendering: (1054, "Unknown column 'read_only_view.id' in 'field list'")
There is no primary key I can see in the view. Is there a workaround?
Comment:
I have no control over the view I am accessing with Django. MySQL browser shows columns there but no primary key.
When you say 'I have no control over the view I am accessing with Django. MySQL browser shows columns there but no primary key.'
I assume you mean that this is a legacy table and you are not allowed to add or change columns?
If so and there really isn't a primary key (even a string or non-int column*) then the table hasn't been set up very well and performance might well stink.
It doesn't matter to you though. All you need is a column that is guaranteed to be unique for every row. Set that to be 'primary_key = True in your model and Django will be happy.
There is one other possibility that would be problemmatic. If there is no column that is guaranteed to be unique then the table might be using composite primary keys. That is - it is specifying that two columns taken together will provide a unique primary key. This is perfectly valid relational modelling but unfortunatly unsupported by Django. In that case you can't do much besides raw SQL unless you can get another column added.
I have this issue all the time. I have a view that I can't or don't want to change, but I want to have a page to display composite information (maybe in the admin section). I just override the save and raise a NotImplementedError:
def save(self, **kwargs):
raise NotImplementedError()
(although this is probably not needed in most cases, but it makes me feel a bit better)
I also set managed to False in the Meta class.
class Meta:
managed = False
Then I just pick any field and tag it as the primary key. It doesn't matter if it's really unique with you are just doing filters for displaying information on a page, etc.
Seems to work fine for me. Please commment if there are any problems with this technique that I'm overlooking.
If there really is no primary key in the view, then there is no workaround.
Django requires each model to have exactly one field primary_key=True.
There should have been an auto-generated id field when you ran syncdb (if there is no primary key defined in your model, then Django will insert an AutoField for you).
This error means that Django is asking your database for the id field, but none exists. Can you run django manage.py dbshell and then DESCRIBE read_only_view; and post the result? This will show all of the columns that are in the database.
Alternatively, can you include the model definition you excluded? (and confirm that you haven't altered the model definition since you ran syncdb?)
I know this post is over a decade old, but I ran into this recently and came to SO looking for a good answer. I had to come up with a solution that addresses the OP's original question, and, additionally, allows for us to add new objects to the model for unit testing purposes, which is a problem I still had with all of the provided solutions.
main.py
from django.db import models
def in_unit_test_mode():
"""some code to detect if you're running unit tests with a temp SQLite DB, like..."""
import sys
return "test" in sys.argv
"""You wouldn't want to actually implement it with the import inside here. We have a setting in our django.conf.settings that tests to see if we're running unit tests when the project starts."""
class AbstractReadOnlyModel(models.Model):
class Meta(object):
abstract = True
managed = in_unit_test_mode()
"""This is just to help you fail fast in case a new developer, or future you, doesn't realize this is a database view and not an actual table and tries to update it."""
def save(self, *args, **kwargs):
if not in_unit_test_mode():
raise NotImplementedError(
"This is a read only model. We shouldn't be writing "
"to the {0} table.".format(self.__class__.__name__)
)
else:
super(AbstractReadOnlyModel, self).save(*args, **kwargs)
class DbViewBaseModel(AbstractReadOnlyModel):
not_actually_unique_field = IntegerField(primary_key=True)
# the rest of your field definitions
class Meta:
db_table = 'read_only_view'
if in_unit_test_mode():
class DbView(DbViewBaseModel):
not_actually_unique_field = IntegerField()
"""This line removes the primary key property from the 'not_actually_unique_field' when running unit tests, so Django will create an AutoField named 'id' on the table it creates in the temp DB that it creates for running unit tests."""
else:
class DbView(DbViewBaseModel):
pass
class MainClass(object):
#staticmethod
def main_method(request):
return DbView.objects.all()
test.py
from django.test import TestCase
from main import DbView
from main import MainClass
class TestMain(TestCase):
#classmethod
def setUpTestData(cls):
cls.object_in_view = DbView.objects.create(
"""Enter fields here to create test data you expect to be returned from your method."""
)
def testMain(self):
objects_from_view = MainClass.main_method()
returned_ids = [object.id for object in objects_from_view]
self.assertIn(self.object_in_view.id, returned_ids)