I have a strange behaviour on my Django/PostgreSQL system.
After saving a model object the primary key is none although it's an AutoField and the id is correctly saved in the database.
The following script passage returns None for the id:
a = SomModelClass()
a.someattribute = 'xyz'
a.save()
a.someattribute
>>> 'xyz'
a.id
>>> None
The model class looks somehow like this:
class SomeModelClass(models.Model):
id = models.AutoField(db_column = 'id', primary_key = True)
someattribute = models.CharField(db_column = 'someattribute', max_length = 200)
This behaviour occurs only on this model; all other models work fine.
The problem appeared one day without changing the model structure.
Perhaps there is some problem with the data integrity of the database? Using another database server it works fine.
Best regards!
I solved the problem now. The relationship between the sequence and the serial column was somehow destroyed. A simple
ALTER SEQUENCE <<sequence_name>> OWNED_BY <<table_name>>.<<pk_column_name>>
solved the problem.
Best regards!
Can you check you Postgresql log and find out what query is being fired? This might give some clues. Also write a quick unit test to exercise the same code and see if it works.
Related
Can anyone help me out here? I am inserting data into my postgreSQL DB.
admin_created is a booleanfield set to false by default. I've provided a true value for the first workout, but left the second workout blank for the booleanfield. Based on my understanding it should automatically be set to false, but i'm getting the error message below. Any ideas on why this is happening?
#.sql
INSERT INTO main_app_workout(name, description, admin_created)
VALUES
('Yoga', 'Roll up your yoga mat and discover the combination of physical and mental exercises that have hooked yoga practitioners around the globe.', 'True');
INSERT INTO main_app_workout(name, description)
VALUES
('Boxing', 'Ready to get your sweat on? Learn the six basic punches to build the foundation of an experienced boxer.');
#models.py
class Workout(Model):
name = models.CharField(max_length=40)
description = models.TextField()
exercises = ManyToManyField(Exercise, blank=True)
admin_created = models.BooleanField(default=False)
#Error code
psql:db/create_main_exercises.sql:49: ERROR: 23502: null value in column "admin_created" of relation "main_app_workout" violates not-null constraint
EDIT:
Thank you all for the comments. My solution to this problem was to provide true values for the admin_created for the seeded data. In addition I changed the admin_created field to
admin_created = models.BooleanField(null=True, default=False)
When I create new instances of the model in Django it automatically sets it to False.
Django is builded to use the ORM, if you are doing this insertion manually, Django can't set the defaul value for you, so will be passed NULL.
The field based constraints are only placed at the code level in django not at the DB level.
So if you 'll create a object in django programmatically you 'll not face this issue. But when created using SQL you 'll face this issue.
Only a few constraints are available that you can apply at the DB level that too post django 2.2.
Check the constraints documentation for django here
class VehicleModel(db.Model):
__tablenam__ = 'vehicle'
id = db.Column(db.Integer,primary_key=True)
class DriverModel(db.Model):
__tablename__ = 'driver'
id = db.Column(db.Integer,primary_key=True)
v_id = db.Column(db.Integer,db.ForeignKey('vehicle.id'))
v_rel = db.relationship('VehicleModel',backref=db.backref('vehicle',uselist=False))
def update(self,id,v_id):
self.id = id
self.v_id = v_id
db.session.commit()
In this above code, i am not able to update the value of v_id. Sqlalchemy is not throwing any sql error either. I don't know, the update method is working fine but not able to update v_id value. The value of v_id remains same as it was set when adding the column for first time.
This is because Alembic can’t do particularly detailed checks on your dB.Models do you have to change the migration script yourself. If you’re using Flask-migrate it’ll be in a migrations/versions folder. Change this using the alembic syntax found elsewhere in the file to change a foreign key.
Well it worked automatically after making certain changes in the relation, so never mind!
And thank you all for your responses!
I'm running Django 1.5 with SQLite, and I have a model called Assignment. Whenever I create one, it gets created with the correct pk value. But, whenever I try to retrieve any Assignment from my database, it is always returned with a pk of 90. I've been fighting this for an hour, and I have to admit I'm completely confused.
Here's my code, if it's any use.
class Assignment(models.Model):
class Meta:
app_label = 'holiday'
unique_together = ('year', 'employee')
year = models.PositiveIntegerField(db_index=True)
employee = models.ForeignKey('bsc.Employee', db_index=True)
days = models.PositiveIntegerField(null=True)
This, and a bunch of methods that calculate some values based on models related to this one. Nothing fancy.
I've got to add that this model has had a somewhat rough past - with all my absent-mindedness, I had originally set year as the primary key, which quickly failed as soon as I added two Assignments to different employees. Maybe I should look at the DB schema and see if anything's wrong. Thankfully, the app hasn't made it to production yet, but hopefully this can be fixed without a full DB reset.
If you had created 90 previous records then deleted the rows from your database; your database key index will still be set to what would have been the next primary key number in your database.
The way to resolve this would be to as described in this other stackoverflow post:
SQLite Reset Primary Key Field
I have a model (lets call it Entity) that has an attribute (Attribute) that changes over time, but I want to keep a history of how that attribute changes in the database. I need to be able to filter my Entities by the current value of Attribute in its manager. But because Django (as far as I can tell) won't let me do this in one query natively, I have created a database view that produces the latest value of Attribute for every Entity. So my model structure looks something like this:
class Entity(models.Model):
def set_attribute(self, value):
self.attribute_history.create(value=value)
def is_attribute_positive(self, value):
return self.attribute.value > 0
class AttributeEntry(models.Model):
entity = models.ForeignKey(Entity, related_name='attribute_history')
value = models.IntegerField()
time = models.DateTimeField(auto_now_add=True)
class AttributeView(models.Model)
id = models.IntegerField(primary_key=True, db_column='id',
on_delete=models.DO_NOTHING)
entity = models.OneToOneField(Entity, related_name='attribute')
value = models.IntegerField()
time = models.DateTimeField()
class Meta:
managed = False
My database has the view that produces the current attribute, created with SQL like this:
CREATE VIEW myapp_attributeview AS
SELECT h1.*
FROM myapp_attributehistory h1
LEFT OUTER JOIN myapp_attributehistory h2
ON h1.entity_id = h2.entity_id
AND (h1.time < h2.time
OR h1.time = h2.time
AND h1.id < h2.id)
WHERE h2.id IS NULL;
My problem is that if I set the attribute on a model object using set_attribute() checking it with is_attribute_positive() doesn't always work, because Django may be caching that the related AttributeView object. How I can I make Django update its model, at the very least by requerying the view? Can I mark the attribute property as dirty somehow?
PS: the whole reason I'm doing this is so I can do things like Entity.objects.filter(attribute__value__exact=...).filter(...), so if someone knows an easier way to get that functionality, such an answer will be accepted, too!
I understand that the attribute value is modified by another process (maybe not even Django) accessing the same database. If this is not the case you should take a look at django-reversion.
On the other hand if that is the case, you should take a look at second answer of this. It says that commiting transaction invalidate query cache and offer this snippet.
>>> from django.db import transaction
>>> transaction.enter_transaction_management()
>>> transaction.commit() # Whenever you want to see new data
I never directly solved the problem, but I was able to sidestep it by changing is_attribute_positiive() to directly query the database table, instead of the view.
def is_attribute_positive(self, value):
return self.attribute_history.latest().value > 0
So while the view gives me the flexibility of being able to filter queries on Entity, it seems the best thing to do once the object is received is to operate directly on the table-backed model.
Suppose I have following models:
class Thing(models.Model):
name = models.CharField(max_length=100)
ratings = models.ManyToManyField('auth.User', through='Rating')
class Rating(models.Model):
user = models.ForeignKey('auth.User')
thing = models.ForeignKey('Thing')
rating = models.IntegerField()
So I have a lot of things, and every user can rate every thing. I also have a view showing a list of all things (and they are huge in numbers) with a rating that user assigned to each of them. I need a way to retreive all the data from database: Thing objects with additional field user_rating taken from at most one (because we have a fixed User) related Rating object.
Trivial solution looks like that:
things = Thing.objects.all()
for thing in things:
try:
thing.user_rating = thing.ratings.objects.get(user=request.user).rating
except Rating.DoesNotExist:
thing.user_rating = None
But the flaw of this approach is obvious: if we have 500 things, we'll do 501 requests to database. Per one page. Per user. And this is the most viewed page of the site. This task is easily solvable with SQL JOINs but in practice I have more complicated schema and I will certainly benefit from Django model framework. So the question is: is it possible to do this Django-way? It would be really strange if it isn't, considering that such tasks are very common.
As I understood, neither annotate(), nor select_related() will help me here.
I guess you should try this:
https://docs.djangoproject.com/en/1.3/ref/models/querysets/#extra
Example
result = Thing.objects.all().extra(select={'rating': 'select rating from ratings where thing_id = id'})
Your result set gets a new field 'rating' for each 'thing' object.
I use this approach in one of my recent projects. It produces one complex query instead of n+1 queries.
Hope this helps :)
Since you are planning to display everything in one page. I can think of this approach. You can give this a try:
Get all the ratings given by the current user and Get all the Things.
Now try to create a dictionary like this:
thing_dict = {}
for thing in Thing.objects.all():
thing_dict[thing] = None
for rating in Rating.objects.filter(user = request.user):
thing_dict[rating.thing] = rating
Now thing_dict contains all the entries of model Thing as keys and has its rating as its value.
May not be the best way. I am keen on seeing what others answer.