django modified_at field with database level auto update - django

I need to have modifed_at fields in my Django project. A field that updates every time a row updates in the database, despite where the update comes from: through calling .save() or through queryset.update() or even when updates happen in the database directly and not from the Django app.
there is an auto_now property that does not solve my problem according to this SO question(based on Django document).
other SO questions(like this and this) ask the same thing, update instance at every change not only .save()
This problem can be solved using triggers as said here but this way we need to write the same trigger for every modifed_at field in models.
as discussed in this Django ticket this problem will not be addressed and solved in Django. even the suggested patch only updates the instance if it changes via Django.
the only way that comes to my mind is to do something like this in a mixin. a mixin that when inherited creates a trigger for fields with auto_now=True. maybe change SQL when Django creates the model creation SQL. but I don't know how to implement this.
so I have two questions:
what is the best way to achieve database-level updates for modified_at fields
If my suggested way is the best option how to implement it?
I would like to have a database-agnostic solution but FYI currently I'm using PostgreSQL.

As you said, if you use triggers you'd have to create a trigger for every table.
To make this easier however you could create a migration file to create/destroy the trigger. Here's an example.
Then it would just be a matter of copy-pasting that migration file whenever you create a new model.
Edit:
You could even override the makemigrations command to automatically add the creation of the trigger step to the operations of the initial migrations file. Here's an answer that shows how to override management commands.

So you can use the Django's auto_now field for that.
If you want to auto populate the DateTimeField when creating a new object you can do this: created_at = models.DateTimeField(auto_now_add=True, verbose_name=_("created_at"))
If you want to update the field anytime the object is updated you can do this instead: updated_at = models.DateTimeField(auto_now=True, verbose_name=_("updated at"))
For more info you can refer here

Related

How to create a log database in django to record the user that created, updated or deleted an object?

I want to record the user in a database who created, updated or deleted an object there using django. I've found a simple solution with threadlocal and a logging abstract class from here : Why is using thread locals in Django bad? ( but that is discouraged ).
But the problem with this solution is that it is extremely difficult to write any unit test. So what would be a better solution for logging event based information about a user who created, updated, or deleted an object in django?
You can try django-simple-history. (https://django-simple-history.readthedocs.io/en/latest/querying_history.html)
It provides an history in django admin or querying history from python code

What is the recommended way to remove fields from django-models without downtime?

My team and I work on a shared django repository on a product which requires 99.99+% uptime.
I want to remove a few fields from a frequently used model. Say my model is defined as follows, and I want to remove field2.
class MyModel(models.Model):
field1 = ...
field2 = ...
Unfortunately this is accompanied by a migration, which if performed, will cause a downtime during deployments because the old servers still reference the non-existent fields in queries, like the ones below.
new_model = MyModel()
new_model.save()
or
MyModel.objects.all()
I can't get away without creating a migration because I work on a shared repository, and django will force the new migration on the next developer to make changes to the repo after me.
Is there a recommended way of removing fields from models in django ?

Django 2.1/PostgreSQL - can I prevent deleting model objects?

There were some questions in the past about this but as Django grows, there are many new database functions.
I'm looking for a way to prevent model to be deleted from anywhere by anyone.
I have a model Product and I don't want product to be deleted from database ever.
I understand that overriding delete is sometimes a good way but I would like to do it on database level so there is no chance to delete it from shell_plus or any other source.
In Postgres, I think, there is a way:
CREATE RULE product_del_protect AS ON DELETE TO product DO INSTEAD NOTHING;
But I would like to do it through Django so every migrated database will be affected.
There may be a way to do that in model or custom migration.
And better would be to raise an error.

How to revert changes, specifically deletions, with django-simple-history

We have django-simple-history set up for our models. Recently a whole bunch of models were mysteriously deleted. This was noticed a few days after the fact, so it would be nice to avoid a full DB backup restore since that would wipe manual changes that happened after the fact.
I cannot find any way to easily restore an model instance, specifically a deleted one. I can query the Historical version of the model and find everything that was deleted. With that I can also observe that all of them had deletions as their last change. I can use the instance property on history - 1 to get the state before deletion but if I try to save that it errors since the model instance was deleted and doesn't exist anymore.
So basically, what is the cleanest way to restore a deleted model instance if I have the Historical record of it with django-simple-history? I would like to retain the history if possible, so I am looking into any solution before totally recreating the objects.
As I understand, the question, it is about restoring a deleted model instance, not the class itself. So Kal's answer does not help here.
To restore a deleted instance, simple history can NOT be used. According to the documentation, simple history can only restore model instances that are still existing.
Since you have a full backup of the database, you can import this database into the django environment, load the old model instance from this backup database and save it to the production database (choose the database in django shell with "using"). See this post.
The best way to avoid such situations is to use the app "reversion". With this django-app, you actually can restore deleted instances. See the documentation.
Do you mean that your model, not just the instances, has been completely deleted? If that’s the case, it probably means some migration removed it.
You could try reverting the migration then restore from the Historical record.
The previous version of your model is stored in the _HISTORICAL<model_name> table. You could do a simple sql insert/select query to copy the data from that table back into the original table. Something like the below would work, but check the correct sql syntax for your database.
insert into app_mymodel (col_a, col_b)
select col_a, col_b from app_HISTORICALmymodel
where id in (1,2,3...)
If model using django-simple-history then,
Data can be restored from the historical model.
We need to filter from the historical model. for eg with some keyword or in some date range in which period the data got deleted accidentally.
For eg. the Model Name is Car, then by default django-simple-history creates a historical model as HistoricalCar.
historical_data = HistoricalCar.objects.filter(history_type='-')
for each in historical_data:
instance = each.instance # This return instance of car.
instance.pk = None # Create New Instance
instance.save()

SqlAlchemy changes in Model description not applied in DB

I am currently developing a server using Flask/SqlAlchemy. It occurs that when an ORM model is not present as a table in the database, it is created by default by SqlAlchemy.
However when an ORM class is changed with for instance an extra column is added, these changes do not get saved in the database. So the extra column will be missing, every time I query. I have to adjust my DB manually every time there is a change in the models that I use.
Is there a better way to apply changes in the models during development? I hardly think manual MySql manipulation is the best solution.
you can proceed as the following:
new_column = Column('new_column', String, default='some_default_value')
new_column.create(my_table, populate_default=True)
you can find more details about sqlalchemy migration in: https://sqlalchemy-migrate.readthedocs.org/en/latest/changeset.html