Update new field for all users upon migration - django

I learned Django recently. Now on one of the projects I'm working on, I need to add a new field, is_core_administrator to model UserProfile, which will set the oldest user as a core administrator. I need to update this field for all users with migration.
Is there a way to do this? I mean, when I make migrations, is it possible to update this field to true for the ​oldest user, and false for rest. Finding the oldest user will not be difficult since we already have another field DateJoined. I just want to update the corresponding field on making the migration.

Sure, just use migrations.RunSQL or migrations.RunPython depending on your requirements.
The latter could be easier to use in this case, but you should be able to do this with a single UPDATE SQL statement too.

You could do that using RunPython option in Django migrations.
Create a function at the top of the migration file.
def set_default_values(apps, schema_editor):
# Set the defualt values
Add the below line before the alter field operation in the migration file.
RunPython(set_default_values)

Please clarify your use case. I see that you have three options:
You might need migrate command to perform this modification every time you migrated your program. If this is so, you need to add a small loop to the end of the script that would be generated by makemigrations and if you ship your program to the production environment with the modified migration script you will be able to do so.
If you need this as a one time only modification for your development environment, I suggest it is better not to modify migration script but to change is_core_administrator by using Django shell manually.
If you need to program to set its first created UserProfile with is_core_administrator set True where and when ever installed, I suggest the easiest way is to define save(self): and to check within if the object you are going to save is the first instant probably by using UserProfile.objects.count() or UserProfile.objects.exists() and set is_core_administrator accordingly. Maybe something like following:
class UserProfile(models.Model):
...
...
is_core_administrator=models.BooleanField(default=False)
def save(self, force_insert=False, force_update=False, using=None,
update_fields=None):
if not UserProfile.objects.exists():
self.is_core_administrator=True
self.etiket=self.woo_name
return models.Model.save(self, force_insert=force_insert, force_update=force_update, using=using, update_fields=update_fields)

Related

Renaming Django model without breaking existing migrations

I want to rename a model in Django 3.2, keep my existing migrations and be able to both migrate a db with the old table name and create a db from scratch.
I've started by renaming the model class and all references to it in the code. As "./manage.py makemigrations" did not automatically create a migration, I manually created a migration that renames the model:
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('market_integrations', '0003_migration'),
]
operations = [
migrations.RenameModel('OldModelName', 'NewModelname')
]
My initial idea is that I should not update existing migrations, as I never do when creating other migrations. However, the references to the old model in the old migrations cause a LookupError when I run "./manage.py migrate". I've tried using both the model name string and apps.get_model(). Migration code samples that break:
operations = [
migrations.CreateModel(
name="OldModelName",
...
)
]
operations = [
migrations.CreateModel(
name=apps.get_model("myapp", "OldModelName"),
...
)
]
As keeping the old model name in old migrations didn't work, I replaced the old model name in old migrations with the new name. After doing that, "./manage.py migrate" ran successfully, including the model renaming migration. However, when I try to create a new database, the model renaming migration fails because that new database never had a table with the old name.
What should I do to be able to preserve my migrations and have them working in both existing and new databases?
I have already checked Django migration strategy for renaming a model and relationship fields , but I didn't find the answer to my question there.
Please attach the full model (you can delete irrelevant fields) and the migration you created in your initial question (I cannot comment on your initial post yet, so I have to mention it in an answer).
This is important because I think there are foreign keys pointing to the model you want to rename.
Or are you trying to rename Foo to Bar and create a new Foo?
Does the error also occur if you try to start the server after your initial idea?
Your initial idea of keeping existing migrations untouched seems good, because you will be able to run it on all your environments (local, dev, prod, etc), and approach 2 would require manual renaming of database tables and fields.
I might be able to extend my answer after you provide models and migrations.
migrations.RenameModel(
old_name="ModelA",
new_name="ModelB",
),
migrations.AlterField(
model_name="myrelatedmodel",
name="somefieldname",
field=models.ForeignKey(
# change the properties accordingly (best copy it over). The database may only change to="table"
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="myapp.modelb",
),
),
If this did not work, where does the error come from? You can try to isolate your migrations and models from the rest of the code (like in a new empty project attached to the same database). Maybe this will help you find out if the error is from a leftover reference to OldModelName or some weird error. Just as an idea so you don't spend time looking at the wrong place
Try to do one change and one migration at a time. You can later squash them. If django does not auto-detect the renaming of your model, it means that all fields related to this model will fail too.
Revert everything to the working state before renaming
Make sure all migrations are applied
Rename the model and create a migration to only rename the model. Are you able to apply it?
if not, try to manually modify all related fields (one2one, foreignkey, etc)
If that still fails, create a copy of your model in models.py, rename the copied model and try to use migrations.RunPython() in a new migration file to copy all the data from the old table to the new one.
If you only have one populated database, you might get away with replacing all ModelA occurrences with ModelB and change the structure of your database.
Best of luck!

Django 1.8 Migrations. Adding DateTimeField after db creation. Best practices?

So some time a couple migrations after my first one, I decided I wanted to include these fields:
created = models.DateTimeField(auto_now_add=True)
modified = models.DateTimeField(auto_now=True)
into one of my models. When I makemigrations it gave me
You are trying to add a non-nullable field 'created' to episode without a default; we can't do that (the database needs
something to populate existing rows).
So I then changed it to
created = models.DateTimeField(auto_now_add=True, default=datetime.now)
After trying to makemigrations again, it said that at_api.Episode.modified: (fields.E160) The options auto_now, auto_now_add, and default are mutually exclusive. Only one
of these options may be present.
All right, so I just went ahead and removed the auto_now_add
created = models.DateTimeField(default=datetime.now)
I could now makemigrations without any problems. And then I later removed default=datetime.now and replaced it with auto_now_add=True, and migrated again without any problems. However, I can't help feeling that this might not be the best way of doing things. I feel like something might go wrong later in the project.
I think the best practice here would have been to make the fields nullable. What your created field means at the moment is: "The time when the instance was created, or the arbitrary time when I ran the migration." The standard way to represent the lack of a value is NULL, rather than an arbitrary value.
That said, if you do want to use some arbitrary value you just need to tell Django what it is. Usually makemigrations gives you the option to indicate a one-off value to use for existing rows - did that not happen?
A more laborious method would be to declare the field nullable, then create a data migration to fill in your desired value, and then make it non-nullable. What you did is basically a simplified version of that. I don't see it creating any problems moving forward other than the issue of created not really being the time the instance was created.
I've just had the exact problem. I use Django 1.10. I read Kevin answer and I've tried to put default value when Django asked me to fill it as datetime.now string.
And I was surprised because, for those fields, Django automatically ask you if you want to use datetime.now as default:
$ ./manage.py makemigrations
You are trying to add the field 'date_created' with 'auto_now_add=True' to projectasset without a default; the database needs something to populate existing rows.
1) Provide a one-off default now (will be set on all existing rows)
2) Quit, and let me add a default in models.py
Select an option: 1
Please enter the default value now, as valid Python
You can accept the default 'timezone.now' by pressing 'Enter' or you can provide another value.
The datetime and django.utils.timezone modules are available, so you can do e.g. timezone.now
Type 'exit' to exit this prompt
[default: timezone.now] >>>
So, I just confirm that and everything seems to be working fine!

Use validation to prevent duplicate file _name_ being uploaded

How can I detect that the name of a file that a user has provided for upload (via a django.forms.ModelForm using a FileField field) is a duplicate of one that exists, and thus decide to fail validation on the form?
I'm finding this particularly challenging, because from within the form, I don't see how I can find out what the value of upload_to is for this FileField, so I can't go looking myself in the file system to see if that file is there already.
As i see it you have 2 options:
Set a value in your settings.py to hold your 'upload_to' and then use it to check when you are validating.
Something like this to verify would work (you need to change your upload_to ofc):
from django.conf import settings
if settings.UPLOAD_TO:
# Do something
Issue with that is that you can't have subfolders or anything complex there.
A second option would be, as mentioned in your comments, to add a new column to your model that holds a hash for your file. This approach should work better. As someone mentioned in your comments, to avoid uploading a big file, checking, failing, uploading another big file, etc, you can try to hash it in the client and verify it via ajax first (you will verify it again in the server, but this can make things go faster for your users).
Older question, but Django 1.11 now supports the unique option on FileField. Set unique=True on your field declaration on your model.
It shouldn't matter what you are setting upload_to to. The file name will still be stored in the database.
Changed in Django 1.11:
In older versions, unique=True can’t be used on FileField.
https://docs.djangoproject.com/en/1.11/ref/models/fields/#unique

Django - Prevent automatic related table fetch

How can I prevent Django, for testing purposes, from automatically fetching related tables not specified in the select_related() call during the intial query?
I have a large application where I make significant use of
select_related() to bring in related model data during each original
query. All select_related() calls are used to specify the specific related models, rather than relying on the default, e.g. select_related('foo', 'bar', 'foo__bar')
As the application has grown, the select_related calls haven't
completely kept up, leaving a number of scenarios where Django happily
and kindly goes running off to the database to fetch related model
rows. This significantly increases the number of database hits, which
I obviously don't want.
I've had some success in tracking these down by checking the queries
generated using the django.db.connection.queries collection, but some
remain unsolved.
I've tried to find a suitable patch location in the django code to raise an
exception in this scenario, making the tracking much easier, but tend
to get lost in the code.
Thanks.
After some more digging, I've found the place in the code to do this.
The file in question is django/db/models/fields/related.py
You need to insert two lines into this file.
Locate class "SingleRelatedObjectDescriptor". You need to change the function __get__() as follows:
def __get__(self, instance, instance_type=None):
if instance is None:
return self
try:
return getattr(instance, self.cache_name)
except AttributeError:
raise Exception("Automated Database Fetch on %s.%s" % (instance._meta.object_name, self.related.get_accessor_name()))
# leave the old code here for when you revert!
Similarly, in class "ReverseSingleRelatedObjectDescriptor" further down the code, you again need to change __get__() to:
def __get__(self, instance, instance_type=None):
if instance is None:
return self
cache_name = self.field.get_cache_name()
try:
return getattr(instance, cache_name)
except AttributeError:
raise Exception("Automated Database Fetch on %s.%s" % (instance._meta.object_name, self.field.name))
# BEWARE: % parameters are different to previous class
# leave old code here for when you revert
Once you've done this, you'll find that Django raises an exception every time it performs an automatic database lookup. This is pretty annoying when you first start, but it will help you track down those pesky database lookups. Obviously, when you've found them all, it's probably best to revert the database code back to normal. I would only suggest using this during a debugging/performance investigation phase and not in the live production code!
So, you're asking how to stop a method from doing what it's specifically designed to do? I don't understand why you would want to do that.
However, one thing to know about select_related is that it doesn't automatically follow relationships which are defined as null=True. So if you can set your FKs to that for now, the relationship won't be followed.

What would be a consistent method to preserve FileField behaviour in Django 1.2.5

As with Django 1.2.5 a model containing a filefield will not automatically delete the associated files any more when the model is deleted. Check the release notes here: http://docs.djangoproject.com/en/1.2/releases/1.2.5/
I am quite new to Django, so i wonder what would be a good way to preserve the old behaviour, as i have a need for it. Is it enough to just override the model.save method?
if you take a look at changeset 15321 in django's code repository, you'll see that this functionality has been removed by deleting a signal handler that the FileField had which intercepted the its parent model's delete event and subsequently tried to delete its file.
The functionality is quite easy to restore, you just need to "undo" these changes. One warning though: The problem with deleting files within transactions is real and if you delete files "the old fashioned way" you could end up deleting stuff even when a rollback occurs. Now, if that doesn't pose a problem for you, read on!
We can easy subclass the FileField and restore that functionality without touching the original class. This could should do that (note that I'm just restoring the old functionality deleted from the code):
from django.db.models.fields.files import FileField
class RetroFileField(FileField):
# restore the old delete file when model is deleted functionality
def __init__(self, verbose_name=None, name=None, upload_to='', storage=None, **kwargs):
# init FileField normally
super(RetroFileField, self).__init__(verbose_name, name, upload_to, storage, **kwargs)
def contribute_to_class(self, cls, name):
# restore the SIGNAL that is handled when a model is deleted
super(RetroFileField, self).contribute_to_class(cls, name)
signals.post_delete.connect(self.delete_file, sender=cls)
def delete_file(self, instance, sender, **kwargs):
file = getattr(instance, self.attname)
# If no other object of this type references the file,
# and it's not the default value for future objects,
# delete it from the backend.
if file and file.name != self.default and \
not sender._default_manager.filter(**{self.name: file.name}):
file.delete(save=False)
elif file:
# Otherwise, just close the file, so it doesn't tie up resources.
file.close()
(I haven't tested the code... but it should be more or less ok)
You should put this code in a fields.py module in your project, or wherever it makes sense to you. Just remember, that from now on, instead of using django.db.models.FileField you'll be using yourproject.fields.RetroFileField for your file fields. And if you're using image fields and you depend on this functionality too... well... I think you'll need to subclass the image fields too and make them use your RetroFileField instead of the original FileField.
Oh, and if you don't like the name of the field, just rename it to something more appropriate, just remember to update the super() calls inside.
Hope this helps!
Another note: You should see if you can just use a cron job to delete orphaned files like the changelog suggests.