I am working on a data migration for a Django app to populate
the main table in the db with data that will form the mainstay of
the app - this is persistent/permanent data that may added to but
never deleted.
My reference is the Django 1.7 documentation and in particular an
example on page
https://docs.djangoproject.com/en/1.7/ref/migration-operations/#django.db.migrations.operations.RunPython
with a custom method called forward_funcs:
def forwards_func(apps, schema_editor):
# We get the model from the versioned app registry;
# if we directly import it, it'll be the wrong version
Country = apps.get_model("myapp", "Country")
db_alias = schema_editor.connection.alias
Country.objects.using(db_alias).bulk_create([
Country(name="USA", code="us"),
Country(name="France", code="fr"),])
I am assuming the argument to bulk_create is a list of Country model objects not namedtuple objects, although the format looks exactly the same. Is this the case, and could someone please explain what db_alias is?
Also, if I wish to change or remove existing entries in a table using a data migration what are the methods corresponding to bulk_create to do this?
Thanks in advance for any help.
Country is just the same as you would do from app.models import Country. Only thing different, the import always gives you the latest model and apps.get_model in a migration gives you the model at the time of the migration. It continues to edit the model within the initial migration.
About bulk_create; its argument is indeed a list of unsaved Country objects and uses it to do an huge insert into your db. More information about bulk_create can be found here; https://docs.djangoproject.com/en/1.7/ref/models/querysets/#bulk-create.
About db_alias, it is the name of the database you set within your settings. Most of the time it is default, so you can leave it in your code if you just use one database. The function will probably will called more than once if you have more databases set within your settings. More info about using; https://docs.djangoproject.com/en/1.7/ref/models/querysets/#using.
An bulk delete is actually quite simple, you just filter your Countries and call delete on the queryset. So something like;
Country.objects.filter(continent="Europe").delete()
About the persistent/permanent data question, I don't really have a solution for that one. One thing you can do, I think, is overwrite the .delete() function on the model and Manager.
Related
I would like to follow this guideline and avoid a nullable ForeignKey.
I have a ForeignKey to the django User model.
If I try to store request.user in an instance of this model, I get this error:
ValueError: Cannot assign "<SimpleLazyObject: <django.contrib.auth.models.AnonymousUser>>":
"MyModel.user" must be a "User" instance.
I think it is feasible to avoid the non-conditionless data schema (nullable foreign key).
How could I solve this?
Those guidelines suggest you create an instance of user that reflects an anonymous user. Otherwise known as a sentinel value. You'd have to keep track of it via some unique key, likely the username. Then make sure it exists and nobody else has actually created a user with that key otherwise you run into other problems.
However, because of those outlined issues above I disagree with those guidelines. If your data model allows for optional relationships, then you absolutely should use NULL values.
Regarding the comment:
If there is no NULL in your data, then there will be no NullPointerException in your source code while processing the data :-)
Simply because there are no NULL fields, doesn't mean those conditions don't exist. You still are handling these edge cases, but changing the names and some of the syntax. You're still vulnerable to bugs because you still have as many conditions (and potentially more given that you have to now make sure your sentinel value is unique).
Hey it's my first attempt at answering a question! I'm a newbie, but I had a similar error recently. I suppose this is a naive version Nigel222's answer which calls for doing this with a migration, but maybe there is something of value here nonetheless for another newbie who needs a simpler solution. I was influenced by an answer to this post by Ayman Al-Absi that suggests that you may need to reference this user by it's auto-generated primary key.
By default, request.user is AnonymousUser when not authenticated. It seems from the error message, AnonymousUser can't be used as value for your foreign key in the User table.
Proposed solution:
from django.contrib.auth.models import User
# Start by creating a user in your User table called something like anon in some kind of initialization method:
tempUser= User.objects.create_user(username="anon", email="none", first_name="none", last_name="none")
tempUser.save()
#when the user is unauthenticated, before calling a method that takes request as a parameter do:
if request.user.is_anonymous:
anonUser = User.objects.get(username='anon')
request.user=User.objects.get(id=anonUser.id)
Another comment for the newbie. I made my own table called User in models.py. This became confusing. I had to import it with an alias:
from .models import User as my_user_table
It would have been better just to call it my_user_table to begin with.
Create a special instance of User for this purpose. It's The best place to do so is in a data migration for the model which will rely on being able to create a ForeignKey to this special User object. When you deploy your app and run makemigrations and migrate, it will create the special user objects before there are any actual users in the DB.
There's a lot of detail on creating data migrations here
Here's an example of making sure that some Group objects will exist as of this migration for any future deployment.
# Generated by Django 2.2.8 on 2020-03-05 09:53
from django.db import migrations
def apply_migration(apps, schema_editor):
Group = apps.get_model("auth", "Group")
Group.objects.bulk_create(
[Group(name="orderadmin"),
Group(name="production"),
Group(name="shipping")]
)
def revert_migration(apps, schema_editor):
Group = apps.get_model("auth", "Group")
Group.objects.filter(name__in=["orderadmin", "production", "shipping"]).delete()
class Migration(migrations.Migration):
dependencies = [
('jobs', '0034_auto_20200303_1810'),
]
operations = [
migrations.RunPython(apply_migration, revert_migration)
]
We have 2 identical Django instances and one is trial the other is for production. Recently one client on trial purchased full product and I need to move ONLY their data from trial to production. I don't know if there's a convenient way to do that, since:
If I use Django fixtures then it might overwrite the existing data in production system because of the default id that Django assigned to each entry(I might be wrong but I think fixtures are only good for initialization).
Using sql to dump the DB might not help either because of the similar problem with the first approach, and it's also complex because there are other customers in trial but I only need to move this client's data.
Please give me some advice if you have similar experience.
The issue is you cannot transfer the primary keys (IDs) from the trial to the production DB, isn't it ? So 2 solutions:
1) A tank to kill an ant
You do a SQL export of your trial database, and you increase every primary key and foreign key by a number (for ex: 10000). This number needs to high enough to avoid unicity constraint violation when you will import it in the DB
2) The smart solution
If, and only if, your model is well designed: for every model you can find a set of its columns that could make a substitute primary key to the ID. If you have attributes with unique = True, or models with unique_together = (...) it's perfect: you can use natural keys !
In every model of your source code, you add the method get_by_natural_key:
class Person(models.Model):
firstname = models.CharField...
last_name = models.CharField...
class Meta:
unique_together = ("first_name", "last_name")
def get_by_natural_key(self, first_name, last_name):
return self.get(first_name=first_name, last_name=last_name)
Then you can use Django dumpdata command to export the trial database with the IDs replaced by the natural keys ! Then with the same code, you use the loaddata command to import these data files.
I have a bit of a problem using django-registration and signals.
The basic setup is that I have a django 1.4.3 setup, with django-south and django-registration (and the db is SQLite for what it's worth).
EDIT: I changed the question a bit because the effect is the same in a shell, so the registration is not in cause (edits are in italic).
I have a one of my model that is related to the User model in the following way:
class MyUserProfile(models.Model):
user = models.OneToOneFiled(User)
#additional fields
I initialized the base using south.
When I do a little sqlall to check the sql that should be in it, I can clearly see:
CREATE TABLE "myApp_myuserprofile" (
"id" integer NOT NULL PRIMARY KEY
"user_id" integer NOT NULL UNIQUE REFERENCES "auth_user" ("id"),
#other fields
)
After that, I wanted to initialize the data if the user activated its account.
So in models.py I put
from django.dispatch import receiver
from registration.signals import user_activated
#Models....
#receiver(user_activated)
def createMyProfile(sender, **kwargs):
currentUser = kwargs['user']
profile = Profile(user = currentUser, #other fields default value)
profile.save()
#And now the reverse relation:
currentUser.myuserprofile = profile
currentUser.save()
While I am in there, everything seems alright, if I print the ids (both for the user and the profile), and if I travel back and forth between the 2, I see something that seems correct.
If I disable this part of the code and do the same kind of initialization using the shell, I get the same result.
But after that, everyhting is wrong.
If I open a shell and import the relevant matters, I have the following for every X value
MyUserProfile.objects.get(pk=X)
#DoesNotExist Exception
User.objects.get(pk=X).myuserprofile.pk
1
MyUserProfile.objects.all()[X].pk
1
Seems a bit weird no?
And now if I go to the sql shell
select id from myApp_myuserprofile;
1
1
1
1
...
So I have a primary column which is filled with the same value all over the place. Which is well... embarrassing to say the least (and does lead to problem, because everyone has a profile with the same Id).
Any idea to what could be the cause of the problem and how I could solve it?
P.S: Note that the foreign key from the related relation are correct and their uniqueness is preserved.
Well looks like the problem was indeed coming from the use of SQLite and South.
The doc states that:
SQLite doesn’t natively support much schema altering at all, but South has workarounds to allow deletion/altering of columns. Unique indexes are still unsupported, however; South will silently ignore any such commands.
Which was (I think) the case, as I hadn't created this relation from the start but with a latter migration. I just reseted the base and migrations and voilà.
See What's the recommended approach to resetting migration history using Django South? for the migrations and a simple ./manage.py reset myApp for the base reset.
I am limiting the ContentType choices for a Generic Relation using limit_choices_to but it shows models that no longer exist. For example with this code:
employer_content_type = models.ForeignKey(ContentType,
limit_choices_to={"model__in": ('venue', 'festival')}, related_name="employer")
I get a list of choices that has duplicates, i.e. festival, festival, venue, venue
However when I limit the choices by app rather then just models like this:
employer_content_type = models.ForeignKey(ContentType,
limit_choices_to={'app_label': 'contacts'}, related_name="employer")
I get a list of all of the models with no duplicates, i.e. address, email, festival, venue
At one point in my development I created a new app ("contacts") that was a duplicate of an older app. All of the models had the same names etc. At first I thought that this was causing the duplicates but the problem didn't go away after I removed the old app from settings.py and deleted the old models form the database.
I think it is a cache issue but I never set up caching!
SO how do I either clear the cache, or limit the choices by model and app at the same time.
Thanks for your help!
Note: Unfortunately I couldnt add pictures so its a little hard to describe!
Look for contentypes_contentype db table and delete obsolete models there. syncdb should also prompt you to delete obsolete models from content type table.
Here, I am a bit confused with forms in Django. I have information for the form(a poll i.e the poll question and options) coming from some db_table - table1 or say class1 in models. Now the vote from this poll is to be captured which is another model say class2. So, I am just getting confused with the whole flow of forms, here i think. How will the data be captured into the class2 table?
I was trying something like this.
def blah1()
get_data_from_db_table_1()
x = blah2Form()
render_to_response(blah.html,{...})
Forms have nothing to do with models in Django. They are just class meant to get informations from a dictionary (often request.POST) and check if each data linked to a key match a type and a format (e.g: is this a string of the form "bla#foo.tld").
You can ask django to create a form from a model, and in that case it will do its checking job, then if the data match, it will create a model, fill it and save it.
If a form is not created from a model, it will do nothing but checking. It will save nothing.
If it is created from a model, it will create a new instance of this particular model instance and save it.
If you want something more complicated, like, pre fill a form from various models or according to some conditions, or, say, you need to save several models according to the result of one form, you must do it manually.