I am writing some free software based on Django.
I have a class Item which describes a pricing plan (such as "subscription, $10 per week without a trial period").
My code often creates new items based on existing. For example new item created based on the above item would be: "subscription, $10 per week with the trial period 10 days" (for the case if a customer paid for 10 days already).
Now there are two kinds of items:
predefined items (as in the first example);
modified items (based on another item, as in the second example).
Now the trouble:
I create predefined items using ./manage.py loaddata ... command which loads the items from a JSON file.
I create modified items in my Python code.
If I add a new item to the JSON code and run ./manage.py loaddata ... again, then (accordingly to how I understand) the loaddata command may overwrite one of the modified items (created later by my Python code).
What to do to avoid overwriting modified items with new predefined items? More generally, how to keep predefined and modified items distinct, to be sure the code could differentiate which items are predefined and which are not?
Dumpdata and Loaddata should not be used to create "modified items". Maybe treat these commands more like "backup" and "restore". If you want to load new created items from a json file, write a custom management command:
Custom Management Commands
First declare an abstract model:
class ItemBase(models.Model):
class Meta:
abstract = True
name = models.CharField(max_length=100)
# ...
Then declare:
class PredefinedItem(ItemBase):
pass
class ModifiedItem(ItemBase):
base = models.OneToOneField(PredefinedItem, null=True)
#atomic
#staticmethod
def obtain_predefined(id):
try:
return ModifiedItem.objects.get(base_id=id)
except ModifiedItem.DoesNotExist:
predefined = PredefinedItem.objects.get(pk=id)
return ModifiedItem.objects.create(base=id, **model_to_dict(
predefined,
fields=[f.name for f in ItemBase._meta.fields]))
obtain_predefined() allows to make a copy of a predefined object, of this copy to be used instead of the predefined object itself. Thus we can not worry about predefined object overwriting modified objects.
Note: https://stackoverflow.com/a/52787554/856090 answer used.
Related
My application creates several rows of data per customer per day. Each row is modified as necessary using a form. Several modifications to a row may take place daily. At the end of the day the customer will "commit" the changes, at which point no further changes will be allowed. In each row I have a 'stage' field, stage=1 allows edits, stage=2 is committed, no further changes allows.
How can I update the stage value to 2 on commit?
In my model I have:
#property
def commit_stage(self):
self.stage = 2
self.save()
Is this the correct way to do this? And if so, how to I attach this function to a "commit" button.
I suspect you are confused about what properties do. You should absolutely not attach this sort of functionality to a property. It would be extremely surprising behaviour for something which is supposed to retrieve a value from an instance to instead modify it and save it to the db.
You can of course put this in a standard model method. But it's so trivial there is no point in doing so.
In terms of "attaching it to a button", nothing in Django can be called from the front-end without a URL and a view. You need a view that accepts the ID of the model instance from a POST request, gets the instance and modifies its stage value, then saves it. There is nothing different from the form views you already use.
I have created a form that allows a user to modify a Product's ProductClass. However, after I save the product, its ProductAttributeValues are automatically duplicated and stored. Here is a simple example of the process:
product = Product.objects.first()
product.product_class = ProductClass.objects.get(pk=2)
product.save()
Imagine that the product initially belonged to a ProductClass with pk=1 and that it had a single ProductAttribute named started with a ProductAttributeValue set to True. After executing the three lines above, a new ProductAttributeValue of started also set to True was automatically generated and stored. Is there a simple way to avoid this behavior? Is this how a Product's ProductClass should be modified?
So I have a list of unique pupils (pupil is the primary_key in an LDAP database, each with an associated teacher, which can be the same for several pupils.
There is a box in an edit form for each teacher's pupils, where a user can add/remove an pupil, and then the database is updated according using the below function. My current function is as follows. (teacher is the teacher associated with the edit page form, and updated_list is a list of the pupils' names what has been submitted and passed to this function)
def update_pupils(teacher, updated_list):
old_pupils = Pupil.objects.filter(teacher=teacher)
for pupils in old_pupils:
if pupil.name not in updated_list:
pupil.delete()
else:
updated_list.remove(pupil.name)
for pupil in updated_list:
if not Pupil.objects.filter(name=name):
new_pupil = pupil(name=name, teacher=teacher)
new_pupil.save()
As you can see the function basically finds what was the old pupil list for the teacher, looks at those and if an instance is not in our new updated_list, deletes it from the database. We then remove those deleted from the updated_list (or at least their names)...meaning the ones left are the newly created ones, which we then iterate over and save.
Now ideally, I would like to access the database as infrequently as possible if that makes sense. So can I do any of the following?
In the initial iteration, can I simply mark those pupils up for deletion and potentially do the deleting and saving together, at a later date? I know I can bulk delete items but can I somehow mark those which I want to delete, without having to access the database which I know can be expensive if the number of deletions is going to be high...and then delete a lot at once?
In the second iteration, is it possible to create the various instances and then save them all in one go? Again, I see in Django 1.4 that you can use bulk_create but then how do you save these? Plus, I'm actually using Django 1.3 :(...
I am kinda assuming that the above steps would actually help with the performance of the function?...But please let me know if that's not the case.
I have of course been reading this https://docs.djangoproject.com/en/1.3/ref/models/querysets/ So I have a list of unique items, each with an associated email address, which can be the same for several items.
First, in this line
if not Pupil.objects.filter(name=name):
It looks like the name variable is undefined no ?
Then here is a shortcut for your code I think:
def update_pupils(teacher, updated_list):
# Step 1 : delete
Pupil.objects.filter(teacher=teacher).exclude(name__in=updated_list).delete() # delete all the not updated objects for this teacher
# Step 2 : update
# either
for name in updated_list:
Pupil.objects.update_or_create(name=name, defaults={teacher:teacher}) # for updated objects, if an object of this name exists, update its teacher, else create a new object with the name from updated_list and the input teacher
# or (but I'm not sure this one will work)
Pupil.objects.update_or_create(name__in=updated_list, defaults={teacher:teacher})
Another solution, if your Pupil object only has those 2 attributes and isn't referenced by a foreign key in another relation, is to delete all the "Pupil" instances of this teacher, and then use a bulk_create.. It allows only 2 access to the DB, but it's ugly
EDIT: in first loop, pupil also is undefined
I want to have facebook kind of news feed, in which i need to fetch data from 2 different models ordered by time.
Models are something like :
class User_image(models.Model):
user = models.ForeignKey(User_info)
profile_pic = models.ImageField(upload_to='user_images')
created = models.DateTimeField(auto_now_add=True)
class User_status(models.Model):
user = models.ForeignKey(User_info)
status = models.CharField(max_length=1)
created = models.DateTimeField(auto_now_add=True)
As per my requirement, i can not make a single model out of these two models.
Now i need to know the simple code in views and template so as to display profile pic and status in the news feed according to time.
Thanks.
The most simple way of archiving this is to have a base model, call it Base_event,
class Base_event(models.Model):
user = models.ForeignKey(User_info)
created = models.DateTimeField(auto_now_add=True)
and derive both your models from this Base. This way you write less code, and you archive your objective. Notice that you have to make an implementation choice: how will they will inherit from base. I advice to read Django documentation to help you choose wisely according to what you want to do.
EDIT:
I would notice that the accepted answer has a caveat. It sorts the data on the python and not on the mysql, which means it will have an impact on the performance: the whole idea of mysql having SORT is to avoid having to hit the database and them perform the sorting. For instance, if you want to retrieve just the first 10 elements sorted, with the accepted solution you have to extract all the entries, and only then decide which ones are the first 10.
Something like Base_event.objects.filter().sort_by(...)[10] would only extract 10 elements of the database, instead of the whole filtered table.
The easy solution now becomes the problem later.
Try something like creating list chain.
feed = list(chain(User_image,User_status))
feed = sorted(feed, key=operator.attrgetter('date_added'))
for those who refer it as not correct.
https://stackoverflow.com/a/434755/2301434
I've been looking for a way to define database tables and alter them via a Django API.
For example, I'd like to be write some code which directly manipulates table DDL and allow me to define tables or add columns to a table on demand programmatically (without running a syncdb). I realize that django-south and django-evolution may come to mind, but I don't really think of these tools as tools meant to be integrated into an application and used by and end user... rather these tools are utilities used for upgrading your database tables. I'm looking for something where I can do something like:
class MyModel(models.Model): # wouldn't run syncdb.. instead do something like below
a = models.CharField()
b = models.CharField()
model = MyModel()
model.create() # this runs the create table (instead of a syncdb)
model.add_column(c = models.CharField()) # this would set a column to be added
model.alter() # and this would apply the alter statement
model.del_column('a') # this would set column 'a' for removal
model.alter() # and this would apply the removal
This is just a toy example of how such an API would work, but the point is that I'd be very interested in finding out if there is a way to programatically create and change tables like this. This might be useful for things such as content management systems, where one might want to dynamically create a new table. Another example would be a site that stores datasets of an arbitrary width, for which tables need to be generated dynamically by the interface or data imports. Dose anyone know any good ways to dynamically create and alter tables like this?
(Granted, I know one can do direct SQL statements against the database, but that solution lacks the ability to treat the databases as objects)
Just curious as to if people have any suggestions or approaches to this...
You can try and interface with the django's code that manages changes in the database. It is a bit limited (no ALTER, for example, as far as I can see), but you may be able to extend it. Here's a snippet from django.core.management.commands.syncdb.
for app in models.get_apps():
app_name = app.__name__.split('.')[-2]
model_list = models.get_models(app)
for model in model_list:
# Create the model's database table, if it doesn't already exist.
if verbosity >= 2:
print "Processing %s.%s model" % (app_name, model._meta.object_name)
if connection.introspection.table_name_converter(model._meta.db_table) in tables:
continue
sql, references = connection.creation.sql_create_model(model, self.style, seen_models)
seen_models.add(model)
created_models.add(model)
for refto, refs in references.items():
pending_references.setdefault(refto, []).extend(refs)
if refto in seen_models:
sql.extend(connection.creation.sql_for_pending_references(refto, self.style, pending_references))
sql.extend(connection.creation.sql_for_pending_references(model, self.style, pending_references))
if verbosity >= 1 and sql:
print "Creating table %s" % model._meta.db_table
for statement in sql:
cursor.execute(statement)
tables.append(connection.introspection.table_name_converter(model._meta.db_table))
Take a look at connection.creation.sql_create_model. The creation object is created in the database backend relevant to the database you are using in your settings.py. All of them are under django.db.backends.
If you must have ALTER table, I think you can create your own custom backend that extends an existing one and adds this functionality. Then you can interface with it directly through a ExtendedModelManager you create.
Quickly off the top of my head..
Create a Custom Manager with the Create/Alter methods.