I have a very simple Image model
class Image(Model):
owner = ForeignKey(User)
img = ImageField(upload_to=image_file_path)
the image_file_path resolves to <username>/images/ where username is the Image model instance's owner.username
Now, when using Django admin to change the owner of the image, I want the physical image file to be moved to the appropriate path, i.e. <new_username>/images/.
What is the simplest / "correct" way of doing this?
Edit:
A few thoughts after experimenting a bit
post_save handlers: the idea is to make sure that the model is sane by moving the file into the correct place after the model changes. The problem is that if somthing bad happens – the file is missing / transfer errors out (e.g. if storage is on S3 and the connection is bad) / etc. – then you can end up in a loop trying to revert the change or end up with an inconsistent model / FS state
Django admin action: this seems to be a safer way because you can just copy the file, check that it's OK, then change the model, check that it's sane, then delete the old file. If anything breaks you can abort and since this is manually initiated from the admin interface there is no chance of users experiencing inconsisten behaviour.
pre_save: The same process as in 'Django admin action' could probably be used with pre_save, although if you abort due to problems then user experience suffers. On the other hand if copy / moving files is borked, users are not going to be able to upload anything anyway.
Is there a fourth way? Are there sanitizers for Models in Django?
Use the post_save signal of your Image model. In the same way you can use post_delete to actually delete the image file when the record is deleted.
You can review the Signals documentation here.
Related
I am starting an app that has a complex permission structure that will inevitably be managed by the users themselves. I have the following permissions in the model:
class Meta:
permissions = (
('can_view', 'View project'),
('manage_view', 'Can assign View project'),
('can_edit', 'Edit project'),
('manage_edit', 'Can assign Edit project'),
('can_delete', 'Delete project'),
('manage_delete', 'Can assign Delete project'),
('creator', 'Full access to model features.'),
)
These will be managed at the object level with Django-guardian and I am using custom mixins to deal with all of the various combinations and features.
The 'creator' permission is assigned to the initial creator of an object, and they can assign the other permissions to Users via an e-mail based invite system.
My question surrounds the options are for assigning the creator permission on creation of the object.
Some approaches I have thought of so far:
Assign in view after save
newObject.save()
assign('creator', User, newObject)
Generally, I prefer to get these types of events out of my views though.
Override save()
The problem with this method is I have to give the save access to the User object, which means also overriding the init to set self.request = request.
post_save signal
I know I could take advantage of this signal, but compared to the previous two I haven't attempted an implementation of this yet.
Hopefully this provides enough insight into where I am at with things.
I would love to know what the best of these methods would be, or, if they are all bad ideas what an alternative implementation might be.
Thanks,
JD
AD Override save(): You can add user parameter to save method (so you don't have to override init too). This way code would break at runtime if you try to call save method without passing user instance (and as long as you test your code you should be fine with that approach).
AD post_save signal : If you didn't try it... try it! Documentation is pretty well on the signals topic and they are rather easy to learn. What might be tricky is where should you connect signals probably (I prefer to do that at the end of models module).
Unfortunately, there is no best approach for your answer. Also, remember that neither save method nor post_save signal are not fired if you i.e. insert instances with raw SQL or do bulk_create (https://docs.djangoproject.com/en/stable/ref/models/querysets/#bulk-create). So identify places where you want this automatic permission assignment to be happening (that should be probably one place anyway).
Alternatively, you can add FK field pointing at the creator to your model. You would be able to use that information instead of checking permission with guardian (and as you noted about using mixins that should actually fit your problem well too). I used that approach for my project management application some time ago.
Hope that helps!
https://docs.djangoproject.com/en/stable/ref/models/querysets/#bulk-create
If I want to be able to "post process" an image after it is uploaded, crop it down to size and apply some compression. As it stands, I am doing this using the post_save signal, when the model is saved, I am accessing the file, applying the post production and saving over the original.
I am only doing this when the created argument of the post save signal is set to true to avoid unnecessary image processing every time the model is updated.
The problem
When the image field of an existing instance is updated, the post processing of the image is being skipped because the created flag is false.
How can I setup my model to only apply post processing to the image when the ImageField has changed, even if the model is already created? This app may not always be used with django admin, so overwriting the imagefield_save method isn't going to work.
Hope someone can help!
This question is from a long time ago, so probably not actual anymore?
did you have a look at:
pre_save.connect(before_mymodel_save, sender=MyModel)
have a look at the signal documentation of django
you create a function before_mymodel_save and you can try to do anything in there. If you're using a save inside the post or pre save functions: Just make sure you disconnect the signal if you save the MyModel object within this function (and connect again), to avoid endless loops.
Scenario: large project with many third party apps. Want to add tagging to those apps without having to modify the apps' source.
My first thought was to first specify a list of models in settings.py (like ['appname.modelname',], and call django-tagging's register function on each of them. The register function adds a TagField and a custom manager to the specified model. The problem with that approach is that the function needs to run BEFORE the DB schema is generated.
I tried running the register function directly in settings.py, but I need django.db.models.get_model to get the actual model reference from only a string, and I can't seem to import that from settings.py - no matter what I try I get an ImportError. The tagging.register function imports OK however.
So I changed tactics and wrote a custom management command in an otherwise empty app. The problem there is that the only signal which hooks into syncdb is post_syncdb which is useless to me since it fires after the DB schema has been generated.
The only other approach I can think of at the moment is to generate and run a 'south' like database schema migration. This seems more like a hack than a solution.
This seems like it should be a pretty common need, but I haven't been able to find a clean solution.
So my question is: Is it possible to dynamically add fields to a model BEFORE the schema is generated, but more specifically, is it possible to add tagging to a third party model without editing it's source.
To clarify, I know it is possible to create and store Tags without having a TagField on the model, but there is a major flaw in that approach in that it is difficult to simultaneously create and tag a new model.
From the docs:
You don't have to register your models
in order to use them with the tagging
application - many of the features
added by registration are just
convenience wrappers around the
tagging API provided by the Tag and
TaggedItem models and their managers,
as documented further below.
Take a look at the API documentation and the examples that follow for how you can add tags to any arbitrary object in the system.
http://api.rst2a.com/1.0/rst2/html?uri=http://django-tagging.googlecode.com/svn/trunk/docs/overview.txt#tags
Updated
#views.py
def tag_model_view(request, model_id):
instance_to_tag = SomeModel.objects.get(pk=model_id)
setattr(instance_to_tag, 'tags_for_instance', request.POST['tags'])
...
instance_to_tag.save()
...returns response
#models.py
#this is the post_save signal receiver
def tagging_post_save_handler(sender, instance, created):
if hasattr(instance, 'tags_for_instance'):
Tag.objects.update_tags(instance, instance.tags_for_instance)
I am implementing an application with django, which has a model with a FileField:
class Slideshow(models.Model):
name = models.CharField(max_length=30,unique=True)
thumbnail = models.FileField(max_length=1000,upload_to="images/app/slideshows/thumbnails")
and I have an admin backend where django manages the models. I just added the file admin.py and django manages everything for me
from django.contrib import admin
from apps.gallery.models import Slideshow
admin.site.register(Slideshow)
In the backend, it is possible to add, delete and update the slideshows. However, when I try to update a slideshow and change its attribute thumbnail [FileField], django does not delete the old file. Consequently, after several updates the server is filled with many files which are useless.
My question is: how can I make django delete those files automatically after an update?
I would really appreciate your help
I thought much about this problem, and eventually I find out a solution than works well for me. You can find all models in project and connect pre_save and post_delete signals to them.
At the end I made app, which sloves this problem - django-cleanup
I'm sure Django does this by design. It can't know, for example, whether any other models might be using that file. You would also be really surprised if you expected the file to remain and discovered that django deleted it!
However, there's also the issue that as soon as you change the file field, you lose the old file name.
There's an open ticket about that problem: http://code.djangoproject.com/ticket/11663
There's a patch in http://code.djangoproject.com/ticket/2983 which shows how to override __set__ to store the previous file name. Then your model's __save__ method can get access to the previous file name to delete it.
I'm going to be honest: this is a question I asked on the Django-Users mailinglist last week. Since I didn't get any replies there yet, I'm reposting it on Stack Overflow in the hope that it gets more attention here.
I want to create an app that makes it easy to do user friendly,
multiple / mass file upload in your own apps. With user friendly I
mean upload like Gmail, Flickr, ... where the user can select multiple
files at once in the browse file dialog. The files are then uploaded
sequentially or in parallel and a nice overview of the selected files
is shown on the page with a progress bar next to them. A 'Cancel'
upload button is also a possible option.
All that niceness is usually solved by using a Flash object. Complete
solutions are out there for the client side, like: SWFUpload
http://swfupload.org/ , FancyUpload http://digitarald.de/project/fancyupload/
, YUI 2 Uploader http://developer.yahoo.com/yui/uploader/ and probably
many more.
Ofcourse the trick is getting those solutions integrated in your
project. Especially in a framework like Django, double so if you want
it to be reusable.
So, I have a few ideas, but I'm neither an expert on Django nor on
Flash based upload solutions. I'll share my ideas here in the hope of
getting some feedback from more knowledgeable and experienced people.
(Or even just some 'I want this too!' replies :) )
You will notice that I make a few assumptions: this is to keep the
(initial) scope of the application under control. These assumptions
are of course debatable:
All right, my idea's so far:
If you want to mass upload multiple files, you are going to have a
model to contain each file in. I.e. the model will contain one
FileField or one ImageField.
Models with multiple (but ofcourse finite) amount of FileFields/
ImageFields are not in need of easy mass uploading imho: if you have a
model with 100 FileFields you are doing something wrong :)
Examples where you would want my envisioned kind of mass upload:
An app that has just one model 'Brochure' with a file field, a
title field (dynamically created from the filename) and a date_added
field.
A photo gallery app with models 'Gallery' and 'Photo'. You pick a
Gallery to add pictures to, upload the pictures and new Photo objects
are created and foreign keys set to the chosen Gallery.
It would be nice to be able to configure or extend the app for your
favorite Flash upload solution. We can pick one of the three above as
a default, but implement the app so that people can easily add
additional implementations (kinda like Django can use multiple
databases). Let it be agnostic to any particular client side solution.
If we need to pick one to start with, maybe pick the one with the
smallest footprint? (smallest download of client side stuff)
The Flash based solutions asynchronously (and either sequentially or
in parallel) POST the files to a url. I suggest that url to be local
to our generic app (so it's the same for every app where you use our
app in). That url will go to a view provided by our generic app.
The view will do the following: create a new model instance, add the
file, OPTIONALLY DO EXTRA STUFF and save the instance.
DO EXTRA STUFF is code that the app that uses our app wants to run.
It doesn't have to provide any extra code, if the model has just a
FileField/ImageField the standard view code will do the job.
But most app will want to do extra stuff I think, like filling in
the other fields: title, date_added, foreignkeys, manytomany, ...
I have not yet thought about a mechanism for DO EXTRA STUFF. Just
wrapping the generic app view came to mind, but that is not developer
friendly, since you would have to write your own url pattern and your
own view. Then you have to tell the Flash solutions to use a new url
etc...
I think something like signals could be used here?
Forms/Admin: I'm still very sketchy on how all this could best be
integrated in the Admin or generic Django forms/widgets/...
(and this is were my lack of Django experience shows):
In the case of the Gallery/Photo app:
You could provide a mass Photo upload widget on the Gallery detail
form. But what if the Gallery instance is not saved yet? The file
upload view won't be able to set the foreignkeys on the Photo
instances. I see that the auth app, when you create a user, first asks
for username and password and only then provides you with a bigger
form to fill in emailadres, pick roles etc. We could do something like
that.
In the case of an app with just one model:
How do you provide a form in the Django admin to do your mass
upload? You can't do it with the detail form of your model, that's
just for one model instance.
There's probably dozens more questions that need to be answered before
I can even start on this app. So please tell me what you think! Give
me input! What do you like? What not? What would you do different? Is
this idea solid? Where is it not?
Thank you!
I just released a simple app for this about a month ago: django-uploadify.
It's basically a Django template tag that acts as a wrapper for the very nifty Uploadify (requires jQuery). Using it is as simple as adding this to your template...
{% load uploadify_tags }{% multi_file_upload ‘/upload/complete/url/’ %}
The tag will fire events (1 per file) on both the client-side and server-side (Django signal) to indicate when an incoming file has been received.
For example, assuming you have a model 'Media' that handles all user-uploaded files...
def upload_received_handler(sender, data, **kwargs):
if file:
new_media = Media.objects.create(
file = data,
new_upload = True,
)
new_media.save()
upload_recieved.connect(upload_received_handler, dispatch_uid=‘whatever.upload_received’)
Check out the wiki for info on how to set it up and create the signal handlers (client/server).
About your conceptual implementation from above, here's a few points of consideration:
Having the app automatically create the "File Model" instance probably isn't as robust as people may already have their own models they're working with
If you want to implement any type of security or authentication, you need an open system and less of an 'auto-create' type
I really think signals/events are the way to handle this, and also handle the 'DO OTHER STUFF' part of what you mentioned.
My conclusion was that multi-upload can never really be a form widget in the sense that Django implements form widgets. 1 file will most likely be represented by 1 model instance (with some exceptions), which means that we end up with a situation where 1 widget can represent N model instances. However Django is setup so that a widget represents 1 value for 1 field in 1 instance. It just doesn't fit for the majority of use-cases to have it as a widget (hence why I went the template tag route).