This is the first time I've tried to upload a file with Django. I did something and it worked, though I realized later that it's not the correct way to do it. When I called save on the object, did it call a built-in handler for the FileField? I realize that I should create my own handler but I was just curious why this worked.
def upload_test(request):
user=User.objects.get(pk=user.id)
photoform=PhotoForm()
if request.method=='POST':
photoform=Post_PhotoForm(request.POST,request.FILES)
if photoform.is_valid():
photo=photoform.save(commit=False)
photo.user=user
photo.save()
return HttpResponse('success')
else:
return HttpResponse('%s' %photoform.errors)
return render_to_response("site/upload_test.html", {'photoform':photoform}, context_instance=RequestContext(request))
This is saving the object and uploading the file to the directory specified in the FileField.
If I create a handler which writes the file in chunks, how can I also save the photoform instance? Will it create duplicates?
Thanks for the insight.
I imagine PhotoForm is a ModelForm? Manual handling of uploaded files as described in the docs is only required for standard Forms. The chunk handling is performed in the background by the models.FileField and its storage object, etc.
Related
I need to process an uploaded file with GeoDjango. According to the documentation, I should use Datasource() constructor from GDAL.
Problem is that size of uploaded shapefiles can be less than 2.5 MB, so MemoryFileUploadHandler is used by default and thus I can't access to a filepath required by Datasource().
I decided to override request.upload_handlers for my specific view, with only "django.core.files.uploadhandler.TemporaryFileUploadHandler" because I don't need to create a subclass (for now), but I get the following error : You cannot set the upload handlers after the upload has been processed.
Here is my piece of code:
def home(request):
request.upload_handlers = ["django.core.files.uploadhandler.TemporaryFileUploadHandler"]
if request.method == 'POST':
form = UploadFileForm(request.POST, request.FILES)
if form.is_valid():
handle_uploaded_file(request.FILES['file'])
else:
form = UploadFileForm()
return render(
request,
'app/index.html',
{
'title':'HOME',
'form': form,
}
)
What am I doing wrong ? Also, should I create a subclass for a custom handler anyway ?
the problem is when the function is called django already use the upload_handler so it's to late to change it.
I remember you might be able to still change it if you disable csrf for the function. (Also see: Where/how to replace default upload handler in Django CBV?)
Another possibility would be writing your own upload handler or maybe a middleware that changes the upload handler depending on the path.
I have a CreateView for a ModelForm with a form_valid() method, which calls form.save() quite early on (so that I can get the object's ID), then goes on to do some other stuff (create some related objects, and send some emails).
def form_valid(self, form):
context = self.get_context_data()
preferences_formset = context['preferences_formset']
if preferences_formset.is_valid():
student = form.save()
...
send_email_one()
send_email_two()
send_email_three()
return HttpResponseRedirect(self.get_success_url())
I recently discovered that some of the later processing had some errors resulting in an unhandled exception in some cases when send_email_three is called. I can see from my logs that send_email_one and send_email_two are being called, and the exception occurs in send_email_three. However, when this has occurred, I can't find the objects in the DB. I was under the impression that form.save() should create and save the object in the DB - is this the case or does it roll back the save if the form_valid function errors out at a later point?
I'm using django 1.8.17
PS: Yes, I know I should have the emails in a deferred task; that will be implemented later.
That depends on the ATOMIC_REQUESTS setting. Setting it to True will trigger the behaviour described in the docs:
Before calling a view function, Django starts a transaction. If the response is produced without problems, Django commits the transaction. If the view produces an exception, Django rolls back the transaction.
You can use #transaction.atomic decorator
https://docs.djangoproject.com/en/1.10/topics/db/transactions/#controlling-transactions-explicitly
I'm looking for a way to upload file to django server. The thing is I'm not trying to save in I just to open it and get data for processing. I looked through some of the examples here, but I couldn't find anything that answers this. I'm probably just not looking for a correct thing please help.
I don't want to use models, just a simple website with a upload button.
Thanks!
Use a Form:
class UploadForm(forms.Form):
file = forms.FileField()
def process(self):
file = self.cleaned_data.get('file')
# do whatever you need here to process the file
# e.g. data = file.read()
In your view, call process() on your form after the user uploads the file and the form is successfully validated.
def my_view(request):
if request.method == 'POST':
form = UploadForm(files=request.FILES)
if form.is_valid():
form.process()
return ...
Depending on the size of the file and your Django settings for FILE_UPLOAD_HANDLERS, the file is discarded immediately after the view is done processing if MemoryFileUploadHandler is used. The operating system will also eventually discard the file if TemporaryFileUploadHandler is used.
There are a few tiny related questions buried in here, but they really point to one big, hairy best practice question. This is kind of a tough feature to implement because it's supposed to do a couple tricky things at once...
drag-and-drop multi-file uploader (via Javascript)
multi-page form (page one: upload and associate files with an existing document model;
page two: update and save file/document objects and meta-data to database)
...and I haven't found a pre-existing code sample or implementation anywhere. (Depending on one's approach, it could sweep off the table or automagically answer all the related/embedded/follow-on questions.) Bottom-line, the purpose of this post is to answer this question: What's the most elegant approach which minimizes the intervening questions/problems?
I'm using this implementation of a drag-and-drop JQuery File Uploader in Django to upload files...
https://github.com/miki725/Django-jQuery-File-Uploader-Integration-demo
The solution I link to above saves files on the filesystem, of course, but in batches per upload session, via creating a directory for each batch of files, and then assigning a UUID to each of those directories. Each uniquely named directory on the filesystem contains files uploaded during that particular upload session. That means any sort of database storage method first has to tease apart and iterate over all the files in the filesystem directory created for each upload session by this solution.
Note: the JQuery solution linked to above doesn't use a form (in forms.py) inside the app directory. The form is hardcoded into the template, which is already a bit of a bummer...'cause now I also have to find a nice way to bind each of the above files in each batch to a form.
I think the simplest--albeit perhaps least performant solution--is to create two views, for two forms...to save each file to the database in the view on the first page, and then update the database on the second page. Here's the direction I'm presently rolling in:
IN THE TEMPLATE...
...uploader javascripts in header...
<form action="{% url my_upload_handler %}" method="POST" enctype="multipart/form-data">
<input type="file" name="files[]" multiple
</form>
IN VIEWS.PY...
def my_upload_handler_0r_form_part_one(request):
# POST (in the upload handler; request triggered by an upload action)
if request.method == 'POST':
if not ("f" in request.GET.keys()):
...validators and exception handling...
...response_data, which is a dict...
uid = request.POST[u"uid"]
file = request.FILES[u'files[]']
filename = os.path.join(temp_path, str(uuid.uuid4()) + file.name)
destination = open(filename, "wb+")
for chunk in file.chunks():
destination.write(chunk)
destination.close()
response_data = simplejson.dumps([response_data])
response_type = "application/json"
# return the data to the JQuery uploader plugin...
return HttpResponse(response_data, mimetype=response_type)
# GET (in the same upload handler)
else:
return render_to_response('my_first_page_template.html',
{ <---NO 'form':form HERE
'uid': uuid.uuid4(),
},
context_instance = RequestContext(request))
def form_part_two(request):
#here I need to retrieve and update stuff uploaded on first page
return render_to_response('my_second_page_template.html',
{},
context_instance = RequestContext(request))
This view for the first page leverages the JQuery uploader, which works great for multi-file uploads per session and does what it's supposed to do. However, as hinted above, the view, as an upload handler, is only the first page in what needs to be a two page form. On page two, the end user would subsequently need to retrieve each uploaded file, attach additional data to the files they just uploaded on page one, and re-save to the database.
I've tried to make this work as a two-part form via various solutions, including form wizards and/or generic class based views...following examples mainly enabling data persistence via the session. These solutions get rather thorny very quickly.
In summary, I need to...
upload multiple files in a uniquely identified batch (via drag and drop)
tease apart and iterate over each batch of uploaded files
bind each file in the batch to a form and associate it with an existing document model
submit / save all of these files at once to the database
retrieve each of those files on the following page/template of a potentially new form
update metadata for each file
resubmit / save all of those files at once to the database
So...you can see how all of the above compounds the complexity of a simple file upload, and increases the complexity of providing the feature, by involving related questions like:
forms.py: how best to bind each file to a form
models.py: how to associate each file with a pre-existing document model
views.py how to save each file in accordance with pre-existing document model in Postgres in the first page; update and save each document in the second page
...and, again, I'd like to do all of that without a form wizard, and without class-based views. (CBVs, especially, for this use case elude me a bit.) In other words: I'm looking for advice leading toward the most bulletproof and easy to read/understand solution possible. If it causes multiple hits to the database, that's fine by me. (If saving a file to the database seems anti best practice, please see this other post: Storing file content in DB
Might I be able to just create a separate view for two forms, and subclass a standard upload form, like so...
In forms.py...
class FileUploadForm(forms.Form):
files = forms.FileField(widget=forms.ClearableFileInput(attrs={'name':'files[]', 'multiple':'multiple'}))
#how to iterate over files in list or batch of files here...?
file = forms.FileField()
file = forms.FileField()
def clean_file(self):
data = self.cleaned_data["file"]
# read, parse, and create `data_dict` from file...
# subclass pre-existing UploadModelForm
**form = UploadModelForm(data_dict)**
if form.is_valid():
self.instance = form.save(commit=False)
else:
raise forms.ValidationError
return data
...and then refactor the earlier upload handler above with something like...
In views.py, substituting the following for present upload handler...
def view_for_form_one(request):
...
# the aforementioned upload handler logic, plus...
...
form = FileUploadForm(request.POST, request.FILES)
if form.is_valid():
form.save()
else:
# display errors
pass
...
def view_for_form_two(request):
# update and commit all data here
...?
In general, with this type of problem, I like to create single page with one <form> on it, but multiple sections which the user progresses through with javascript.
Breaking a form into a multi-part, wizard-style form series is much easier with javascript, especially if the data it produces is dynamic in nature.
If you absolutely must break it out into multiple pages, I would advise you to set up your app to be able to save the data into the database at the end of each step.
You can do that by making the metadata which the user adds at step 2 a nullable field, or even moving the metadata to a separate model.
So I'm working with django and file uploads and I need a javascript function to execute after the file has been uploaded.
I have a file upload handler in my views.py which looks like this:
def upload_file(request):
form = UploadFileForm(request.POST, request.FILES)
if form.is_valid():
for f in request.FILES.getlist('fileAttachments'):
handle_uploaded_file(f)
return HttpJavascriptResponse('parent.Response_OK();')
else:
return HttpResponse("Failed to upload attachment.")
And I found a django snippet from http://djangosnippets.org/snippets/341/ and I put the HttpJavascriptResponse class in my views.py code. It looks as follows:
class HttpJavascriptResponse(HttpResponse):
def __init__(self,content):
HttpResponse.__init__(self,content,mimetype="text/javascript")
However, when I upload a file the browser simple renders "parent.Response_OK();" on the screen instead of actually executing the javascript. And Chrome gives me the warning: "Resource interpreted as Document but transferred with MIME type text/javascript"
Is there anyway to get views.py to execute the script?
A better way is to pass the mime to the HttpResponse Object.
See documentation: https://docs.djangoproject.com/en/3.2/ref/request-response/#django.http.HttpRequest.content_type.
return HttpResponse("parent.Response_OK()", content_type="application/x-javascript")
Note - Some previous versions of Django used mimetype instead of content_type:
return HttpResponse("parent.Response_OK()", mimetype="application/x-javascript")
I believe this will work.
return HttpResponse("<script>parent.Response_OK();</script>")
However, you might think about returning a success (200) status code in this case, and then having some javascript in parent attach to the load event of this child, and branch based on return status code. That way you have a separation of view rendering code and view behavior code.
Chase's solution worked for me, though I need to execute more javascript than I care to put in a python string:
from django.http import HttpResponse
from django.contrib.staticfiles.templatetags import staticfiles
...
return HttpResponse("<script src='{src}'></script>".format(
src = staticfiles.static('/path/to/something.js')))
I ended up here looking for a way to serve a dynamic js file using django.
Here is my solution :
return render(request, 'myscript.js', {'foo':'bar'},
content_type="application/x-javascript")