Currently my users are able to upload files, as I am deployed via Heroku I am using Django-storages to upload to AWS S3 buckets. I am using a CreateView/UpdateView as below, this works well however I now want to be able to run an operation on the file before uploading to AWS, my research insofar suggests that I can use temporary_file_path() to do this in the form_valid however I am getting an error,
UpdateView/CreateView
class project_update(LoginRequiredMixin, UpdateView):
model = Project
form_class = ProjectForm
template_name = "home/update_project.html"
context_object_name = 'project'
success_url = reverse_lazy("project-list")
def form_valid(self, form):
handle_uploaded_boq(form['boq_file'].temporary_file_path(), form.cleaned_data['project_title'])
return super(project_update, self).form_valid(form)
However I am getting the following error:
'BoundField' object has no attribute 'temporary_file_path'
So what is the best way to run the operation handle_uploaded_boq() before the file is uploaded to AWS?
to access the file in form_valid method you can use
form.files['boq_file']
and to access the path of uploaded file of TemporaryUploadedFile class, use
form.files['boq_file'].temporary_file_path()
or
form.files['boq_file'].file.name
Note: to get the path of the uploaded file, the uploaded file must be an object of class TemporaryUploadedFile and not of InMemoryUploadedFile. You can handle it by updating the FILE_UPLOAD_HANDLERS in setting.py to following
FILE_UPLOAD_HANDLERS = [
'django.core.files.uploadhandler.TemporaryFileUploadHandler',
]
assuming that the name of file_field you used in the form is boq_file
Related
I have an app built in Django and currently deployed on Google App Engine. Everything works fine except for when I want to upload files larger than 32MB. I get an error which says, 413. That’s an error.
I have been doing some research and I've come to realize that I have to use Google App Engine's Blobstore API. I have no idea on how to implement that on my Django App.
Currently my code looks something like this:
Model:
class FileUploads(models.Model):
title = models.CharField(max_length=200)
file_upload = models.FileField(upload_to="uploaded_files/", blank=True, null=True)
Form:
class UploadFileForm(forms.ModelForm):
class Meta:
model = FileUploads
fields = ["title", "file_upload"]
View:
def upload_file(request):
if request.method == "POST":
form = UploadFileForm(request.POST, request.FILES)
if form.is_valid():
form.save()
form = UploadFileForm()
return render(request, "my_app/templates/template.html", {"form": form})
Everything works fine. I would just like to know how to implement Google App Engine's Blobstore API on my current code structure to enable large file uploads.
From Google Cloud Official Documentation:
There exists limits that apply specifically to the use of the Blobstore API.
The maximum size of Blobstore data that can be read by the application with one API call is 32 megabytes.
The maximum number of files that can be uploaded in a single form POST is 500.
Your code looks fine, and this is an expected error due to Blobstore API quotas and limits. One way would be to split up your file sizes, which should not exceed 32 MB, and make multiple API calls when uploading larger files. Another solution would be to upload directly to Google Cloud Storage.
Hope this clarifies your question.
ERROR: "Upload a valid image. The file you uploaded was either not an image or a corrupted image."
I have been looking around and I haven't been able to find a solution to my problem. I use the same images locally in a venv and they work. I use the same images in a docker container that has the same Pillow library and dependancies and it works.
I have a Django ImageField that I have a simple admin form.
I can get the images to upload to S3 for storage. I have pulled the docker container we use on the servers and I have ran this locally, but is cannot get the error. I have not run into this error before with image uploading, so I am unsure why this is happening.
# models.py
#deconstructible
class RandomFileName(object):
def __init__(self, path):
self.path = path
def __call__(self, instance, filename):
ext = filename.split('.')[-1]
filename = '{}.{}'.format(uuid.uuid4().hex, ext)
return os.path.join(self.path, filename)
class MyModel(models.Model):
name = models.CharField(max_length=50)
avatar = models.ImageField(
upload_to=RandomFileName('avatars')
)
...
# admin.py
#admin.register(MyModel)
class MyModelAdmin(admin.ModelAdmin):
list_display = (
'name',
...
)
fieldsets = (
(None, {'fields': (
'name',
'avatar',
)}),
)
...
Dependancies:
Django==2.0.3
Pillow==5.3.0
Edit:
This is behind AWS API Gateway and my answer/solution is below if anyone should encounter this issue themselves.
I solved this yesterday! There is nothing wrong with the code itself or the packages.
One thing I forgot to mention is that this is behind API Gateway and so I needed to modify that to accept "multipart/form-data".
Had the same/similar issue, this resolved it:
Add multipart/form-data to API Gateway Settings under Binary Media Types
In proxy "Method Request", "HTTP Request Headers" add "Accept" and "Content-Type"
RE-DEPLOY API
I have a serializer for my Post class which has a image and a link attribute.
media is an FileField and link is a URLField which is a url to somewhere else I share my post (in another website.)
I want to:
Submit my post data (text, and the image)
Accessing the url of submitted file to use in sharing it in another place.
Updating the link value after I found it.
This is what I tried:
post = PostCreateSerializer(data=request.data, context={'request': request})
post.is_valid(raise_excpetions=True)
post.save()
media_url = post.data.get('media')
link = find_link_value(media_url)
post.link = link
post.save()
This raises an exception. says:
You cannot call `.save()` after accessing `serializer.data`.If you need to access data before committing to the database then inspect 'serializer.validated_data' instead.
The problem is when I use post.validated_data.get('media') instead of .data, it doesn't give me the url. It gives me an InMemoryUploadedFile object, that of course, doesn't have any path and url.
I thought I could use name attribute of InMemoryUploadedFile object to find the url (the one that will be created after .save()), but when the name is duplicate, the real name of file in disk and url differs from it's original name (for example, name.jpg and name_aQySbJu.jpg) and I can't use it for my purpose.
Question
How can I have access to URL of that uploaded file, and also call save() after I updated my post?
The serializer's save() method return corresponding instance. So, you can use that to get the url
post = PostCreateSerializer(data=request.data, context={'request': request})
post.is_valid(raise_excpetions=True)
post_instance = post.save()
media_url = post_instance.media.url
link = find_link_value()
post_instance.link = link
post_instance.save()
post = PostCreateSerializer(data=request.data, context={'request': request})
post.is_valid(raise_excpetions=True)
media_url = post.validated_data.get('media')
link = find_link_value()
post.save(link=link)
Actually, you can use this approach in order to save your post. When you pass link through serializer_obj.save(**kwargs) method it automatically pass your extra data into create(self, validated_data) or update(self, instance, validated_data) adding your extra data into validatad_data as dictionary. So than you can handle your extra data in create or update method of serializer.
I am trying to upload documets in appengine-django.
Docs getting uploaded successfully with pure django code [ using python manage.py runsever ].
But when i am trying to run django with appengine project it gives me error ,
[Errno 30] Read-only file system: u'/home/nishant/workspace1/problemdemo/uploaded_files/1372313114_43_nishant26062013.zip'
This error caused because Developers have read-only access to the filesystem on App Engine.
Is there is another way to upload docs to google cloud sql ?
Here is my code ,
models.py
from django.db import models
import time
# Create your models here.
def get_upload_file_name(instance,filename):
return "uploaded_files/%s_%s" % (str(time.time()).replace('.','_'),filename)
class Candidate(models.Model):
title=models.CharField(max_length=20)
resume=models.FileField(upload_to=get_upload_file_name)
def __unicode__(self):
return self.title
forms.py
from django import forms
from actualproblem.models import Candidate
class TestForm(forms.ModelForm):
class Meta:
model=Candidate
fields=('title','resume',)
views.py
# Create your views here.
from django.shortcuts import render
from actualproblem.forms import TestForm
def sampletest(request):
if request.method=='POST':
form = TestForm(request.POST,request.FILES)
if form.is_valid():
form.save()
else:
form=TestForm()
return render(request,'profile.html',{'form':form})
How can i upload documetns to google cloud sql ?
You may solve the conundrum by using Uploadcare, it can help in situations when you have little or no control over host filesystem (GAE, heroku, shared hosting or whatnot). Uploaded files will be stored in Uploadcare and you will store file UUIDs or URLs in your DB.
There is Django package: pyuploadcare
disclosure: I am one of Uploadcare developers and am posting this not to shamelessly promote the service but because it was built to solve cases like this one.
This is a follow up question for Django on Google App Engine: cannot upload images
I got part of the upload of images to GAE Blobstore working. Here's what I did:
In models.py I created a model PhotoFeature:
class PhotoFeature(models.Model):
property = models.ForeignKey(
Property,
related_name = "photo_features"
)
caption = models.CharField(
max_length = 100
)
blob_key = models.CharField(
max_length = 100
)
In admin.py I created an admin entry with an override for the rendering of the change_form to allow for insert of the correct action to the Blobstore upload url:
class PhotoFeatureAdmin(admin.ModelAdmin):
list_display = ("property", "caption")
form = PhotoFeatureForm
def render_change_form(self, request, context, *args, **kwargs):
from google.appengine.ext import blobstore
if kwargs.has_key("add"):
context['blobstore_url'] = blobstore.create_upload_url('/admin/add-photo-feature')
else:
context['blobstore_url'] = blobstore.create_upload_url('/admin/update-photo-feature')
return super(PhotoFeatureAdmin, self).render_change_form(request, context, args, kwargs)
As I use standard Django, I want to use the Django views to process the result once GAE has updated the BlobStore in stead of BlobstoreUploadHandler. I created the following views (as per the render_change_form method) and updated urls.py:
def add_photo_feature(request):
def update_photo_feature(request):
This all works nicely but once I get into the view method I'm a bit lost. How do I get the Blob key from the request object so I can store it with PhotoFeature? I use standard Django, not Django non-rel. I found this related question but it appears not to contain a solution. I also inspected the request object which gets passed into the view but could not find anything relating to the blob key.
EDIT:
The Django request object contains a FILES dictionary which will give me an instance of InMemoryUploadedFile. I presume that somehow I should be able to retrieve the blob key from that...
EDIT 2:
Just to be clear: the uploaded photo appears in the Blobstore; that part works. It's just getting the key back from the Blobstore that's missing here.
EDIT 3:
As per Daniel's suggestion I added storage.py from the djangoappengine project which contains the suggested upload handler and added it to my SETTINGS.PY. This results in the following exception when trying to upload:
'BlobstoreFileUploadHandler' object has no attribute 'content_type_extra'
This is really tricky to fix. The best solution I have found is to use the file upload handler from the djangoappengine project (which is associated with django-nonrel, but does not depend on it). That should handle the required logic to put the blob key into request.FILES, as you'd expect in Django.
Edit
I'd forgotten that django-nonrel uses a patched version of Django, and one of the patches is here to add the content-type-extra field. You can replicate the functionality by subclassing the upload handler as follows:
from djangoappengine import storage
class BlobstoreFileUploadHandler(storage.BlobstoreFileUploadHandler):
"""Handler that adds blob key info to the file object."""
def new_file(self, field_name, *args, **kwargs):
# We need to re-process the POST data to get the blobkey info.
meta = self.request.META
meta['wsgi.input'].seek(0)
fields = cgi.FieldStorage(meta['wsgi.input'], environ=meta)
if field_name in fields:
current_field = fields[field_name]
self.content_type_extra = current_field.type_options
super(BlobstoreFileUploadHandler, self).new_file(field_name,
*args, **kwargs)
and reference this subclass in your settings.py rather than the original.