Upload zip files to Google App Engine using Django - django

In a nutshell, we are trying to implement the following example from Google using Django:
from google.appengine.ext import blobstore
from google.appengine.ext.webapp import blobstore_handlers
class MainHandler(webapp2.RequestHandler):
def get(self):
upload_url = blobstore.create_upload_url('/upload')
self.response.out.write('<html><body>')
self.response.out.write('<form action="%s" method="POST" enctype="multipart/form-data">' % upload_url)
self.response.out.write("""Upload File: <input type="file" name="file"><br> <input type="submit"
name="submit" value="Submit"> </form></body></html>""")
class UploadHandler(blobstore_handlers.BlobstoreUploadHandler):
def post(self):
upload_files = self.get_uploads('file') # 'file' is file upload field in the form
blob_info = upload_files[0]
self.redirect('/serve/%s' % blob_info.key())
Our app is written in Django 1.x. We use urls.py, views.py and models.py to define our url scheme, views and data models respectively.
In our database, we need to keep track of the data file (or at least, a reference to where the file exists on disk). Currently, our data model is as follows:
class FileData(models.Model):
token = models.CharField(max_length=10)
file = models.FileField(upload_to=get_upload_file_name, null=True, default=None, blank=True)
The field 'token' contains a token that client apps will use to request the file for download. The field 'file' is intended to contain the data. However, if we need to make it a reference to a blob location, it won't be a problem. We just don't know what to do at this point. What is the right solution?
Our urls pattern is as follows:
urlpatterns = patterns('',
url(r'^upload/', 'views.upload', name='upload'),
url(r'^/serve/([^/]+)?', 'views.servehandler', name='servehandler'),
)
Our form is as follows.
class DataUploadForm(forms.Form):
"""
FileData form
"""
token = forms.CharField(max_length=10)
file = forms.FileField()
class Meta:
model = FileData
fields = ['token', 'file', ]
def __init__(self, *args, **kwargs):
super(DataUploadForm, self).__init__(*args, **kwargs)
self.helper = FormHelper()
self.helper.form_tag = False
self.helper.layout = Layout(
'token',
'file',
ButtonHolder(
Submit('submit', 'Submit', css_class='btn btn-success btn-block'),
),
)
Our view is as follows.
def upload(request):
args = {}
args.update(csrf(request))
if request.method == 'POST':
print "In /upload"
form = DataUploadForm(request.POST, request.FILES)
if form.is_valid():
file_data = FileData()
file_data.token = request.POST.get('token')
file_data.file = request.FILES.get('file')
print "===========> Uploading:" + file_data.token
file_data.save()
return render(request, 'upload_success.html', args)
form = DataUploadForm()
args['form'] = form
return render(request, 'upload.html', args)
When we execute, we get:
Exception Value:
[Errno 30] Read-only file system: u
What are we doing wrong? How can we improve our code? How do we get Google App Engine to save our file, perhaps in the blobstore, and give us the reference to where the file exists for use in downloading it later?
Please be complete. Describe changes needed for urls.py, models.Model, views.py and yaml files (if necessary). Address how to upload large files (i.e., files greater than 40 MB each).
Please don't send us any more links. We have searched and seen a lot of postings-- none of which answers this question. If you can, please post a code snippets that answer the question -- not links.

The reason you have this Read-only file system error is that the default models.FileField save uploaded file to the MEDIA_ROOT folder and it's read-only for Google Appengine apps.
You have to use a third party FileField to save the uploaded files to services like Google Cloud Storage or Amazon S3.
I did a quick search and found:
https://github.com/viadanna/django-gcsengine
https://django-storages.readthedocs.org
And for the upload file size limit issue, Google Appengine has a size limit for request which is 32M. To support larger files, you have to make the user upload their files directly to BlobStore or Google Cloud Storage and build a callback handler to link the uploaded file with your model.
Check these links:
https://cloud.google.com/appengine/docs/python/blobstore/
Upload images/video to google cloud storage using Google App Engine

Related

Dajngo CSV FIle not download ? When we have a large CSV file download its takes some time?Django 502 bad gateway nginx error Django

How can I download a large CSV file that shows me a 502 bad gateway error?
I get this solution I added in below.
Actually, in this, we use streaming references. In this concept for example we download a movie it's will download in the browser and show status when complete this will give the option to show in a folder same as that CSV file download completely this will show us.
There is one solution for resolving this error to increase nginx time but this is will affect cost so better way to use Django streaming. streaming is like an example when we add a movie for download it's downloading on the browser. This concept is used in Django streaming.
Write View for this in Django.
views.py
from django.http import StreamingHttpResponse
503_ERROR = 'something went wrong.'
DASHBOARD_URL = 'path'
def get_headers():
return ['field1', 'field2', 'field3']
def get_data(item):
return {
'field1': item.field1,
'field2': item.field2,
'field3': item.field3,
}
class CSVBuffer(object):
def write(self, value):
return value
class Streaming_CSV(generic.View):
model = Model_name
def get(self, request, *args, **kwargs):
try:
queryset = self.model.objects.filter(is_draft=False)
response = StreamingHttpResponse(streaming_content=(iter_items(queryset, CSVBuffer())), content_type='text/csv', )
file_name = 'Experience_data_%s' % (str(datetime.datetime.now()))
response['Content-Disposition'] = 'attachment;filename=%s.csv' % (file_name)
except Exception as e:
print(e)
messages.error(request, ERROR_503)
return redirect(DASHBOARD_URL)
return response
urls.py
path('streaming-csv/',views.Streaming_CSV.as_view(),name = 'streaming-csv')
For reference use the below links.
https://docs.djangoproject.com/en/4.0/howto/outputting-csv/#streaming-large-csv-files
GIT.
https://gist.github.com/niuware/ba19bbc0169039e89326e1599dba3a87
GIT
Adding rows manually to StreamingHttpResponse (Django)

Limit Amount Of Files A User Can Upload

I have a multi-file upload and want to limit users to 3 uploads each. My problem is that I need to know how many files a user has already created in the DB and how many they are currently uploading (they can upload multiple files at once, and can upload multiple times).
I have attempted many things, including:
Creating a validator (the validator was passed the actual file being added, not a model, so I couldn't access the model to get it's id to call if StudentUploadedFile.objects.filter(student_lesson_data=data.id).count() >= 4:).
Doing the validation in clean(self): (clean was only passed one instance at a time and the DB isn't updated till all files are cleaned, so I could count the files already in the DB but couldn't count how many were currently being uploaded).
Using a pre-save method (If the DB was updated between each file being passed to my pre-save method it would work, but the DB is only updated after all the files being uploaded have passed through my pre-save method).
My post-save attempt:
#receiver(pre_save, sender=StudentUploadedFile)
def upload_file_pre_save(sender, instance, **kwargs):
if StudentUploadedFile.objects.filter(student_lesson_data=instance.data.id).count() >= 4:
raise ValidationError('Sorry, you cannot upload more than three files')
edit:
models.py
class StudentUploadedFile(models.Model):
student_lesson_data = models.ForeignKey(StudentLessonData, related_name='student_uploaded_file', on_delete=models.CASCADE)
student_file = models.FileField(upload_to='module_student_files/', default=None)
views.py
class StudentUploadView(View):
def get(self, request):
files_list = StudentUploadedFile.objects.all()
return render(self.request, 'users/modules.html', {'student_files': files_list})
def post(self, request, *args, **kwargs):
form = StudentUploadedFileForm(self.request.POST, self.request.FILES)
form.instance.student_lesson_data_id = self.request.POST['student_lesson_data_id']
if form.is_valid():
uploaded_file = form.save()
# pass uploaded_file data and username so new file can be added to students file list using ajax
# lesson_id is used to output newly added file to corresponding newly_added_files div
data = {'is_valid': True, 'username': request.user.username, 'file_id': uploaded_file.id, 'file_name': uploaded_file.filename(),
'lesson_id': uploaded_file.student_lesson_data_id, 'file_path': str(uploaded_file.student_file)}
else:
data = {'is_valid': False}
return JsonResponse(data)
template.py
<form id='student_uploaded_file{{ item.instance.id }}'>
{% csrf_token %}
<a href="{% url 'download_student_uploaded_file' username=request.user.username file_path=item.instance.student_file %}" target='_blank'>{{ item.instance.filename }}</a>
<a href="{% url 'delete_student_uploaded_file' username=request.user.username file_id=item.instance.id %}" class='delete' id='{{ item.instance.id }}'>Delete</a>
</form>
js
$(function () {
// open file explorer window
$(".js-upload-photos").on('click', function(){
// concatenates the id from the button pressed onto the end of fileupload class to call correct input element
$("#fileupload" + this.id).click();
});
$('.fileupload_input').each(function() {
$(this).fileupload({
dataType: 'json',
done: function(e, data) { // process response from server
// add newly added files to students uploaded files list
if (data.result.is_valid) {
$("#newly_added_files" + data.result.lesson_id).prepend("<form id='student_uploaded_file" + data.result.file_id +
"'><a href='/student_hub/" + data.result.username + "/download_student_uploaded_file/" +
data.result.file_path + "' target='_blank'>" + data.result.file_name + "</a><a href='/student_hub/" + data.result.username +
"/delete_student_uploaded_file/" + data.result.file_id + "/' class='delete' id=" + data.result.file_id + ">Delete</a></form>")
}
}
});
});
UPDATE:
forms.py
class StudentUploadedFileForm(forms.ModelForm):
student_file = forms.FileField(widget=forms.ClearableFileInput(attrs={'multiple': True}))
view.py
class StudentUploadView(View):
model = StudentUploadedFile
max_files_per_lesson = 3
def post(self, request, *args, **kwargs):
lesson_data_id = request.POST['student_lesson_data_id']
current_files_count = self.model.objects.filter(
student_lesson_data_id=lesson_data_id
).count()
avail = self.max_files_per_lesson - current_files_count
file_list = request.FILES.getlist('student_file')
print(len(file_list))
if avail - len(file_list) < 0:
return JsonResponse(data={
'is_valid': False,
'reason': f'Too many files: you can only upload {avail}.'
})
else:
for f in file_list:
print(f)
data = {'test': True}
return JsonResponse(data)
Thank you.
I've tried using a PyPi package and it works flawlessly. I'm gonna go out on a limb here and assume that you are open to editing the package code to fix any errors you encounter due to compatibility issues since most of the packages that haven't been updated in quite a while might face them.
To solve the problem of limiting the number of files a user can upload, django-multiuploader package would be of enormous help and would honestly do more than you ask for. And yes, it uses JQuery form for uploading multiple files.
How to use it?
Installation and pre-usage steps
Installation
pip install django-multiuploader
python3 manage.py syncdb
python3 manage.py migrate multiuploader
In your settings.py file :
MULTIUPLOADER_FILES_FOLDER = ‘multiuploader’ # - media location where to store files
MULTIUPLOADER_FILE_EXPIRATION_TIME = 3600 # - time, when the file is expired (and it can be cleaned with clean_files command).
MULTIUPLOADER_FORMS_SETTINGS =
{
'default': {
'FILE_TYPES' : ["txt","zip","jpg","jpeg","flv","png"],
'CONTENT_TYPES' : [
'image/jpeg',
'image/png',
'application/msword',
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
'application/vnd.ms-excel',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
'application/vnd.ms-powerpoint',
'application/vnd.openxmlformats-officedocument.presentationml.presentation',
'application/vnd.oasis.opendocument.text',
'application/vnd.oasis.opendocument.spreadsheet',
'application/vnd.oasis.opendocument.presentation',
'text/plain',
'text/rtf',
],
'MAX_FILE_SIZE': 10485760,
'MAX_FILE_NUMBER':5,
'AUTO_UPLOAD': True,
},
'images':{
'FILE_TYPES' : ['jpg', 'jpeg', 'png', 'gif', 'svg', 'bmp', 'tiff', 'ico' ],
'CONTENT_TYPES' : [
'image/gif',
'image/jpeg',
'image/pjpeg',
'image/png',
'image/svg+xml',
'image/tiff',
'image/vnd.microsoft.icon',
'image/vnd.wap.wbmp',
],
'MAX_FILE_SIZE': 10485760,
'MAX_FILE_NUMBER':5,
'AUTO_UPLOAD': True,
},
'video':{
'FILE_TYPES' : ['flv', 'mpg', 'mpeg', 'mp4' ,'avi', 'mkv', 'ogg', 'wmv', 'mov', 'webm' ],
'CONTENT_TYPES' : [
'video/mpeg',
'video/mp4',
'video/ogg',
'video/quicktime',
'video/webm',
'video/x-ms-wmv',
'video/x-flv',
],
'MAX_FILE_SIZE': 10485760,
'MAX_FILE_NUMBER':5,
'AUTO_UPLOAD': True,
},
'audio':{
'FILE_TYPES' : ['mp3', 'mp4', 'ogg', 'wma', 'wax', 'wav', 'webm' ],
'CONTENT_TYPES' : [
'audio/basic',
'audio/L24',
'audio/mp4',
'audio/mpeg',
'audio/ogg',
'audio/vorbis',
'audio/x-ms-wma',
'audio/x-ms-wax',
'audio/vnd.rn-realaudio',
'audio/vnd.wave',
'audio/webm'
],
'MAX_FILE_SIZE': 10485760,
'MAX_FILE_NUMBER':5,
'AUTO_UPLOAD': True,
}}
Take note of that MAX_FILE_NUMBER, right within their lies the answer to your question. Have a look at the source once you install this and try implementing it on your own if you want. It might be fun.
Refer for further instructions :
django-multiuploader package on pypi
I guess that you can use multi file upload in Django hasn't trickled to the community yet. Excerpt:
If you want to upload multiple files using one form field, set the multiple HTML attribute of field’s widget:
# forms.py
from django import forms
class FileFieldForm(forms.Form):
file_field = forms.FileField(widget=forms.ClearableFileInput(attrs={'multiple': True}))
Your form and view structure is also very contrived, excluding fields from a form and then setting values injected via HTML form on the model instance. However, with the code shown, the model instance would never exist as the form has no pk field.
Anyway - to focus on the problem that needs fixing...
In the form, request.FILES is now an array:
class StudentUploadView(View):
model = StudentUploadedFile
max_files_per_lesson = 3
def post(request, *args, **kwargs):
lesson_data_id = request.POST['student_lesson_data_id']
current_files_count = self.model.objects.filter(
student_lesson_data_id=lesson_data_id
).count()
avail = self.max_files_per_lesson - current_files_count
file_list = request.FILES.get_list('student_file')
if avail - len(file_list) < 0:
return JsonResponse(data={
'is_valid': False,
'reason': f'Too many files: you can only upload {avail}.'
})
else:
# create one new instance of self.model for each file
...
Addressing comments:
From an aesthetic perspective, you can do a lot with styling...
However, uploading async (separate POST requests), complicates validation and user experience a lot:
The first file can finish after the 2nd, so which are you going to deny if the count is >3.
Frontend validation is hackable, so you can't rely on it, but backend validation is partitioned into several requests, which from the user's point of view is one action.
But with files arriving out of order, some succeeding and some failing, how are you going to provide feedback to the user?
If 1, 3 and 4 arrive, but user cares more about 1, 2, 3 - user has to
take several actions to correct the situation.
One post request:
There are no out of order arrivals
You can use a "everything fails or everything succeeds" approach, which is transparent to the end user and easy to correct.
It's likely that file array order is user preferred order, so even if you allow partial success, you're likely to do the right thing.
So the gist of it is that you are using jQuery .each to upload images via AJAX. Each POST request to your Django view is a single file upload, but there might be multiple requests at the same time.
Try this:
forms.py:
class StudentUploadedFileForm(forms.ModelForm):
class Meta:
model = StudentUploadedFile
fields = ('student_file', )
def __init__(self, *args, **kwargs):
"""Accept a 'student_lesson_data' parameter."""
self._student_lesson_data = kwargs.pop('student_lesson_data', None)
super(StudentUploadedFileForm, self).__init__(*args, **kwargs)
def clean(self):
"""
Ensure that the total number of student_uploaded_file instances that
are linked to the student_lesson_data parameter are within limits."""
cleaned_data = super().clean()
filecount = self._student_lesson_data.student_uploaded_file.count()
if filecount >= 3:
raise forms.ValidationError("Sorry, you cannot upload more than three files")
return cleaned_data
views.py:
class StudentUploadView(View):
def get(self, request):
# stuff ...
def post(self, request, *args, **kwargs):
sld_id = request.POST.get('student_lesson_data_id', None)
student_lesson_data = StudentLessonData.objects.get(id=sld_id)
form = StudentUploadedFileForm(
request.POST,
request.FILES,
student_lesson_data=student_lesson_data
)
if form.is_valid():
uploaded_file = form.save()
# other stuff ...

How to zip multiple uploaded file in Django before saving it to database?

I am trying to compress a folder before saving it to database/file storage system using Django. For this task I am using ZipFile library. Here is the code of view.py:
class BasicUploadView(View):
def get(self, request):
file_list = file_information.objects.all()
return render(self.request, 'fileupload_app/basic_upload/index.html',{'files':file_list})
def post(self, request):
zipfile = ZipFile('test.zip','w')
if request.method == "POST":
for upload_file in request.FILES.getlist('file'): ## index.html name
zipfile.write(io.BytesIO(upload_file))
fs = FileSystemStorage()
content = fs.save(upload_file.name,upload_file)
data = {'name':fs.get_available_name(content), 'url':fs.url(content)}
zipfile.close()
return JsonResponse(data)
But I am getting the following error:
TypeError: a bytes-like object is required, not 'InMemoryUploadedFile'
Is there any solution for this problem? Since I may have to upload folder with large files, do I have to write a custom TemporaryFileUploadHandler for this purpose? I have recently started working with Django and it is quite new to me. Please help me with some advice.
InMemoryUploadedFile is an object that contains more than just file you should open file and read it content ( InMemoryUploadedFile.file is the file)
InMemoryUploadedFile.open()
You should open file with open() and then read() it's content, also you should check if you have uploaded files correctly also you could use with syntax for both zip and file
https://www.pythonforbeginners.com/files/with-statement-in-python

How to access file after upload in Django?

I'm working on a web. User can upload a file. This file is in docx format. After he uploads a file and choose which languages he wants to translate the file to, I want to redirect him to another page, where he can see prices for translations. The prices depends on particular language and number of characters in the docx file.
I can't figure out how to handle the file uploaded. I have a function which get's path to file and returns a number of characters. After uploading file and click on submit, I want to call this function so I can render new page with estimated prices.
I've read that I can call temporary_file_path on request.FILES['file'] but it raises
'InMemoryUploadedFile' object has no attribute 'temporary_file_path'
I want to find out how many characters uploaded file contains and send it in a request to another view - /order-estimation.
VIEW:
def create_order(request):
LanguageLevelFormSet = formset_factory(LanguageLevelForm, extra=5, max_num=5)
language_level_formset = LanguageLevelFormSet(request.POST or None)
job_creation_form = JobCreationForm(request.POST or None, request.FILES or None)
context = {'job_creation_form': job_creation_form,
'formset': language_level_formset}
if request.method == 'POST':
if job_creation_form.is_valid() and language_level_formset.is_valid():
cleaned_data_job_creation_form = job_creation_form.cleaned_data
cleaned_data_language_level_formset = language_level_formset.cleaned_data
for language_level_form in [d for d in cleaned_data_language_level_formset if d]:
language = language_level_form['language']
level = language_level_form['level']
Job.objects.create(
customer=request.user,
text_to_translate=cleaned_data_job_creation_form['text_to_translate'],
file=cleaned_data_job_creation_form['file'],
short_description=cleaned_data_job_creation_form['short_description'],
notes=cleaned_data_job_creation_form['notes'],
language_from=cleaned_data_job_creation_form['language_from'],
language_to=language,
level=level,
)
path = request.FILES['file'].temporary_file_path
utilities.docx_get_characters_number(path) # THIS NOT WORKS
return HttpResponseRedirect('/order-estimation')
else:
return render(request, 'auth/jobs/create-job.html', context=context)
return render(request, 'auth/jobs/create-job.html', context=context)
The InMemoryUploadedFile does not provide temporary_file_path. The content lives 'in memory' - as the class name implies.
By default Django uses InMemoryUploadedFile for files up to 2.5MB size, larger files use TemporaryFileUploadHandler. where the later provides the temporary_file_path method in question. Django Documentation
So an easy way would be to change your settings for FILE_UPLOAD_HANDLERS to always use TemporaryFileUploadHandler:
FILE_UPLOAD_HANDLERS = [
'django.core.files.uploadhandler.TemporaryFileUploadHandler',
]
Just keep in mind that this is not the most efficient way when you have a site with a lot of concurrent small upload requests.

Populate model with metadata of file uploaded through django admin

I have two models,Foto and FotoMetadata. Foto just has one property called upload, that is an upload field. FotoMetadata has a few properties and should receive metadata from the foto uploaded at Foto. This can be done manually at the admin interface, but I want to do it automatically, i.e: when a photo is uploaded through admin interface, the FotoMetadata is automatically filled.
In my model.py I have a few classes, including Foto and FotoMetadata:
class Foto(models.Model):
upload = models.FileField(upload_to="fotos")
def __str__(self):
return '%s' %(self.upload)
class FotoMetadata(models.Model):
image_formats = (
('RAW', 'RAW'),
('JPG', 'JPG'),
)
date = models.DateTimeField()
camera = models.ForeignKey(Camera, on_delete=models.PROTECT)
format = models.CharField(max_length=8, choices=image_formats)
exposure = models.CharField(max_length=8)
fnumber = models.CharField(max_length=8)
iso = models.IntegerField()
foto = models.OneToOneField(
Foto,
on_delete=models.CASCADE,
primary_key=True,
)
When I login at the admin site, I have an upload form related to the Foto, and this is working fine. My problem is that I can't insert metadata at FotoMetadata on the go. I made a function that parse the photo and give me a dictionary with the info I need. This function is called GetExif is at a file called getexif.py. This will be a simplified version of it:
def GetExif(foto):
# Open image file for reading (binary mode)
f = open(foto, 'rb')
# Parse file
...
<parsing code>
...
f.close()
#create dictionary to receive data
meta={}
meta['date'] = str(tags['EXIF DateTimeOriginal'].values)
meta['fnumber'] = str(tags['EXIF FNumber'])
meta['exposure'] = str(tags['EXIF ExposureTime'])
meta['iso'] = str(tags['EXIF ISOSpeedRatings'])
meta['camera'] =str( tags['Image Model'].values)
return meta
So, basically, what I'm trying to do is use this function at admin.py to automatically populate the FotoMetadata when uploading a photo at Foto, but I really couldn't figure out how to make it. Does any one have a clue?
Edit 24/03/2016
Ok, after a lot more failures, I'm trying to use save_model in admin.py:
from django.contrib import admin
from .models import Autor, Camera, Lente, Foto, FotoMetadata
from fotomanager.local.getexif import GetExif
admin.site.register(Autor)
admin.site.register(Camera)
admin.site.register(Lente)
admin.site.register(FotoMetadata)
class FotoAdmin(admin.ModelAdmin):
def save_model(self, request, obj, form, change):
# populate the model
obj.save()
# get metadata
metadados = GetExif(obj.upload.url)
# Create instance of FotoMetadata
fotometa = FotoMetadata()
# FotoMetadata.id = Foto.id
fotometa.foto = obj.pk
# save exposure
fotometa.exposure = metadados['exposure']
admin.site.register(Foto, FotoAdmin)
I thought it would work, or that I will have problems saving data to the model, but actually I got stucked before this. I got this error:
Exception Type: FileNotFoundError
Exception Value:
[Errno 2] No such file or directory: 'http://127.0.0.1:8000/media/fotos/IMG_8628.CR2'
Exception Location: /home/ricardo/Desenvolvimento/fotosite/fotomanager/local/getexif.py in GetExif, line 24
My GetExif function can't read the file, however, the file path is right! If I copy and paste it to my browser, it downloads the file. I'm trying to figure out a way to correct the address, or to pass the internal path, or to pass the real file to the function instead of its path. I'm also thinking about a diferent way to access the file at GetExif() function too. Any idea of how to solve it?
Solution
I solved the problem above! By reading the FileField source, I've found a property called path, which solve the problem. I also made a few other modifications and the code is working. The class FotoAdmin, at admin.py is like this now:
class FotoAdmin(admin.ModelAdmin):
def save_model(self, request, obj, form, change):
# populate the model
obj.save()
# get metadata
metadados = GetExif(obj.upload.path)
# Create instance of FotoMetadata
fotometa = FotoMetadata()
# FotoMetadata.id = Foto.id
fotometa.foto = obj
# set and save exposure
fotometa.exposure = metadados['exposure']
fotometa.save()
I also had to set null=True at some properties in models.py and everything is working as it should.
I guess you want to enable post_save a signal
read : django signals
Activate the post_save signal - so after you save a FOTO you have a hook to do other stuff, in your case parse photometa and create a FotoMetadata instance.
More, if you want to save the foto only if fotometa succeed , or any other condition you may use , pre_save signal and save the foto only after meta foto was saved.