I am developing two web sites using the Django framework.
The thing is - one site is sharing part of the content from the other one.
They both use different amazon WS buckets to store images, etc.
So for the site which shares some content with another one I need to specify a different MEDIA_URL, but it seems impossible cause 'upload_to' and 'storage' parameters of the FileField only influence the file being uploaded.
Is there any way to use another storage when displaying image after it was uploaded?
Related
I have a Django app running on a server. Currently user uploads are stored on the server filesystem.
I already know how to set up S3 storage. What is the best way to migrate existing uploads to S3 without breaking the API and having existing uploads still available?
These files are served to the front end in two ways:
Directly through a /media/path/to/upload endpoint:
/media/<path> django.views.static.serve
Through a viewset:
/api/v1/users/<user_pk>/items/<item_pk>/attachments/<pk>/ project.app.views.ItemAttachmentsViewSet
Does the following make sense:
Change the storage backend
Go through all model objects
Save each model object to get the files uploaded
Have /media/path go to a new view that will serve the files similar to how ItemAttachmentsViewSet does it.
? Or is there a better way?
The procedure outlined in the question was what I ended up doing, with the exception of step 4 which turned out to be unnecessary.
I have one Django application running GAE.The application uses content folder which contains images and html snippets.The content folder was uploaded in google cloud storage.I would like to render a image in static file using img tag.For using img tag I want to know the url of that image.I have seen that when we set the permission to share publicly it will give us a url.But I don't want to share that files publicly.If I share an another application can use my files.I don't want that.I there any way to do that with out log in a user
Sharing it publicly is the best way to go.
You could also base64 encode the image data when you render out the template, which means the url of the image will not be shown to the public on your page. Then you can obfuscate the image names in the GCS. This way it's still public but hard to reach.
I am working on a django app to store user pics and photos.
What is the optimal approach to store individual user media.
File Sizes are no more than 5MB.
The data is persistent.
The approach i have in mind is:
On form data submission, Upload it to an FTP server using django-storages.
Store the url and fetch it via http later for user.
How to save upload files to another server
I have seen the answers and I don't know what type of queue needs to be used.
you'd usually save the file locally and then latter upload it to some cloud service asynchronously, preferably using something like django-celery
see this answer
I'm developing a django 1.4 application, and I'm using django-filepicker to upload images. They use a special modelField for images called FPFileField.
I'm interested in storing those images in a MongoDB database, instead of uploading it to a "media" folder or something similar.
Any idea how this can be achieved?
I've tried mongoengine with no luck.
I would recommend storing the url into your mongo Document directly and then making use of the filepicker.io URL and built in S3 support rather than storing the file on your servers directly.
You should look into GridFS
I'd also consider uploading it to amazon's S3 service too - depending on what your requirements are.
My setup is: Django 1.3/Python 2.7.2/Win Server 2008 R2/IIS 7.5/MS SQL Server 2008 R2. I am developing an application whose main function is to analyze uploaded files and produce a report.
Reading over the documentation for django-filetransfers, I believe this is a solution to a problem I've been trying to solve for a while (i.e. form-based file uploads completely block all Django responses until the file-transfer finishes...horror for even moderate-sized files).
The documentation talks about piping uploads to S3 or Blobstore, and that might be what I end up doing eventually, but during development I thought maybe I could just set up my own "poor-man's S3" on a server that I control. This would basically just be another Django instance (or possibly a simple ASP.NET app) whose sole purpose is to receive uploaded files. This sounds like it should be possible with django-filetransfers and would solve the problem of Django responsiveness (???).
But I am missing some bits of understanding how this works in general, as well as some specifics. Maybe an example will help: let's say I have MyMainDjangoServer and MyFileUploadServer. MyMainDjangoServer will serve the views, including the upload form. MyFileUploadServer will "catch" the uploaded files. My questions/confusion are as follows:
My upload form will contain additional fields beyond just the file(s)...do I understand correctly that MyMainDjangoServer will somehow still get that form data, minus the file data (basically: request.POST), and the file data gets shunted over to MyFileUploadServer? How does this work? Will MyMainDjangoServer still block during the upload to MyFileUploadServer?
I assume that what I would need to do on MyFileUploadServer is have a view/URL that handles the form request and sucks out the request.FILES data. What else needs to happen? What happens to the rest of the form data?
How would I set up my settings.py for this scenario? The django-filetransfers examples seem to assume either S3 or GAE/Blobstore but maybe I am missing some basics.
Any advice/answers appreciated...this is a confusing and frustrating area of Django for me.
"MyMainDjangoServer will somehow still get that form data, minus the file data (basically: request.POST), and the file data gets shunted over to MyFileUploadServer? How does this work? Will MyMainDjangoServer still block during the upload to MyFileUploadServer?"
I know the GAE Blobstore, presumably S3 as well, handles this by requiring you to give it a success_url. In your case that would be the url on MyMainDjangoServer where your file receiving view on MyFileUploadServer would re-post the non-files form data to once the upload is complete.
Have a look at the create_upload_url method here: https://developers.google.com/appengine/docs/python/blobstore/functions
You need to recreate this functionality in some form (see below).
"How would I set up my settings.py for this scenario?"
You'd need to create your own filetransfers backend which would be a file with a prepare_upload function in it.
You can see the App Engine one here:
https://github.com/django-nonrel/djangoappengine/blob/develop/storage.py
The prepare_upload method just wraps the GAE create_upload_url method mentioned above.
So in your settings.py you'd have something like:
PREPARE_UPLOAD_BACKEND = 'myapp.filetransfers_backend.prepare_upload'
(i.e. the import path to your prepare_upload function)
For the rest you can start with the ones provided by filetransfers already:
SERVE_FILE_BACKEND = 'filetransfers.backends.url.serve_file'
# if you need it:
PUBLIC_DOWNLOAD_URL_BACKEND = 'filetransfers.backends.url.public_download_url'
These rely on the file_field.url being set (see Django docs) and since your files will be on a separate server you probably need to look into writing a custom storage backend for Django too. (the S3 and GAE cases assume you're using the custom Django storage backends from here)