"NotImplementedError at /myfile/download/8" while downloading zip file in Django - django

I am trying to download zip file which contains multiple files in same or different formats.
This zip file get downloaded after clicking on a button "Download".
This functionality works perfectly on local development server. But after deploying the web app on Google cloud it throws
"NotImplementedError at /myfile/download/8"
This backend doesn't support absolute paths.
...
Cloud storage has respective path with the file still it is not working, why?
Everything is working fine on local machine, but fails on production, why?
Please help! Thanks in advance.

I think that calling the path() property of the FileSystemStorage class, or the url() property where the contents of the file referenced by name can be accessed and then using absolute URLs for downloading files from either of the two is what's flagging this error:
“For storage systems that aren’t accessible from the local filesystem,
this will raise NotImplementedError instead.”
You should try and avoid saving to absolute paths; there is a File Storage API which abstracts these types of operations for you. Looking at the documentation, it appears that the save() function supports passing a file-like object instead of a path.
Also, if you need to serve a zip file download in Django, you just need to serve it as attachment using Django's HttpResponse:
from django.http.response import HttpResponse
def zip_file_view(request):
response = HttpResponse(open('/path/to/your/zipfile.zip', 'rb'), content_type='application/zip')
response['Content-Disposition'] = 'attachment; filename=any_name_you_like.zip'
return response

Related

Django google app engine No such file or directory

I have a Django 2.x with python 3.6 site in Google Cloud, the app is in app engine flex. (my first app :)
My app has an upload page, where I am asking the user upload a JSON file (that is never kept in the site), what I do is open it and generate another file from it
I know that django depending on the size of the file it goes into memory but I was never able to use this functionality, so what I did in local env, was creating a folder that I called, temp_reports, so I created the files here, uploaded them into a bucket and then deleted them, from temp_reports.
So I was thinking, as the site is already in gcloud, if I can directly create these files into the bucket? or do I still need to generate them in the site and then upload them?
Now if it is from my site I keep getting the following error:
Exception Value:
[Errno 2] No such file or directory: '/home/vmagent/app/temp_reports/file_516A3E1B80334372ADB440681BB5F030.xlsx
I had in my app.yaml
handlers:
- url: /temp_reports
static_dir: temp_reports
Is there something I am missing? in order to use temp_reports?
Or how can I create a file directly into my bucket?
You can certainly use the Storage Bucket without having to upload the file manually. This can be done by Google Cloud Storage client library (Preferred Method) . It allows you to store and retrieve data directly from the Storage Bucket. Secondly, you can use Cloud Storage API to do the same functionality but requires more efforts to set it up.
You want to use the upload_from_string method from google.cloud.storage.blob.Blob.
upload_from_string(data, content_type='text/plain', client=None,
predefined_acl=None)
So to create a text file directly on the bucket you could do this:
storage_client = storage.Client()
bucket = storage_client.get_bucket(‘mybucket’)
blob = bucket.blob(‘mytextfile.txt’)
blob.upload_from_string('Text file contents', content_type='text/plain')
For more information you can refer to the following page:
https://googleapis.github.io/google-cloud-python/latest/storage/blobs.html#google.cloud.storage.blob.Blob.upload_from_string

How can I upload and download files with graphene-django?

I'm currently using graphene-django v2.0 and I've absolutely no clue of how can I upload and download files like images, does anyone have an example of a query where you can download an Image and a mutation where you can upload one?
UPLOADS
You don't need to invent your own frontend code to add a file upload to a mutation -- there are existing packages that do this already. For example, apollo-upload-client if you are using Apollo.
To receive an uploaded file on the backend, the files are going to be available in the dictionary request.FILES. So any mutation handling a file upload needs to examine info.context.FILES.items to get and save the file data. The specifics of this code are going to depend on the ultimate destination of the saved file.
(UPDATE) However, if possible I would recommend not using graphene-django to upload files because it adds a large amount of complexity on both the backend and the frontend. My team ultimately scrapped our working graphene-django file upload code and replaced it with a standard Django file upload.
DOWNLOADS
For downloads, I would recommend not using graphQL for the actual download. Instead create a Django function view that returns a HttpResponse or FileResponse and sets the Content-Disposition header. Something like
from django.http import HttpResponse
def download(request):
... do stuff to get file contents and file name and mime_type
response = HttpResponse(file_contents, content_type=mime_type)
response['Content-Disposition'] = 'attachment; filename="{}"'.format(file_name)
return response
Then add this download path to your urls.py and to a graphQL query response. So graphQL would be used to get the download path, but actually downloading the file would be a regular Django page.

Serve static HTML in Django

I’m pretty new to Django so forgive me if this is something I shouldn’t even consider, but I’d like my app to be able to link to a large number of static HTML pages (enough that creating URL paths to each one would be unreasonable) and there would constantly be more being uploaded (by me via FTP).
I’ve got this working on the the development server by adding the path to those HTML files to my STATICFILES_DIRS [] but that doesn’t seem to work when I push my code to production.
I tried setting up a STATIC_ROOT and running collectstatic but that didn’t help, and I’m also worried that even if I got that working, I would have to run collectstatic each time I uploaded new files.
So my question is, is this even reasonable? Should I just avoid hosting these static HTML files alongside my project and leave them where they are now, on a separate server under a separate domain?
The only reason I wanted to host them together initially is because along with the static HTML files, there is an SQL LITE database that my Django app displays data from (this is the whole purpose of my app). However, I could just implement another method of getting that SQL LITE file like using ftlib. Note that I don’t need that database to connect to Django at all - I just needs to read data from it.
You don't need to write urls for every page. You can "capture" the requested page name from the url and render the page according to its value.
# urls.py
url(r'^page/(?P<page_name>\w+)/$', my_view)
# views.py
import os
from django.http import HttpResponse, Http404
FTP_UPLOAD_DIR = '/path/to/directory/where/you/upload/files/'
def my_view(request, page_name):
# check if requested page exists
if os.path.exists(FTP_UPLOAD_DIR + page_name):
# if yes, then serve the page
with open(FTP_UPLOAD_DIR + page_name) as f:
response = HttpResponse(f.read())
return response
else:
raise Http404
Above, we are reading the file directly from the upload folder, so there's no need for you to run collectstatic.

How to upload file in django google app engine?

I am new to Google App Engine. I have deploy the pure django application in google app engine. It is working fine. But I wrote the django file upload functionality as shown below,
def forhandler(list, project_id, task_id, request):
for i in list:
filename = i
extension = filename.content_type
print i
newdoc = FileUpload(project_id=project_id, task_id=task_id, file=filename, filetype=extension)
newdoc.save()
When I run the ./manage.py runserver. The above functionality working correctly and uploading files correctly. When I use google app engine means dev_appserver.py my_project. It is perfect but when I upload the file using above functionality It gives an error as shown below,
Exception Value: [Errno 30] Read-only file system: u'/home/nyros/Desktop/projectstat/projectstat/media/documents/2013/05/24/1354676051_chasm_fishing_w1.jpeg'
How do I upload the file using django with google app Engine ? Please solve my problem.Thanks......
The problem here is that App Engine uses a read-only filesystem, and the default Django file upload mechanism wants to store files on disk. (Regardless of whether you're using App Engine or not, storing images in an ordinary database is a bad idea.)
Instead, you should use AppEngine's Blobstore API to save the image. This is special storage App Engine provides for storing large data uploaded by users.
The good news is there's a plugin that takes care of all of this for you: http://www.allbuttonspressed.com/projects/django-filetransfers
Just follow the instructions there and you should be in business.
(Note: I've only tried this with django-nonrel. I've never tried with with vanilla Django.)
The best way to upload files in google app engine with python is using the blobstorehandler.
class Upload(blobstore_handlers.BlobstoreUploadHandler):
for upload in self.get_uploads():
try:
img = Image()
img.primary_image = upload.key()
img.put()
except:
pass

Django private file upload

I would like admin users to be able to attach private arbitrary files related to my models a la Share Point. As Django is primarily used as a publication platform, all of the examples I've found so far upload files to the static directory, where they are publicly accessible. How can I allow an admin to upload files that have the same auth permissions as the model they are related to?
What you can do is set a certain location in your web server that is classified as internal only, with business logic in your Django application that will send a redirect with a certain header that will allow your web server to serve the file statically.
I have previously done this with Nginx using X-Accel-Redirect so I am going to use that for my example, but I believe there is equivalent functionality in Apache2 and other web servers (X-Sendfile, I think?).
In your Nginx config, set up a location that serves the directory where you are uploading your access-protected files:
location /protected/ {
internal;
alias /var/www-priv/;
}
Files in this directory will not be accessible externally at the URL of /protected/{filepath}, but will be if you return a response from your Django application with the header X-Accel-Redirect = /protected/{filepath}.
Create a view with a url like /media/{filepath} in which you perform the necessary business logic to control access to it (you may want to make the path params slightly more detailed such that you can capture the app label, model and object ID which the file is attached for the purposes of access control e.g. /media/{app_label}/{model}/{object_id}/{filename})
You then just do
response = HttpResponse()
response['X-Accel-Redirect'] = "/protected" + filepath
return response
and Bob's your uncle - the user will be served the protected file.
I think the most powerful way to do this is writing a custom file storage
It's no so difficult (but may be overkill for your needs). Then you can bind your custom storage to your model in this way:
from django.db import models
from django.db.models.fields.files import FileField
from myapp.storage import MyCustomStorage
class MyModel(models.Model):
path = FileField( ... , storage=MyCustomStorage(), blank=False)
Then, you can implement the business logic in your custom storage class. Doing this, you can store your private files on a local file system, data base, or in a remote system like Google AppEngine.
Cheers!