I have a website where I allow people to download files, such as Powerpoint templates that they can reuse.
I added a column to the model that stores information and location of these files called 'Downloads' and in that column I'd like to use it as a counter to track how many times the file has been downloaded.
How would I go about this? Would it be a GET request on the page? Basically I just provide a link to the file and the download starts automatically, so not sure how to relay the fact that the link was clicked back to the model.
Many thanks :-)
If you create a view that handles the GET request you could put the updating code in that view. If your Django-app is not handling the uploading itself, you can create a view that simply redirects to the download link, after updating the counter in the database.
You have several ways to do this:
create a custom view, that "proxies" all files, while keeping track of downloads
create a middleware that does pretty much the same as above, filtering which requests to record
..but none of the above will be applicable if you want to count downloads of collected static files, that will be served directly by your http server, without passing through django. In this case, I'd retrieve the downloads count from the webserver logs; have a look if your webserver allows storing logs to database, otherwise I'd create a cron-running scripts that parses the logfiles and stores the count to a db, accessible from your django application.
Like redShadow said you can create a proxie view. This view could serve the files via mod_xsendfile (if you are using apache as webserver) and set a counter for the downloads.
Related
I am creating a forum using Gatsby
I have develop a form that users can use to create threads to add to the forum in a page called create.js which which sends the data to an external DB.
Once, the user has submitted the thread, I want to create a new page using a template, normally I would use in Gatsby-node.js; according to the Gatsby Docs Gatsby-node.js is only run once on deployment.
Is there another way that I can access CreatePage() outside of Gatsby-node.js or is there another function I am missing?
Ultimately I want the new page to available in the Gatsby application, without redeploying, after the user has created the necessary content.
The way Gatsby works is that all pages need to be generated at build time. You cannot add new pages without triggering a new build.
Gatsby is not a suitable platform for a forum since content changes hundreds or thousands of times a day. Gatsby is intended for content that changes infrequently such as blogs (which might update a few times a day).
I order to generate pages without using CreatePage in Gatsby-node.js, Gatsby advises to use #reach/Router and matchPage to extend the client application's router, we call this functionality (Client-only Routes)
more info here
I am working on a django app to store user pics and photos.
What is the optimal approach to store individual user media.
File Sizes are no more than 5MB.
The data is persistent.
The approach i have in mind is:
On form data submission, Upload it to an FTP server using django-storages.
Store the url and fetch it via http later for user.
How to save upload files to another server
I have seen the answers and I don't know what type of queue needs to be used.
you'd usually save the file locally and then latter upload it to some cloud service asynchronously, preferably using something like django-celery
see this answer
My setup is: Django 1.3/Python 2.7.2/Win Server 2008 R2/IIS 7.5/MS SQL Server 2008 R2. I am developing an application whose main function is to analyze uploaded files and produce a report.
Reading over the documentation for django-filetransfers, I believe this is a solution to a problem I've been trying to solve for a while (i.e. form-based file uploads completely block all Django responses until the file-transfer finishes...horror for even moderate-sized files).
The documentation talks about piping uploads to S3 or Blobstore, and that might be what I end up doing eventually, but during development I thought maybe I could just set up my own "poor-man's S3" on a server that I control. This would basically just be another Django instance (or possibly a simple ASP.NET app) whose sole purpose is to receive uploaded files. This sounds like it should be possible with django-filetransfers and would solve the problem of Django responsiveness (???).
But I am missing some bits of understanding how this works in general, as well as some specifics. Maybe an example will help: let's say I have MyMainDjangoServer and MyFileUploadServer. MyMainDjangoServer will serve the views, including the upload form. MyFileUploadServer will "catch" the uploaded files. My questions/confusion are as follows:
My upload form will contain additional fields beyond just the file(s)...do I understand correctly that MyMainDjangoServer will somehow still get that form data, minus the file data (basically: request.POST), and the file data gets shunted over to MyFileUploadServer? How does this work? Will MyMainDjangoServer still block during the upload to MyFileUploadServer?
I assume that what I would need to do on MyFileUploadServer is have a view/URL that handles the form request and sucks out the request.FILES data. What else needs to happen? What happens to the rest of the form data?
How would I set up my settings.py for this scenario? The django-filetransfers examples seem to assume either S3 or GAE/Blobstore but maybe I am missing some basics.
Any advice/answers appreciated...this is a confusing and frustrating area of Django for me.
"MyMainDjangoServer will somehow still get that form data, minus the file data (basically: request.POST), and the file data gets shunted over to MyFileUploadServer? How does this work? Will MyMainDjangoServer still block during the upload to MyFileUploadServer?"
I know the GAE Blobstore, presumably S3 as well, handles this by requiring you to give it a success_url. In your case that would be the url on MyMainDjangoServer where your file receiving view on MyFileUploadServer would re-post the non-files form data to once the upload is complete.
Have a look at the create_upload_url method here: https://developers.google.com/appengine/docs/python/blobstore/functions
You need to recreate this functionality in some form (see below).
"How would I set up my settings.py for this scenario?"
You'd need to create your own filetransfers backend which would be a file with a prepare_upload function in it.
You can see the App Engine one here:
https://github.com/django-nonrel/djangoappengine/blob/develop/storage.py
The prepare_upload method just wraps the GAE create_upload_url method mentioned above.
So in your settings.py you'd have something like:
PREPARE_UPLOAD_BACKEND = 'myapp.filetransfers_backend.prepare_upload'
(i.e. the import path to your prepare_upload function)
For the rest you can start with the ones provided by filetransfers already:
SERVE_FILE_BACKEND = 'filetransfers.backends.url.serve_file'
# if you need it:
PUBLIC_DOWNLOAD_URL_BACKEND = 'filetransfers.backends.url.public_download_url'
These rely on the file_field.url being set (see Django docs) and since your files will be on a separate server you probably need to look into writing a custom storage backend for Django too. (the S3 and GAE cases assume you're using the custom Django storage backends from here)
Ok, I know that serving media files through Django is a not recommended. However, I'm in a situation where I'd like to serve "static" files using fine-grained access control through Django models.
Example: I want to serve my movie library to myself over the web. I'm often travelling and I'd like to be able to view any of my movies wherever I am, provided I have internet access. So I rip my DVDs, upload them to my server and build this simple Django application coupled with some embeddable video player.
To avoid any legal repercussions, I'd like to ensure that only logged-on users with the proper permissions (i.e. myself and people living in the same household, which can, like me, access the real DVDs at their convenience), but denies it to other users (i.e. people who posted comments on my blog) and returns an HTTP 404.
Now, serving these files directly using Apache and mod_wsgi is rather troublesome because when an HTTP request for the media files (i.e. http://video.mywebsite.com/my-favorite-movie/) comes in, I need to validate against my user database that the person at the other end has the proper permissions.
Question: can I achieve this effect without serving the media files directly through a Django view? What are my options?
One thing I did think of is to write a simple script that takes a session ID and a video's slug and returns some boolean indicating if the user may (or may not) access the video file. Then, somehow request mod_wsgi to execute this script before accessing the requested URL and return an HTTP 404 if the script failed. However, I don't have a clue if this is even possible.
Edit: Posting this question clarified some of my ideas for search and I've come across mod_python's file wrapper extension. Does anyone have enough experience with that to validate that it is a viable solution?
Yes, you can hook into Django's authentication from Apache. See this how-to:
Authenticating against Django’s user database from Apache
I have 50 different websites that use the same layout and code base, but mostly non-overlapping data (regional support sites, not link farm). Is there a way to have a single installation of the code and run all 50 at the same time?
When I have a bug to fix (or deploy new feature), I want to deploy ONE time + 1 restart and be done with it.
Also:
Code needs to know what domain the request is coming to so the appropriate data is displayed.
The Sites framework comes to mind.
Apart from that we have Django running for multiple sites by symlinking Django to various docroots. Works like a charm, too.
I can see two quite distinct ways to do this:
Use one database and the sites framework. Every post/picture/whatever model is connected to a Site and you always filter on Site. This requires a separate settings file for every database.
Use one database for each and every site. This allows different users for every site, but requires duplication of everything that is stored in the database. It also requires a separate settings file pointing to the correct database.
Either way, you do not duplicate any code, only data.
--
If you need to do site-specific, or post-specific changes to ie. a template, you should read up on how Django loads templates. It allows you to specify a list, ie ["story_%d.html", "story_site_%d.html", "story.html"] and django will look for the templates in that order.
I just ran into this and ended up using a custom middleware class that:
Fetch the HTTP_HOST
Clean the HTTP_HOST (remove www, ports, etc.)
Look up domain in a Website table that's tied to each account.
Set the account instance on the HTTPRequest object.
The throughout my view code I do lookups based on the account stored in the HTTPRequest objects.
Hope that helps someone in the future.