One of my view functions is a very long processing job and clearly needs to be handled differently.
Instead of making the user wait for long time, it would be best if I were able to lunch the processing job which would email the results, and without waiting for completion notify the user that their request is being processed and let them browse on.
I know I can use os.fork, but I was wondering if there is a 'right way' in terms of Django. Perhaps I can return the HTTP response, and than go on with this job somehow?
There are a couple of solutions to this problem, and the best one depends a bit on how heavy your workload will be.
If you have a light workload you can use the approach used by django-mailer which is to define a "jobs" model, save new jobs into the database, then have cron run a stand-alone script every so often to process the jobs stored in the database (deleting them when done). You can use something like django-chronograph to manage the job scheduling easier
If you need help understanding how to write a script to process the job see James Bennett's article Standalone Django Scripts for help.
If you have a very high workload, meaning you'll need more than a single server to process the jobs, then you want to use a real distribute task queue. There is a lot of competition here so I can't really detail all the options, but a good one to use with for Django apps is celery.
Why not simply start a thread to do the processing and then go on to send the response?
Before you select a solution, you need to determine how the process will be run. I.e is it the same process for every single user, the data is the same and can be scheduled regularly? or does each user request something and the results are slightly different ?
As an example, if the data will be the same for every single user and can be run on a schedule you could use cron.
See: http://www.b-list.org/weblog/2007/sep/22/standalone-django-scripts/
or
http://docs.djangoproject.com/en/dev/howto/custom-management-commands/
However if the requests will be adhoc and you need something scalable that can handle high load and is asynchronous: what you are actually looking for is a message queing system. Your view will add a request to the queue which will then get acted upon.
There are a few options to implement this in Django:
Django Queue service is purely django & python and simple, though the last commit was in April and it seems the project has been abandoned.
http://code.google.com/p/django-queue-service/
The second option if you need something that scales, is distributed and makes use of open source message queing servers: celery is what you need
http://ask.github.com/celery/introduction.html
http://github.com/ask/celery/tree
Related
I am building an app, where I need to fetch some data from an API and update all the models with that data every few minutes.
What would be a clean way to accomplish something like this?
Well, it's a quite open question.
You'll need to create a task that runs every few minutes, you can do this with Celery. Celery has a task schedluer http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html wich will launch a certain function at a configured time similar to a crontab
The task then would fetch the data, http://docs.python-requests.org/en/master/ is a very good library to make http requests.
And lastly but no less important you would need to serialize the fetched data and save it to your model. Django rest framework serializing capabilities are a great starting point, but if data structure is simple enough you can just use JSON python library json.loads(data) and create a function that translate the fields on the API to the fields of the model.
By the way, I'm supposing a REST API.
You can use a task management tool that has the feature of running periodic tasks in the intervals you specify, like Periodic Tasks in Celery.
Also, If you run your code on a Unix-like system, you can stick with the core django functionality. Just write your functionality as a Django Management Command and set a cronjob to run it in your preferred interval.
I'm making a web app using Django.
I'd like events to trigger 'background' tasks that run parallel to the Django application. (by parallel I just mean that they don't impact the speed of the users experience)
Types of tasks I'm talking about
a user logs in and an event is trigged to start populating that users cache in anticipation of future requests based on their use habits.
a user posts some data to the database but that post triggers an api call to another website where the returned data will be parsed, aggregated and used to supplement that users post.
rolling updates of data used in the app through api calls to other websites
aggregating data and running general maintenance tasks.
After a few days of research I'm thinking that I should use twisted to accomplish this. Which has lead me to my question:
Is twisted overkill for what I'm trying to accomplish?
Many of these tasks are far more i/o bound than cpu bound. So I'm thinking asynchronous is best.
Anyone advice would be appreciated.
Thank you
Yes, I think it's overkill.
Rather than fold in a full async framework such as Twisted, with all the technical overhead that brings, you might be better off using a task queue to be able to do what you want as a background process.
When your app needs to do a background task (anything that would otherwise block the request/response cycle), put the task in the queue and let a separate worker process pick things off the queue and deal with them as fast as it can. (You can always add more workers).
Two of the most popular queue libraries for Python/Django are celery and rq. They're especially good with Redis as a backend, but there are other backend options, too.
Personally, I much prefer rq over celery, in terms of its API and its clean setup, but both are used by a lot of people.
And both are definitely easier to get your head around than something like Twisted, IMO.
I have these tables
exam(id, start_date, deadline, duration)
exam_answer(id, exam_id, answer, time_started, status)
exam_answer.status possible values are 0-not yet started 1-started 2-submitted
Is there a way to update exam_answer.status once now - exam_answer.time_started is greater than exam.duration? Or if it is already past deadline?
I'll also mentioning this if it might help me better, I'm building this for a django project.
Django applications, like any other WSGI/web application, are only meant to handle request-response flows. If there aren't any requests, there is no activity and such changes will not happen.
You can either write a custom management command that's executed periodically by a cron job, but you run into the risk of possibly displaying incorrect data. You have elegant means at your disposal to compute the statuses before any related views start their processing, but this might be potentially a wasteful use of resources.
Your best bet might be to integrate a task scheduler with your application, such as Celery. Do not be discouraged because Celery seemingly runs in a concurrent multiprocess environment across several machines--the service can be configured to run in a single-thread and it provides a clean interface for scheduling such tasks that have to run at some exact point in the future.
I'm building a django app which lists the hot(according to a specific algorithm) twitter trending topics.
I'd like to run some processes indefinitely to make twitter API calls and update the database(postgre) with the new information. This way the hot trending topic list gets updated asynchronously.
At first it seemed to me that celery+rabbitmq were the solution to my problem, but from what I understand they are used within django to launch scheduled or user triggered tasks, not indefinitely running tasks.
The solution that comes to my mind is write a .py file to continually put trending topics in a queue and write independent .py files continually running, making get queue requests and saving the data in the db used by django with raw SQL or SQLAlchemy. I think that this could work, but I'm pretty sure there is a much better way to do it.
If you just need to keep some processes running continually, supervisor is a nice solution.
You can combine it with any queuing technology you like to push things into your queues.
I'm writing some auction functionality for a website powered by Django, and there is an option to have the auction be extended by a few seconds every time somebody bids (so that there is no last-minute sniping). When the auction is finished, certain things need to be done like notifying the winner of their victory and notifying the seller to whom they need to send the item.
How, in Django, do I go about running an auction-completion script whenever an auction ends, especially when the auction end time may be a moving target?
Thanks.
It sounds like extending an auction_end DateTime field could easily be done on the bid() view
As far as emailing outcomes you should write a management script http://docs.djangoproject.com/en/dev/howto/custom-management-commands/ checking for auctions that have ended and haven't had notifications sent and schedule this script to run in cron at regular intervals (by minute if neccessary). An example would be pinax's django-notification emit_notices script.
http://github.com/jtauber/django-notification/blob/master/notification/management/commands/emit_notices.py
Better still you might be able to leverege django-notification itself.
Django command extensions provides jobs system (job scheduling). After you add it to the crontab it is capable of running jobs hourly/daily/weekly/monthly.
I think that it should be easy to extend job management command to acomplish your tasks.
Sounds like you probably want to have whatever code is responsible for closing the auction fire a custom signal to indicate that it's happening.