Options for running on user demand asynchronous / background tasks in Django? - django

My Django app generates a complex report that can take upto 5 minutes to create. Therefore it runs once a night using a scheduled management command.
That's been ok, except I now want the user to be able to select the date range for the report, which means the report needs to be created while the user waits.
What are my options for running the tast in the background? So far I've found these:
Celery - might work but is complex
django-background-tasks looks like the right tool for the job but hasn't been updated for years, last supported Django is 2.2
The report/background task could be generated by AWS Lambda, basically in a microservice. Django calls the Microservice which can execute the background task then call the Django app back once finished. This is what I did last time but not sure it would work now as I'd need to send the microservice 10mb of data to process.
Use subprocess.popen which someone here said worked for them but other reports say it doesn't work from Django.
EDIT: Looks like Django 3.1 onwards supports ASync views and may be the simple solution for this.

Related

What is the best way to transfer large files through Django app hosted on HEROKU?

HEROKU gives me H12 error on transferring the file to an API from my Django application (Understood it's a long running process and there is some memory/worker tradeoff I guess so). I am on one single hobby Dyno right now.
The function just runs smoothly for around 50MB file. The file itself is coming from a different source ( requests python package )
The idea is to build a file transfer utility using Django app on HEROKU. The file will not gets stored in my app side. Its just getting from point A and sending to point B.
Went through multiple discussions along with the standard HEROKU documentations, however I am struggling in between in some concepts:
Will this problem be solved by background tasks really? (If YES, I am finding explanation of the process than the direct way to do it such that I can optimize my flow)
As mentioned in standard docs, they recommend background tasks using RQ package for python, I am using Postgre SQL at moment. Will I need to install and manage Redis Database as well for this. Is this even related to Database?
Some recommend using extra Worker other than the WEB worker we have by default. How does this relate to my problem?
Some say to add multiple workers, not sure how this solve it. Let's say today it starts working for large files using background tasks, what if the load of users at same time increases. How this will impact my solution and how should I plan the mitigation plan around the risks.
If someone here has a strong understanding with respect to the architecture, I am here to listen your experiences and thoughts. Also, let me know if there is other way than HEROKU from a solution standpoint which will make this more easy for me.
Have you looked at using celery to run this as a background task?
This is a very standard way of dealing with requests which take a large time to complete.
Will this problem be solved by background tasks really? ( If YES, I am finding explanation of the process than the direct way to do it such that I can optimise my flow )
Yes it can be solved by background tasks. If you are using something like Celery which has direct support for django, you will be running another instance of your Django application but with a different startup command for Celery. It then keeps reading for new tasks to execute and reads the task name from the redis queue (or rabbitmq - whichever you use as the broker) and then executes that task and updates the status back to redis (or the broker you use).
You can also use flower along with celery so that you have a dashboard to see how many tasks are being executed and what are their statuses etc.
As mentioned in standard docs, they recommend background tasks using RQ package for python, I am using Postgre SQL at moment. Will I need to install and manage Redis Database as well for this. Is this even related to Database?
To use background task with Celery you will need to set up some sort of message broker like Redis or RabbitMQ
Some recommend using extra Worker other than the WEB worker we have by default. How does this relate to my problem?
I dont think that would help for your use case
Some say to add multiple workers, not sure how this solve it. Let's say today it starts working for large files using background tasks, what if the load of users at same time increases. How this will impact my solution and how should I plan the mitigation plan around the risks.
When you use celery, you will have to start few workers for that Celery instance, these workers are the ones who execute your background tasks. Celery documentation will help you with exact count calculation of these workers based on your instance CPU and memory etc.
If someone here has a strong understanding with respect to the architecture, I am here to listen your experiences and thoughts. Also, let me know if there is other way than HEROKU from a solution standpoint which will make this more easy for me.
I have worked on few projects where we used Celery with background tasks to upload large files. It has worked well for our use cases.
Here is my final take on this after full evaluation, trials and earlier recommendations made here, thanks #arun.
HEROKU needs a web worker to deliver the website runtime which hold 512MB of memory, operations your perform if are below this limits should be fine.
Beyond that let's say you have scenarios like mentioned above where a large file is coming from one source api and going into another target api with Django app, you will have to:
First, you will have to run the file upload function as a background process since it will take time more than 30 seconds to respond which HEROKU expects to return. If not H12 Error is waiting for you. Solution to this is implementing Django Background tasks, Celery worked in my case. So here Celery is your same Django app functionality running as a background handler which needs its own app Dyno ( The Worker ) This can be scaled as needed in future.
To make your Django WSGI ( Frontend App ) talk to the Celery ( Background App), you need a message broker in between which can be HEROKU Redis, RabbitMQ, etc.
Second, the problems doesn't gets solved here even though you have a new Worker dedicated for the Celery app, the memory limits will still apply as its also a Dyno with its own memory.
To overcome this, your Python requests module should download the file in stream instead of direct downloading complete file in a single memory buffer. Iterate and load the stream data in chunks and send the file chunks to target endpoint.
Even chunk size plays here an important role. I will not put exact number here since it depends on various factors:
Should not be too small, else it will take more time to transfer.
Should not be too big to be handled by either of the source/target endpoint servers.

Send Weekly Email Notifications to Users regarding any changes in data

What are ways to send automatic weekly email notifications to users upon any update or changes in the data in a django project ?
I'll describe the simplest solution I'm aware of. There are also much, much more sophisticated approaches to this problem.
Fundamentally, you need three things:
A task runner (I recommend configuring cron jobs with django-kronos)
An SMTP provider (I recommend Mailgun, which is super simple to set up with Django and gives you some test credits out of the box).
Email templates (write yourself & render to string w/ Django -- Mailgun has some good open source templates as well on their blog)
Example: django-kronos provides decorators for registering functions as cron jobs (this assumes your web server is Linux-based). These jobs can be installed as part of your deploy process from the command line:
./manage.py installtasks
For kronos to find tasks, they must be located in cron.py files inside of your apps.
# myapp/cron.py
import kronos
from django.contrib.auth.models import user
from myapp.services import check_for_changes, notify_user_of_change
# Register cron job to run once a week (every Sunday) at midnight
#kronos.register('0 0 * * 0')
def notify_about_changes():
"""Sets up a cron job and runs this service function once a day.
Installed With:
``./manage.py installtasks``
"""
all_my_users = User.objects.all()
for user in all_my_users:
changes = check_for_changes(user)
for change in changes:
notify_user_of_change(user, change)
Note that it's a good practice to keep your task function as "thin" as possible. In your case, you're (a) checking for changes, then (b) notifying about each change. These could each be a function that accepts a user instance as a param, or they could be model methods. In any case, the hard work should be handled elsewhere.
This is a very broad question, so unfortunately the answer is going to be broad as well. The general workflow you're looking for probably starts off with a Signal which you can set up to be activated on update of data.
That signal would then store data about the changes and then use one of many different email options to send email. You could use either built-in Django emails, or use a 3rd party API such as django-mailchimp.
Hopefully this gets you headed in the right direction.

Django setting up a scheduled task without Cron

I know there are many questions asking about this, especially this one: Django - Set Up A Scheduled Job?.
But what I want to understand is, how does a scheduled task inside Django actually works?
My simplistic way to think about it is that there's an infinite loop somewhere, something like this (runs every 60 seconds),
import time
interval=60 #60 seconds
while True:
some_method()
time.sleep(interval)
Question: where do you put this infinite loop? Is there some part of the Django app that just runs in the background alongside the rest of the app?
Thanks!
Django doesn't do scheduled tasks. If you want scheduled tasks, you need a daemon that runs all the time and can launch your task at the appropriate time.
Django only runs when a http request is made. If no one makes a http request for a month, django doesn't run for a month. If there are 45 http requests this second, django will run 45 times this second (in the absence of caching).
You can write scripts in the django framework (called management commands) that get called from some outside service (like cron). That's as close as you'll get to what you want. If that's the case, then the question/answer you reference is the place to get the how tos.
Probably on a unixy system, cron is the simplest outside service to work with. On recent linux systems, cron has a directory /etc/cron.d into which you can drop your app's cron config file, and it will not interfere with any other cron jobs on the system. No editing of existing files necessary.

Alternative to django-celery to perform asynchronous task in Django?

In my admin I have a form allowing to upload a file to fill the DB.
Parsing and filling the DB take a long time, so I'd like to do it asynchronously.
As recommended by several SO users I tried to install python-celery, but I can't manage to do it (I'm on Webfaction).
Is there any simple, easy-to-install alternative?
If webfaction supports cron jobs you can create your own pseudo broker. You could save your long running tasks to the db and in a 'tasks' table, this would allow you to return a response to the user instantaneously. Then there could be a cron that goes through very frequently and looks for uncompleteled tasks and processes them.
I believe this is what django mailer does
https://github.com/jtauber/django-mailer/
https://stackoverflow.com/a/1419640/594589
Try Gearman along with it's python client library
It's very easy to setup and run gearman. Try few examples.

Rather than using crontab, can Django execute something automatically at a predefined time

How to make Django execute something automatically at a particular time.?
For example, my django application has to ftp upload to remote servers at pre defined times. The ftp server addresses, usernames, passwords, time, day and frequency has been defined in a django model.
I want to run a file upload automatically based on the values stored in the model.
One way to do is to write a python script and add it to the crontab. This script runs every minute and keeps an eye on the time values defined in the model.
Other thing that I can roughly think of is maybe django signals. I'm not sure if they can handle this issue. Is there a way to generate signals at predefined times (Haven't read indepth about them yet).
Just for the record - there is also celery which allows to schedule messages for the future dispatch. It's, however, a different beast than cron, as it requires/uses RabbitMQ and is meant for message queues.
I have been thinking about this recently and have found django-cron which seems as though it would do what you want.
Edit: Also if you are not specifically looking for Django based solution, I have recently used scheduler.py, which is a small single file script which works well and is simple to use.
I've had really good experiences with django-chronograph.
You need to set one crontab task: to call the chronograph python management command, which then runs other custom management commands, based on an admin-tweakable schedule
The problem you're describing is best solved using cron, not Django directly. Since it seems that you need to store data about your ftp uploads in your database (using Django to access it for logs or graphs or whatever), you can make a python script that uses Django which runs via cron.
James Bennett wrote a great article on how to do this which you can read in full here: http://www.b-list.org/weblog/2007/sep/22/standalone-django-scripts/
The main gist of it is that, you can write standalone django scripts that cron can launch and run periodically, and these scripts can fully utilize your Django database, models, and anything else they want to. This gives you the flexibility to run whatever code you need and populate your database, while not trying to make Django do something it wasn't meant to do (Django is a web framework, and is event-driven, not time-driven).
Best of luck!