Back end tasks and frontend separation - django

I use Django and Django Rest Framework for my internal API and I use Vue.js for my frontend. The backend (API) and the frontend are totally separated.
I need to run a background task (every time a user is created) and I am considering 2 solutions:
Call (with a post_save signal) a function that runs the task.
Note that this function will call a 3rd party API. The call might fail for various reasons and/or run during a long period ~20sec.
Create a background task
With Redis or RabbitMQ or django-background-tasks.
Which solution should I go for ?
If both solutions are acceptable, what would be the limitations/advantages of each one ?

You might need django celery. This is a great package for background tasks for django, you can choose either Redis or RabbitMQ as the broker, where the brokers doesn't matter much on my opinion.
Why can this be a good solution for your problem?
This is easy to install where you just need to install the django celery and redis(I prefer redis), configure some settings and you have now async functions.
You might need soon a scheduled task, where you just need to install its additional package.
You only need to build type function and attach a decorator for it to be async.
from celery import shared_task
#shared_task
def add(x,y):
return X+y
and call it anywhere in you code
add.delay()
you know how background task.

Related

Background tasks with Django on Heroku

So I'm building a Django app on Heroku. The tasks that the app performs frequently run longer than 30 seconds, and therefore I'm running into the 30s timeout by Heroku. I first tried solving it by submitting the task from my Django view to AWS lambda, but in that case, the view is waiting for the AWS Lambda function to finish, so it doesn't solve my problem.
I have already read the tutorials on Heroku on handling background tasks with Django. I'm now faced with a few different options on how to proceed, and would love to get outside input on which one makes the most sense:
Use Celery & Redis to manage the background tasks, and let the tasks be executed on AWS Lambda.
Use Celery & Redis to manage the background tasks, but let the tasks be executed in a Python script on Heroku.
Trying to solve it with asyncio in order to keep it leaner (not sure whether that specific case could be solved with asyncio, though?
Maybe there's an even better solution that I don't see?
Looking forward to any input/suggestions!

Django running long sql process in background

I have a web page where I need to run a long sql process (up to 20 mins or so) when the user clicks on a certain button. The script runs, but the user is then unable to continue browsing the rest of the website.
I would like to have it so that when the button is clicked, it goes into a queue that runs in the background.
I have looked inth django-background-tasks, but the problem is that it does not seem to be possible to start the queued tasks without running python manage.py process_tasks.
I have heard of Celery, but I am using a Windows system and it does not seem to be suitable.
I am new to django and website infrastructures, and am not sure if this is feasible. I have also seen in older response that the threading package can work to do this, but I am unsure if it is outdated.
You can use create_task provided by Asyncio that can run a background task for you without blocking the view for clients.
Python 3.7+
Asyncio create_task
Disclaimer: I'm not so sure if myfunc() needs to be async unless you are performing an async worthwhile task.
You could also have a while loop in myfunc() for periodic repeated operations.
import asyncio
async def myfunc():
await asyncio.sleep(5)
print("Hi, after 5 seconds.")
task = asyncio.create_task(myfunc(), )

Django-Celery - Work in parallel on repetitive requests

I am not familiar with Djano-Celery, so I would like to know if it is the right tool for what I need to do before going deeper in the doc.
My django app has a web service for tiling map images that is called like this: http://host.com/tiling/x/y/z.png
xyz are integer variables that are used in the tiling function to compute the output.
My question is: do Djano-Celery can create workers for parallel processing on this tiling function when repetitive requests are detected?
For instance, 10 or more requests could be sent by a user at a time : http://host.com/tiling/0/1/1.png, http://host.com/tiling/1/0/1.png etc...
Can Django-Celery creates workers for each in parralel instead of computing each request one by one? What are the requierements on server side? Do I need something linke NGINX or GUNICORN or WSGI or CGI? I am confuse about those things...
In most cases celery is used for asynchronous tasks handling. But it works also for concurrent tasks!
By default celery uses multiprocessing but you can also use Eventlet - concurrent networking library for Python.
Reference:
http://docs.celeryproject.org/en/latest/userguide/concurrency/eventlet.html#concurrency-eventlet
http://docs.celeryproject.org/en/latest/userguide/workers.html#concurrency

Django Job queue for interfacing with celery

My django web-app logic is heavily geared towards background task execution (both periodic as well as stand alone, synchronous as well as asynchronous). All the research seems to point to using Celery being the most recommended approach. I plan to eventually deploy on Heroku and the fact that it has support for Celery + Redis (what I'm using for local development) is a big plus for me.
However I need more extensive scheduling capabilities than celery provides. I need some of my periodic tasks to be able to run schedules like 'run on last sun of the month' etc. So I've implemented my own models in django to store a recurrence rule and other needed parameters.
Now I'm stumped with how to interface my tables with celery. Ideally what I'd like to do is to have my own Job model which has the schedule, the task which should be run when it becomes due as well as the parameters for the task. Sort of like function ptr in C++. Then I would run a daemon which keeps checking the job queue for which job has become due, if its periodic it creates the next job instance and pushes it into queue, then runs the associated task with parameters using celery's delay method or similar.
questions:
Does this approach even make sense?
If not what other alternative approach(es) can I use
If yes how do I go about designing that Job/Event queue...
I'd love to hear a better approach to doing this or if there's an existing implementation of a job queue that might be suitable or a way to use celery's job queue itself...
Thanks heaps..
The periodic tasks in Celery works pretty much like this. There's a dedicated scheduler process (celery beat) which simply sends off tasks when they are due.
You can also create new schedulers to use with beat by subclassing the celery.beat.Scheduler class, and you can create custom schedules too (like the crontab schedule that is already built-in) by subclassing celery.schedules.schedule.
There's a database-backed scheduler implementation in the django-celery extension (djcelery.schedulers.DatabaseScheduler), which uses many tricks to avoid too frequent polling of the database and so on (sadly it's not well commented).
Scheduler: https://github.com/celery/celery/tree/master/celery/beat.py
schedules: https://github.com/celery/celery/tree/master/celery/schedules.py
DatabaseScheduler: https://github.com/celery/django-celery/tree/master/djcelery/schedulers.py

django-notification. How can I use threading email send?

In my django application I am using django-notification to send notifications. However I noticed that in some cases (when sending multiple notifications) my web application is giving delayed responses. Although I am sending notifications through Ajax requests, I still think it would be best if I could implement mailtools library which provide threaded emails.
Has anyone implemented such a thing? Is it easy? How can I use ThreadedMailer from mailtools in django-notification?
or, is there another alternative?
Use Celery for this purpose. It's easy to setup with django and you can use the code you're using right now.
The ajax request puts the email into task queue and returns. You could return your task id if you want to check later if the task succeeded.
Update:
Celery only enables you to call your functions in backgound. Say in ajax view you called:
send_email(…)
Now in tasks.py you should define function:
#task
def send_email(…)
And in the view you will call it by:
send_email.delay(…)
And that's it. The email will be sent by background worker deamon using your existing python code.
This doesn't make django-notification obsolete. Celery does completly different thing and can be used with any lib you can imagine.
The only change is task arguments have to be pickable. It means you have to pass db ids, not whole objects, etc.