Running a quick async task with Django+Gunicorn - django

We have coded a system that uses Django + Celery, where our Celery tasks take a few minutes each to complete.
I'm looking for a quick, easy to use method for running an immediate async task (a few seconds) when a user logs in, without having to use the celery system (where queued tasks may take ages to finish)
I have read similar questions on S.O but they were referring to Apache+uWSGI rather than Gunicorn. Also, Questions regarding Gunicorn mentioned that greenlets are blocking.
This answer suggests using Threads or Multiprocessing, but I am confused - will those options work with Gunicorn or will they cause it to hang/crash? What about using Fork?

I think I found a solution: I should use celery's "Task Routing" and set up:
A queue for slow tasks
A queue for quick tasks
And two (or more) workers, one of which only executes the quick tasks.
See example in this sample code (change "windows" -> "slow" or "quick" according to the need)
(original presentation here)

Related

Django + Celery with long-term scheduled tasks

I'm developing a Django app which relies heavily on Celery task scheduling, using Redis as backend. Tasks can be set to run at a large periods of time, as well as in a few seconds/minutes.
I've read about Redis visibility timeout and consequences of scheduling tasks with timedelta greater than visibility timeout (I'm also in the process of dealing with it in a previous project), so I'm interested if there's anything neater than my solution, which is to have another "helper" task run 5 minutes before the "main" one needs to be executed, scheduling the "main" task to run in required time, storing task id in DB, and then checking in "main" task if the stored task id is the one that is being run. The last part (with task id storing) is required as multiple runs of "helper" task could spawn a lot of "main" task instances, but with this approach each will have different task id.
I really hate how that approach sounds and how it works, as if the task is scheduled to be run a month from current time, "helper" and "main" tasks are executed up to a hundred times.
I also know that it's an open issue, so I'm interested in more a neat workaround than a solution itself.
Having tested available options, in my opinion only using RabbitMQ as broker solves the whole problem.
Although it's a viable option for me, lack of some of redis configuration parameters (e.g. pool size) makes it unusable for those who are using hosting services with some limit on opened broker connection.

Django celery, celery-beat: fills the queue without control, scheduling troubles

I have small project with couple of tasks to run several times a day.
The project is based on Django 2.1, having celery 4.2.1 and django-celery-beat 1.3.0. And also have rabbitmq installed.
Each task is inside it's projects application. Runs, works, gives some result.
The problem is - on virtual server, leased from some provider, if I set any task to run periodically (each hour, or two)- it starts running immidiately, without end and, as i suppose in some kind of parallel threads, wish mesh each other.
Command rabbintmqctl list_queues name messages_unacknowldged always shows 8 in queue celery. Purging the queue celery does not give any changes. Restarting service - too.
But setting tasks schedule to run in exact time works good. Well, almost good. Two tasks have schedule to run in the beginning of different hours (even and odd). But both run in about 30 minutes after hour beginning, of the same (odd) hour. At least tasks don't run more times in a day than set in schedule. But it is still something wrong.
As a newbie with rabbitmq and celery don't know where to look for solution. Official celery docs didn't help me. May be was not looking in right place. Any help help or advice would be good. Thanks.
It seems this is bug of django-celery-beat - https://github.com/celery/celery/issues/4041.
If anyone have already made any solution for this - please inform.

How to record all tasks information with Django and Celery?

In my Django project I'm using Celery with a RabbitMQ broker for asynchronous tasks, how can I record the information of all of my tasks (e.g. created time (task appears in queue), worker consume task time, execution time, status, ...) to monitor how Celery is doing?
I know there are solutions like Flower but that seems to much for what I need, django-celery-results looks like what I want but it's missing a few information I need like task created time.
Thanks!
It seems like you often find the answer yourself after asking on SO. I settled with using celery signals to do all the recording I want and store the results in a database table.

Celery all generated tasks status

Django produces multiple Celery tasks through chains in one script run (f.e. if / is opened in browser, 1000 tasks are called by delay method).
I need something that will restrict new task generation, if tasks, queued in previous script run, are still running.
You need a distributed lock for this, which celery doesn't offer natively.
For these kinds of locks I've found redis.Lock useful to most cases. If you need a semaphore, you can use redis' atomic incr/decr functions along with some kind of watchdog mechanism to ensure your processes are still running.
You can restrict the number of tasks of one type running at the same time by setting:
rate_limit = “1000/m”
=> only 1000 tasks of this type can run per minute.
(see http://docs.celeryproject.org/en/latest/userguide/tasks.html#list-of-options)

Django Job queue for interfacing with celery

My django web-app logic is heavily geared towards background task execution (both periodic as well as stand alone, synchronous as well as asynchronous). All the research seems to point to using Celery being the most recommended approach. I plan to eventually deploy on Heroku and the fact that it has support for Celery + Redis (what I'm using for local development) is a big plus for me.
However I need more extensive scheduling capabilities than celery provides. I need some of my periodic tasks to be able to run schedules like 'run on last sun of the month' etc. So I've implemented my own models in django to store a recurrence rule and other needed parameters.
Now I'm stumped with how to interface my tables with celery. Ideally what I'd like to do is to have my own Job model which has the schedule, the task which should be run when it becomes due as well as the parameters for the task. Sort of like function ptr in C++. Then I would run a daemon which keeps checking the job queue for which job has become due, if its periodic it creates the next job instance and pushes it into queue, then runs the associated task with parameters using celery's delay method or similar.
questions:
Does this approach even make sense?
If not what other alternative approach(es) can I use
If yes how do I go about designing that Job/Event queue...
I'd love to hear a better approach to doing this or if there's an existing implementation of a job queue that might be suitable or a way to use celery's job queue itself...
Thanks heaps..
The periodic tasks in Celery works pretty much like this. There's a dedicated scheduler process (celery beat) which simply sends off tasks when they are due.
You can also create new schedulers to use with beat by subclassing the celery.beat.Scheduler class, and you can create custom schedules too (like the crontab schedule that is already built-in) by subclassing celery.schedules.schedule.
There's a database-backed scheduler implementation in the django-celery extension (djcelery.schedulers.DatabaseScheduler), which uses many tricks to avoid too frequent polling of the database and so on (sadly it's not well commented).
Scheduler: https://github.com/celery/celery/tree/master/celery/beat.py
schedules: https://github.com/celery/celery/tree/master/celery/schedules.py
DatabaseScheduler: https://github.com/celery/django-celery/tree/master/djcelery/schedulers.py