Django-celery - How to execute tasks serially? - django

I'm using Celery 3.1. I need to only execute the next task when the last one is finish. How can I assure that there are not two tasks working at the same time? I've read the documentation but it is not clear for me.
I've the following scheme:
Task Main
- Subtask 1
- Subtask 2
I need that when I call "Task Main" the process will run till the end(Subtask 2) without any new "Task Main" starting.
How can I assure this?

One strategy is through the use of locks. The Celery Task Cookbook has an example at http://docs.celeryproject.org/en/latest/tutorials/task-cookbook.html.

If I understand you want to execute only MainTask one by one, and you want to call subtasks in your MainTask. Without creating separate queues and at least 2 separate workers this is impossible. Because if you will store in same queue all tasks looks for celery as same tasks.
So solution for is:
map MainTask to main_queue
Start separate worker for this queue like:
celeryd --concurrency=1 --queue=main_queue
map subtasks to sub_queue
Start separate worker for this queue
celeryd --queue=sub_queue
Should work!
But I think this is complecated architecture, may be you can make it much easier if you will redesign your process.
Also you can find this useful (it works for you but it could run parallel MainTask):
You should try to use chains, here is an example on Celery's docs: http://docs.celeryproject.org/en/latest/userguide/tasks.html#avoid-launching-synchronous-subtasks.

Related

Reusing a database record created by means of Celery task

There is a task which creates database record {R) when it runs for the first time. When task is started second time it should read database record, perform some calculations and call external API. First and second start happens in a loop
In case of single start of the task there are no problems, but in the case of loops (at each loop's iteration the new task is created and starts at certain time) there is a problem. In the task queue (for it we use a flower) we have crashed task on every second iteration.
If we add, at the and of the loop time.sleep(1) sometimes the tasks work properly, but sometimes - not. How to avoid this problem? We afraid that task for different combination of two users started at the same time also will be crashed.
Is there some problem with running tasks in Celery simultaneously? Or something we should consider, tasks are for scheduled payments so they have to work rock solid

(Django) RQ scheduler - Jobs disappearing from queue

Since my project has so many moving parts.. probably best to explain the symptom
I have 1 scheduler running on 1 queue. I add scheduled jobs ( to be executed within seconds of the scheduling).
I keep repeating scheduling of jobs with NO rq worker doing anything (in fact, the process is completely off). In another words, the queue should just be piling up.
But ALL of a sudden.. the queue gets chopped off (randomly) and first 70-80% of jobs just disappear.
Does this have anything to do with:
the "max length" of queue? (but i dont recall seeing any limits)
does the scheduler automatically "discard" jobs where the start time
is BEFORE the current time?
ran my own experiment. RQ scheduler does indeed remove jobs whose start date < now.

How to purge all celery subtasks from parent task?

In my sharedtask I call several subtasks(...).apply_async(). Thus, both the parent task and the subtasks have their own task_id.
When I cancel the entire operation, I call the revoke of all active tasks and it works correctly. But as soon as cores are released, the queue moves on, executing the next subtasks.
How do I programmatically clear the queue, thereby preventing the next subtasks from being executed?
You could try the following:
app.control.revoke(task_id, terminate=True)
and start the task with:
task_id = subtasks(...).apply_async()
I'm not sure if it works with subtasks to save the task id in this way, but with normal tasks it works well, so it is worth trying.

Jobs optimization in Laravel 5.5

I have below code in supervisor which keep polling the jobs table
program:laravel-queue-listener]
command=php /var/www/laravel/artisan queue:work --sleep=120 --tries=2 --daemon
Question: Right now, it goes to database to check pending jobs after each 2 minutes...Is there any way to process queues on demand? I meant when the below code executes...it may process the queue and before that check if the queue is already processing or not...
Is there any such function in the Framework to process queues manually and check if the queue is currently polling or processing any job or not?
$User->notify(new RegisterNotification($token, $User));
I understand your question as how to process queues on demand in Laravel. There is already a detailed answer here but the command you are looking for is.
php artisan queue:work --once
However if what you are trying to do is to run the queue worker when an event happens, you can still do that by invoking the queue worker from code. Example:
public static function boot(){
static creating($user){
Artisan::call('queue:work --once');
}
}

winrt c++ tasks queue

I need make a task queue in c++/cx but due to my poor experience i dont know how.
The purpose is:
- creating the task in some thread with lambda ("task1 = [] () {}")
- then add this task to task queue, task queue executing in other thread
- when task expecting it queue, it doesnt execute
- tasks executing only after previously executed task
As i got, when you use auto a = concurrency::create_task(lambda) it start immediately. Delayed start of such task need an pointer to previous task, but i cant get it as my tasks generated in separate threads.
So could anybody help me to solve this problem?
Seems like proper using of std::task_group can solve my problem.
Also std::task_handle dont execute on creation so using it may solve my problem too, but it needs its own queue.