Django running long sql process in background - django

I have a web page where I need to run a long sql process (up to 20 mins or so) when the user clicks on a certain button. The script runs, but the user is then unable to continue browsing the rest of the website.
I would like to have it so that when the button is clicked, it goes into a queue that runs in the background.
I have looked inth django-background-tasks, but the problem is that it does not seem to be possible to start the queued tasks without running python manage.py process_tasks.
I have heard of Celery, but I am using a Windows system and it does not seem to be suitable.
I am new to django and website infrastructures, and am not sure if this is feasible. I have also seen in older response that the threading package can work to do this, but I am unsure if it is outdated.

You can use create_task provided by Asyncio that can run a background task for you without blocking the view for clients.
Python 3.7+
Asyncio create_task
Disclaimer: I'm not so sure if myfunc() needs to be async unless you are performing an async worthwhile task.
You could also have a while loop in myfunc() for periodic repeated operations.
import asyncio
async def myfunc():
await asyncio.sleep(5)
print("Hi, after 5 seconds.")
task = asyncio.create_task(myfunc(), )

Related

Return JSONResponse before updating database in django, python2

Im having a project running on python2.7. The project is old but still its necessary to update the database when a request is received. But the update process takes time and ends up with a timeout. Is there anyway return JsonResponse/Httpresponse, before updating the database so that timeout doesn't occur. I know its not logical to do so, but its a temporary fix.
Also, i cant use async since its python2
Use multiprocessing or multithreading this will execute your task with another process and send HTTP response fastly to client-side

Execute code after Response using Django's Async Views

I'm trying to execute a long running function (ex: sleep(30)) after a Django view returns a response. I've tried implementing the solutions suggested to similar questions:
How to execute code in Django after response has been sent
Execute code in Django after response has been sent to the
client
However, the client's page load only completes after the long running function completes running when using a WSGI server like gunicorn.
Now that Django supports asynchronous views is it possible to run a long running query asynchronously?
Obviously, I am looking for a solution regarding the same issue, to open a view which should start a background task and send a response to the client without waiting until started task is finished.
As far as I understand yet this is not one of the objectives of async view in Django. The problem is that all executed code is connected the the worker started to handle the http request. If the response is sent back to the client the worker cannot handle any other code / task anymore started in the view asynchronous. Therefore, all async functions require an "await" in front of. Consequently, the view will only send its response to the client if the awaited function is finished.
As I understand all background tasks must be pushed in a queue of tasks where another worker can catch each new task. There are several solution for this, like Djangp Channels or Django Q. However, I am not sure what is the most lightweighted solution.

Back end tasks and frontend separation

I use Django and Django Rest Framework for my internal API and I use Vue.js for my frontend. The backend (API) and the frontend are totally separated.
I need to run a background task (every time a user is created) and I am considering 2 solutions:
Call (with a post_save signal) a function that runs the task.
Note that this function will call a 3rd party API. The call might fail for various reasons and/or run during a long period ~20sec.
Create a background task
With Redis or RabbitMQ or django-background-tasks.
Which solution should I go for ?
If both solutions are acceptable, what would be the limitations/advantages of each one ?
You might need django celery. This is a great package for background tasks for django, you can choose either Redis or RabbitMQ as the broker, where the brokers doesn't matter much on my opinion.
Why can this be a good solution for your problem?
This is easy to install where you just need to install the django celery and redis(I prefer redis), configure some settings and you have now async functions.
You might need soon a scheduled task, where you just need to install its additional package.
You only need to build type function and attach a decorator for it to be async.
from celery import shared_task
#shared_task
def add(x,y):
return X+y
and call it anywhere in you code
add.delay()
you know how background task.

How do I schedule my desktop application(PyQt) to send emails at some intervals

So, I have built this nice application(using camelot) and do not know how to implement a scheduled job on it, that can send emails based on some conditions at regular intervals. I am trying to implement it using schedule but don't know how to make my app call it automatically.
here is my code:
import time, schedule
def job():
print "I'm working..."
schedule.every(10).seconds.do(job)
while True:
schedule.run_pending()
time.sleep(1)
When I start my application, nothing works. How do I make my application be aware of this.
If possible, I would like this schedule job to execute without even starting my application.
If you want to schedule jobs that will interact with the user, you have to use Camelot Actions, which have their own method for executing functions in the model thread, posting the result to the GUI thread, and back...
But you don't need that, you just need to run jobs (functions) that will access the database, create emails and send it, without interacting with the user. This can be done with a completly independent application, without GUI.
In order to avoid creating two applications, you can change the behavior using a command line parameter. If you start the application without any params, it will open the GUI as usual, but if you run it with -background, it will only start the schedule loop.
Then you can hook your application on system start, executing it with "-background", and you will have the schedule running without requiring your users to start the application.
If later a user starts the application, then you'll have two instances running, the first one running the schedule loop, and the second one with the GUI.
main.py:
if __name__ == '__main__':
import sys
if "-background" in sys.argv:
import background
background.main()
else:
main() #camelot's main
background.py:
from camelot.core.orm import Session
import time, schedule
def job():
session = Session()
#Use session to access and manipulate model
print "I'm working..."
session.flush() #flush session to DB
def main():
from camelot.core.conf import settings
settings.setup_model()
schedule.every(10).seconds.do(job)
while True:
schedule.run_pending()
time.sleep(1)
You should run the app with python main.py -background.

HTTP call after celery task have changed state

I need a scheduler for my next project, and since I'm coding using Django I went for Celery.
What I am looking for is a way for a task to tell Django when it is done, so I can update the database and use SSE to tell the user. All this can be done fairly simple with just putting all the logic into the task. But what do I do when I am planning to have several celery workers?
I found a bunch of info online to cover the single-worker-case, but not many covering the problem if you have more than one worker.
What I thought about was using http callbacks from the workers to the web-server to let it know that the task is done. Looking at celery.task.http looked promising, but didnt do what I needed.
Is the solution to use signals and hook up manual http calls? Or am I on the wrong path? Isn't this a common problem? How can this be solved more elegantly?
So, what are you mean when you tell tell to Django? Is I understand you right, django request which initiliazed a Celery task, is still alive a time when this task is finished? I that case you can check some storage ( database, memcached, etc ). and send your SSE.
Look, there is one way to do that.
1. You django view send task to Celery, after that it goes to infinite loop ( or loop with timeout 60sec?) and waits results in memcached.
Celery gets task executes, and pastes results to memcached.
Django view gets new results, exit the loop and sends your SSE.
Next variant is
Django view sends task to Celery, and returns
Celery execute tasks, after executing it makes simple HTTP requests to your django app.
Django receives a http request from Celery, parse params and send SSE to your user again
Here is some code that seems to do what I want:
In django settings:
CELERY_ANNOTATIONS = {
"*": {
"on_failure": celery_handlers.on_failure,
"on_success": celery_handlers.on_success
}
}
In the celery_handlers.py file included:
def on_failure(self, exc, task_id, *args, **kwargs):
# Use urllib or similar to poke eg; api-int.mysite.com/task_handler/TASK_ID
pass
def on_success(self, retval, task_id, *args, **kwargs):
# Use urllib or similar to poke eg; api-int.mysite.com/task_handler/TASK_ID
pass
And then you can just setup api-int to use something like:
from celery.result import AsyncResult
task_obj = AsyncResult(task_id)
# Logic to handle task_obj.result and related goes here....