Triggering Celery Tasks from c++ with Redis - c++

We have a setup where we have a web frontend programmed in Django and a backend written in C++ that parses data for us.
The frontend uses Celery in combination with Redis for asynchronous tasks.
Since it would be convenient in some situations, I was wondering today if it is possible to trigger a Celery task from within C++.
Since there is a Redis client available for C++, I am pretty sure that this is possible, if the correct messages are sent to Redis, however, I was not able to find any information on this anywhere.
My next step would be to try and dig the needed Information out of the Celery source code, but before I do that:
Does anybody have any information on this subject that could help me or get me started or is there even someone who has done this before?
Any help is appreciated. (Also if you got a reason why this will not work.)
Thank you.

I had a similar need to trigger a celery task from logstash. Basically, I had to create a message that looked something like this:
{
"body": "base_64_encoded_string (see below)",
"content-type": "application/json",
"properties": {
"body_encoding": "base64",
"correlation_id":"f009c9e0-0ca6-42a6-a046-3d0e53e06060",
"reply_to":"e1eb91f0-6780-4c34-b633-7ef9a46baf5e",
"delivery_mode":2,
"delivery_tag": "7788b924-a7fe-4c9a-839e-1c7ca602dbba",
"delivery_info": {
"priority":0,
"routing_key":"default",
"exchange":"default"
}
}
}
In this case, the decoded body translates to:
{
"args": ["meta_val","doc_value"],
"task":"goldstone.compliance.tasks.process_fim_event",
"id":"23deb69e-49c1-4a61-8639-d4627d0fc591"
}
If you have kwargs on your task, you can add a kwargs: {"key": "value", ...} to your body.
The body above maps triggers a task called process_fim_event. The task def looks like:
#task()
def process_fim_event(meta, doc):
...

The easiest way of doing this that I know of is to use flower, a HTTP Celery API. With flower you can create a task with anything that can make an HTTP request. One example from the Github Readme:
$ curl -X POST -d '{"args":[1,2]}' http://localhost:5555/api/task/async-apply/tasks.add
So, the idea is that your c++ app would make an HTTP request against the flower api, which would then insert the task into your Redis queue.

Related

Schedule task in Django using schedule package

I am trying to learn how to scheduled a task in Django using schedule package. Here is the code I have added to my view. I should mention that I only have one view so I need to run scheduler in my index view.. I know there is a problem in code logic and it only render scheduler and would trap in the loop.. Can you tell me how can I use it?
def job():
print "this is scheduled job", str(datetime.now())
def index(request):
schedule.every(10).second.do(job())
while True:
schedule.run_pending()
time.sleep(1)
objs= objsdb.objects.all()
template = loader.get_template('objtest/index.html')
context= { 'objs': objs}
return HttpResponse(template.render(context, request))
You picked the wrong approach. If you want to schedule something that should run periodically you should not do this within a web request. The request never ends, because of the wile loop - and browsers and webservers very much dislike this behavior.
Instead you might want to write a management command that runs on its own and is responsible to call your tasks.
Additionally you might want to read Django - Set Up A Scheduled Job? - they also tell something about other approaches like AMPQ and cron. But those would replace your choice of the schedule module.

How to show progress status of requests response download into web page in Django

I build an application in Django, and I have a function to get attendance log from a fingerprint machine, basically like this:
import requests
from xml.etree import ElementTree
def main(request):
Key="xxxxxx"
url="http://192.168.2.188:80"
soapreq="<GetAttLog><ArgComKey xsi:type=\"xsd:integer\">"+Key+"</ArgComKey><Arg><PIN xsi:type=\"xsd:integer\">All</PIN></Arg></GetAttLog>"
http_headers = {
"Accept": "application/soap+xml,multipart/related,text/*",
"Cache-Control": "no-cache",
"Pragma": "no-cache",
"Content-Type": "text/xml; charset=utf-8"
}
response = requests.post(url+"/iWsService",data=soapreq,headers=http_headers)
root = ElementTree.fromstring(response.content)
Now, that process will be repeated for hundred++ fingerprint machines, and I need to display some kind of progress status and also error messages ( if any, like connection cannot be eastablished, etc) on a page, periodically and sequentially, after each events. I mean something like :
....
"machine 1 : download finished."
"downloading data from machine 2 .. Please wait "
"machine 2 : download finished."
...
Thanks.
I'm not really sure what is the main hurdle you're facing. Can you try framing the question in a more precise way?
From what I understand, you want a page that changes dynamically based on something that's happening on the server. I think django may not be the best tool for that, as its basic use is to compute a full view in one go then display it. However, there are a few things that can be done.
The easiest (but not the best in terms of server load) would be to use Ajax requests from the webpage.
Have a "main view" that loads the user-facing page as well as some JS libraries (e.g. jQuery), which will be used to query the server for progress.
Have a "progress view" that displays the current status and that is queried from the main view via AJAX.
This is not the best architecture because you may end up reloading data from the server too often or not often enough. I would suggest to have a look at websockets, which allow you to keep a client-server connection open and use it when needed only, but there is no native support for them in django.

HTTP call after celery task have changed state

I need a scheduler for my next project, and since I'm coding using Django I went for Celery.
What I am looking for is a way for a task to tell Django when it is done, so I can update the database and use SSE to tell the user. All this can be done fairly simple with just putting all the logic into the task. But what do I do when I am planning to have several celery workers?
I found a bunch of info online to cover the single-worker-case, but not many covering the problem if you have more than one worker.
What I thought about was using http callbacks from the workers to the web-server to let it know that the task is done. Looking at celery.task.http looked promising, but didnt do what I needed.
Is the solution to use signals and hook up manual http calls? Or am I on the wrong path? Isn't this a common problem? How can this be solved more elegantly?
So, what are you mean when you tell tell to Django? Is I understand you right, django request which initiliazed a Celery task, is still alive a time when this task is finished? I that case you can check some storage ( database, memcached, etc ). and send your SSE.
Look, there is one way to do that.
1. You django view send task to Celery, after that it goes to infinite loop ( or loop with timeout 60sec?) and waits results in memcached.
Celery gets task executes, and pastes results to memcached.
Django view gets new results, exit the loop and sends your SSE.
Next variant is
Django view sends task to Celery, and returns
Celery execute tasks, after executing it makes simple HTTP requests to your django app.
Django receives a http request from Celery, parse params and send SSE to your user again
Here is some code that seems to do what I want:
In django settings:
CELERY_ANNOTATIONS = {
"*": {
"on_failure": celery_handlers.on_failure,
"on_success": celery_handlers.on_success
}
}
In the celery_handlers.py file included:
def on_failure(self, exc, task_id, *args, **kwargs):
# Use urllib or similar to poke eg; api-int.mysite.com/task_handler/TASK_ID
pass
def on_success(self, retval, task_id, *args, **kwargs):
# Use urllib or similar to poke eg; api-int.mysite.com/task_handler/TASK_ID
pass
And then you can just setup api-int to use something like:
from celery.result import AsyncResult
task_obj = AsyncResult(task_id)
# Logic to handle task_obj.result and related goes here....

django-notification. How can I use threading email send?

In my django application I am using django-notification to send notifications. However I noticed that in some cases (when sending multiple notifications) my web application is giving delayed responses. Although I am sending notifications through Ajax requests, I still think it would be best if I could implement mailtools library which provide threaded emails.
Has anyone implemented such a thing? Is it easy? How can I use ThreadedMailer from mailtools in django-notification?
or, is there another alternative?
Use Celery for this purpose. It's easy to setup with django and you can use the code you're using right now.
The ajax request puts the email into task queue and returns. You could return your task id if you want to check later if the task succeeded.
Update:
Celery only enables you to call your functions in backgound. Say in ajax view you called:
send_email(…)
Now in tasks.py you should define function:
#task
def send_email(…)
And in the view you will call it by:
send_email.delay(…)
And that's it. The email will be sent by background worker deamon using your existing python code.
This doesn't make django-notification obsolete. Celery does completly different thing and can be used with any lib you can imagine.
The only change is task arguments have to be pickable. It means you have to pass db ids, not whole objects, etc.

creating a web url that listens to redis pubsub published message

Edit
OK I have a long polling from javascript that talks to a django view. The view looks as follows. It loses some messages that I publish from redis client in the channel. Also I should not be connecting to redis for every request (Perhaps the redis variables can be saved in session?)
If someone can point out the changes I need to make this view work with long polling, it would be awesome! Thank you!
def listen (request):
if request.session:
logger.info( 'request session: %s' %(request.session))
channel = request.GET.get('channel', None)
if channel:
logger.info('not in cache - first time - constructing redis object')
r = redis.Redis(host='localhost', port=6379, db=0)
p = r.pubsub()
logger.info('subscribing to channel: %s' %(channel))
p.psubscribe(channel)
logger.info('subscribed to channel: %s' %(channel))
message = p.listen().next()
logger.info('got msg %s' %(message))
return HttpResponse(json.dumps(message));
return HttpResponse('')
----Original question---
I am trying to create a chat application (using django, python) and am trying to avoid the polling mechanism. I have been struggling with this now - so any pointers would be really appreciated!
Since web sockets are not supported in most browsers, I think long polling is the right choice. Right now I am looking for something that scales better than regular polling and is easy to integrate with python django stack. Once I am done with this development, I plan to evaluate other python frameworks (tornado twister, gevent etc.) come to mind.
I did some research and liked the redis pubsub mechanism. The chat message gets published to a channel to which both users have already subscribed to. Following are my questions:
From what I understand, apache would not scale well since long polling would soon run into process/thread limits. Hence I have decided to switch to nginx. Is this rationale correct? Also are there any issues involved in nginx that I am worried about? In particular, I am worried about the latest version not supporting http 1.1 for proxy passing as mentioned in the blog post at http://www.letseehere.com/reverse-proxy-web-sockets?
How do I create the client portion of the subscription of messages on the browser side? In my mind, it would be a url to which the javascript code would "long poll". So at the javascript level, the client would poll a url which gets "blocked" in a "non blocking way" at the server side. When a result (in this case a new chat message) appears, server returns the result. Javascript does what it needs to and then again polls the same url. Is this thinking correct? What happens in between the intervals when the javascript loop is pausing - do we loose any messages from the server side.
In essence, I want to create the following:
From redis, I publish a message to a channel "foo" (can use redis-cli also - easy to incorporate it later in python/django)
I want the same message to appear in two browser windows that use the same js code to poll. Assume that the browser code knows the channel name for test purpose
I publish a second message that again appears in two browser windows.
I am new to real time apps, so apologies for any question that may not make sense.
Thank you!
Well just answering your question partly and mentioning one option out of many: Gunicorn being used with an async worker class is a solution for long-polling/non-blocking requests that is really easy to setup!