Proper way to run nameko service(s) inside a Django application? - django

If I have a vanilla Django REST (DRF) application and I would like to integrate a nameko service (specifically an event_handler event listening service), what's the best way to achieve this?
I cannot simply nameko run a service if it's part of a Django application.
I'm considering running the nameko service via a custom Django management command, but would I lose some of nameko's features, say, scalability? Eg. nameko maintains a pool of 10 workers per nameko run (if I remember correctly).

This is how:
https://github.com/sivabudh/djanko/blob/master/services.py
See: django-nameko-standalone
Update: If you want to do microservices with Django, just use Celery. Works like a charm.

Related

Django auth with redis

I try to plug in my django aplication into microservice architecture.
Authentication is made by component written in spring boot. It stores session in redis in following format:
"spring:session:sessions:71f06a1d-b169-4bb9-a4c8-013bb82742ee"
Is it possible to configure some existing django lib (configure namespace) to use build in auth system? I would like to avoid write down whole thing from scratch.
Any help will be welcome.
I managed the problem - that kind of functionality provides custom authentication
http://www.django-rest-framework.org/api-guide/authentication/#example .

Is django server based on blocking code

I read somewhere that Django is a blocking code. Does that mean when I deploy my code, the server will be able to serve only one request at a time? Do I need to use some other framework like tornado to solve this purpose? Is Django only meant for development and debugging purpose? If it does not solve the purpose of deployment why not use node.js or some other framework.
It means each thread or process can only serve one request at a time. But your server is normally configured to spin up multiple processes.

Running a meteor app as part of a wider django project

I'm currently working on a project which would require some realtime functionalities such as Multi-user chatrooms etc.
Ideally, I’m looking to have meteor run the chat application(on a different port) and mongodb act as message broker to the django back-end which would take care of user registration , management and everything 'non-realtime' related.
This would involve setting up a reverse-proxy which would redirect to a different port based on the url (please let me know if i'm wrong in this)
Would this be possible(or even advisable)? Another option would be to implement the same with tornado. but I have no experience with building tornado-based apps and rather do this with a framework I’m comfortable with.
Thanks,
You can have Django serve the Meteor front-end while providing access to its data using django-ddp, giving you some distinct advantages:
Continue to serve your existing Django project/apps.
No extra services or ports to manage.
Scale out by simply adding more front-end Python/Django servers (server to server IPC is done via the existing database connection).
Use django.contrib.auth user accounts in your Meteor app.
Familiar Python/Django code (no "callback" style such as with Tornado).
Use time-tested, trusted relational databases.
Use Django migrations to effectively manage schema changes.
There's a Gitter chat room where I can give you assistance if you need it.
DISCLAIMER: I'm the author of django-ddp.
A meteor application is more than capable of handling the user registration flow and many other things. Why not just build the application entirely in meteor? Your application sounds like a perfect candidate for meteor, with realtime interaction with your database at the core.
The other option would be to use swampdragon which adds realtime data binding within django. It allows for simple bi directional communication between the server and the client. Again, essential for a chat application. It nice and easy to get setup and running as well.
Are there any specific reasons to not implementing your application in one framework alone?

Apache/wsgi/django configuration

We have two django applications running on the same server that interact with an API that uses oauth. They function as expected, communicating with each other, when run under the django development server. However, when deployed using apache/wsgi they don't work together.
(To be more specific, one application is an instance of the Indivo server; the other one is a custom application that interacts with Indivo.)
What is the best way to trouble shoot this?
Make sure that the Django instances are working by themselves first. For example, one app could be started under Apache, and the other using ./manage.py runserver. Reverse which one is running using Apache and verify that all works as expected.
Use the Apache error logs to look for errors such as failed requests.
Since one of your apps appears to implement a web API, use something like the Google Chrome Postman App to exercise the site from a web browser.
Learn how to use the Django logging framework to log information about your apps as they execute.

How to use Tornado work with Django? Is it a good solution?

I've a huge django project and have to use Instagram API and its subscriptions model to work. For the subscriptions, my server has to be very responsive and be ready to work asynchronously to set up a hook so as to receive notifications once the user posts. Or that's what the documentation suggests. Now will it be a good thing to use Tornado there? Just for that small part or can I do it using Django in an effective way? if so, how?
You can use the WSGI container on top of Tornado to host any WSGI application, including Django, however, when you do that the WSGI application is still running as a blocking application and will not magically be running as an asynchronous application. So, when Django is handling a request there is no ability to handle another request at the same time within Django. The solution at that point is not much different to running a single threaded WSGI server and you would need to have multiple Tornado instances to handle concurrent requests.
So all really depends on what you mean by asynchronous. You certainly can't make use of Tornado's direct asynchronous programming API in Django. Thus there isn't really any great benefit from using Tornado with Django via the WSGI interface.
As I understand you are talking about this paragraph in Instagram docs
You should build your system to accept multiple update objects per payload - though often there will be only one included. Also, you should acknowledge the POST within a 2 second timeout--if you need to do more processing of the received information, you can do so in an asynchronous task.
That's another type of "asynchronous" that Tornado provides.
I think Django + Celery will suite better for this.
Your application will work in this way:
You receive JSON-data from Instagram
Create a celery-task, e.g. instagram_process.delay(request.raw_post_data) or instagram_process.delay(request.body) according to your Django version
Response to Instagram with 200 status code
In instagram_process task you do all your procession - parse JSON, store it do database and anything else you need.
If you want to check X-Hub-Signature you can either do it between steps 1 and 2, or pass this header to the task and verify the signature at step 4.
You can use tornado.wsgi to integrate Tornado with other WSGI compliant frameworks. Check out this demo project for details:
https://github.com/bdarnell/django-tornado-demo