Django: Send regularly updates over websocket - django

We are currently developing an application with django and django-omnibus (websockets).
We need to send regularly updates from the server (django) to all connected clients via websockets.
The problem is, that we can't use cron or something related to do the work. I've written a manage.py command but through some limitations it seems omnibus can't send the message to the websocket if launcher by python manage.py updateclients.
I know that django is not designed for this kind of stuff but is there a way to send the updates within the running django instance itself?
Thanks for help!

Is the reason you can't use cron because your hosting environment doesn't have cron? ...or because "it seems omnibus can't send the message to the websocket if launcher by python manage.py" ?
If you simply don't have cron then you need to find an alternative such as https://apscheduler.readthedocs.org/en/latest/ or Celery also provides scheduled tasks.
But if the problem is the other side: "a way to send the updates within the running django instance itself" then I would suggest a simple option is to add an HTTP API to your Django app.
For example:
# views.py
from django.core.management import call_command
def update_clients(request):
call_command('updateclients')
return HttpResponse(status=204)
Then on your crontab you can do something like:
curl 127.0.0.1/internalapi/update_clients
...and this way your updateclients code can run within the Django instance that has the active connection to the omnibus tornado server.
You probably want some access control on this url, either via your webserver or something like this:
https://djangosnippets.org/snippets/2095/

Related

Django, FastAPI and DRF

I want to make a project, which uses Django as backend, PostgreSQL as database and FastAPI with Django REST Framework for REST.
Don't see any problems with making a project just with Django, DRF and Postgres, but face with difficulties when speak about FastAPI and DRF at the same time.
So there is no problem in connecting Postgres to Django, and there is no problem to make endpoints for DRF.
But how can I connect fastapi? Where to place endpoints and how to run all this stuff together?
In some examples I saw that FastAPI isntance is initiated in WSGI.py and then server runs from calling commands such like this:
uvicorn goatfish.wsgi:app
But i am not sure that it works like this when I mix more than only Django and FastAPI.
I want to use FastAPI for optical character recongnition and DRF for user registration, logins etc.
Are there any advices about making project with such a structure? Or maybe someone have a repository with such kind of project on github?
EDIT: Hope to see answers here, but for now I only see the solution in making classic Django + DRF app, then make FastAPI app with endpoints, run these apps on different ports and do some chain of actions:
From django app we load an image to form and when we submit this form we send POST request to FastAPI endpoint, which will run OCR process and then return JSON with recognized text and then will send this JSON to the Django Callback endpoint, which will process it and save to the database.
What do you think about such thing?
I think, you may:
Mix of fastapi+django. But this is only for replace DRF and use fastapi as rest framework.
Microservices. You may to run Django on one port, fastapi - another. And both may to use one shared database.
Mircoservices. All from point 2, but some api tasks (for example sign-in/sign-up) on Django and another - on fastapi.
As an alterantive you could use django ninja.
It uses similar concepts as FastAPI (Routes, Pydandantic Model Validation, Async Support) but is an Django app.
Which of course is more the monolithic approach but still valid.
Well, so after few days of thinking about I decided, that there is no sense in a question which I asked :)
Here we should talk about microservice architecture, where such kind of problem just doesn't exist. All we need is to make as much services as we need in our project using any framework (Django, FastAPI, Flask, Ruby etc.) and make connections between them.
As example I can run my main Django server on port 8000, my FastAPI server on port 5000 and my DRF service on port 6000. And then I can just do whatever I want from my main Django server making requests to endpoints of FastAPI and DRF.
So that's very simple example, now I'm diving deeper into microservice architecturing, but that's definitely what I need.

Proper way to run nameko service(s) inside a Django application?

If I have a vanilla Django REST (DRF) application and I would like to integrate a nameko service (specifically an event_handler event listening service), what's the best way to achieve this?
I cannot simply nameko run a service if it's part of a Django application.
I'm considering running the nameko service via a custom Django management command, but would I lose some of nameko's features, say, scalability? Eg. nameko maintains a pool of 10 workers per nameko run (if I remember correctly).
This is how:
https://github.com/sivabudh/djanko/blob/master/services.py
See: django-nameko-standalone
Update: If you want to do microservices with Django, just use Celery. Works like a charm.

Consume kafka messages from django app

I'm designing a django based web application capable of serving via Web sockets data that need to be consumed from Kafka topic.
At this time, I came up with a solution splitted in two components: one component which consumes from kafka, perform some basic operations over retrieved data, and send the result to the django app using an http request.
After request have been received, a message is written over a specific django channel.
Is there a better architecture to address this kind of scenario? Should I enclose all the Kafka part in a "while True" loop in a celery async task? Should I spawn a new process when django starts? If so, can I still use the django signals to send the data via web socket?
Thanks,
Fb
yes, you can use your django code/repository and build separate app/program to deal with kafka queue and database through django ORM
just add at begin of this program code like
sys.path.append(os.getcwd())
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "<your_app>.settings")
django.setup()
and then you can use your models in this program, like
from <your_app>.models.timeslots import TimeSlotReserve
also good idea is to add some multithreading to this separate app

How can I run multiple apps which that accept requests in Heroku?

Is it possible to configure Heroku where one has two processes that accept HTTP requests?
I would like to run one traditional request/response process (perhaps a Django Gunicorn process), and also run a NodeJS service that provides web-sockets. It would be nice if I could configure Heroku to work to a routing pattern like this:
ws/ # NodeJS websocket process
* # Django Process
Where any request with a URL begining with ws/ gets routed to the NodeJS websocket process, and everything else gets routed to Django.
Heroku gives an example of something similar.
https://devcenter.heroku.com/articles/realtime-polyglot-app-node-ruby-mongodb-socketio
But I really do not like this approach and will only consider it as a last resort. The issue is that the NodeJS process and the Rails process, are in separate Heroku apps. This will cause hassle when it comes to billing versioning and staging.

python script instead of browser as client for web socket to django server

I want to have Python script instead of browser as client and the client should be able to communicate with the server by websockets.
Basically I need the server to notify a python script in some other computer that some new information has arrived. I don't want my client to poll all the time.
Is this possible? Can anybody suggest some reading for doing this?
Sure, possible. One option for Python is Autobahn (https://github.com/tavendo/AutobahnPython), which supports Python 2/3 and Twisted as well as asyncio. It allows you to write WebSocket clients.