Django API beyond simple data handling - django

I have a django application that deploys the model logic and data handling through the administration.
I also have in the same project a python file (scriptcl.py) that makes use of the model data to perform heavy calculations that take some time, per example 5 secs, to be processed.
I have migrated the project to the cloud and now I need an API to call this file (scriptcl.py) passing parameters, process the computation accordingly to the parameters and data of the DB (maintained in the admin) and then respond back.
All examples of the django DRF that I've seen so far only contain authentication and data handling (Create, Read, Update, Delete).
Could anyone suggest an idea to approach this?

In my opinion correct approach would be using Celery to perform this calculations asynchronous.

Write a class which inherits from DRF APIView which handles authentication, write whatever logic you want or call whichever function, Get the final result and send back the JsonReposen. But as you mentioned if the Api takes more time to respond. Then you might have to think of some thing else. Like giving back a request_id and hit that server with the request_id every 5seconds to get the data or something like that.

Just to give a feedback to this, the approach that I took was to build another API using flask and normal python scripts.
I also used sqlalchemy to access the database and retrieve the necessary data.

Related

Django Channels 2.x or Ajax?

I have a jQuery dataTable, with lots of data(10000+ rows), which can be used/updated in the real time from lots of users and I need that table to be refreshed and up to date all the time. I've tried with Ajax calls on 1 second, and I figured that that's not good, since every user is sending Ajax request every second * 50 or more users, and I think that there must be a better solution. I've been thinking to use Django Channels, or something similar(?) and i was wondering is that possible? Basically I need to set up some kind of listener on the model(database), and to pull the new data on change(that is almost every second). I would love to here an opinion from someone who has more experience than I do. Thank you in advance.
Ps. I've been told to use web sockets for that.
In short yes web socket would be the suggested way to do this.
If all the updates to the models that you need to subscribe to are being made through Djangos ORM.
I would suggest looking into using (or be inspired by) DjangoChannelsRestFramework as a starting point for subscribing to multiple models in your db.
This library is set up to let you subscribe to multiple object types over a single web socket connection and get ws messages whenever they objects change. It also provides some tools to let you re-use your existing DRF views over the web socket.
Disclaimer: I am the author of the DjangoChannelsRestFramework.

Django - Update a Model every couple of minutes

I am building an app, where I need to fetch some data from an API and update all the models with that data every few minutes.
What would be a clean way to accomplish something like this?
Well, it's a quite open question.
You'll need to create a task that runs every few minutes, you can do this with Celery. Celery has a task schedluer http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html wich will launch a certain function at a configured time similar to a crontab
The task then would fetch the data, http://docs.python-requests.org/en/master/ is a very good library to make http requests.
And lastly but no less important you would need to serialize the fetched data and save it to your model. Django rest framework serializing capabilities are a great starting point, but if data structure is simple enough you can just use JSON python library json.loads(data) and create a function that translate the fields on the API to the fields of the model.
By the way, I'm supposing a REST API.
You can use a task management tool that has the feature of running periodic tasks in the intervals you specify, like Periodic Tasks in Celery.
Also, If you run your code on a Unix-like system, you can stick with the core django functionality. Just write your functionality as a Django Management Command and set a cronjob to run it in your preferred interval.

Session based rollback through tastypie API call example code?

Lets say I have django.contrib.sessions.middleware.SessionMiddleware installed in django and I'm using SessionAuthentication class for API authentication in tastypie. Within a session I'm doing some changes in models through my API and after that I want to roll back. Can I do it through tastypie? If yes, what method should I execute? I can't find such a method in tastypie docs. Do you have any working example of that?
Django supports database transactions, which will commit multiple state changes atomically. (Documentation...)
It is unclear in your question how you want to trigger the rollback. One option is to use request transactions, which will rollback if an unhandled exception is issued by the view function. If you want more fine grained control, familiarize yourself with the options in the linked-to documentation. For example, you may explicitly create a transaction and then roll it back inside your view.
With respect to Tastypie, you may need to place your transaction management inside the appropriate method on the Resource interface.
I hope this gives you some pointers. Please update your question with more details if necessary.
So you want to commit changes to your models to the database, and then roll them back on a future request? That's not something that TastyPie supports (or, for that matter, Django or SQL). It's not really possible to do a clean rollback like that, considering other requests could have interacted with/changed/ built relationships with those objects in the mean time.
The best solution would probably be to integrate something like Reversion that would allow you to restore objects to a previous state.
If you want to be able to roll back all of the operations during a session, you'd probably need to keep track of the session start time and the list of objects that had been changed. If you wanted to do a rollback, you'd just have to iterate over that list and invoke reversion's revert method, like
reversion.get_for_date(your_model, session_start_datetime).revert()
However, again, that will also roll back any changes any other users have made in the same time frame, but that will be the case for any solution to this requirement.

How to do "Lazy-write" per-request logging in Django?

Using Django, I need to do some per-request logging that involves database writes.
I understand Django's process_request() and process_response() middleware hooks, but as far as I can tell, those hooks are in the critical path (by design) for rendering the Web page response.
I'd prefer not to have my post-request database write operations holding up the response time for the page.
Is there a simple design pattern for Django that lets me do a "lazy log write", where I can use request hooks to gather information during request processing, but any follow-on operations and the actual log write operation does not occur until after the response is written to the user?
I'm using WSGI currently, but would prefer the most general solution possible.
Django implements a request_finished-signal which is fired after the processing of the response has been finished, but the down side is, it will not allow you to access the current request object which makes it not quite useful for logging... The latest place to hook into django's response processing is most probably in the HttpResponse class itself. You could eg. store the data temporarily in request.session and write them to the database in the close() method.
But I guess there are also other alternatives you should consider: You could use something like Celery to handle your logging tasks asynchronously. Furthermore there are non-sql databases like MongoDB that offer you handy and performant features for logging, eg. you dont have to wait until the changes are really committed to the database which can give you a big performance advantage.

Asynchronous database update in Django?

I have a big form on my site. When the users fill it out and submit it, most of the data just gets dumped to the database, and then they get redirected to a new page. However, I'd also like to use the data to query another site, and then parse the results. That might take a bit longer. It's not essential that the user sees these results right away, so I was wondering if it's possible to asynchronously call a function that will handle this, and then return an HttpResponse from my view like usual without making them wait?
If so... how? Any particular libraries I should look at?
User RabbitMQ and Celery with django. If you are deployed on EC2, also look at SQS
You create a message from the request-response cycle and an alternative process or a cron keeps checking off the messages.