I am looking for a way to detect if there is data insertion happened in my database, i will use this to create a notification badge in real time so that the user wont need to refresh the page to see if there is new notifications. I am planning to use django channels but i cannot see any tutorials that will fit my need.
I've tried to use timer but i don't think it is the best practice for this.
Related
I have a jQuery dataTable, with lots of data(10000+ rows), which can be used/updated in the real time from lots of users and I need that table to be refreshed and up to date all the time. I've tried with Ajax calls on 1 second, and I figured that that's not good, since every user is sending Ajax request every second * 50 or more users, and I think that there must be a better solution. I've been thinking to use Django Channels, or something similar(?) and i was wondering is that possible? Basically I need to set up some kind of listener on the model(database), and to pull the new data on change(that is almost every second). I would love to here an opinion from someone who has more experience than I do. Thank you in advance.
Ps. I've been told to use web sockets for that.
In short yes web socket would be the suggested way to do this.
If all the updates to the models that you need to subscribe to are being made through Djangos ORM.
I would suggest looking into using (or be inspired by) DjangoChannelsRestFramework as a starting point for subscribing to multiple models in your db.
This library is set up to let you subscribe to multiple object types over a single web socket connection and get ws messages whenever they objects change. It also provides some tools to let you re-use your existing DRF views over the web socket.
Disclaimer: I am the author of the DjangoChannelsRestFramework.
I am currently developing a computer based test web app with Django, and I am trying to figure out the best way to persist the choices users make during the course of the test.
I want to persist the choices because they might leave the page due to a crash or something else and I want them to be able to resume from exactly where they stopped.
To implement this I chose saving to Django sessions with db backend which in turns save to the database and this will resolve to a very bad design because I don't want about 2000 users hitting my db every few seconds.
So my question is are there any other ways I can go about implementing this feature that I don't know of. Thanks
If your application is being run on a browser, to be specific, if you are designing a progressive web application, you can make use of browser storage systems, such as localstorage, indexed db, cookies, etc ..
This way, you wouldn't need to send user's updated state back and forth to your backend and you can do the state update based on a specific condition or each n seconds/minutes.
I have a django application that deploys the model logic and data handling through the administration.
I also have in the same project a python file (scriptcl.py) that makes use of the model data to perform heavy calculations that take some time, per example 5 secs, to be processed.
I have migrated the project to the cloud and now I need an API to call this file (scriptcl.py) passing parameters, process the computation accordingly to the parameters and data of the DB (maintained in the admin) and then respond back.
All examples of the django DRF that I've seen so far only contain authentication and data handling (Create, Read, Update, Delete).
Could anyone suggest an idea to approach this?
In my opinion correct approach would be using Celery to perform this calculations asynchronous.
Write a class which inherits from DRF APIView which handles authentication, write whatever logic you want or call whichever function, Get the final result and send back the JsonReposen. But as you mentioned if the Api takes more time to respond. Then you might have to think of some thing else. Like giving back a request_id and hit that server with the request_id every 5seconds to get the data or something like that.
Just to give a feedback to this, the approach that I took was to build another API using flask and normal python scripts.
I also used sqlalchemy to access the database and retrieve the necessary data.
I am having large list of data of ingredients required for cooking. More than 3000+
I am using Django rest framework as the backend and ReactJs as frontend.
Each item in the list has a name, id, measurementunit, density in kg/ltr and cost/measurementunit
In Django i have created an api endpoint to supply the data in JSON format.
I want to display the data in a table format and with search filter on the top. Also at a time i want to show maximum 300 results.
Can someone guide me how to achieve this. Should i fetch all the list at a time or use pagination from django. Should i use seperate api for search or do it using reactjs on the frontend.
Presently i dont need any authorization in django. The data is for local use only.
3000 records is a lot to send down to the client in one chunk. It is easier to develop against, but it doesn't scale well and is likely to create a measurable load time. If you're OK with that, and you don't expect your data set to grow, then perhaps the pragmatic approach is to keep it as a big list... but it goes against best practices.
Either way, you likely don't want to show a giant 3k-row list to the user, so the UI will have some sort of pagination mechanism. That might be "next", "previous" pages, or it might be infinite scrolling. Either way, the data abstraction should be considering it as paged data.
Paging API
Assuming you decide to make your API support paging, you should use the backend framework's paging capabilities, like the Django Paging API or some other abstraction for your REST API. There are lots of resources out there.
Search
The moment you decide to paginate the API, you are committing to handling search (filtering) on the backend as well. The only way you can manage client-side filtering is if the client has access to all the data. Since that isn't the case, you'll need to include a search filter in your data request. Searching and pagination aren't mutually exclusive, so make sure your API supports both at the same time. A common way to handle this would be like this:
http://yoursite.com/api/ingredients?page=5&page_size=100&search=carrot
Client
On the React side, you can build your own UI (it is easy to do), or you can use a component which abstracts this for you, like react-js-pagination or react-paginate. The client component shouldn't really care if the API is paged or not. Instead, it just notifies you when to display different records and the rest is up to you.
If you decide to keep everything in one big REST call, then you still need to slice the data out of your array to display. If you paginate your API, then you can keep an instance cache on the client side of the pages you've received. If you don't have the data, make the REST call to get it, and populate an array with the data. That way, if a user goes forwards and then backwards, you aren't re-fetching.
Conclusion
I hope this helps a bit. Enjoy :)
I've been trying to learn Django, but I'm still pretty much a web dev newbie, so please bear with me. Maybe something is just fundamentally wrong with this question...
For example, lets say some data exists in a JSON stream that is updated constantly. I'm trying to capture bits of that data and store it in my database, and it's displayed when I visit my Django built page. I guess there's two ways to do this:
In my views.py, it checks the data source, updates the database, and displays the information through a html file. This just seems like it's not the right way to do it. The source would be polled every time the page is viewed.
I would think the correct way to do it is have an application on the server that polls the data source every 1 minute or whatever and updates the database. The views.py only displays information from the database.
Am I even thinking about this correctly? I haven't found any information/examples on how to write the application that would sit on the server and constantly update the database.
Thanks!!
The second way is the right way to go about this, and the application that you would write to poll the json stream does not have to be written with django.
If you want to use the same models for the application, you can implement it as a custom management command, then run the command using cron at an interval. The command would poll the stream, update the database. Your view would then read the database and display the data.
If you want to do this in "realtime" (I use the word realtime here loosely), the server that is hosting the json stream should allow for "push" or a socket connection that will remain open.