Integrating django-import-export with react - django

I have a postgresql database and I want to add data to it.
I want to upload an excel file containing the data and save it to database.
I have a backend server of django and a frontend server of React.
I am easily able to import the data from the excel sheet to database using django_import_export but from the django admin.
What I want is, to do this using React and for normal users(non superusers) also. Is there a way to integrate django_import_export with react? Any other way to implement this functionality is also apreciated.

Presumably your backend uses a REST API to handle requests from the frontend. So, you can write an API handler which receives the Excel data posted to it.
The Django API handler can create an import process to handle upload. Check the documentation for more information.
Note that if you are loading large files, then you might want to handle the upload asychronously. This is little more tricky, but you could look at Celery to help with this.

Related

Do we need to render templates when using ReactJS as a front-end?

So I just created a website's front-end using ReactJS. Now all I need is a backend database that I will fetch data from using requests.
The question is whether I need to render templates using my backend or just use my server to make requests (eg get, post etc)
PS. I will be using Django as my backend.
Thank you everyone who will help me out.
Doing both is recommended. Based on the requirements and use cases we must use both ways to render.
For example, Some products use initial html as a Server side rendered page with all essential data required inserted as scripts and so on. This helps in loading primary content faster. If we are not following this in applications that require data initially. Then it might take more time to fetch React chunks, scripting and after seeing an API makes request, and then getting data and then displaying the primary content. So when a page needs more data (like More API calls) then server side rendering might be a good way.
For other scenarios like getting user details, All these can be done using React.
No, because you will use DRF (Django Rest Framework) to communicate between frontend and backend. Basically you will write your own APIs in the views.py that will respond with JSON data, at least in major of cases this will be enough. So, you don't need templates, since template are really Djangos' frontend, that you will not be using at all.
But, this heavily depends on what you are doing and what is your setup.

How to implement message notification in React App?

I am now building an application using React and Redux as the frontend and Django as the backend. What i am trying to realize is whenever an end user upload a file, all the end users that are related to this file should receive a notification.
I am thinking of using websocket/socket.io but I am not sure if that works well with Django. Or any experience or suggestions of using any other technologies to implement the message notification function?
A simple Google search revealed Django Channels
Channels is a project that takes Django and extends its abilities beyond HTTP - to handle WebSockets, chat protocols, IoT protocols, and more. It’s built on a Python specification called ASGI.
Using the field_field.files[0].file.slice() method in javascript you can send a file in chunks over a websocket. Using field_field.files[0].size you can get the total size and divide the total of what you've sent and the size of the file to build a progress bar. Make sure you wrap your file writes in the #sync_to_async decorator as doing it without that would block the event loop. That method is part of Channels and found in Asgiref.
Channels Redis can be used to notify any or all of the users that an event has occurred, such as a file being uploaded.

React-Rest app, where to fetch data from database

I have an App composed by back-end: Python with Django and Django REST, and front-end composed of React.
Right now I have Excel files with data, which I import with python in json format to the back-end, so they are available for a fetch in the front-end via REST-url like here.
I am now translating my data into a web-based-database to be queried into my app.
But I have questions regarding the structure of my app with this change.
I have url-based queries for my new database.
Should I continue to import the queries in the back-end REST framework and, from there, to React?
Or should I use the url-based queries directly inside my React, substituting the REST url calls?
You can get an idea by referring this url.
https://www.andreasreiterer.at/connect-react-app-rest-api/
this describing about how to bind data using REST APIs in react.
I have found some sources that presented me two ways of solving the problem
Case 1:
Have the JSON Query importation on server side in your back-end and pass this data to your API (REST in my case).
Basic source: https://www.valentinog.com/blog/tutorial-api-django-rest-react/
Pros:
Do not need to change the structure for the rest of my application. The data layer continues the same, as before I was working with an Excel file and now I just change to a JSON Query.
The connection between server-client continues to be straight forward
Credential systems can be applied more easily for the data will be stored in your API
Cons
Harder to implement
Connection between python and url queries must have individual settings (url-queries are usually browser-based, and some queries can't be performed in python)
Harder to debug
Case 2: Query data with a native fetch Javascript method and handle the data in client side.
Basic Sources:https://www.robinwieruch.de/react-fetching-data/
https://blog.hellojs.org/fetching-api-data-with-react-js-460fe8bbf8f2
Pros:
Faster and easier to implement
Easier to debug
Javascript handles queries in a simpler way than python
Cons:
Credential system can't be applied
Less secure/robust method
Double connection between client and server (client-queries and client-API), because the API would still be maintained to store local information.

Serve scraped HTML data as an API using Django Rest Framework

I'm trying to build a public facing API that collects data through scraping HTML (the content of the page is what is important, not the pages themselves). I've elected to use Django-Rest-Framework as my backend. My question is: How exactly would I organize the structure of this project so that the Django ORM stores the scraped content and then it can be accessed using Django-Rest-Framework's API?
I've looked into Scrapy, but that seems less focused on content scraping and more focused on webcrawling. Additionally, it deploys in its own project, which makes conflicts with Django's bootstrapping.
Is my best bet just running cronjobs? That seems inelegant.
Use Celery to create asynchronous and periodic tasks.
If you need something lightweight for scraping, you can use BeautifulSoup. Here is a tutorial.
Overall, this is what you need to do:
Start ordinary Django project.
Add Celery to it.
Write some scraping code.
Call your custom scraping code from celery tasks. Save the scraped content to the database.
Use Django-Rest-Framework to create an API which will serve the content from the database.

How to port from Drupal to Django?

What would be the best way to port an existing Drupal site to a Django application?
I have around 500 pages (mostly books module) and around 50 blog posts. I'm not using any 3rd party modules.
I would like to keep the current URLS (for SEO purposes) and migrate database to Django. I will create a simple blog application, so migrating blog posts should be ok. What would be the best way to serve 500+ pages with Django? I would like to use Admin to edit/add new pages.
All Django development is similar, and yours will fit the pattern.
Define the Django model for your books and blog posts.
Unit test that model using Django's built-in testing capabilities.
Write some small utilities to load your legacy data into Django. At this point, you'll realize that your Django model isn't perfect. Good. Fix it. Fix the tests. Redo the loads.
Configure the default admin interface to your model. At this point, you'll spend time tweaking the admin interface. You'll realize your data model is wrong. Which is a good thing. Fix your model. Fix your tests. Fix your loads.
Now that your data is correct, you can create templates from your legacy pages.
Create URL mappings and view functions to populate the templates from the data model.
Take the time to get the data model right. It really matters, because everything else is very simple if your data model is solid.
It may be possible to write Django models which work with the legacy database (I've done this in the past; see docs on manage.py inspectdb).
However, I'd follow advice above and design a clean database using Django conventions, and then migrate the data over. I usually write migration scripts which write to the new database through Django and read the old one using the raw Python DB APIs (while it is possible to tie Django to multiple databases simultaneously, too).
I also suggest taking a look at the available blogging apps for Django. If the one included in Pinax suits your need, go ahead and use Pinax as a starting point.
S.Lott answer is still valid after years, I try to complete the analysis with the tools and format to do the job.
There are many Drupal export tools out of there by now but with the very same request I go for Views Datasource choosing JSON as format. This module is very solid and available for the last version of Drupal. The JSON format is very fast in both parsing and encoding and it's easy to read and very Python-friendly (import json).
Using Views Datasource you can create a node view sorted by node id (nid), show a limited number of elements per page, configure a view path, add to it a filter identifier and pass to it the nid to read all elements until you get an empty JSON response.
When importing in Django you have a wide set of tools as well, starting from loaddata to load fixtures. Views Datasource exported JSON but it's not formatted as Django expects fixtures: you can write a custom admin command to do the import, where you can have the full control of the import flow.
You can start your command passing a nid=0 as argument and then let the procedure read, import and then fetch data from the next page passing simply the last nid read in the previous HTTP request. You can even restrict access to the path on view but you need additional configuration on the import side.
Regarding performance, just for example I parsed and imported 15.000+ nodes in less than 10 minutes via a Django 1.8 custom admin command on an 8 core / 8 GB Linux virtual machine and PostgreSQL as DBMS, logging success and error information into a custom model for each node.
These are the basics for import/export between these two platform, for detailed information I described all the major steps for export from Drupal and then import to Django in this guide.