Populate web page using API result directly from API server - django

Firstly sorry if this question has been asked before but I'm a novice so even if it has I'm unaware of the language I'd even use to try and seek it out.
I'm beginning to learn about REST API's and it got me thinking. Is it possible to load the JSON response directly from the API server into the user's browser and bypass your own server?
Imagine you have say a Django app running on a server that accesses email messages from Outlook.com using the graph API.
I assume an ordinary flow would go something like:
User request->your server->graph api-> your server-> user browser.
It seems like a waste for it to hit your server that second time before it goes on to be presented to the user's browser.
Is there a way the Django app can render a template and effectively tell the browser "expect some data from X source, and place it in y location in this template"?

You could do that with javascript. You'd have to include either a script tag in your template, or create and include some static javascript files with the code.
I'd recommend learning and using the jQuery javascript library, as it makes what you're talking about much easier to implement. Research ajax requests, those are what you'll need to make requests directly to another server, bypassing your own

Related

Do we need to render templates when using ReactJS as a front-end?

So I just created a website's front-end using ReactJS. Now all I need is a backend database that I will fetch data from using requests.
The question is whether I need to render templates using my backend or just use my server to make requests (eg get, post etc)
PS. I will be using Django as my backend.
Thank you everyone who will help me out.
Doing both is recommended. Based on the requirements and use cases we must use both ways to render.
For example, Some products use initial html as a Server side rendered page with all essential data required inserted as scripts and so on. This helps in loading primary content faster. If we are not following this in applications that require data initially. Then it might take more time to fetch React chunks, scripting and after seeing an API makes request, and then getting data and then displaying the primary content. So when a page needs more data (like More API calls) then server side rendering might be a good way.
For other scenarios like getting user details, All these can be done using React.
No, because you will use DRF (Django Rest Framework) to communicate between frontend and backend. Basically you will write your own APIs in the views.py that will respond with JSON data, at least in major of cases this will be enough. So, you don't need templates, since template are really Djangos' frontend, that you will not be using at all.
But, this heavily depends on what you are doing and what is your setup.

Django: implementing websockets to work with my existing MVT-based app (Channels seems to need me to throw out my entire existing code)

I have an existing Django project in the sports domain with apps that are built on the Model-View-Template structure. Many of the models and views are fairly sophisticated and work well currently. Data from the database (scores etc) is combined with some incoming user inputs through forms (HTTP POST requests) to be displayed on a web page via templates.
However, I now need live data to be displayed to users and be automatically refreshed to all the users continuously, either because one of the users entered something new (in the front-end), or because the scores changed during the game (directly enters the back-end).
I've done some research on Stack Overflow as well as tutorials on Youtube/ rest of the web and it appears that for me to use Django Channels, I'd have to start from scratch and build everything ground-up, which I'd like to avoid. How do I use the websocket protocol for my Django app easily without having to completely rework everything that I've done so far?
You don't really need to start from scratch or anything. You just need to add a module using channels. I am assuming that currently, the data is fetched only when the page is refreshed. What you need to do is write a consumer that is used to send messages directly to the client via websocket. Then in the front you can update the widget with the scores on each message received in the websocket. You can also stream user actions through the websocket to the server which will then be broadcasted by the consumer to needed clients. You may not even need to change anything in the existing code.
It will be easier to understand how this works and how you can incorporate it to your project by reading the channels tutorials. It became clearer to me after reading it so I would advice you do the same

How to integrate Django Rest APIs with BackboneJS frontend for single page application

I am not able to comprehend what would be pros and cons of the following approaches in making a single page backbone application using RESTful APIs from Django Rest Framework.
Render the whole app from within Django's template.
Serve the backbone app from another server ie node server. With nginx in the front for both servers.
Serve the HTML/Templates and JS from a separate CDN.
What are the things to take care ie points of caution in each strategy. Is there any other way to tie them up which I am missing?
This is a very broad question, and really it has nothing to do with Django or Backbone. What you're really asking about is a "thick-client" architecture vs. a "thin-client" architecture. In other words, having your user interface rendered on the client vs. having it rendered on the server.
First, allow me to recap a few things to make sure we're on the same page. The "thin-client" approach is the traditional/old school model, and the model Django itself is based on. The server renders HTML, sends it to the client, and whenever the client wants to do something it sends data back to the server and asks for fresh HTML.
In contrast the more modern "thick-client" approach lets the client render all of the UI. Whenever the client needs to do something it makes an AJAX request to a (presumably REST-ful) API, powered by a library like Django REST Framework. That API just returns the relevant data, and leaves it up to the client to render it appropriately.
There are advantages and disadvantages to both approaches, but the thick-client approach is becoming more and more popular because:
network transactions are faster: because your server is only sending the exact JSON you need instead of a mess of HTML, the "payload" of the response is much smaller
you can fetch all data "behind the scenes"; this makes things appear faster to the user, and lets you implement UI paradigms (eg. infinite scroll) that a thin client can't
the client/server relationship is simpler, because the people writing server code never have to even think about HTML or any other presentation logic; they get to just focus on the data (which, being server engineers, is probably the part they're most interested in anyway)
This is why a lot of companies (including the one I work for) have all but abandoned Django proper in favor of API endpoints served by Django REST Framework.
So, if you want to go with a thick client architecture, Django should never serve anything except the very first HTML page (and even that could be served by ngnix if you wanted, since it's just static HTML). After that you'd use a Backbone.Router and Backbone.Views to render your site. Whenever you need new information from the server you'd fetch a Backbone.Model or Backbone.Collection (with its url property pointing to your Django REST Framework endpoint).
I can attest that this whole approach works great; the site I work on is very complex, with many endpoints, and Backbone + Django REST Framework handles it beautifully. The only (slightly) tricky part is caching: in the thin client approach the browser automatically caches pages for you, but since there are no "pages" in a thick client (just AJAX responses with data) there is no automatic caching. This means that if you want to cache data you'll need to do it yourself, for instance with a Backbone.Collection devoted to that purpose.
Hope that helps.
P.S. Back in the day Django REST Framework didn't handle Django authentication stuff (ie. logging in/out) quite the way we wanted, so we wound up serving one other page, our login page, from Django. However I'm pretty sure the current Django REST Framework handles authentication stuff much better now, so this likely won't be an issue for you.

How to POST data to remote server and display result

I would like to submit some post request from my django app to payment service.
Is there a simple way to do so from within a view and then display the
remote service instead of my own view (something like being redirected to remote
page with post data)
Currently I am messing with some weird approach of displaying a hidden form and
ovveridng it's javascript submit call to implement some of my own stuff, but I think
it is overcomplicated.
I need something like this:
def view(request, **kwargs):
do_my_own_business()
post_data = create_post_dada(**kwargs)
return post_remote_page("https://example.com", post_data)
You can use python's urllib2 to to make a request to a 3rd party website. This is essentially emunlating how you or I would visit the website via code - here's a good tutorial.
If you want to make life easier, there are some libraries build upon urllib2 which make doing these external request easier - namely requests and mechanize. If you are from a PHP background you can also use pyCurl (Not based on urllib2)
Be sure that the payment service you are using doesn't have it's own API or library that you could use instead, as this would be much easier and safer then the above approach which is essentially screen scraping.
Furthermore, be aware that for every request your user makes to your server, your server needs to make to an other external server which means you could potentially time out etc. introducing another level of complexity in managing the connections.

Emulating a web browser to wrap the functionality of several similar web sites

I'm interested in emulating the functionality of a web browser in C++ so that I can create a wrapper for several web sites. Right now, the biggest issues with these sites are that they make heavy use of JavaScript that interacts with the HTML DOM. Thus, the simple solution of using curl to download the page, and something like RapidXML to parse its contents is out.
Next, I considered using something like v8 with curl, and that solves the issue of interpreting the JavaScript on the page nicely. However, it doesn't solve the issue of connecting the HTML DOM methods with the JavaScript; in other words, document.getElementById() would fail in v8.
Next, I considered WebKit, which seems like it's perfectly suited to emulate a web browser--after all, Chromium and Safari both utilize it in their web browsers. However, it's a little too complete. I don't need all of the rendering aspects it includes.
So, I'd be looking for some way to:
Make an SSL connection to a web site
Interpret the JavaScript on that web site in connection with the HTML DOM
Set the value of the username/passwords <input> fields with my username and password
Simulate clicking the "Submit" button by calling the formSubmit() function, from <input type="button" onClick="formSubmit()">
Handle the HTTP POST form action and the subsequent HTTP 301 and JavaScript redirects (accomplished using window.location)
Repeat 2-5 as needed
Besides what I've already considered, what other options do I have? Ideally, I'd want this to be extremely lightweight, without requiring linking to many libraries.
I'm primarily concerned with developing for Windows 7 64-bit.
Well, this sounds all too much like a brute-force program. Disregarding that, and since you don't seem to need to render any website, I think you should just fetch the file through cURL or something, then parse it, check for the form through using a regex, retrieve the form action, then make a request using the method taken from the <form> tag and whichever input you want.
Problem is, there would be no proper way to know when is it that you've logged in properly, unless you made some kind of per-site checking. This comes mainly from the fact that many sites use sessions rather than direct cookies or HTTP auth, and since you can't read from sessions directly, it is impossible for you to guess when the session has changed.
That's the most lightweight solution I can come up with right now.