Best way to provide a secure external webpage for importing to an internal database and Django. (Best way for the tiers to comunicate). - django

I have an Django application at my work, only available on the internal network.
Currently we import data using Excel, but this is a terrible error prone process and I want to replace it.
I would like to provide a rich web application in Javascript which exposes some, but not all of the data from the main Django application (lookup values for menus). This would run on a server visible to the outside world.
So what is a good approach for this?
Management are concerned about security of making the main Django app available to the outside world, and I would prefer an intermediate tier as well - I think it would be easier to write a small server side app than to go through the current code and make sure it is secure enough to the outside world (I learned Django buildiong this app, so some of the older code is not done according to best practices, but does work as it needs to). I would also like it to hold the new data until someone has checked it looks OK before importing to the main database. (I am the only developer, so there are time considerations).
So two options I can think of just now.
1: Have a small Django app on an external facing server. This can communicate with the main app to get the values required for lookups, and store the input before it gets imported. The tables will essentially mirror the main app and need updated when the main app tables change.
2: Have something similar, but rather than use a database, use the external facing server to contact the REST interface on the internal server. Something like using Django non-relational to get data from the REST interface of the main app. Put an import table in the main database server to store the dats for approval.
Are either of these good / bad approaches?
Any other suggestions?
Are there any good resources for learning about n-tier apps?

If I understand you correctly you want a small Group of trusted users to be able to access an internal database. There is already an Internal Django App accessing that database.
Management is concerned about making this app or an extension of it available to the general Internet.
I think ther concerns are very valid. If you have only a limited set of users accessing the import functionality, push authentication out of the Django Web Application into the HTTP Server / Balancer / Frontend.
For example set up an apache external webserver forcing all access to your Django App beeing encrypted (HTTPS) and authenticated. Users can be authenticated via HTTP-Auth using static files on the server. Password changes / user additions have to be done by an admin logging into the server.
Only after completing this login the Django App with it's own authentication can be accessed. I would opt vor a smale seperate import App instead of extending the main app. This small app could run with reduced permissions on the main database for an defense in depth aproach.
This setup provides you with a litte additional interfaces / points of failures, while maintaining a small attack surface against random Internet users. You can hire a security consultant th audit your apache config and be assured that you locked out the greater Internet and only have to worry about HTTP-Authenticated users.
I have benn running such setups for 15 years by now. Users are annoyed by the double authentication and password saving in Internet Cafes is an issue whith HTTP-Auth but generally it is verry seamless if once set up.

Related

How to run multiple instances of a Django app?

This question doesn't involve any code. I just want to know a way to run multiple instances of a django app and if it is really possible in the first place.
I make django apps and host them on Apache. With the apps, I noticed a conflict between multiple users accessing the web app.
Let us assume it is a web scraping app. If one user visits the app and run the scraper, the other user accessing the site from a different location doesn't seem to be able to visit the app or run the scraper unless the scraping that the first user started finishes.
Is it really possibe to make it independent for all different users accessing the app?
There are a few ways you could approach this. You might consider putting your app into a container (Google search: docker {your stack})
Then implement something like Docker Swarm or Kubernetes to allocate multiple instances of your app.
That being said, you might consider how to refactor your app to allow multiple users. It sounds like your scraping process locks things. But in reality, there's no reason your server should lock up during this.
It might be better to create your app so that when it received a request, like someone visiting the site, the server pays out the requested web page. When a user asks for a scrape/task to run, the server calls your scaper service or script asynchronously.
That way your app can still function while a scrape is in progress. This would be MUCH more resource efficient (and likely simpler) than spinning up tens or hundreds of instances of your entire app.
tl;Dr: containerization for multiple instances
Refactor app so a single user can't threadlock it.

Is it possible to (how) to lock a domain server behind a password?

I've been working on a django Web app to manage some data for tenants. Things like their personal contact information, government ID, tenancy agreements etc (in other words, very gdpr sensitive info) as well as personally private information such as expense reports. All of this is displayed on the front end of the app as intended seeing as it's supposed to be a private tool for internal company use.
At the moment I run it at home on a local machine server so there is little risk involved however for my convenience I'd like to take the Web app live so I can access it while I'm out and about. The issue I'm having is that there really is no reason for anyone other than myself or business associates to use this app and therefore no reason for anyone else to connect to the domain.
I've considered making the landing page of the website a login page and locking all other views behind this with CSRF protection but even that is too close for comfort in my opinion as it would mean allowing external entities tor connect to the app. I'd much rather have a server which refuses any connection to the outside world straight away unless it is from me. In other words, a server which does not divulge any part of the app or database until login credentials have been correctly entered.
What I envision is that once you type in the domain and hit enter, the moment a connection is made, the server prompts you with an alert box asking for login credentials before any of the app or templates are loaded.
Is this even possible? I've never hosted any of my software online and do not want to fall into a nasty data breach situation by taking this live. At the same time it isn't ideal that the current system operates on the premise that I'm home all the time.

How to define URLs only once for server and client?

everyone! I am building a web application, i.e. a server-client application. For the interaction between the two, I have to define the URLs twice (hard-coded strings), both on the backend and the frontend, which makes future changes hard, because it would require changing the code in two places, rather than just one.
I am using Django and Angular and so I am looking for a way to specify the back end endpoints once, then ideally read them and use them for the Angular production build. Therefore changes to the endpoints will only require a new build, but no further changes.
Should these be defined in some .cfg file and be read by the back end on server startup and maybe somehow add them to the Angular's build process? Any suggestion would help because this redundancy comes in almost every webapp project and there has to be a more clever solution!
Thanks for the help in advance!
Here, it is the backend application that owns and defines url mappings to entities. It is possible that multiple clients can consume from the same API, like a web client, an Android client and an iOS client. In this setup, your backend is the point of truth for the url mappings, and client applications should be configured to use the url mappings defined in the backend application.
One possible way to do this is to serve defined urls in the backend on a path of the backend application, and have your client applications configure themselves using the data provided there. For example, if you use Django Rest Framework, by default, on the root path of the API ("/"), resources along with url mappings for the resources are served. You can use such a mechanism to configure your client applications on build time.
How many endpoints and how likely are you to alter them? Most likely you will always have to make more changes than just in 1 place as the reason behind changing an endpoint is normally you are trying to POST or GET new data structures. This would mean you will have to alter that request process anyway to handle the new data type or what was being posted.
Also, consider some of the publicly available api's out there - they don't give you an endpoint that serves a config file of available routes. When they make a change to their endpoints they usually create a versioned api so that consumers can upgrade in their own time.
In my opinion, unless you are planning a large scale web app, I wouldn't be too worried about trying to implement something like this.

Running a meteor app as part of a wider django project

I'm currently working on a project which would require some realtime functionalities such as Multi-user chatrooms etc.
Ideally, I’m looking to have meteor run the chat application(on a different port) and mongodb act as message broker to the django back-end which would take care of user registration , management and everything 'non-realtime' related.
This would involve setting up a reverse-proxy which would redirect to a different port based on the url (please let me know if i'm wrong in this)
Would this be possible(or even advisable)? Another option would be to implement the same with tornado. but I have no experience with building tornado-based apps and rather do this with a framework I’m comfortable with.
Thanks,
You can have Django serve the Meteor front-end while providing access to its data using django-ddp, giving you some distinct advantages:
Continue to serve your existing Django project/apps.
No extra services or ports to manage.
Scale out by simply adding more front-end Python/Django servers (server to server IPC is done via the existing database connection).
Use django.contrib.auth user accounts in your Meteor app.
Familiar Python/Django code (no "callback" style such as with Tornado).
Use time-tested, trusted relational databases.
Use Django migrations to effectively manage schema changes.
There's a Gitter chat room where I can give you assistance if you need it.
DISCLAIMER: I'm the author of django-ddp.
A meteor application is more than capable of handling the user registration flow and many other things. Why not just build the application entirely in meteor? Your application sounds like a perfect candidate for meteor, with realtime interaction with your database at the core.
The other option would be to use swampdragon which adds realtime data binding within django. It allows for simple bi directional communication between the server and the client. Again, essential for a chat application. It nice and easy to get setup and running as well.
Are there any specific reasons to not implementing your application in one framework alone?

Deviding Django application (social network) to Django Frontend and RESTfull API

I'm writing Django application (social network) and thinking about dividing monolithic project to two projects: UI and API. For example, Django will be used only to render pages, interacting with and taking data from API, written on web.py.
Pros are following:
I can develop and test API independently.
In the future, other UI can appears (mobile, for example), it will require service.
I plan to outsource web UI developing, so, if my application will have two modules, I can provide outside only UI one, not sharing logic of application.
Cons are following:
I'm working alone, and developing two projects are harder, then one.
I will not be able to use cool Django admin panel. I will need to write my own.
web.py is more low-level comparing with Django.
It's like a brain dump, but I will be really appreciated if you share your experience in creating web application with UI module and independent API module.
Update (more specific question, as Mike asked)
What Python framework will you use for creating REST API of social network, which can be used by different client applications? Is using web.py that returns JSON only and rendering it by Django for web is good idea?
Thanks,
Boris.
I've been in a situation similar to yours. I ended up writing both, the UI and the API part in Django. Currently, I am serving them both out of the same process/project. You mentioned you wanted to be able to outsource the UI development, but do hear me out.
In the meantime, I have used django-piston to implement the RESTful front end, but a bit of preparation went into it:
Encapsulate all DB and ORM accesses into a library. You can do that either for your entire project, or on an app by app basis. The library is not just a low-level wrapper around your DB accesses, but also can be for higher-level 'questions', such as "all_comments_posted_by_friends()" or something. This accomplishes two things:
You can call your pre-canned queries from UI views as well as API views without having to re-implement them in multiple places.
You will later be able to replace some - if not all - of the underlying DB logic if you ever feel like going to a NoSQL database, for example, to some other distributed storage model. You can setup your entire app of this ahead of time, without actually having to worry about the complicated details of this right at the start.
The authentication layer for the API was able to accept an HMAC/token based header for programmatic access and normal Django auth. I setup the views in such a way that they would render plain JSON for the programmatic clients (based on content-type), and would render the data structure in HTML (with clickable links and clickable docstrings) if browsed by a human from a browser. This makes it possible that the API is fully explorable and clickable by a human without having to read any docs, while at the same time it can be easily processed by a client just via JSON.
On effect, the database layer I build serves as the internal API. This same database layer can be used from multiple applications, multiple processes, if you wish to do so. The UI views and the REST views were both implemented in Django. They can either be in the same process or in separate processes (as long as they have access to the same database right now).