At the moment I'm running a Django application that is the same for all of my clients.
Each client has its own subdomain e.g.:
http://client1.myapp.com/
http://client2.myapp.com/
client1 has a settings file, client2 has a settings file etc. The Django app is the same for everyone.
Nginx proxies the requests for each subdomain to a fcgi instance.
So, every client runs its own Django instance, consuming a lot of memory.
Is it possible to run one fcgi instance that switches to the right settings file based on the subdomain that is requested?
Thank you very, very much for your time.
Good question. There was an extensive discussion of this exact problem on the django developers mailing list a few weeks ago.
Basically there isn't a good official solution right now but lots of people are solving the problem themselves in various ways. If you read the whole thread you will probably have a better idea of how to proceed or who to talk to.
Related
I am currently working on a django project where I need a separate backend application to be able to initiate gRPC requests to my django frontend.
Simply put: I want the backend to be able to access and change objects.
Therefore I need my django project to also start my grpc server while starting up.
I did a lot of research and didn't really find any best practice on that.
As far as I would imagine, I could override the startup command of django to also start my grpc server on a seperate thread. But I am afraid that I will end up with a very bad practice solution.
Is there any recommendable way to do this?
I am excited about every hint :)
This question doesn't involve any code. I just want to know a way to run multiple instances of a django app and if it is really possible in the first place.
I make django apps and host them on Apache. With the apps, I noticed a conflict between multiple users accessing the web app.
Let us assume it is a web scraping app. If one user visits the app and run the scraper, the other user accessing the site from a different location doesn't seem to be able to visit the app or run the scraper unless the scraping that the first user started finishes.
Is it really possibe to make it independent for all different users accessing the app?
There are a few ways you could approach this. You might consider putting your app into a container (Google search: docker {your stack})
Then implement something like Docker Swarm or Kubernetes to allocate multiple instances of your app.
That being said, you might consider how to refactor your app to allow multiple users. It sounds like your scraping process locks things. But in reality, there's no reason your server should lock up during this.
It might be better to create your app so that when it received a request, like someone visiting the site, the server pays out the requested web page. When a user asks for a scrape/task to run, the server calls your scaper service or script asynchronously.
That way your app can still function while a scrape is in progress. This would be MUCH more resource efficient (and likely simpler) than spinning up tens or hundreds of instances of your entire app.
tl;Dr: containerization for multiple instances
Refactor app so a single user can't threadlock it.
I am currently creating a website hosted by Django. I plan to use React as my frontend framework. I have done some research on putting them together but most say that I should go for the SPA model and have separate web servers for frontend and backend. The problem is that I wish to use apache as a prod server with django and avoid having 2 separate servers. I have read about the hybrid model and having django serve static files with react.
My Biggest concern is security as I have already setup apache for security and I aware that node.js is somewhat insecure.
What would the best approach be. The separate SPA model or the hybrid model.
I'd say it's okay to go for hybrid model if the project is small and you are the only one working on it and you only want to make things done. I think it's kinda messy to create apps like this unless they don't really worth the time.
But if it's a big project and more than one developer is working on it or will work on it then i highly recommend going with separate web servers one serving frontend app and one django app.
Also note that you don't really need 2 different servers. You can use one server for both and use 2 different which is still not necessary and you can use one web server to serve both.
And security not something that different models can cause to downgrade or upgrade. It's up to you to configure the server and write both frontend and backend apps secure enough to do the work for you.
There are more than one web servers that are as secure as they can be and they work with both django and react pretty well. I used nginx many times to host both django and react apps and i had no problem causing by nginx itself whatsoever.
And for last piece of advice if you will; Creating good quality apps requires a lot of time and energy, working with different technologies that do really good for what they are made for and if you are planning to be a really good developer you should come out of your comfort zone and adapt with new technologies that comes out and they are coming out pretty rapidly which requires you to learn constantly and do things in way you are not used to yet and making things work even if they doesn't seem to be good together at the first look.
I'm learning Django/backend programming but I'm not exactly sure how to organize all of my stuff. I'm planning on having two websites with different domains, one is a personal one, and another one is for an organization.
What would be the best way to create this with Django? Would I separate this into two projects, and have all of the files in each project? Should I create an app for each section of the site? Or should I put it all under one project, and have an app for each website?
Also, as a quick side question, why do I have to run Nginx and uwsgi instead of just using Django? I don't exactly understand the difference between all these things. NginX is a proxy server, that sends requests to uwsgi, which then goes to Django, right? It seems excessive. I don't even know where to start, in terms of creating a host name router..
Thanks a lot, and if you can any good reading links/books let me know!
As a beginner you better stick with different projects for
different domain, when you gain more experience, you will find out
how much of the logic is reused in both and then make a decision of reusing the apps in single project.
Please read the section on wsgi: https://docs.djangoproject.com/en/dev/howto/deployment/wsgi/
What are the risks of doing this? I understand the documentation says not to do it, but I have password protected all the pages.
The point is that your "password protection" is useless if a hacker can simply bypass that and read your database directly. We don't know if they can, but - as the docs say - the dev server has undergone no security testing whatsoever, so they might well be able to.
Plus, the server is single-threaded. It will only ever be able to serve one request at a time. That makes for a very slow experience for your users.
Seriously, there is no reason to do this. Setting up Apache + mod_wsgi, or whatever your preferred hosting environment is, is a five-minute process if you follow the very detailed instructions.
If you mean, you want to deploy your Django project, you should use something like NGINX etc.
If you just want to tell, what if a development server is public, you have the same risks with any another project written in any platform.