Optimizing Django app in Google App Engine - django

I am facing slow page load time (list page in Django admin) for a simple Django app, deployed on Google App Engine, with postgres Cloud SQL. There are fewer than 20 records. SQL time is negligible, even with 30 queries.
Majority of the time is shown as domLoading
I assume that domLoading is probably referring to the initial loading of the case/ page, which took 3 seconds.
Most of the solutions online refer to tweaking apache/nginx settings. But since I am using Google App Engine (PaaS), I cannot directly control webserver settings. gcloud app deploy is supposed to handle the load-balancing and code versioning.
How do I improve the basic load time of Django App on GAE? Both GAE and Cloud SQL are hosted in the same region.
PS: I did find some answers like Optimizing my Django app on Google App Engine, but they refer to optimizing SQL queries, which is not the case here.

This could be due to not having any available instances since that's about the time it takes for an instance to be deployed. You could use warmup requests to improve performance in this case. If that doesn't work for you, you can use Cloud Trace to analyze the requests and isolate the issue.

Related

upgrading postgresl/redis databases without downtime on GCP

I'm creating a web app in react with a nodeJS backend. I'm hosting all this on the Google Cloud Platform. I'm using a postgresql database and a redis database, and because my knowledge of these databases is very little, I'm using the managed options (cloud SQL and cloud memorystore).
These are not the cheapest solutions, but for now, they'll do what I want them to do.
My question now is: I'm using the managed options. Imagine my web app has success and grows bigger, I'll probably want my own managed solution (like a redis cluster on compute engine or a postgresql cluster on compute engine). Will I be able to migrate my managed databases to the compute engine solution without downtime/loss of data?
If things are getting bigger, I'll probably hire someone with more knowledge about postgresql/redis, that's not the problem, the only thing I want to know: is it possible to upgrade from a GCP managed solution to an unmanaged solution on compute engine without loss of data and downtime? I'm do not want loss of data at all, a little downtime should not be the problem.
Using the managed solution is, in fact, a better approach for handling databases. GCP takes over updates, management and maintenance of the database and provides reliable tools for backup and scaling.
But to answer your question, yes it is possible to migrate with a minimum downtime. You would need to configure main/worker or master/slave (deprecated terminology) with synchronous replication. After that you can switch your database to worker (which is in Compute Engine) and make it your primary database. This would give basically minimal possible downtime.

Why is Django on Google App Engine is very slow?

I have Django server deployed on Google App Engine, I am doing simple GET request which is taking around 2 seconds, while the same request takes around 300ms when run locally. Both servers are using the same mysql database on Google Cloud SQL. I am testing this on my home wifi (100mbps), so don't think it's a network issue, anyway the payload is pretty small (2.5kb).
Anyone seen this slowness when deployed to Google App Engine? Is there any configuration change I could make to Google App Engine, that would make it faster?
Any suggestions are welcome.
Thanks!
When comparing Google App Engine’s performance with the local one you should keep in mind that deploying on GAE needs more time in order to import all the necessary libraries and set up the Django framework.
Here , it is stated that Instance Startup Time for Standard Environment is up to seconds and for Flexible up to minutes. Additionally, I found some StackOverflow posts that shed some light on this here and here.
You may profile your application by using Cloud Trace to analyze the requests and isolate what causes the issue so that you may improve it afterwards.
In addition to that there are various ways to optimize your application’s performance, as the following typical ones:
Scaling configuration, by setting up “min_idle_instances” to be kept running and ready to serve traffic.
using Warm Up Requests to reduce request and response latency during the time when your app's code is being loaded to a newly created instance.
Furthermore, here and here you may find the official Running Django on the App Engine environments tutorials so that you can spot any details you may have missed.
Finally, during my investigation, I came across PageSpeed Insights, which analyzes the content of a web page, then generates suggestions to make that page faster and could be handy.
I hope this information is helpful and point you in the right direction.

Docker Django NGINX Postgres build with scaling

I have worked mostly with monolithic web applications over the years and have not spent very much time looking at Django high-availability scaling solutions.
There are plenty of NGINX/Postgres/Django/Docker builds on hub.docker.com but not one that I could find that would be a good starting point for a scalable Django web application.
Ideally the Docker project would include:
A web container configuration that easily allows adding new web containers to web app
A database container configuration that easily allows adding new database shards
Django microservice examples would also be a plus
Unfortunately, I do not have very much experience with Kubernetes and Docker swarms.
If there are none, I may make this my next github.com project.
Thank you in advance.
That is a very broad request. It depends on many factors:
What is the performance requirement?
Make sure, you really need that kind of scaling. In most cases, a single good server with docker-compose can handle your needs. If you want to reduce risk of downtime, create 2 servers, add a load balancer in between them and extract shared services like DB or Caching into another Service and have both server instances use them.
How do you want to run your applications?
Docker-Compose:
Checkout https://github.com/pydanny/cookiecutter-django on how to do it with docker-compose on a single server.
Kubernetes:
Django is quite good to run in Kubernetes, I am running several applications on Kubernetes. The issue with that is, you need to setup all other services properly (redis, DB, ElasticSearch, etc.). Each of those services run independently of your main Django app and need to be connected through libraries like haystack or python-redis.
Heroku & others
There are also providers like Heroku that offer a lot the auto scaling for a price. If price is less of a concern, go with these solutions because they are very easy to setup and maintain.
How scalable should your DB be?
If you are trying to setup your own DB cluster across different servers/regions, you will spend a lot of time building and maintaining it. I would recommend, use DB services like AWS RDS. They do this for you with easy setup and maintenance. You can configure how scalable it should be. It costs some money, but it's the least effort solution.

What are some of the most appropriate ways for serving a large scale django app on Google Compute Engine?

I am working on a project that will presumably have a lot of user uploaded content and also a fairly large user base. I am now looking for deploying this app to the Google Compute Engine.
I have looked up for the possible options and nginx+gunicorn seems to be a good option. In the beginning I am going to be using a single ns-1 instance with 100 GB persistent hard drive and google cloud sql for serving my database.
But I want to make things scalable so that I can add more instances and disk storage without any hustle in the future. But I am very confused how to do that. So the main concern is.
I want such setup so that I can extend my disk space and no. of Google Compute Instances whenever I want.
In order to have a fully scalable architecture, a good approach is to separate computation / serving, from file storage, and both from data storage. Going part by part:
file storage - Google Cloud Storage - by storing common service files in a GCS bucket, you get a central repository that is both highly-redundant, and scalable;
data storage - Google Cloud SQL - gives you a highly reliable, scalable MySQL-like database back-end, which can be resized at will to accommodate increasing database usage;
front-ends - GCE instance group - template-generated web / computation front-ends, setting up a resource pool into which a forwarding rule (load balancer) distributes incoming connections.
In a nutshell, this is one of the most adaptable set-ups I can think of, while you keep control over every aspect of the service and underlying infrastructure.
A simple approach would be to run a Python app on Google App Engine, which will auto-scale your instances (both up and down) and it supports Django, as mentioned by #spirulence in the comments.
Here are some starting points:
Django and Cloud SQL support on App Engine
Running Pure Django Projects on Google App Engine
Third-party Libraries in Python 2.7
The last link shows which versions of Django are currently supported.

django hosting with mysql

I m running my python base REST services on DJango free hosting site called "pythonanywhere".
So far each web service was ~100ms in response thus my frontend was super fast, but last few days response is drastically changed, now for same REST API I am getting 30 seconds.
With above performance I cannot schedule my demo, I am planning to setup new django/python/mysql base environment for myself.
What are the best ways to host/setup Django based application (with mysql), preferably free hosting but I dont mind spending few bucks for better service.
For production deployment setups the recommended deployment is with wsgi.
StackOverflow is not the right place to solicit blanket recommendations especially since you haven't given any idea of what your expected load/usage is.
If you just need something to run your application "online" quickly; a PaaS provider should provide the quickest ramp up time. I have used heroku before and its very simple to get started.