Continuously develop and deploy a Django app with Visual Studio Code and Docker - django

I am developing a Django app locally with Visual Studio Code. In preparation for deployment I "dockerized" everything and now I am already able to run this container locally.
Before I try to build my Docker image somewhere else (I have Google Cloud Run in mind), I want to make sure that I still can debug my code.
With the official 'Python in a container' tutorial I am able to set breakpoints and so on when my app runs locally with Docker.
So I think the workflow will look like this:
I develop my app locally and debug it within Visual Studio Code.
For further debugging I can do this locally with Docker as described above.
When everything looks good I push this container to Google Cloud Run or whatever.
Does that sound like a reasonable plan or did I miss something important? In the end, I am looking for an easy convenient way to continuously develop (and debug) a Django app with Visual Studio Code and deploy it with Docker.

I've never used Google Cloud Run or smth, but based on experience with remote servers I can advice following approach. You can use github actions and docker hub. Cover your application or at least critical parts of it with tests which will ensure that everything important works properly. You can set github actions up the way that your tests will be run everytime you push to your github repo. If tests will be passed an image of your application (usually it's name is your_app:latest) will be updated on dockerhub allowing you to build from an image. It's a good practice to have multiple images. For example you could have a stable version, say v.1.0 and a beta version your_app:latest. Thus you will be able to run your stable version on a production server, while beta version can be run on a development server. Do not update stable versions, release new ones and keep existing ones.
An example of how github actions file can look like:
name: your_app_workflow
on: [push]
jobs:
tests:
# run your tests here
push_to_docker_hub:
name: Push Docker image to Docker Hub
runs-on: ubuntu-latest
needs: tests
steps:
- name: Check out the repo
uses: actions/checkout#v2
- name: Push to Docker Hub
uses: docker/build-push-action#v1
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
repository: your_repository_on_dockerhub
tag_with_ref: true
Maybe you know following, but I will mention it anyway. Django built in database is SQLite which is not reliable at all, thus if you are going to let others use your product, you MUST think of another database. Current standard in web industry is PostgreSQL. There are Mongo, Redis and others, but PostgreSQL is used the most. Also, Django doesn't serve static and media files in production, so you will have to use proxy server such as Nginx. Nginx can not work with your Django app directly thus you will need an intermediary such as Gunicorn. Again, I don't know about Google Cloud Run but on a typical remote server you would do it this way.

Related

Should I use docker in order to be able to run ChomeDriver on Azure Web App Services in Django Server?

Recently I have started a Django server on Azure Web App Service, now I want to add a usage of "ChromoDriver" for web scraping, I have noticed that for that I need to install some additional Linux packages (not python) on the machine. the problem is that it gets erased on every deployment, does it mean that I should switch to Docker ?
Container works, but you can also try to pull down the additional packages in the custom start up file without messing around the machine after the deployment
https://learn.microsoft.com/en-us/azure/developer/python/tutorial-deploy-app-service-on-linux-04

Is docker best just for prod environments?

I just decided to jump into using docker to test out building a microservice application using AWS fargate.
My question really relates to hearing about many development teams using Docker to avoid people saying the phrase "works on my machine" when committing code. Although I see the solution to that problem being solved, I still do not see how Docker images actually can be used in development environment.
The workflow for anything above production baffles me. Example of my thinking is...
team of 10 devs all use docker, each pull the image from the repo to there container, with the source code, if they all have a individual version of the image, that means any edits they make to that image is their own and when they push back to the repo where none of the edits can be merged (along with that to edit a image source code is not easily done as well).
I am thinking of it in the say way as git -GitHub, where code is pushed to a branch and then merged to master to create a finished product.
I guess if you pull the code from the GitHub master and create the Docker image is the way for it to be used, but again that points back to my original assumption of Docker being used for Production environments over development.
Is docker being used in development, more so the dev can just test the feature on the container that ever other dev on the team is using so all the environments match across the team?
I just really do not understand the workflow of development environments with docker.
I'd highlight three cases where I've found Docker particularly useful, prior to a production deploy:
Docker is really useful for installing local dependencies. If your application needs a database, docker run postgresql with appropriate options. Need a clean start? Delete the container. Running two microservices that need separate databases? Start two containers. The second microservice is maintained by another team? Run it in a container too.
Docker is useful for capturing the build environment in the CI system. Jenkins, for example, can run build steps inside a container, bind-mounting the current work tree in, so it's useful to build an image that just contains build-time dependencies (which can be updated independently of the CI system itself).
If you're running Docker in production, you can test the exact thing you're about to run. You're guaranteed the install environment will be the same in the QA and prod environments, because it's encapsulated inside the same Docker image. A developer can debug problems against the production-installed code without actually being in production.
In the basic scenario you describe, an important detail to note is that you never "edit an image"; you always docker build a new image from its Dockerfile and other source code. In compiled languages (C++, Go, Java, Rust, Haskell) the source code won't be in the image. Even if you're "using Docker in development" the actual source code will be in some other system (frequently Git), and typically you will have a CI system that builds "official" images from that source code.
Where I see Docker proposed for day-to-day development, it's either because the language ecosystem in use makes it hard to have multiple versions concurrently installed, or to avoid installing software on the host system. You need specific tooling support to "develop inside a container", and if developers choose their own IDE, this support is not universal. Conversely, in between OS package managers (APT, Homebrew) and interpreter version managers (rbenv, nvm) it's usually straightforward to install a couple of things on the host. If your application isn't that sensitive to, say, the specific version of Node, it's probably easier to use whichever version is already installed on your host than to try to insert Docker into the process.

Deploy Django app with Docker

I'm attempting to deploy a Django app via docker, first locally, and then to a cloud server. I could not find an answer to my initial question before I attempt this: if I run docker-machine create, I'm guessing this should be run from within my virtualenv, right?
This would then grab all of my specific app dependencies, and begin to build certificates to throw in the container? If not, please explain otherwise..
Yes you are correct.
I will try to help you by my experience, if you wanna deploy django apps via docker.
First you need to setup docker machine in your local machine. Please see the
instruction. By default driver that will be used is --driver
virtualbox default.
List what kind of specifics dependencies images of your apps. Ex:
you need nginx, postgres, uwsgi, or you need to fetch an image then
modified that image you can use dockerfile (its the best practice
for you).
I suggested you to use docker-compose. Really its make our project
pretty easy to manage. You have to define all images that you need
for your app in docker-compose file Please read this reference.
After you finished develop your app then you want to deploy in production server (cloud) you just need to copy all your project then running your docker-compose. All images dependencies will be automatically pulled in the cloud.
As a reference, you can see this project (this is an open source project that I developed.) On that project, I use make file to manage docker-compose command and it make easy to manage.
An example of dockerfile
An example of docker-compose.yml
An example of Makefile
Hope this will help you.

docker unit test setup

I want to setup a unit test environment for my product. I have a web application build on nginx in Lua which use mysql and redis. I think docker will be good for this although i am new to docker. My application runs on centos server (production server).
I am planning to setup different container for mysql,redis and webapp and then write UT application (unit test for Lua using Busted framework) in my mac (My development machine is MAC) or VM to test it. The UT application will talk to docker container nginx and nginx will use container mysql and redis. Is this good ? If yes ,can someone guide me how to do this? maybe some good link? If no , what could be better way. I have already tried using vagrant but that took too much time which shouldn't be in my UT case.
For an example how we setup our project template you may have a look at phundament/app and its testing setup.
We are using a dockerized GitLab installation with a customized runner, which is able to execute docker-compose.
Note! The runner itself is running on a separate Docker host.
We are using docker-compose.yml to define the services in a stack with adjustments for development and testing.
The CI configuration is optimized to handle multiple concurrent tests of isolated stacks, this is just done by specifying a custom COMPOSE_PROJECT_NAME.
Some in-depth documentation about our testing process and useful information about docker-compose and dockerized CI.
#testing README
#testing Docs
CI builds
Extending services and Compose files
Docker-in-Docker for CI?
Finally, Travis CI also supports Docker since a while, but I haven't tested this approach at all.
If you are new to Docker based CI, please look at Drone:
Official page
Github repo
Tutorial
There some are drawbacks to this solution (like size of images), but it will get you off the grounds.

How to use vagrant to develop on django locally and then deploy to EC2/Azure?

I chose Vagrant so that other developers in my team can quickly start contributing to the project. Is there anyway we can also make it easy for the developed code to be deployed on EC2 or Azure servers? If there are any articles on the optimal setup, please point me to them. Thanks!
The first video of Getting started with Django shows how to use Vagrant for locally Django developing and how to use it for deploying it to Heroku, you may want to use the first part of the tutorial (the one related with the local development). For the second it depends how you are going to deploy it, but as long as your code will be in a Git repository, you could clone it to EC2/azure from git.