Dockerfile for .sqlite3 database - django

I currently have a Django application I want to deploy. The database being used is SQLite3 and I would like to run it in a separate container as the web application. I need to create a docker image of the Database and push it to my repo on AWS ECR. I already have an image of the application there but need a seperate image of the database so the data will persist. Any advice on creating a DockerFile or docker-compose file to do this would be great.

Related

Permanent Superuser Django Docker

So, I have a dockerfile that I run Django via CapRover to run the backend of my website. Amongst the components, I have an admin panel that I log into using a superuser that I've created by ssh-ing into the docker container. The problem is that whenever I change my backend by uploading it as a .tar via CapRover, I have to create my superuser again.
My question is, is there any way to avoid this by creating some sort of persistance? Or is there any way of automating this via my dockerfile?
Thanks!

Pushing Fortio in cloud foundry using docker image

I am trying to push Fortio(a load testing tool for microservices) as an application in cloud foundry. I am trying it with a docker image from hub.
cf push <app-name> --docker-image fortio:fortio random-route
Application crashes when it tries to start the application. cf logs says : executable file not found in $PATH
It is working fine in my local docker setup but not in cloud foundry.
Any help??
https://docs.cloudfoundry.org/devguide/deploy-apps/push-docker.html
A Docker image that meets the following requirements:
The Docker image must contain an /etc/passwd file with an entry for the root user. In addition, the home directory and the shell for that root user must be present in the image file system.
Unfortunately, public Docker image fortio/fortio doesn't meet this particular requirement.
Solutions:
build customized Docker app image with shell and /etc/passwd
use source code and push it as a native golang app and not a Dockerized app

How to download updated sqlite Database from AWS EB? (Django app)

I have deployed my django app on AWS Elastic beanstalk, using the sqlite database. The site has been running for a while.
Now I have made some changes to the website and want to deploy again, but by keeping the changes the database went through(by form inputs)
But I am not able to find a way to download the updated database or any files. The only version of the app I am able to download is the one I already uploaded when I first deployed.
How can I download the current running app version and deploy with the same database?
you may use scp to connect to remote server and download all your files.
as commented, you should better use dumpdata and then i/o that file (instead of the db or full site).
however, if your remote system python version is not the same as your django site, then the dumpdata may not work (this is my case), so I managed to download the whole site folder back to local and then use dumpdata
aws elastic beanstalk Python app's files in /opt/python/current/app
just use scp -r to download the whole site to local.
I believe you have sqlite db file being packaged from your local during eb deploy. If that is a case then all you need to do is not include db.sqlite file in your deployments.
Move location of db.sqllite file to s3 or ebs persistent volume so that it persists.
Use .ebextentions to run db migration in aws on top of the aws db file just the way django recommends.
To be on safe side you can ssh into your eb env which is ec2 instance and download db.sqlite file just incase.

Deploy Django app with Docker

I'm attempting to deploy a Django app via docker, first locally, and then to a cloud server. I could not find an answer to my initial question before I attempt this: if I run docker-machine create, I'm guessing this should be run from within my virtualenv, right?
This would then grab all of my specific app dependencies, and begin to build certificates to throw in the container? If not, please explain otherwise..
Yes you are correct.
I will try to help you by my experience, if you wanna deploy django apps via docker.
First you need to setup docker machine in your local machine. Please see the
instruction. By default driver that will be used is --driver
virtualbox default.
List what kind of specifics dependencies images of your apps. Ex:
you need nginx, postgres, uwsgi, or you need to fetch an image then
modified that image you can use dockerfile (its the best practice
for you).
I suggested you to use docker-compose. Really its make our project
pretty easy to manage. You have to define all images that you need
for your app in docker-compose file Please read this reference.
After you finished develop your app then you want to deploy in production server (cloud) you just need to copy all your project then running your docker-compose. All images dependencies will be automatically pulled in the cloud.
As a reference, you can see this project (this is an open source project that I developed.) On that project, I use make file to manage docker-compose command and it make easy to manage.
An example of dockerfile
An example of docker-compose.yml
An example of Makefile
Hope this will help you.

How To Stop Overwriting Directories When Pushing to Heroku

This seems simple enough but I can't find a fix. I have a django app on heroku. I'm using heroku as a staging environment. My app uses a directory media for images and files uploaded from the admin pages.
When I git push heroku master it destroys this directory even though it's not added in the repo.
How can I stop this?
Heroku doesn't provide persistent storage.
On git push your code is transformed into a slug and then it is distributed to a dyno that your application runs in. See https://devcenter.heroku.com/articles/slug-compiler for details.
If you want persistent storage you should integrate your application with AWS S3 (Simple Storage Service) or some other service.