Permanent Superuser Django Docker - django

So, I have a dockerfile that I run Django via CapRover to run the backend of my website. Amongst the components, I have an admin panel that I log into using a superuser that I've created by ssh-ing into the docker container. The problem is that whenever I change my backend by uploading it as a .tar via CapRover, I have to create my superuser again.
My question is, is there any way to avoid this by creating some sort of persistance? Or is there any way of automating this via my dockerfile?
Thanks!

Related

Dockerfile for .sqlite3 database

I currently have a Django application I want to deploy. The database being used is SQLite3 and I would like to run it in a separate container as the web application. I need to create a docker image of the Database and push it to my repo on AWS ECR. I already have an image of the application there but need a seperate image of the database so the data will persist. Any advice on creating a DockerFile or docker-compose file to do this would be great.

Is there a way to create superuser for Django apps on Heroku without using the CLI?

I changed my app to use PostgreSQL on Heroku instead of the SQLite database that I have been using in development. Now I want to log into my admin portal on my hosted app, but I am not sure how to create a superuser.
Is there a way to push my local database to my Heroku database? I have been deploying through GitHub (not using the Heroku CLI) so I was wondering if there was a way without using the CLI.
If there isn't a way without using the Heroku CLI, could someone tell me how I would init the Heroku git repository, considering the fact that I already have a GitHub repository that I have been using?
This is my settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}
import dj_database_url
db_from_env = dj_database_url.config(conn_max_age=600)
DATABASES['default'].update(db_from_env)
A couple ways you can do this:
Include a DataMigration in the migrations file, this can be done with ./manage.py makemigrations <app_name> --empty. In that file you can write a datamigration that adds a superuser.
You can create a custom start script for the app, in that script, invoke a ManagementCommand that ensures a superuser is always there.
Is there a way to push my local database to my Heroku database?
Yes, if you are using the same database software locally are you are on Heroku. For example you can push a database into Heroku Postgres using the Heroku CLI. But if you are still developing with SQLite (which is not recommended as these products aren't drop-in replacements for each other) there is no simple way to push your database.
I know you are asking for a solution without using the Heroku CLI, but I'm still going to recommend using it.
Use of the Heroku CLI and your deployment method are unrelated. In fact, command-line deployment doesn't even use the CLI—it's done via regular git push. You can (and in my opinion should) have the CLI installed even if you are deploying via GitHub.
In this instance, you could use heroku run to create your Django superuser via a one-off dyno:
heroku run python manage.py createsuperuser
This is the most straightforward solution.
Here are some other reasons you might want to have the CLI installed:
To manage addons
To scale your app
To upgrade your app's stack
To configure multiple buildpacks, e.g. to use Node.js and Python in the same app
To view your app's logs
To enable HTTPS on your custom domain using Automated Certificate Management
To use the Docker registry and deployment
...
If you want to go down this path, install the CLI and then log in by running heroku login.
If there isn't a way without using the Heroku CLI, could someone tell me how I would init the Heroku git repository, considering the fact that I already have a GitHub repository that I have been using
After installing the CLI, cd into your project directory and tell Heroku which app that code belongs to by running heroku git:remote -a <your-app-name>.
This adds a Heroku remote but will not change any existing remotes. For example, if you were previously using git push origin to push to GitHub you'll still be able to do that.
After doing that you should be able to use heroku run as I showed above.

Running Django's createsuperuser in Google Cloud Run

I'm trying to run a Django app on Google Cloud Run. The site itself works nicely, can run migrations and collect static assets via a startup script. The one thing I cannot figure out how to do is create a superuser. This requires interactively typing in a password or at least setting it via a django shell. I currently cannot figure out how to do this and it seems like it might not be possible; which would make Cloud Run unusable for Django. Has anyone been able to achieve this or have a sustainable workaround? Thanks!
Instead of Django Shell use the api to create the superuser. Once you have the script make it part of the container build process.

How to decouple local directory and Heroku project

I am developing an app with Django and I successfully pushed it on Heroku.
This app has a postgres database and a form to allow users to fill the database.
I have coupled the local directory and Heroku, so that both if I run the server by command prompt, or if I access the app, and submit the form, my database get changed.
Now I want to make some experiments on the local database without changing the one on Heroku.
Is it possible?
Can I do it by just commenting the database URL in settings.py ?
I have searched for this matter on Google but I don't know the name it, so that I cound not find proper answer.
This is not a question of "decoupling" directories. It is because you are trying to use sqlite on Heroku, and you have added the sqlite file to your git repo.
You cannot use sqlite on Heroku; use the postgres add-on. Additionally, your sqlite file must not be in source control.

How to download updated sqlite Database from AWS EB? (Django app)

I have deployed my django app on AWS Elastic beanstalk, using the sqlite database. The site has been running for a while.
Now I have made some changes to the website and want to deploy again, but by keeping the changes the database went through(by form inputs)
But I am not able to find a way to download the updated database or any files. The only version of the app I am able to download is the one I already uploaded when I first deployed.
How can I download the current running app version and deploy with the same database?
you may use scp to connect to remote server and download all your files.
as commented, you should better use dumpdata and then i/o that file (instead of the db or full site).
however, if your remote system python version is not the same as your django site, then the dumpdata may not work (this is my case), so I managed to download the whole site folder back to local and then use dumpdata
aws elastic beanstalk Python app's files in /opt/python/current/app
just use scp -r to download the whole site to local.
I believe you have sqlite db file being packaged from your local during eb deploy. If that is a case then all you need to do is not include db.sqlite file in your deployments.
Move location of db.sqllite file to s3 or ebs persistent volume so that it persists.
Use .ebextentions to run db migration in aws on top of the aws db file just the way django recommends.
To be on safe side you can ssh into your eb env which is ec2 instance and download db.sqlite file just incase.