synchronize the local database with deployed project in Heroku - django

When I logged in the project deployed on Heroku with a registered account, Learning Log, it starts freshly without any data I have entered in the local project.
I'm aware that Heroku employed Postgres while my local database is python's contributed sqlite.
Additionally, I ignored the *.sqlite3 in git repository following the tutorial.(When undo the ignore, it not works)
I prefer to work locally, push local data to remote and fetch remote to local.
How could I synchronize remote with local?

You can connect your database for your server in local runserver.
Just change your database setting for your server ip. (first you have to open 5432 port in your for your local ip)
like below
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'your_server_db_name',
'USER': 'your_server_user',
'PASSWORD': 'your_server_db_pw',
'HOST': 'your_server_ip',
'PORT': '5432',
}
}
Then the db is synchronized. However, I don't recommend this way. Whenever you migrate / or do sth concerned with db, it must be crashed because you can't modify code both.
So if you want more active database from real db, you just use pg_dump for dumping sql and add it to your local database.

Related

Heroku Postgres - How come my production database is working, even though not listed in settings.py - Django?

I configured my database in Heroku several months ago so don't remember exact steps I took. I'm using the Heroku-Postgres add-on: https://devcenter.heroku.com/articles/heroku-postgresql
I have a DATABASE_PASS listed as a config var in Heroku. And I have a config var for DATABASE_URL
In my settings.py file I only have the following as it relates to my database. Why is my app still working in production on Heroku if DATABASES variable is referring to localhost only?
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'my_dev',
'USER': 'postgres',
'PASSWORD': os.environ.get('DATABASE_PASS'),
'HOST': 'localhost',
'PORT': '5410',
}
}
The Heroku-Postgres documentation states the following:
The value of your app’s DATABASE_URL config var might change at any time. You should not rely on this value either inside or outside your
Heroku app.
Am I doing something wrong? Should I not rely on DATABASE_URL as a config var?
Additional Detail - I am using the django-heroku pip package.
Why is my app still working in production on Heroku if DATABASES variable is referring to localhost only?
Additional Detail - I am using the django-heroku pip package.
django-heroku does quite a lot, including setting up your database from the DATABASE_URL environment variable automatically:
This will automatically configure DATABASE_URL, ALLOWED_HOSTS, WhiteNoise (for static assets), Logging, and Heroku CI for your application.

Data always resets to original state on heroku free account after some time

I have deployed my django app into heroku free account.(first time)
In my working environment I use SQLLite on Heroku I assume POstgres should work.
However I am confused whats going on.
So first of all the app is up and running the data that was in my SQLlite database is currently shows at my deployed app.
Is it because my SQLLite DB was copied to heroku and this is what I actually see?
Another issue that indicates somthing is wron with DB is that if I create superuser via terminal I cant log with it to my app.
Another issue is that it looks data constantly restores to its original state after some time .
I followed the instructions and this is what I have in my settings for DB
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
import dj_database_url
db_from_env = dj_database_url.config()
DATABASES['default'].update(db_from_env)
Because that is how Heroku works. The file system is ephemeral and you must not store anything on it; but sqlite does store is data there, so it will get reset every time a new dyno is started.
Your reference to postgres is confusing. You are not using postgres, but you must.

Django migrations: sqlite3 development db, Amazon Elastic Beanstalk and Amazon RDS postgresql live database

I'm wondering how the community would handle this particular scenario.
I have a Django app that I develop locally using an SQLite3 database as my development database.
The live application is hosted on Amazon Elastic Beanstalk and uses an Amazon RDS PostgreSQL database for production.
To deploy the app, I simply push the Django app to Elastic Beanstalk with eb deploy (which pushes the latest committed version from the local git repository).
settings.py configures the database and checks if the environment is live like so:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
if 'RDS_DB_NAME' in os.environ:
from settings_live import *
and settings_live.py changes the database configuration to the production settings like so:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ['RDS_DB_NAME'],
'USER': os.environ['RDS_USERNAME'],
'PASSWORD': CREDENTIALS['RDS_PASSWORD'],
'HOST': os.environ['RDS_HOSTNAME'],
'PORT': os.environ['RDS_PORT'],
}
}
This all works fine, but issues come up when it comes to migrations. For example: in my development environment I create a new model in an app's models.py. After making the change, I run manage.py makemigrations myapp and manage.py migrate. The migrations are properly applied to my sqlite3 development database. No problems.
Then I commit my changes in preparation for live deployment. My .gitignore file is configured to ignore db.sqlite3 as well as */migrations (since these migrations are only applicable to the development database).
Then I push my latest commit (which doesn't contain my dev database or associated migrations) to Elastic Beanstalk with eb deploy. I have configured an .ebextentions file (.ebextensions/02_commands.config) to run migrations on the production database like so:
03_makemigrations:
command: "django-admin.py makemigrations myapp1 myapp2"
leader_only: true
04_migrate:
command: "django-admin.py migrate"
leader_only: true
Here's the problem: any previous migrations that were generated in the Elastic Beanstalk environment with makemigrations no longer exist in app/migrations since the eb deploy deployment process overwrites the old app with the new one (which only contains a blank migrations directory). This leads to some unexpected behaviour such as tables not being created in the production database.
One solution I've considered (but haven't even begun to implement) is to create a script that copies migration files from an S3 bucket to */migrations and configure 02_commands.config to run this prior to running makemigrations and migrate. Then run another script afterwards that copies the new migrations files back to the S3 bucket. I just wonder if my whole workflow is wrong if it has come to this though.
Your mistake is in saying that the migrations are only applicable to the development database. That's just false. The whole point of migrations is that they are exactly intended to keep your development and production databases in sync. They are part of your code; they should be committed along with all the rest of the code, deployed to production, and run there.

Django Web App on Azure, does not pull remote sqlite3 database

I have a django web app up and running with Azure. Now, when there are changes in the remote database, I cannot pull it so that it is local (It doesn't detect any changes even though the sqlite3 file is included in the project and the commits). And if I try to sync or push, the remote database gets reset/overwritten with my local database. I cannot simply do the git ignore trick because I need to push the database after I created new models (I'm assuming the remote database does not have my local migrations) Does anyone know how to fix this issue? Here are my database settings in the settings.py file:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': path.join(PROJECT_ROOT, 'db.sqlite3'),
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
My git ignore file:
# Python cache
__pycache__/
*.pyc
# PTVS analysis
.ptvs/
# Build results
/bin/
/obj/
# User-specific files
*.suo
*.user
*.sln.docstates
# Auto-generated virtual environment
/env/
# Auto-generated web.config
/web.config
# Auto-collected static files folder
/static/
# Windows image file caches
Thumbs.db
ehthumbs.db
# Folder config file
Desktop.ini
# Recycle Bin used on file shares
$RECYCLE.BIN/
# Mac desktop service store files
.DS_Store
This is surely a very bad idea. Your remote database will surely be updated all the time, with live data created by the users of your website. Why would you want to override that with stale data from dev?
And this:
I'm assuming the remote database does not have my local migrations
is not only false, it's quite literally missing the whole point of migrations, which is that you run the same code in dev and prod to keep the tables in sync.
Finally, sqlite isn't really suitable for production use in anything but a toy site. Use a proper db like Postgres.

Django unable to open database file using mysql

I'm a newbie in django so as python
I just succesfully configured my first django site over an apache server, then I configured it to work with mysql database editing the settings.py file and running the following command
python manage.py syncdb
I started playing a bit with the admin but occasionally when making get or post requests I get the following message OperationalError at "/some/route" unable to open database file
If I refresh the page loads fine, but if i keep refreshing any page in the admin the error shows up, so it's a random thing.
For what I've searched this is an issue related with sqlite, but I'm using mysql.
My database config in settings.py is the following:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'simulation',
'USER': 'root',
'PASSWORD': 'root',
'HOST': ''
}
}
I tried to specify hosts "localhost" and "127.0.0.1" but the result was the same.
It's really important for me find a solution that uses mysql as database engine.
Any help would be really appreciated! Thanks.
I've encountered a similar problem that you have right here, in my instance it seemed like my settings.py file was cached on the server. Once I deleted the .pyc files it started working fine.
Also, what happens if you change your database to sqlite? It might be worth checking that your mysql database isn't playing up instead of django.
I ran into the same problem with Django/Mysql/WSGI/Apache setup. Similar to Serakiel, I think that the problem stemmed from server caching. Restarting apache fixed the problem for me.