How to copy a database from one django project to another - django

I have a django project with a very large database in production. Ideally I want to copy a few tables from that environment to my dev enviroment. How can I do that without jeopardizing the integrity of the production database?

It depends on the database that you are using.
For mysql you have mysqldump
For postgre you have pg_dump
...

Take a look at Django fixtures:
https://docs.djangoproject.com/en/dev/howto/initial-data/
manage.py has two commands, "dumpdata" and "loaddata" which will dump your data to JSON so it can be imported to your dev server model by model.

Related

how to reflect changes made on postgres database of django from one pc to other

We three working with Django and postgres is the database. Whenever we push the code to GitHub. The database data is not reflecting. The data I stored is visible to me only. The postgres user, password and database name are same on all our laptops. How to make when I push that has to go their databases also.
If you are talking about changes in db schema, then take a look django migrations - https://docs.djangoproject.com/en/3.1/topics/migrations/. The workflow is following:
change model (e.g. add new field, change existing field...)
generate migration file by running python manage.py makemigrations. This generates migration file in <app_folder>/migrations
Run python manage.py migrate to apply changes in models to your database
Add migration file to your version control (github or whaever) and push it
Now when your colleagues gets updated code from version control they need to run python manage.py migrate to apply model changes to their local database.
I found that it is not possible to send postgress data directly. But sqlite can be used for it. This link aids in that - https://manuelvanrijn.nl/blog/2012/01/18/convert-postgresql-to-sqlite/ . But i send data by taking backup like
From Postgres to Postgres: taking backup :
pg_dump dbname > dbname.bak
and then on new Postgres restored with:
psql test < dbname.bak

How to connect with postgresql database after deployment to Heroku?

I'm new to Django but I'm deploying a Django-based website to Heroku, using Postgresql. The deployment was successful, and the website is online and has established connection with the database. However, none of the data from my local database has migrated to the heroku database, causing it to be a blank database. If I go into the admin section and manually input a datapoint, it appears on my site, so I know that database is correctly serving data. What is the proper way for migrating data from your local database to your online, heroku version of the database? I thought the following code would migrate the data:
heroku run python manage.py makemigrations
heroku run python manage.py migrate
But apparently I'm missing something.
make migrations will create a migration that contains your schema, but no data. The migrate command applies the migration to the database.
In order to provide data to be sent over as part of the migrate command you need to either create a data migration or use a fixture.
Another option you have is to dump your local database and do an import into Heroku Postgres
All in all, it depends on how much local data you have that you want copied over. If its only a few rows, I would use either a data migration or a fixture, if its 100s or 1000s of rows an export/import of your dataset is your best bet.

Migrate Django development database (.sql3) to Heroku

How does one migrate their Django .sql3 development database to heroku?
Per here, and here I tried: heroku pg:psql --app sblic < database.sql3 but my Django admin shows no new uploads (even after syncdb/migrate/ or collectstatic
Perhaps there may be a way to directly upload an sql3 file to Heroku, but I went with the path of clearest certainty (convert local sql3 db to postgre db and upload a dump of postgre db to Heroku via pgbackups tool):
Create a dump of your sql3 database as a json file
With PostgreSql installed and with its bin directory in the Path environment variable, create a postgre user and database (or just plan on using your initial super user account created upon installing postgresql)
Update your settings.py with a reference to your newly created postgre database (note, 'HOST' may need to be set as 'localhost', 'user' is your postgre user login)
run python manage.py syncdb to initiate your new postgre db
Optional: if necessary, truncate your postgre db of contenttypes
load your dump from step 1 (if you have fixture issues, see here)
Now you have a working postgre local db. To migrate, first create a postgre dump
Post your postgre dump somewhere accessible by URL (such as drop box or amazon s3 as suggested in previous link).
Perform pgbackups restore command, referencing your dump url
Your heroku app should now reference the contents of your local db.
Heroku command line tool uses the psql binary. You have to install PostgreSQL on your local development machine to have psql available. From the (documentation)[https://devcenter.heroku.com/articles/heroku-postgresql#pg-psql]:
You must have PostgreSQL installed on your system to use heroku pg:psql.
You can in fact keep using your SQLite database with Heroku, but this is not recommended as it will be rewritten with your local copy if you re-deploy it to another dyno. Migrating the data to psql is recommended as described at https://devcenter.heroku.com/articles/sqlite3

Import MySQL tables into Django

I have recently created a PHP application which connects to my MySQL database and displays the table information. Where Django is concerned, I am only familiar with them being created in SQLite by creating the relevant models.
Is there a way to do this in reverse? So have Django create models from the tables in MySQL?
You can use your existing schema to create tables. The official documentation is at: https://docs.djangoproject.com/en/1.7/howto/legacy-databases/
There are two commands:
python manage.py inspectdb
This will give you the output of running:
python manage.py inspectdb > models.py
Django won't know how you want to break these models up into separate application modules, so you'll need to do that on your own, if necessary.

Why is django manage.py syncdb failing to create new columns on my development server?

I am trying to create a development server from a production server from which I can test out new ideas.
I created a duplicate of my production server's database by dumping it using Postgres' db_dump and then imported the dump into a new database.
I then copied my production django directory and altered all .py files to refer to server_debug. rather than server in my import statements.
Using the admin interface to alter some data works in that only the development server has its data altered.
However, when I then try adding a new field in my models.py in my development server, manage.py syncdb fails to create it.
Is there something I am neglecting that could cause manage.py to refer to my production server rather than my development server?
syncdb doesn't touch tables that already exist. You need to either reset the app (easiest if you don't care about the data), modify the table manually (more of a quick hack) or use a migration app and version your models — South, for example.