I'm wrestling with an InnoDB MySQL database & not having much luck (normally I'm a MyISAM guy). I'm trying to take the content from a production database and copy it over the development database (e.g. same schema etc.. just hooked up to a different website). Normally in MyISAM I would do this:
1) Dump the prod databse: mysqldump -u root -p prod_db_name > prod_dump.sql
2) Log into mysql terminal;
3) drop database dev_db_name;
4) create database dev_db_name;
5) exit mysql terminal
6) mysql -u root -p dev_db_name < prod_dump.sql
et voila.
However when I do this in this particular situation all seems well until I try to update a content piece using the website hooked to the newly populated dev database; in which case I get this error:
Error Number: 1452
Cannot add or update a child row: a foreign key constraint fails (`abbott/logs`, CONSTRAINT `fk_logs_user` FOREIGN KEY (`user_id`) REFERENCES `users` (`id`))
Obviously this is related to the users table. Although I'm not sure if it makes a difference, one factor may be that I am not in control of the user id's that populate the users table; they are fed in from an external Single Sign On / Federation API.
I've tried (blind guesses) adding --single-transaction to the mysqldump command; as well as set FOREIGN_KEY_CONSTRAINTS = 0, then manually deleting each table & turning them back on again then reading in the dump (e.g. rather than dropping the whole db).
Any help would be much appreciated.
Reece.
Related
I have a Postgres database on Heroku and I want to have that database hosted on ElephantSQL, but I cannot for the life of me find how to do it. I have a DUMP file downloaded from the Heroku Postgres database, but I cannot find how I put the data into ElephantSQL.
My database was linked to a Django app and I already have my instance on ElephantSQL linked to the a copy of the same Django app, but hosted on Render. I can see on ElephantSQL that the instance has all the tables, so I just need to put the data in there somehow.
Any tips or hints are welcome!
The tables that you already see may be the problem. It may also be that the parents of those tables are the problem. An SQL dump is just a series of SQL commands that run in order to write schema and data back into a new database.
The first SQL commands set up the databases, schemas, and tables. If your tables are already there, your import may be failing because it wants to create new tables that already exist. Sorry without more information on the specific errors you are seeing, it's hard to be more specific. Hopefully one of these 4 options help you. Let me know in the comments how it goes.
For all of the options below, I would suggest backing up your destination server database to be sure that you don't mess up the Django app that currently is working there even without your data.
Option 1: Look for a way to export your source database without database, schema, and table definitions.
Option 2: If you're careful, you can edit your dump file to remove those initial setup commands, and have the dump file start with the commands that only push the data to your tables.
Option 3: (IMPORTANT to backup the destination server first for this one.) Drop the database, schema, and/or tables on the destination server so that nothing is pre-defined, and see if your dump file can reconstruct everything again the way Django needs it on the destination server with your data.
Option 4: It may be a lot of tables, but you can usually export individual table data files with an option to not include table definitions. Then you have to import all the tables as separate imports. A mix of option 3 and 4 may work also where if you can't find an option to not include the definition, drop the tables at the destination and import each table independently.
Here is what worked for me using a Mac terminal:
Create a backup file from Heroku:
pg_dump -h heroku_host -d heroku_database -U heroku_user -p heroku_port -W -Ft > latest.dump
You'll be prompted to input the password from Heroku PostgreSQL. Note that these heroku_xxx are found in the Heroku PostgreSQL credentials.
Restore the backup from Heroku to ElephantSQL:
pg_restore -h elephant_host -d elephant_database -U elephant_user -p 5432 -W -Ft latest.dump
You'll be prompted to input the password from Heroku PostgreSQL. Note that these elephant_xxx are found in the ElephantSQL credentials and that they use the term "server" instead of host.
I am trying to transfer my testing server's database to my working PC. So I do this:
(Postgres 12.5)
pg_dump -U nutrition -d mednutrition_db --clean -f mednut_bak.sql
and then I try to restore it like this on my Postgres 9.5
psql -U chris mednutrition_db < mednut_bak.sql
This simple method was working fine for me. But now it seems there is a problem which I cannot solve.
I noticed that postgres creates (django)ID fields using this default value (for example):
nextval('django_migrations_id_seq'::regclass)
However, when I restore the database, the default value for the ID fields remains empty, and of course I cannot insert any new data to the restored database.
How can I resolve this?
UPDATE: I just noticed that on my backup file (mednut_bak.sql) there is the proper instruction to create the default ID sequence:
ALTER SEQUENCE public.django_migrations_id_seq OWNED BY public.django_migrations.id;
and
ALTER TABLE ONLY public.django_migrations ALTER COLUMN id SET DEFAULT nextval('public.django_migrations_id_seq'::regclass);
So, I guess problem is while restoring because for some reason Postgres ignores these lines.
I am recreating a web app in Django that was running in a server but it was terminated, fortunately, I did a backup of all the code. My problem comes with the database because but I do not know how to transfer all the data from the old db.sqlite3 Django database web app into the new one.
I found a similar question as mine Django: transfer data from one database to another but the user wanted to transfer data from specific columns because their models.pyfrom the old and new databases were slightly different. In my case, my models.py from the old and new databases are the same.
Alternatives
I am using the DB Browser for SQLite to explore the content of the old database and I could add manually each row into the Django administration but this will take me too much time.
I could copy the old db.sqlite and replace it in the new web app because the models.py file remains the same but this solution is not appropriate IMO, this solution is rude and I think it goes against the good practices of Software.
How should I proceed for transferring data from the old database to the new one?
This seems like a one time copy of one db to another. I don't see how this goes against good software practice unless you have to be copying this db frequently. I've done it before when migrating servers and it doesn't cause any issues assuming the two instances of the application are the same build.
I was able to do some minor tricks in order to solve my problem because there is not a straightforward functionality that allows you to transfer all your data from two sqlite databases in Django. Here is the trick I did:
Download the sqlite db browser to explore and export the contents of your old database in a .csv file. Open you database with sqlite db browser and hit on the tables option and you will see all your tables, then do a right click on any of those and hit the export as a csv file option to generate the csv file (name_of_your_csv_file.csv). The other alternative is to use the sqlite3.exe to open your database in cmd or powershell and then doing the export with:
.tables #this lets you explore your available tables
headers on
mode csv
output name_of_your_csv_file.csv
2.There are two choices up to this point: You can either insert all the records at once to your new database or you can drop your existing tables from the new database and then recreate them and import the .csv file. I went for the drop option because there were more than 100 records to migrate.
# sqlite3
# check the structure of your table so you can re-create it
.schema <table_name>
#the result might be something like CREATE TABLE IF NOT EXISTS "web_app_navigator_table" ("id" integer NOT NULL PRIMARY KEY AUTOINCREMENT, "ticket" varchar(120) NOT NULL);
#drop the table
drop table web_app_navigator_table
#re-create the table
create table web_app_navigator_table(id integer not null primary key autoincrement, ticket varchar(120) not null);
#do the import of the csv file
.import C:/Users/a/PycharmProjects/apps/navigator/name_of_your_csv_file.csv table_name_goes_here
You might see an error such as csv:1: INSERT failed datatype mismatch but this indicates that the first row of your csv file was not inserted because it contains the headers of the exported data from your old database.
I am using Django (version 1.8.2) along with PostgreSql (v 9.4.4) on python (v2.7.9 - anaconda distribution) to create an application.
I am creating tables in database by defining them in models.py of Django. Once, I have created these tables I am able to view and add content of these tables in the browser (http://127.0.0.1:8000/admin/).
However, when I try to access these tables using psql, I get the error as relation xyz_abc does not exist, where xyz_abc is a table present in database as seen by using \dt.
I am not able to understand why this should be the case and any suggestions on how this tables can be accessed using psql will be really helpful.
Edit
Steps to access the database are -
Open the terminal in max
type 'pqsl' in terminal which opens the psql
\c testdb - this connects to the database which contains the table xyz_abc
once connected the, i try to select everything from the table using select * from xyz_abc;
So I backed up my geodjango postgis database using pg_dump before performing some calculations which I've managed to mess up. I've created a new database using
createdb -T template0 -O proj proj_backup
psql proj_backup < proj_backup.pg
This seemed to work fine (though there were a few errors during the import), and connecting to the database using psql all my tables are there and appear to have the correct numbers of rows etc.
However, changing my settings.py to connect to my newly imported backup db (proj_backup in my example), gives me the following errors:
DatabaseError: relation "appname_model" does not exist
Any ideas? I'm guessing I did the dump wrong, or that I haven't maintained the ForeignKeys somehow. Thanks very much.
Update
So I figured out my first mistake: I had two similarly named backup databases and connected to the wrong one. Connecting to the correct one seems to have fixed everything. However, it's still quite strange that it didn't recognize the tables in the other backup database, which definitely did exist. Running syncdb on the incorrect database ends up duplicating those tables (if I remember correctly, there were duplicate table names when I listed them all from within psql). Unfortunately, the way I discovered my mistake was by dropping the bad table to recreate it, and so in order to reproduce this error I'll probably have to use time machine. Still very strange, I'll give that a shot when I can get physical access to my work machine.
So is your appname_model table there or is it a view? Was it in public or another named schema?
If the table is there, then chances are you have it in a schema that is not in your database search path. Check the search_path of your old database. It might have included something other than the default, or your default search schema is set in postgresql.conf and is non-standard.