How to change data storage location in mysql server - redmine

I have installed redmine in my ubuntu 12.04 system with mysql server database. Currently the datas are being stored in a default database named mysql. I want to create a new database in the same mysql server and change the location of data storage to the new database. How can I achieve that?

In MySQL, you can do
RENAME DATABASE mysql TO newDbName;
In your code, you can just change the query in the FROM statement to your new database and table.
Or, if you were trying to start with a blank slate, then, after creating the new database, run the following:
mysqldump -u user -ppass -d olddb | mysql -u user -ppass -D newdb
which will copy your old schema to the new db.

Related

Django app running on AWS elastic beanstalk cannot write to database

I have a Django app running on AWS elastic beanstalk. It was previously connected to a MySQL RDS instance. I have been through several steps to migrate the data from a MySQL instance to a PostgreSQL instance.
In brief I connected to the postgres instance and ran python manage.py migrate
I then ran python manage.py sqlflush and ran the resulting SQL commands
Then I imported a json fixture of content types from the MySQL db. I created the fixture (when connected to the MySQL db) with
python manage.py dumpdata contenttypes --indent=4 --natural-foreign > contenttype.json
and imported it into the PostgreSQL db with
python manage.py loaddata contenttype.json
The above preserves PKs and FKs
I then used AWS DMS to load the data from the MySQL instance to the PostgreSQL instance.
Using the Django app (e.g. through Django admin) all the data can be read from the Postgres instance. However, trying to post data to the database results in a 500 error.
Could this be a problem with security settings in AWS? It seems odd that I can read data from the DB but not post to it.
I've tried connecting to the DB using pgAdmin 4 with the same connection settings. I was able to use an INSERT statement to post to the database from there (but had to explicitly state the PK value)
Update
As suggested by Mark B the problem was Primary Keys. Many of the sequences in postgres had been reset to 1 as the current value. Finding the max value of the ID (PK) in each table and updating the corresponding sequence has got everything working. This question on sequences and primary keys was useful.

How to load sql dump in a containerized django docker which depends on postgres

I am running a Django container on docker which depends on Postgres. Now the issue is, when I try to load the postgresql
docker-compose up
cat /tmp/dump.sql | sudo docker exec -i <container_id> psql -U <user> -d <database_name>
This imports the first few tables and gives me a lot of errors on the terminal
An example
constraint "table_id" for relation "table_name" already exists
Another example
null value in column "column_name" violates not-null constraint
Another one
insert or update on table "table_1" violates foreign key constraint "table_id"
I wanted to know is this even the right way to import this data. Plus I cant seem to drop the database as it is already being used by django.
Note:
I tried with volumes where I imported the db with postgresql. But everytime I run django, the database gets reset. I am guessing it has something to do with the migrations, but I can't touch that part.
The issue was being caused because the sql dump was trying to overwrite existing db. A quick solution if you are on dev would be to
docker volume rm <volume_name>.
And then run the cat ... command

how to reflect changes made on postgres database of django from one pc to other

We three working with Django and postgres is the database. Whenever we push the code to GitHub. The database data is not reflecting. The data I stored is visible to me only. The postgres user, password and database name are same on all our laptops. How to make when I push that has to go their databases also.
If you are talking about changes in db schema, then take a look django migrations - https://docs.djangoproject.com/en/3.1/topics/migrations/. The workflow is following:
change model (e.g. add new field, change existing field...)
generate migration file by running python manage.py makemigrations. This generates migration file in <app_folder>/migrations
Run python manage.py migrate to apply changes in models to your database
Add migration file to your version control (github or whaever) and push it
Now when your colleagues gets updated code from version control they need to run python manage.py migrate to apply model changes to their local database.
I found that it is not possible to send postgress data directly. But sqlite can be used for it. This link aids in that - https://manuelvanrijn.nl/blog/2012/01/18/convert-postgresql-to-sqlite/ . But i send data by taking backup like
From Postgres to Postgres: taking backup :
pg_dump dbname > dbname.bak
and then on new Postgres restored with:
psql test < dbname.bak

Migrate Django development database (.sql3) to Heroku

How does one migrate their Django .sql3 development database to heroku?
Per here, and here I tried: heroku pg:psql --app sblic < database.sql3 but my Django admin shows no new uploads (even after syncdb/migrate/ or collectstatic
Perhaps there may be a way to directly upload an sql3 file to Heroku, but I went with the path of clearest certainty (convert local sql3 db to postgre db and upload a dump of postgre db to Heroku via pgbackups tool):
Create a dump of your sql3 database as a json file
With PostgreSql installed and with its bin directory in the Path environment variable, create a postgre user and database (or just plan on using your initial super user account created upon installing postgresql)
Update your settings.py with a reference to your newly created postgre database (note, 'HOST' may need to be set as 'localhost', 'user' is your postgre user login)
run python manage.py syncdb to initiate your new postgre db
Optional: if necessary, truncate your postgre db of contenttypes
load your dump from step 1 (if you have fixture issues, see here)
Now you have a working postgre local db. To migrate, first create a postgre dump
Post your postgre dump somewhere accessible by URL (such as drop box or amazon s3 as suggested in previous link).
Perform pgbackups restore command, referencing your dump url
Your heroku app should now reference the contents of your local db.
Heroku command line tool uses the psql binary. You have to install PostgreSQL on your local development machine to have psql available. From the (documentation)[https://devcenter.heroku.com/articles/heroku-postgresql#pg-psql]:
You must have PostgreSQL installed on your system to use heroku pg:psql.
You can in fact keep using your SQLite database with Heroku, but this is not recommended as it will be rewritten with your local copy if you re-deploy it to another dyno. Migrating the data to psql is recommended as described at https://devcenter.heroku.com/articles/sqlite3

How to copy a database from one django project to another

I have a django project with a very large database in production. Ideally I want to copy a few tables from that environment to my dev enviroment. How can I do that without jeopardizing the integrity of the production database?
It depends on the database that you are using.
For mysql you have mysqldump
For postgre you have pg_dump
...
Take a look at Django fixtures:
https://docs.djangoproject.com/en/dev/howto/initial-data/
manage.py has two commands, "dumpdata" and "loaddata" which will dump your data to JSON so it can be imported to your dev server model by model.