Django:Backup and restore - django

I have been reading about process to do backup and restore in django.
The best i could come up with was to dumpdata.
i.e. python manage.py dumpdata >foo.json
Now to restore this data we will have to delete or drop the present table, then restore this json file using it as a fixture. i.e. do syncdb
Is there any particular way of doing this ? i.e. is there any process which can be used every-time we do backup and restore?
I am looking for a tool like south, which can used for database backup and restore.
I am planning to get my site online.So any help will be highly appreciated.

The django-dbbackup package can do database backups and restores.

For proper backup and restore, use the tools that came with your database.
If you must use django, write your own custom management commands (but again, I question the wisdom of this).

Related

Accessing Database from other location

For my job interview, I got assignment to create CRUD with Django and Postgresql. I created database locally and finished my assignment.
Now I have to upload my code to github. The problem is, they asked for some exampled for CRUD. Developer that is reviewing my code obviously can't access my local DB.
What can I do to solve this issue? Do I have to upload my DB to cloud (if so what is the best way to do that)? Or is there any other way to resolve this?
Thanks in advance
When they download your code they would need to create their own local db, run python manage.py makemigrations and python manage.py migrate, then all the db tables will be created. However, there won't be any initial data.
I recommend downloading your code and running through every step it takes to get your project up and running. This would include things like create an admin user, etc. Then create a basic README with all the steps to make it as easy as possible for them to get it up and running.
Alternatively, you could Dockerize your application and provide a Dockerfile, but that's a bit overkill for a take home interview in my opinion. It may impress the interviewer nonetheless. They may not even want to download and run your project, just review your code in Github.
To provide initial data, you would want to look into writing fixtures. Then have them run python manage.py dumpdata to populate the db.
Just upload it to Heroku or something similar. You already have postgres as database so it's valid way. It is pretty straightforward with Heroku's official guide for django applications. I had same issue in my recruitment process and it was a solution that satisfied recruiter.
Database obviously will be empty unless you prepare some fixtures, which is very good idea. Django docs have something for that.

What's best approach to maintain database table field between git branch?

I'm using Django and Postgresql to develop a web service.
Suppose we've 3~4 branch which for the different features or old-version bugfix purpose.
Then, I met a problem, when I was in branch A and change django model, and run migrate to change database in my local test desktop.
When I switch to another branch which has no migration file, database will inconsistent and cannot work when I try to run django, I've to delete the database and recreate it.
In general, what's the best/common way to deal with this kind demands for developer environment?
I understand your situation well and have been in same shoe several times.
Here is what I prefer(/do):
I am in branch bug-fix/surname_degrade
I changed the user data model [which generated user_migration_005] and then migrated the DB.
Then my boss came and pointed out that the user is not able to login due to login degrade.
So I have to switch branch and fix that first.
I can rollback the migration[user_migration_005] which I have done few moments back. With something like this python manage.py migrate user_migration_004
Switched branch and started working on hot-fix/login_degrade
When I switch back to my previous task , I can just do migration and proceed.
With this procedure I don't need to delete my all tables or restore old database or anything like that.
I am a newbie, will be extremely happy to hear your thoughts.
The major issue here is that, you database will change everytime You migrate,so either you mantain you database consistency among different branches, or You can do One thing, while using/testing (after declaring all the models)
1) Delete all database tables ( If you have a backup or dummy data )
2) Delete all existing migration files in you branch
3) Create new migrations
4) Migrate to new migrations
The above steps can also be done if the models are re modified, after modification just repeat the steps.
Run a different test database in each branch.
When you fork the design, fork the database
Make a clone of the database and migrate that.
Make sure when you push to git, you include your migrations, that wait when someone else pulls the branch and does a migrate django knows what changes were made to the database.

retrieving automated backup from google sql

I would like to download an automated backup of my google cloud sql, my issue is, I have a client who accidentally deleted some data, and I am hoping using the latest automated backup I can pick and choose which to restore, by either restoring it to a different DB or some janky regex
From the options I see, I can only restore the whole thing.
Is there a way? I am in dire need of this capability.
So turns out you can do it by:
Creating a new instance (not a new database)
Restore this instance to the backup you want

Migrations Plugin for CakePHP

I have few questions about this plugin.
1- what does it do?
Is it for exchanging databases between teams or changing their schema or creating tables based on models or something else?
2- if it is not meant to create tables based on models where can I find a script that does this?
3-can it work under windows?
thanks
The Migrations plugin allows versioning of your db changes. Much like is available in other PHP frameworks and Rails.
You essentially start with your original schema and create the initial migration. Each time you make a change you generate a 'diff' that gets stored in the filesystem.
When you run a migration, the database is updated with the changes you made. Think deployment to a staging or production server where you want the structure to be the same as your code is moved from environment to environment.
We are starting to look at this plugin so we can automate our deployments, as the DB changes are done manually right now.

How to ensure database changes can be easily moved over DVCS using django

Overview
I'm building a website in django. I need to allow people to begin to add flatpages, and set some settings in the admin. These changes should be definitive, since that information comes from the client. However, I'm also developing the backend, and as such will am creating and migrating tables. I push these changes to the hub.
Tools
django
git
south
postgres
Problem
How can I ensure that I get the database changes from the online site down to me on my lappy, and also how can I push my database changes up to the live site, so that we have a minimum of co-ordination needed? I am familiar with git hooks, so that option is in play.
Addendum:
I guess I know which tables can be modified via the admin. There should not be much overlap really. As I consider further, the danger really is me pushing data that would overwrite something they have done.
Thanks.
For getting your schema changes up to the server, just use South carefully. If you modify any table they might have data in, make sure you write both a schema migration and as necessary a data migration to preserve the sense of their data.
For getting their updated data back down to you (which doesn't seem critical, but might be nice to work with up-to-date test data as you're developing), I generally just use Django fixtures and the dumpdata and loaddata commands. It's easy enough to dump a fixture and commit it to your repo, then a loaddata on your end.
You could try using git hooks to automate some of this, but if you want automation I do recommend trying something like Fabric instead. Much of this stuff doesn't need to be run every single time you push/pull (in particular, I usually wouldn't want to dump a new data fixture that frequently).
You should probably take a look at South:
http://south.aeracode.org/
It seems to me that you could probably create a git hook that triggers off South if you are doing some sort of continuous integration system.
Otherwise, every time you do a push you will have to manually execute the migration steps yourself. Don't forget to put up the "site is under maintenance" message. ;)
I recommend that you use mk-table-sync to pull changes from live server to your laptop.
mk-table-sync takes a lot of parameters so you can automate this process by using fabric. You would basically create a fabric function that executes mk-table-sync on each tablet that you want to pull from the server.
This means that you can not make dabatase changes yourself, because they will be overwritten by the pull.
The only changes that you would be making to the live database are using South. You would push the code to the server and then run migrate to update the database schema.