Accessing Database from other location - django

For my job interview, I got assignment to create CRUD with Django and Postgresql. I created database locally and finished my assignment.
Now I have to upload my code to github. The problem is, they asked for some exampled for CRUD. Developer that is reviewing my code obviously can't access my local DB.
What can I do to solve this issue? Do I have to upload my DB to cloud (if so what is the best way to do that)? Or is there any other way to resolve this?
Thanks in advance

When they download your code they would need to create their own local db, run python manage.py makemigrations and python manage.py migrate, then all the db tables will be created. However, there won't be any initial data.
I recommend downloading your code and running through every step it takes to get your project up and running. This would include things like create an admin user, etc. Then create a basic README with all the steps to make it as easy as possible for them to get it up and running.
Alternatively, you could Dockerize your application and provide a Dockerfile, but that's a bit overkill for a take home interview in my opinion. It may impress the interviewer nonetheless. They may not even want to download and run your project, just review your code in Github.
To provide initial data, you would want to look into writing fixtures. Then have them run python manage.py dumpdata to populate the db.

Just upload it to Heroku or something similar. You already have postgres as database so it's valid way. It is pretty straightforward with Heroku's official guide for django applications. I had same issue in my recruitment process and it was a solution that satisfied recruiter.
Database obviously will be empty unless you prepare some fixtures, which is very good idea. Django docs have something for that.

Related

My new migration will brake my database on the heroku (postgres)

I am facing a challenge here. So I inhertied the models from previous developers and the tables were not properly built. I added some constraints and new tables in order to normalize those tables. Before pushing the application to the heroku I tested it on my local machine and it actually broke my database.
Now the heroku website is already in production, so there are user information. How should i approach this, do I need to destroy the existing database and create a new one and run the migrations
Be very, very careful. Applying migrations on production servers can cause irreversible damage if you are not careful, and so you should be prepared for every possible situation.
My best recommendation would be to create an entire duplicate copy of your live DB (using Heroku this is as simple as a PG dump/backup). You can then create a new staging site using the same code, upload the backup into a new Database instance, and then test against that. Live environments are not always the same as local ones. You can then run your migrations on the staging site, and see if there are any unexpected effects (the best way to do this would be by utilizing django test cases). If there are any issues, be sure to understand how the rollback process works with django migrations.
A good tutorial that is fairly recent can be found here: https://realpython.com/django-migrations-a-primer/

What's best approach to maintain database table field between git branch?

I'm using Django and Postgresql to develop a web service.
Suppose we've 3~4 branch which for the different features or old-version bugfix purpose.
Then, I met a problem, when I was in branch A and change django model, and run migrate to change database in my local test desktop.
When I switch to another branch which has no migration file, database will inconsistent and cannot work when I try to run django, I've to delete the database and recreate it.
In general, what's the best/common way to deal with this kind demands for developer environment?
I understand your situation well and have been in same shoe several times.
Here is what I prefer(/do):
I am in branch bug-fix/surname_degrade
I changed the user data model [which generated user_migration_005] and then migrated the DB.
Then my boss came and pointed out that the user is not able to login due to login degrade.
So I have to switch branch and fix that first.
I can rollback the migration[user_migration_005] which I have done few moments back. With something like this python manage.py migrate user_migration_004
Switched branch and started working on hot-fix/login_degrade
When I switch back to my previous task , I can just do migration and proceed.
With this procedure I don't need to delete my all tables or restore old database or anything like that.
I am a newbie, will be extremely happy to hear your thoughts.
The major issue here is that, you database will change everytime You migrate,so either you mantain you database consistency among different branches, or You can do One thing, while using/testing (after declaring all the models)
1) Delete all database tables ( If you have a backup or dummy data )
2) Delete all existing migration files in you branch
3) Create new migrations
4) Migrate to new migrations
The above steps can also be done if the models are re modified, after modification just repeat the steps.
Run a different test database in each branch.
When you fork the design, fork the database
Make a clone of the database and migrate that.
Make sure when you push to git, you include your migrations, that wait when someone else pulls the branch and does a migrate django knows what changes were made to the database.

Techniques to Avoid Problems with Django Migrations?

I'm building an e-commerce website with Django 1.8 and PostgreSQL 9.4. I'm interested in learning what techniques I can use when I change my database to avoid having problems with Django migrations, particularly in the event that I can't get migrations to run and I have to delete my migrations, rebuild my database, and restore from backups.
In development, I've found that when I change my database schema and re-run migrations, they only run successfully about 50% of the time. Clearly I'm doing some things wrong. What's worse is that when migrations don't work, it's not always easy to understand exactly why they failed and how to modify my migration files so that they will run. In those situations, I always have to delete the database and start over. This is acceptable in development but it's not a good strategy when I go into production.
What are some "best practices" or "do's and don'ts" you follow when you modify your model classes/database schema so as to increase the probability that your Django migrations will run? And are there any steps you take to ensure that you can restore your database in the event that your migrations won't run and you have to rebuild the database from scratch? I should add that I'm a one-person startup so I don't have the conflict issues that a team working from the same code base would have.
These Techniques are what I'm using
Work locally in the same environment what I'm working in it on server. Same version of the Django and database server then push the migrations itself, don't ignore it, and migrate on the server using there migrations.
This one I used once that I migrate manually, I created the tables, indices, relations using sql commands manually and it worked properly too.
I prefer the first one more

Updating Models

Due to my little confidence with Django and my sheer horror at the thought of seriously messing up my beautiful project, I shall ask for proper advice/instructions here.
The database that my Django project is sitting on top of has been changed (a few field types have been changed) and my models are now out-of-sync. Funnily enough, my Django still works (God knows how) but I still want to update the models. How do I go about doing this the proper way. Thank you very much indeed in advance.
Marked as answered. My actual discover was:
./manage.py inspectdb > <file>
//Hands you all the tables from the database.
//Then you update the models accordingly.
SIMPLE! :)
It's probably a bit late, but you might want to take a look at South, which is a migrations system for Django.
The normal practice for your situation would be to run manage.py reset appname, where appname is the name of the app which contains the models you've changed. You'll obviously want to dump the data in the affected tables first (find out what tables are going to be affected by running manage.py sqlreset appname).
Finally, it's quite possible your site is still running happily because you've not restarted the webserver (I'm assuming you're talking about a production environment, the development server reloads most changes automatically).
If you've already made the changes to the live database, you can probably just change the models and restart your webserver.
As long as your Field names match between the database and the models you shouldn't have any issues.
That being said, it is a much better idea to use a migration tool like south (as Dominic suggested already)

How to ensure database changes can be easily moved over DVCS using django

Overview
I'm building a website in django. I need to allow people to begin to add flatpages, and set some settings in the admin. These changes should be definitive, since that information comes from the client. However, I'm also developing the backend, and as such will am creating and migrating tables. I push these changes to the hub.
Tools
django
git
south
postgres
Problem
How can I ensure that I get the database changes from the online site down to me on my lappy, and also how can I push my database changes up to the live site, so that we have a minimum of co-ordination needed? I am familiar with git hooks, so that option is in play.
Addendum:
I guess I know which tables can be modified via the admin. There should not be much overlap really. As I consider further, the danger really is me pushing data that would overwrite something they have done.
Thanks.
For getting your schema changes up to the server, just use South carefully. If you modify any table they might have data in, make sure you write both a schema migration and as necessary a data migration to preserve the sense of their data.
For getting their updated data back down to you (which doesn't seem critical, but might be nice to work with up-to-date test data as you're developing), I generally just use Django fixtures and the dumpdata and loaddata commands. It's easy enough to dump a fixture and commit it to your repo, then a loaddata on your end.
You could try using git hooks to automate some of this, but if you want automation I do recommend trying something like Fabric instead. Much of this stuff doesn't need to be run every single time you push/pull (in particular, I usually wouldn't want to dump a new data fixture that frequently).
You should probably take a look at South:
http://south.aeracode.org/
It seems to me that you could probably create a git hook that triggers off South if you are doing some sort of continuous integration system.
Otherwise, every time you do a push you will have to manually execute the migration steps yourself. Don't forget to put up the "site is under maintenance" message. ;)
I recommend that you use mk-table-sync to pull changes from live server to your laptop.
mk-table-sync takes a lot of parameters so you can automate this process by using fabric. You would basically create a fabric function that executes mk-table-sync on each tablet that you want to pull from the server.
This means that you can not make dabatase changes yourself, because they will be overwritten by the pull.
The only changes that you would be making to the live database are using South. You would push the code to the server and then run migrate to update the database schema.