How do you use inspectdb in Django? - django

I am just starting with Django, and I would like to make an app that uses my existing sqlite db.
I read the docs and I found that you can create models from a db, using inspectdb; altho I can't find an example of how you use that command, on an existing db.
I copied the db file inside the directory of my project, ran the command and I see that a sqlite3 file is created in my directory project.
Altho the file has nothing to do with the database that I made. I tried to pass the db name to the inspectdb command but it says that it doesn't accept parameters.
So how can I actually tell the command to use my db to create the model for my app?
There must be some obvious step that I am missing...this is what I did:
-created the project
-created the app
-copied my db inside the project folder
-ran inspectdb
But I see the model empty, and a new db called db.sqlite3 created

Found the answer: there is a variable that has to be set, to define which one is the db that the application will use. the default is set to "db.sqlite3", which explain why I am getting this behavior.
Once you modify the name with the database that I already made, the command run without issues.
Not sure if it is just me getting stomped, but this info about the name that has to be changed was not mentioned anywhere...
Thanks

Related

Creating Database in iOS Application

I developed an application in iOS using qt. I am trying to create a DB sqlitedata.db. But DB is not getting created.
I have used
QStandardPaths::standardLocations(QStandardPaths::StandardLocation type)
to get the correct path.
The path returned from qDebug() was:
/var/mobile/Applications/262093E8-F9A7-4624-9559-FB3C6BF393E5/Library/Application Support/sqlitedata.db
But inside the Library folder, there is no folder named Application Support and no DB.
Please help with a solution to create a DB and read / update the DB through application.
make sure the path returned from QStandardPath actually exists. If it doesn't, than create it first.

What is the structure.sql used for?

I'm curious what the point of the structure.sql file is. It seems to be updated and created every time rails migrations are run. So it seems to be a visual representation of our database. What else can it be used for?
When one runs structure:load, what does it do? What does it mean to load a structure file into a database? Why would you need to do that?
Should one be committing the structure.sql file?
Seems like your rails app is configured to use the sql schema format
#/config/application.rb
...
config.active_record.schema_format = :sql
...
the structure.sql is in place of a schema.db.
Running db:structure:load ( or db:schema:load) will load your entire database. You only need to do this when bringing on a new app instance from scratch. After awhile, your migration files will become quite lengthy and it will be better to do a load first, then a migration when bringing up a new app instance

Running tests with unmanaged tables in django

My django app works with tables that are not managed and have the following defined in my model like so:
class Meta:
managed = False
db_table = 'mytable'
When I run a simple test that imports the person, I get the following:
(person)bob#sh ~/person/dapi $ > python manage.py test
Creating test database for alias 'default'...
DatabaseError: (1060, "Duplicate column name 'db_Om_no'")
The tests.py is pretty simple like so:
import person.management.commands.dorecall
from person.models import Person
from django.test import TestCase
import pdb
class EmailSendTests(TestCase):
def test_send_email(self):
person = Person.objects.all()[0]
Command.send_email()
I did read in django docs where it says "For tests involving models with managed=False, it’s up to you to ensure the correct tables are created as part of the test setup.". So I understand that my problem is that I did not create the appropriate tables. So am I supposed to create a copy of the tables in the test_person db that the test framework created?
Everytime I run the tests, the test_person db gets destroyed (I think) and re-setup, so how am I supposed to create a copy of the tables in test_person. Am I thinking about this right?
Update:
I saw this question on SO and added the ManagedModelTestRunner() in utils.py. Though ManagedModelTestRunner() does get run (confirmed through inserting pbd.set_trace()), I still get the Duplicate column name error. I do not get errors when I do python manage.py syncdb (though this may not mean much since the tables are already created - will try removing the table and rerunning syncdb to see if I can get any clues).
I had the same issue, where I had an unmanaged legacy database that also had a custom database name set in the models meta property.
Running tests with a managed model test runner, as you linked to, solved half my problem, but I still had the problem of Django not knowing about the custom_db name:
django.db.utils.ProgrammingError: relation "custom_db" does not exist
The issue was that ./manage.py makemigrations still creates definitions of all models, managed or not, and includes your custom db names in the definition, which seems to blow up tests. By installing:
pip install django-test-without-migrations==0.2
and running tests like this:
./manage.py test --nomigrations
I was able to write tests against my unmanaged model without getting any errors.

Django fixtures for permissions

I'm creating fixtures for permissions in Django. I'm able to get them loaded the way it's needed. However, my question is..say I want to load a fixture for the table auth_group_permissions, I need to specify a group_id and a permission_id, unfortunately fixtures aren't the best way to handle this. Is there an easier way to do this programmatically? So that I can get the id for particular values and have them filled in? How is this normally done?
As of at least Django >=1.7, it is now possible to store permissions in fixtures due to the introduction of "natural keys" as a serialization option.
You can read more about natural keys in the Django serialization documentation
The documentation explicitly mentions the use case for natural keys being when..
...objects are automatically created by Django during the database synchronization process, the primary key of a given relationship isn’t easy to predict; it will depend on how and when migrate was executed. This is true for all models which automatically generate objects, notably including Permission, Group, and User.
So for your specific question, regarding auth_group_permissions, you would dump your fixture using the following syntax:
python manage.py dumpdata auth --natural-foreign --natural-primary -e auth.Permission
The auth_permissions table must be explicitly excluded with the -e flag as that table is populated by the migrate command and will already have data prior to loading fixtures.
This fixture would then be loaded in the same way as any other fixtures
The proper solution is to create the permissions in the same manner the framework itself does.
You should connect to the built-in post_migrate signal either in the module management.py or management/__init__.py and create the permissions there. The documentation does say that any work performed in response to the post_migrate signal should not perform any database schema alterations, but you should also note that the framework itself creates the permissions in response to this signal.
So I'd suggest that you take a look at the management module of the django.contrib.auth application to see how it's supposed to be done.
Just to add to #jonpa's comment, if you are using multitenant app and you want to directly save the fixtures to a file you can do:
python manage.py tenant_command dumpdata --schema=<schema_name> --natural-foreign --natural-primary -e auth.Permission --indent 4 > /path/to/fixtures/fixtures.json

How to detect and respond to a database change (INSERT) from a django project?

I am setting up our project to integrate with a shipping platform called Endicia which has the ability to insert new rows into our database when a package is shipped.
How can I detect from python when a new row has been inserted?
My solution for now would be to query the DB every 30 seconds or so for new rows... is there another solution to send a signal from postgres to python?
You'd set up a custom command that is run by the manage.py file.
You'd put it in `yourapp/management/commands/' folder. Make sure to add an init.py file to both the management and commands folder or the command won't work. Then you create the code for the custom command.
Then, see this related question about running a shell script when changes are made to a postgres database. The answer there was to use PL/sh. You'll need to figure that part out on your own, but basically however you do it, the end result is that the script should call something like /path/to/app/manage.py command_name