How can I run django shell commands from a bash script - django

Instead of repeatedly deleting my tables, recreating them and populating with data in my dev env, I decided to create a bash script called reset_db that does this for me. I got it to whack the tables, recreate them. But it's not able to populated the tables with data from the django orm.
I try to do this by calling the django shell from the script and then running ORM commands to populate my tables. But it seems like the django shell commands are not running.
I tried running the django orm commands manually/directly in the shell and they run fine but not from within the bash script.
The errors I get are:
NameError: name 'User' is not defined
NameError: name 'u1' is not defined
NameError: name 'm' is not defined
Here is my script:
#!/bin/bash
set +e
RUN_ON_MYDB="psql -X -U user --set ON_ERROR_STOP=on --set AUTOCOMMIT=off rcamp1"
$RUN_ON_MYDB <<SQL # Whack tables
DROP TABLE rcamp_merchant CASCADE;
DROP TABLE rcamp_customer CASCADE;
DROP TABLE rcamp_point CASCADE;
DROP TABLE rcamp_order CASCADE;
DROP TABLE rcamp_custmetric CASCADE;
DROP TABLE rcamp_ordermetric CASCADE;
commit;
SQL
python manage.py syncdb # Recreate tables
python manage.py shell <<ORM # Start django shell. Problem starts here.
from rcamp.models import Customer, Merchant, Order, Point, CustMetric, OrderMetric
u1 = User.objects.filter(pk=5)
m = Merchant(u1, full_name="Bill Gates")
m
ORM
I'm new to both django and shell scripting. Thanks for your help.

You should look at creating a fixture to populate your db https://docs.djangoproject.com/en/dev/howto/initial-data/

You need to import User explicitly. The django package and a few other things are automatically imported, but not everything you might want.
Also, to avoid not know what to import, there are management commands. This will leverage your Django and Python. You can learn shell scripting later.

clearly seen in your mistakes is not recognized as a model class User django-admin maybe you lack some import or something like this
from django.db import models
User import from django.contrib.auth.models
, by the way In line
User.objects.filter u1 = (pk = 5)
I think I put
u1 = User.objects.filter (pk = 5). First ()
at the end.
Anyway, here I leave some threads that may be of help,
https://docs.djangoproject.com/en/dev/ref/django-admin/
http://www.stackoverflow.com/questions/6197256/django-user-model-fields-at-adminmodel
https://groups.google.com/forum/?fromgroups = #! topic/django-users/WrVp1DDFrX8
Hope this helps.

Related

Database entries added in background task don't show up on heroku

I have a background task in my django application, which enters all the rows from a csv into one of the tables in my database. I pass the csv via the admin site, that creates a background task, that I can run with python manage.py process_tasks. This all works locally, but on my heroku app, for some reason it doesn't.
I thought maybe inputting data is impossible from the heroku console, but if I run python manage.py shell on the heroku console, I can input data just fine.
This is the code that inputs the data into the database:
from background_task import background
...
#background(schedule=5)
def save_course_from_df(df):
df = pandas.read_json(df)
db = 0
for index, row in df.iterrows():
print("%s percent done!" % str(db / df.shape[0]))
db += 1
values = dict(row)
values = {key: values[key] for key in values.keys()
if type(values[key]) != float or not math.isnan(values[key])}
try:
Course.objects.update_or_create(
url=row['url'],
defaults=values
)
except IntegrityError:
pass
print('done!')
I run this by opening the heroku console and running 'python manage.py process_tasks'. I get the print messages, and no error is being thrown. Still, my database doesn't change.
I expected that after the task runs I would have a full table. Instead, nothing changed.
It seems the problem was with my migrations. For some reason the database on the heroku app had a not null constraint for a column whose model had the null=True argument.

How to add a column to existing table in flask

Done as follows but no column is added.
Migrate database
python manage.py db migrate
Edit migrations/versions/{version}_.py
def upgrade():
from alembic import op
op.add_column('table_name', Column('column_name', INTEGER) )
Update schema
python manage.py db upgrade
The reason is that alembic stores version in table called alembic_version, and once {version} is in alembic_version, then nothing happens.
The solution is to create a new migration script and do migrate again.

How can I activate the unaccent extension on an already existing model

When I try to install the unaccent Postgres extension (through the postgresql-contrib package), everything works as per the below:
# psql -U postgres -W -h localhost
Password for user postgres:
psql (9.3.9)
SSL connection (cipher: DHE-RSA-AES256-GCM-SHA384, bits: 256)
Type "help" for help.
postgres=# CREATE EXTENSION unaccent;
CREATE EXTENSION
postgres=# SELECT unaccent('Hélène');
unaccent
----------
Helene
(1 row)
However, when I try to use with Django 1.8, I get following error:
ProgrammingError: function unaccent(character varying) does not exist
LINE 1: ...able" WHERE ("my_table"."live" = true AND UNACCENT("...
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
Using Postgresql 9.3 and Django 1.8.
A migration file needs to be manually made and applied.
First, create an empty migration:
./manage.py makemigrations myapp --empty
Then open the file and add UnaccentExtension to operations:
from django.contrib.postgres.operations import UnaccentExtension
class Migration(migrations.Migration):
dependencies = [
(<snip>)
]
operations = [
UnaccentExtension()
]
Now apply the migration using ./manage.py migrate.
If you'd get following error during that last step:
django.db.utils.ProgrammingError: permission denied to create extension "unaccent"
HINT: Must be superuser to create this extension.
... then temporarily allow superuser rights to your user by performing postgres# ALTER ROLE <user_name> SUPERUSER; and its NOSUPERUSER counterpart. pgAdminIII can do this, too.
Now enjoy the unaccent functionality using Django:
>>> Person.objects.filter(first_name__unaccent=u"Helène")
[<Person: Michels Hélène>]

How do I inspectdb 1 table from database which Contains 1000 tables

I got a schema which Contains 1000 tables,and many of them I don't need,
how can I just inspectdb the just tables that I need?
You can generate the model of a single table, running this command
python manage.py inspectdb TableName > output.py
This works also if you want to generate the model of a view
You can do it in the python console, or in *.py file:
from django.core.management.commands.inspectdb import Command
from django.conf import settings
from your_project_dir.settings import DATABASES # replace `your_project_dir`
settings.configure()
settings.DATABASES = DATABASES
Command().execute(table_name_filter=lambda table_name: table_name in ('table_what_you_need_1', 'table_what_you_need_2', ), database='default')
https://github.com/django/django/blob/master/django/core/management/commands/inspectdb.py#L32
You can do it by the following command in Django 2.2 or above
python manage.py inspectdb --database=[dbname] [table_name] > output.py
You can get the models of the tables you want by doing:
python manage.py inspectdb table1 table2 tableN > output.py
This way you can select only the tables you want.
You can generate model's python code and write to the console programmatically.
from django.core.management.commands.inspectdb import Command
command = Command()
command.execute(
database='default',
force_color=True,
no_color=False,
include_partitions=True,
include_views=True,
table=[
'auth_group',
'django_session'
]
)
set table=[] empty list to get all tables

Another South "table does not exist" issue: none of the previously posted solutions working

I am trying to make an app using Django and am using South to handle migrations. After I define the app's models.py, I include south in the "INSTALLED_APPS" in settings.py. Then I sync my database. When I validate the database, I get 0 errors. Then I execute the following commands on the command prompt:
C:\Users\abagaria\Desktop\IntegrateID\website>python manage.py schemamigration w
ebsite.integrate --initial
Creating migrations directory at 'C:\Users\abagaria\Desktop\IntegrateID\website\
website\integrate\migrations'...
Creating __init__.py in 'C:\Users\abagaria\Desktop\IntegrateID\website\website\i
ntegrate\migrations'...
+ Added model integrate.Publisher
+ Added model integrate.Author
+ Added model integrate.Book
+ Added M2M table for authors on integrate.Book
Created 0001_initial.py. You can now apply this migration with: ./manage.py migr
ate integrate
C:\Users\abagaria\Desktop\IntegrateID\website>python manage.py migrate website.i
ntegrate
Running migrations for integrate:
- Migrating forwards to 0001_initial.
> integrate:0001_initial
FATAL ERROR - The following SQL query failed: CREATE TABLE "integrate_publisher"
("id" integer NOT NULL PRIMARY KEY, "name" varchar(30) NOT NULL, "address" varc
har(50) NOT NULL, "city" varchar(60) NOT NULL, "state_province" varchar(30) NOT
NULL, "country" varchar(50) NOT NULL, "website" varchar(200) NOT NULL)
The error was: table "integrate_publisher" already exists
! Error found during real run of migration! Aborting.
! Since you have a database that does not support running
! schema-altering statements in transactions, we have had
! to leave it in an interim state between migrations.
! You *might* be able to recover with: = DROP TABLE "integrate_publisher"; []
= DROP TABLE "integrate_author"; []
= DROP TABLE "integrate_book"; []
= DROP TABLE "integrate_book_authors"; []
! The South developers regret this has happened, and would
! like to gently persuade you to consider a slightly
! easier-to-deal-with DBMS (one that supports DDL transactions)
! NOTE: The error which caused the migration to fail is further up.
Error in migration: integrate:0001_initial
DatabaseError: table "integrate_publisher" already exists
I know that a lot of people have faced similar problems while using south, but usually in their case, they make the mistake of executing the "--initial" command more than once-- thereby causing south to make more than one __initial file in the migrations directory. But in my case, South thinks that the table already exists even when I make the first migration!
I have also tried:
deleting the migrations directory
deleting ghost migrations
making a "fake" migration
and then running the actual migration
Can someone please tell me how I fix this problem and can start defining my models again?
If you already have tables in database, do not use --initial, instead you need convert_to_south command. Delete directory "migrations", all tables from database and run the following commands:
python manage.py syncdb
python manage.py convert_to_south appname
python manage.py syncdb --migrate
http://south.readthedocs.org/en/latest/convertinganapp.html