Django and Elasticbeanstalk: a column that has been deleted on local hasn't been deleted on RDS - django

I deployed my project into eb and a form submission doesn't work on eb even though it works on local. And I found the cause.
One of the attribute of a model that is not allowed to be null still remains on RDS even though I removed it in the past.
Log error is like this
ERROR: null value in column "a_variable" violates not-null constraint
How can I handle with this problem? Is there a way to remove the column manually on RDS? I am using PostgreSQL as the db engine.

Your RDS database is just a normal PostgreSQL database. If you really need to access it directly, you can:
Create a security group to open up the PostgreSQL port (5432 with TCP) to your current IP address.
Add the security group to your RDS instance (temporarily, don't forget to remove it later)
Use psql on your local machine to connect to your RDS instance, with your django credentials.
Run the normal SQL command (ALTER TABLE table_name DROP COLUMN column_name;) to delete the column. Be careful! you're directly manipulating your database.
To be sure I would check the django-migrations table to see if all migrations were run correctly. you might run into problems with later migrations if that isn't the case.

Related

Django localhost default port already in use with postgres

I am receiving a error when running postgres, pgadmin and django.
Port is already in use. I know how to use the sudo command to kill the server, however I am not being shown my databases in postgres. It just shows running when I start the server. When I create a new port database shows but doesn't migration the data. Also when I run migrations to django I get error, table already exist and or table doesn't exist. This happens everytime as if there is a duplicate database and it's not connected to the one showing in pgadmin. I tried dropping and deleting tables in pgadmin, and still get table exist.
How do i fix this? How do I show all databases connected and delete the one that overrides my default port and have postgres show database servers in the app.
Thank you in advance. Long post but I'm stressing and cant find a solution.
Deleting tables in pgadmin, change port number, open postgres first than start pgadmin.

AWS DMS - Migrate - only schema

We have noticed that if a table is empty in SQL Server, the empty table does not come via DMS. Only after inserting a record it starts to show up.
Just checking, is there a way to get the schema only from DMS?
Thanks
You can use Schema conversion tool for moving DB objects and Schema. Its a free tool by AWS and can be installed on On-Prem server or on EC2. It gives a good report before you can actually migrate the DB schema and other DB objects. It shows how many Tables, SP's Funcs etc can be directly migrated and shows possible solutions too.

How to check whether sqlite database is attached or not?

I am using sqlite to store my data. I have two databases. In my application, each time a new request comes, I am attaching first db to second db. The problem is, if two request come it is showing the db already in use (it is trying to attach twice with same alias name 'db'). I want to know if there is any way to check whether a database is attached or not?
PRAGMA database_list;
outputs a resultset with full list of available databases. The first column is database name, the second is database file (empty if it is not associated with file). The primary database is always named main, temporary db is always temp.
sqlite> attach "foo.db" as foo;
sqlite> pragma database_list;
0|main|
2|foo|/Users/me/tmp/foo.db
I assume you are reusing the same connection to the database for multiple requests. Because databases are attached to the connection object, attaching fails for the second or further requests with the same connection. The solution I think is thus to attach the database immediately after a new connection is made, and not each time a request is received.

Tables not recognized on restored geodjango postgis backup

So I backed up my geodjango postgis database using pg_dump before performing some calculations which I've managed to mess up. I've created a new database using
createdb -T template0 -O proj proj_backup
psql proj_backup < proj_backup.pg
This seemed to work fine (though there were a few errors during the import), and connecting to the database using psql all my tables are there and appear to have the correct numbers of rows etc.
However, changing my settings.py to connect to my newly imported backup db (proj_backup in my example), gives me the following errors:
DatabaseError: relation "appname_model" does not exist
Any ideas? I'm guessing I did the dump wrong, or that I haven't maintained the ForeignKeys somehow. Thanks very much.
Update
So I figured out my first mistake: I had two similarly named backup databases and connected to the wrong one. Connecting to the correct one seems to have fixed everything. However, it's still quite strange that it didn't recognize the tables in the other backup database, which definitely did exist. Running syncdb on the incorrect database ends up duplicating those tables (if I remember correctly, there were duplicate table names when I listed them all from within psql). Unfortunately, the way I discovered my mistake was by dropping the bad table to recreate it, and so in order to reproduce this error I'll probably have to use time machine. Still very strange, I'll give that a shot when I can get physical access to my work machine.
So is your appname_model table there or is it a view? Was it in public or another named schema?
If the table is there, then chances are you have it in a schema that is not in your database search path. Check the search_path of your old database. It might have included something other than the default, or your default search schema is set in postgresql.conf and is non-standard.

Django Multiple Databases - One not always available

I am developing a Django application which will use multiple database backends. I would like to put an sqlite database on the machine running the django application, and sync to a remote mysql database. The tricky part is that this machine running the application will not always have an internet connection, so the mysql database is not always availalble. There will be multiple machines running the application, each with it's own local sqlite DB, but all using the same remote mysql DB.
I haven't written the code yet, but here is what I have in mind. Every time I run an insert or update I would like to write it to both databases, unless the remote DB is unavailable, in which case I will save the sql statement in a table on the local database to run when the remote DB is available.
Can this be done with database routers, or do I need to manually implement this with each db statement?
Note on PK: Not directly related, but sure to be asked. The primary key will be generated locally on each machine. In the mysql DB there will be a field for this key, and a field with a unique identifier for each instance of the application, which together will provide a unique key.
Suppose you have a model called Blog then you can use the following to store it locally and remotely (assuming that you have configured access to the remote db).
blog = Blog('test')
blog.save() #Assuming that sqlite is the default db
try:
blog.save(using='mysql')
except NoInternetConnection:
pass
Make sure you have defined and configured 'mysql' in settings.py and that you handle the cases when there is no Internet connection.
Side note: I am not quite sure why you actually would want to do this. If this is for backup purposes then I would use standard backup procedures. For more information about using multiple databases see: http://docs.djangoproject.com/en/dev/topics/db/multi-db/#using-raw-cursors-with-multiple-databases
I took DrDee's code and attached it to the post_save signal (+1 for the help).
#receiver(models.signals.post_save) #new signal decorator in Django 1.3
def save_to_remote(sender,instance,using,**kwargs):
if using == 'default' and instance.__module__ == 'mymodel.myapp.models':
try:
instance.save(using='remote')
except:
pending_instance=Pending_Remote(
pk_default=instance.pk,
model_default=instance.__class__.__name__
)
pending_instance.save()
This also saves a record of what was not saved to the remote database. Note that the model Pending_Remote must not be in 'myapp'.