flask-migrate: how to add a new database - flask

I have a flask setup with development & staging environments.
Now I want to add a production environment wit a production database.
I have troubles integrating the new database into flask-migrations.
I did these steps:
created fresh postgres DB
ran db.create_all() from the flask app
(resulting in a DB reflecting the latest version of the data model)
now all flask-migrate commands have errors
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) column "testfield" of relation "customer_feedback" already exists,
because flask migrate
seems to think it needs to apply all migrations that have been
created until today. But they are not necessary because DB is already fully reflecting models.py.
How can I convince flask-migrate to accept the current state as fully migrated?
Or whats the standard workflow for this?
In other words:
I am coming from Django, where the migrate command creates and updates the model if necessary when adding a blank DB. How should it be done with flask?

You need to tell flask migrate that db has already been created and all requirements are already fulfilled. Try following command -
flask db stamp head
This will tell flask migrate not to attempt to add anything.

Related

Flask-migrate doesn't update or migrate changes after environment change

I'm running Flask with FLask-SQLalchemy on DigitalOcean Apps. I can't get Flask-Migrate to work properly on the production environment. Calling flask db migrate on my production app does nothing. No changes detected. Nothing. Just this below:
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
As such I tried this tutorial - which is basically a way of 'starting again'. I'd create the initial migration to a local empty DB, then commit that migration script to source and push it live. The changes (the initial migration) were detected just fine on an empty local DB. All that's changed is the DB is empty and I changed the DATABASE_URI env.
This means there's a migration for the 'first migration' on the production instance. As far as I'm aware, it reflects the state of the production DB.
I'd then run flask db stamp head on production and local. Running flask db migrate on production (with the change I want to migrate) nothing happens. No changes detected. flask db upgrade produces the same results. I checked the migrate instance on production - it has the correct db connection string. The web app works, but its not detecting or able to push through new changes?
I have data in my database I absolutely cannot drop it.

how to reflect changes made on postgres database of django from one pc to other

We three working with Django and postgres is the database. Whenever we push the code to GitHub. The database data is not reflecting. The data I stored is visible to me only. The postgres user, password and database name are same on all our laptops. How to make when I push that has to go their databases also.
If you are talking about changes in db schema, then take a look django migrations - https://docs.djangoproject.com/en/3.1/topics/migrations/. The workflow is following:
change model (e.g. add new field, change existing field...)
generate migration file by running python manage.py makemigrations. This generates migration file in <app_folder>/migrations
Run python manage.py migrate to apply changes in models to your database
Add migration file to your version control (github or whaever) and push it
Now when your colleagues gets updated code from version control they need to run python manage.py migrate to apply model changes to their local database.
I found that it is not possible to send postgress data directly. But sqlite can be used for it. This link aids in that - https://manuelvanrijn.nl/blog/2012/01/18/convert-postgresql-to-sqlite/ . But i send data by taking backup like
From Postgres to Postgres: taking backup :
pg_dump dbname > dbname.bak
and then on new Postgres restored with:
psql test < dbname.bak

How to connect with postgresql database after deployment to Heroku?

I'm new to Django but I'm deploying a Django-based website to Heroku, using Postgresql. The deployment was successful, and the website is online and has established connection with the database. However, none of the data from my local database has migrated to the heroku database, causing it to be a blank database. If I go into the admin section and manually input a datapoint, it appears on my site, so I know that database is correctly serving data. What is the proper way for migrating data from your local database to your online, heroku version of the database? I thought the following code would migrate the data:
heroku run python manage.py makemigrations
heroku run python manage.py migrate
But apparently I'm missing something.
make migrations will create a migration that contains your schema, but no data. The migrate command applies the migration to the database.
In order to provide data to be sent over as part of the migrate command you need to either create a data migration or use a fixture.
Another option you have is to dump your local database and do an import into Heroku Postgres
All in all, it depends on how much local data you have that you want copied over. If its only a few rows, I would use either a data migration or a fixture, if its 100s or 1000s of rows an export/import of your dataset is your best bet.

Why is django manage.py syncdb failing to create new columns on my development server?

I am trying to create a development server from a production server from which I can test out new ideas.
I created a duplicate of my production server's database by dumping it using Postgres' db_dump and then imported the dump into a new database.
I then copied my production django directory and altered all .py files to refer to server_debug. rather than server in my import statements.
Using the admin interface to alter some data works in that only the development server has its data altered.
However, when I then try adding a new field in my models.py in my development server, manage.py syncdb fails to create it.
Is there something I am neglecting that could cause manage.py to refer to my production server rather than my development server?
syncdb doesn't touch tables that already exist. You need to either reset the app (easiest if you don't care about the data), modify the table manually (more of a quick hack) or use a migration app and version your models — South, for example.

How do I successfully integrate a second database with Django South?

One of our clients needs to add some geolocation data to their
site. Since they already have a database setup without GIS extensions,
I decided to create a new database (with the GIS extensions), which I
intend to use to store only the geolocation data.
I had, at some point, set things up to work alright on my development
machine (meaning, I have migrations for these new models). But now that the code has been written, I imported a DB dump
directly from the server so that my development machine exactly
mirrors the production machine, and now I can't seem to get South to
apply the migrations correctly. South seems to have several features
which allow for multiple databases, but none of them have worked so far.
What I've tried:
Just adding the model and migrating. This gives me the following
error:
AttributeError: 'DatabaseOperations' object has no attribute
'geo_db_type'
OK, so South is trying to create the model on the original database
which doesn't have the GIS extensions.
Adding the model, but specifying the 'geo' database for migrating
the 'geo' app. This gives me the following error:
django.db.utils.DatabaseError: relation "south_migrationhistory"
does not exist
I guess south expects its MigrationHistory table to exist on the 'geo'
database as well?
Allow south's models to exist on my 'geo' database.
$ python manage.py syncdb --database=geo
$ python manage.py migrate
This gives me the following error:
django.db.utils.DatabaseError: relation "<model>" already exists
I'm guessing this is because I already have MigrationHistories stored
in the other database?
South apparently has a sparsely documented feature called 'dbs'
(see:
http://south.aeracode.org/docs/databaseapi.html#accessing-the-api )
So I tried the previous three methods again replacing all instances of
"db" with "dbs['geo']".
a. Migrations run smoothly, but don't actually create any tables
in my 'geo' database.
b. Same error as when not using 'dbs' features.
c. Same error as when not using 'dbs' features.
This entire process has been extremely frustrating. Has anyone got
multiple database support up-and-running when using South?
Whenever I have modified the table models I used south and these commands to modify the structure and they always worked:
python manage.py convert_to_south "your_app"
python manage.py migrate "your_app"
I recommend running these commands after running syncdb, so your tables are created.