When I am trying to run python3 manage.py makemigrations it shows :
You are trying to add a non-nullable field 'topic_id' to journey_template without a default; we can't do that (the database needs something to populate existing rows).
Please select a fix:
1) Provide a one-off default now (will be set on all existing rows with a null value for this column)
2) Quit, and let me add a default in models.py
Select an option:
enter code here
from django.db import models
class Topic_Table(models.Model):
topic_id=models.IntegerField(primary_key=True)
topic_name = models.CharField(max_length=100, blank=True)
weightage = models.CharField(max_length=15, blank=True)
grade = models.IntegerField( null=True)
subject = models.CharField(max_length=100, blank=True)
sequence = models.IntegerField(null=True)
month = models.CharField(max_length=15, blank=True)
class Journey_template(models.Model):
student_id = models.IntegerField(default=1)
topic_id = models.ForeignKey('Topic_Table',on_delete=models.CASCADE)
subtopic_id = models.IntegerField()
journey_template_key = models.IntegerField(primary_key=True)
How would I fix this?
You are adding the topic_id field on Journey_template model.
This model already has data on your database.
You have a few options:
1 - provide a default value (like entering the number 1)
2 - delete your database and start with a fresh migration
If you table already has data, adding a non-nullable column without any default will violate the non-null constraint. Django migration doesn't check the table for data, so it assumes there is some.
If your table has no rows, you can go ahead and give it any legal default value. The default does not change your model, it just gives Django a value it can fill in. Since there are no rows of data in the table, it will have no effect other than to make Django happy.
If you table has rows, then if you can think of a sensible value to populate the existing rows with, then use it. Your other option is to change the model by adding null=True to the field and Django will just put nulls in that field for existing rows. (Later, you can put your own values in to those fields with Django or other methods and change the field back to null=False if you like. You will get the same question when you migrate but the answer will have no effect if the fields are not null). null=False is the default for any field in your model.
Related
I have the following existing twitter field on the extended UserProfile model and I'd like to change the field type from a URLField to a CharField with max_length of 20. When I try to migrate the change, I get the error django.db.utils.DataError: value too long for type character varying(20). I do not care about existing data in that field and prefer they go blank if there is existing data when migrating. How can I change the field type and clear existing data?
class UserProfile(models.Model):
user = models.OneToOneField(User, on_delete=models.SET_NULL, null=True)
# old field
twitter = models.URLField(verbose_name="Twitter", blank=True)
# new field
# twitter = models.CharField(
# max_length=20, verbose_name="Twitter Username", null=True, blank=True
# )
Approach 1st
In order to change to type and maximum length of a field which should know the longest url length in your db.
If you are using sqlite3 database then run the following query in your db_shell or sqlite3 shell or whatever you have.
SELECT MAX(LENGTH(column_name)) FROM table_name;
and then set the max_length of the new field to that value.
Approach 2nd
python manage.py shell
and then
# Import your model
profiles = UserProfile.objects.all()
max_length = 0
for userprofile in userprofiles:
if len(userprofile.twitter) > max_length:
max_length = len(userprofile.twitter)
print(max_length)
and then set your max_length to the value printed from the above code.
This error means that there are values on that column, that are longer than 20 characters.
You need to track and change them. This will change them into an empty string, so existing data will be lost for good.
UPDATE <table_name>
SET twitter = ""
WHERE LENGTH(twitter) > 20;
Then you can run you migrations.
I am preparing tables for use in an analytics tool (Microsof Power BI).
It would be helpful to show relationships between the tables to assist in report building.
However, I don't want to use foreign keys with constraints because the method of data updating and integrity checking doesn't need constraints, and it will actually get in the way. Some of the alternate backends this code supports are for cloud databases that don't offer real foreign keys.
I mention that because it means I am trying to define a foreignkey field like this:
order_guid = models.ForeignKey("Dear_sales_header",to_field="order_guid",db_constraint=False,on_delete=models.DO_NOTHING)
The migration file has this:
operations = [
migrations.AlterField(
model_name='sales_fact',
name='order_guid',
field=models.ForeignKey(db_constraint=False, on_delete=django.db.models.deletion.DO_NOTHING, to='dear_zoho_analytics.Dear_sales_header', to_field='order_guid'),
),
]
This table is routed to a different database
python manage.py migrate --database=dear_analytics
does indicate that the migration file was applied (it is 0026)
Applying dear_zoho_analytics.0026_auto_20210217_2245... OK
But when I inspect the postgresql schema in my IDE, the column in sales_fact is renamed to
order_guid_id
so it looks like I have done something wrong because this seems to reference the id field of the "parent table" dear_sales_header yet I need it to refer to dear_sales_header.order_guid which is unique but not the primary key.
Partial extract of Models:
class AnalyticsTable:
# a dummy class used to mark the DearAnalytics tables. Creating a subclass of models.Model breaks things in the ORM such as bulk update
pass
class Sales_fact(models.Model, AnalyticsTable):
analytics_table_name = "sales_fact"
#order_guid = models.CharField(max_length=1024, null=True, blank=True, help_text="")
order_guid = models.ForeignKey("Dear_sales_header",to_field="order_guid",db_constraint=False,on_delete=models.DO_NOTHING)
source_dear = models.CharField(max_length=1024, null=True, blank=True,
help_text="Link to Dear instance which is the source of the data")
class Dear_sales_header(models.Model, AnalyticsTable):
analytics_table_name = "dear_sales_header"
source_dear = models.CharField(max_length=1024, null=True, blank=True,
help_text="Link to Dear instance which is the source of the data")
order_guid = models.CharField(max_length=1024, unique=True, help_text="")
global_sale_status = models.CharField(max_length=1024, null=True, blank=True, help_text="SO Header status")
order_status = models.CharField(max_length=1024, null=True, blank=True, help_text="")
Django adds suffix _id to ForeignKey name as documented
Behind the scenes, Django appends "_id" to the field name to create
its database column name.
If you want name not to include any additions you should set db_column
The name of the database column to use for this field. If this isn’t
given, Django will use the field’s name.
order_guid = models.ForeignKey(
"Dear_sales_header", to_field="order_guid", db_constraint=False,
db_column='order_guid',on_delete=models.DO_NOTHING
)
In MRE I am not able to reproduce any undesired behavior,
Django does properly map ForeignKey to column you set also when settingdb_constraint=True properly ads back right constraint and drops it as backward operation
constraint testbench_sales_fact_order_guid_f2dab1c5_fk_testbench
references testbench_dear_sales_header (order_guid)
deferrable initially deferred
I have added a new field in my model but after that I have deleted db.sqlite3 (to ensure I don't get error below)
agrawalo#:~/myapp> ls
README.md config core manage.py requirements.txt
But still I get this error when I run makemigrations
agrawalo#:~/myapp> ./manage.py makemigrations
You are trying to add a non-nullable field 'high52' to stock without a default; we can't do that (the database needs something to populate existing rows).
Please select a fix:
1) Provide a one-off default now (will be set on all existing rows with a null value for this column)
2) Quit, and let me add a default in models.py
class Stock(models.Model):
name = models.CharField(max_length=255)
code = models.CharField(max_length=20, db_index=True)
price = models.DecimalField(max_digits=19, decimal_places=2)
diff = models.DecimalField(max_digits=19, decimal_places=2)
open_price = models.DecimalField(max_digits=19, decimal_places=2)
previous_close = models.DecimalField(max_digits=19, decimal_places=2)
low52 = models.DecimalField(max_digits=19, decimal_places=2)
high52 = models.DecimalField(max_digits=19, decimal_places=2)
last_updated = models.DateTimeField()
objects = DataFrameManager()
def save(self, *args, **kwargs):
''' On save, update timestamps '''
self.last_updated = timezone.now()
return super(Stock, self).save(*args, **kwargs)
def __str__(self):
return str(self.code)
low52 and high52 are the newly added fields. Please note that none of the other existing field throw this error.
You can either provide a default value to the field
high52 = models.DecimalField(max_digits=19, decimal_places=2, default=0.0)
or you can make it optional
high52 = models.DecimalField(max_digits=19, decimal_places=2, null=True, blank=True)
You can make a decision based on your choice.
To answer your question about the error, the previously existing fields might have been created in the initial migration itself and they don't need a default value. But for the newly added field, you need a default value for mandatory fields and this default value will be populated in the existing records. This doesn't depend on whether you have deleted the existing database or not. This depends on the current state of migrations for that model. Since this is not the initial migration, you will need to provide a default value or make it optional.
It doesn't matter if you deleted the database file or not. makemigrations does not check the database.
You can only add a non-nullable field to a model if you add it to a new model and make an initial migration. This is because, after you make that initial migration, Django has no way of knowing whether you deployed your application somewhere else, so it has no way of knowing if there are instances of a model out there. A situation where this would go wrong:
Create a model X and makemigrations on your local machine.
Deploy your Django application to a server, where the database is populated with instances of model X.
Delete your local database, add non-nullable field Y to model X, makemigrations.
Deploy you Django application to the server.
Problems occur.
The solution here is to either:
Set the Field to null=True
Add a default to the model.
Provide a default when making the migrations.
In your situation, I would say it is ok to provide a one-off default, because it sounds like you have no populated database yet.
You need to provide blank and null True for high52 field .
high52 = models.SomeField(blank=True,null=True)
If you don't want so then you can select any of these two options.
For example If high52 is CharField then you can choose the 1 option and provide some value like '..' or you can set defaults in your models.py
I have a Post models for blog:
сlass Post(models.Model):
author = models.ForeignKey(User,
related_name="blog_posts",
on_delete=models.CASCADE)
........
........
And I want to add a Tag to the project, so I made a Tag model:
class Tag(models.Model):
tag_name = models.CharField(max_length=20,
blank=True)
def __str__(self):
return self.tag_name
I added this to the Post model:
tag = models.ForeignKey(Tag, related_name="blog_tag",
on_delete=models.CASCADE)
But makemigration give to me an error:
You are trying to add a non-nullable field 'tag' to post without a default; we can't do that (the database needs something to populate existing rows).
Please select a fix:
1) Provide a one-off default now (will be set on all existing rows with a null value for this column)
2) Quit, and let me add a default in models.py
So why i have to populate existing rows? And how i can to live the rows empty?
P.S. If i chose 1) i still have an error.
You already have data in your db and now you are adding a new column tag in your database table, which is also not null.
So django is saying that either you provide one-default data for the previously existing rows or you make this new column (tag) a null value (by adding null=True in arguments of the field) so that already existing rows will fill it with null.
I changed my model from this:
class DistList(models.Model):
creator = models.ForeignKey(User, related_name='creator')
created_date = models.DateTimeField(auto_now_add=True)
name = models.CharField(max_length=200, unique=True)
description = models.TextField(blank=True, null=True)
company = models.ForeignKey(Company, blank=True, null=True)
To this:
class DistList(models.Model):
creator = models.ForeignKey(User, related_name='creator')
created_date = models.DateTimeField(auto_now_add=True)
name = models.CharField(max_length=200, unique=True)
description = models.TextField(blank=True, null=True)
company = models.ForeignKey(Company)
The only change was turning the company FK relationship from not required to required.
When I run the migration I specify a one off value that corresponds to the pk of the first company.
./manage.py schemamigration distlist --auto
? The field 'DistList.company' does not have a default specified, yet is NOT NULL.
? Since you are making this field non-nullable, you MUST specify a default
? value to use for existing rows. Would you like to:
? 1. Quit now, and add a default to the field in models.py
? 2. Specify a one-off value to use for existing columns now
? Please select a choice: 2
? Please enter Python code for your one-off default value.
? The datetime module is available, so you can do e.g. datetime.date.today()
>>> 1
But when I run the migration I get an error because it has a pending trigger event?
./manage.py migrate distlist
Running migrations for distlist:
- Migrating forwards to 0005_auto__chg_field_distlist_company.
> distlist:0005_auto__chg_field_distlist_company
FATAL ERROR - The following SQL query failed: ALTER TABLE "distlist_distlist" ALTER COLUMN "company_id" SET NOT NULL;
The error was: cannot ALTER TABLE "distlist_distlist" because it has pending trigger events
I'm not doing anything that seems weird from my point of view so I don't understand this error at all. Can anyone offer insight? I can post the full stack trace if it'll help but I feel like theres something obvious about south and postgresql that perhaps I'm missing?
So I believe I've found the answer. I think postgresql doesn't like altering schemas and adding data at the same time. I first created a datamigration:
./manage.py datamigration distlist add_default_values_to_existing_companies
Then I added this to the forwards method:
def forwards(self, orm):
"Write your forwards methods here."
for distlist in orm['distlist.Distlist'].objects.all():
distlist.company = orm['userprofile.Company'].objects.get(id=1)
distlist.save()
Then I altered the model to remove the blank and null from company.
Then I ran the schema migration and chose to specify a one off for the value as 1 (as I did in the question).
Then I edited that migration file thusly:
def forwards(self, orm):
# Changing field 'DistList.company'
# db.alter_column(u'distlist_distlist', 'company_id', self.gf('django.db.models.fields.related.ForeignKey')(default=1, to=orm['userprofile.Company']))
db.alter_column(u'distlist_distlist', 'company_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['userprofile.Company']))
I just commented out the generated line and removed the default=1 arg.
I don't know... maybe this isn't right but it seemed to work. Hopefully this will help someone.