I have a model similar to this:
class Tree(models.Model):
description = models.CharField(max_length = 255, null = False, blank = False)
parent = models.ForeignKey("Tree", null = True, blank = True, on_delete = models.CASCADE)
class Meta:
ordering = ['description', '-id']
I need to find the latest record for each parent.
I tried with this:
latests = Tree.objects.values("parent").annotate(last = Max("pk"))
The result is not correct because the SQL query is:
SELECT parent_id, MAX(id) AS last FROM tree GROUP BY id;
The ORM translates the foreign key to the source and does not use the value inside the field.
Is there a way not to "follow" the foreign key and to use instead the value of the field?
The model generated in the PostgreSQL database the table named tree with three columns:
Column | Type
------------+-----------------------
id | integer
description | character varying(255)
parent_id | integer
With this data:
id | description | parent_id
----+-------------+----------
1 | A | 1
2 | B | 2
3 | C | 1
4 | D | 1
5 | E | 2
I want this result:
last | parent_id
-----+----------
5 | 2
4 | 1
I can do this simply in SQL with:
select max(id) as last, parent_id from tree group by parent_id
Finally I found a possible workaround: I deleted the ordering in Meta class and the result is the expected one.
Related
After reading this django.db.utils.OperationalError: 3780 Referencing column and referenced column are incompatible and SQLSTATE[HY000]: General error: 3780 Referencing column 'user_id' and referenced column 'id' in foreign key are incompatible
I have two models in Django 3.2.16 declared in different apps, it's a long time project which started in Django 2.2 and was upgraded over time.
Here are the two model classes:
from city_search.models
class Città(models.Model):
nome = models.CharField(max_length = 50, db_index = True)
def __str__(self):
return "{} - {}".format(self.nome,self.regione)
provincia = models.ForeignKey(Provincia, models.SET_NULL, blank=True,null=True)
capoluogo = models.BooleanField(default=False)
regione = models.ForeignKey(Regione, models.SET_NULL, blank=True,null=True, related_name='comuni')
slug = models.SlugField(null=True)
latlng = LocationField(null=True,map_attrs={"center": [2.149123103826298,41.39496092463892], "zoom":10})
from eventos.models instead, which I'm developing
from schedule.models import events
class Manifestazione(events.Event):
ciudad = models.ForeignKey('city_search.Città', on_delete=models.CASCADE, verbose_name='Ciudad', related_name='manifestaciones', blank=False, null=False)
The migration of the latter model fails with the following error:
django.db.utils.OperationalError: (3780, "Referencing column 'ciudad_id' and referenced column 'id' in foreign key constraint 'eventos_manifestazio_ciudad_id_74f49286_fk_city_sear' are incompatible.")
these two models declarations translate to the following MySQL tables (the second is only partially created by the faulty migration)
mysql> describe city_search_città;
+--------------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+--------------+-------------+------+-----+---------+----------------+
| id | int | NO | PRI | NULL | auto_increment |
| nome | varchar(50) | NO | MUL | NULL | |
| provincia_id | int | YES | MUL | NULL | |
| capoluogo | tinyint(1) | NO | | NULL | |
| regione_id | int | YES | MUL | NULL | |
| slug | varchar(50) | YES | MUL | NULL | |
| latlng | varchar(63) | YES | MUL | NULL | |
+--------------+-------------+------+-----+---------+----------------+
7 rows in set (0.00 sec)
and
mysql> describe eventos_manifestazione;
+--------------+--------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+--------------+--------+------+-----+---------+-------+
| event_ptr_id | int | NO | PRI | NULL | |
| ciudad_id | bigint | NO | | NULL | |
+--------------+--------+------+-----+---------+-------+
2 rows in set (0.00 sec)
Now I absolutely understand that int and bigint are a great deal of difference. However I already tried to set DEFAULT_AUTO_FIELD = 'django.db.models.AutoField' in settings.py before making the migrations and later migrating, to no avail.
According to Django Docs on automatic primary key fields I could also try to use the AppConfig so I also edited eventos/apps.py like this and later migrated
class EventosConfig(AppConfig):
default_auto_field = 'django.db.models.AutoField'
name = 'eventos'
which didn't work either. I still get the same table schemes as above.
This is the migration that such settings generated.
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('city_search', '__first__'),
('schedule', '0014_use_autofields_for_pk'),
]
operations = [
migrations.CreateModel(
name='Manifestazione',
fields=[
('event_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='schedule.event')),
('ciudad', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='manifestaciones', to='city_search.città', verbose_name='Ciudad')),
],
bases=('schedule.event',),
),
]
As I also read about a possible collation issue I will compare the table collations below:
mysql> select column_name, COLLATION_NAME, CHARACTER_SET_NAME from information_schema.`COLUMNS` where table_name = "city_search_città";
+--------------+-------------------+--------------------+
| COLUMN_NAME | COLLATION_NAME | CHARACTER_SET_NAME |
+--------------+-------------------+--------------------+
| id | NULL | NULL |
| nome | latin1_swedish_ci | latin1 |
| provincia_id | NULL | NULL |
| capoluogo | NULL | NULL |
| regione_id | NULL | NULL |
| slug | latin1_swedish_ci | latin1 |
| latlng | latin1_swedish_ci | latin1 |
+--------------+-------------------+--------------------+
7 rows in set (0.00 sec)
and
mysql> select column_name, COLLATION_NAME, CHARACTER_SET_NAME from information_schema.`COLUMNS` where table_name = "eventos_manifestazione";
+--------------+----------------+--------------------+
| COLUMN_NAME | COLLATION_NAME | CHARACTER_SET_NAME |
+--------------+----------------+--------------------+
| event_ptr_id | NULL | NULL |
| ciudad_id | NULL | NULL |
+--------------+----------------+--------------------+
2 rows in set (0.01 sec)
My hypothesis
is it more likely that the int and bigint difference is to blame
and I'm unable due to some bug or weird reason that the migrations
are ignoring my preference to shift to int as the automatic
primary key fields?
Does it have to do with some difference in using these settings when it comes to foreign keys?
Or do you believe it may have more to do with the collation difference? But are there differences that I may not be able to see?
I'm unable to complete a proper migration
It seems the error text was warning about the wrong table. It all worked fine by adding DEFAULT_AUTO_FIELD = 'django.db.models.AutoField' to settings.py and make new project-wide migrations that included some tweaks to a third app's model fields, where indeed models.AutoField was set to the id fields/columns.
This third app and the one I'm developing are not connected directly through a ForeignKey, I don't understand where the relation is, although the error turned up after I created the models I mentioned in this thread. So, honestly I'm not very sure why it worked or what these third models may have to do with this new app I'm adding to the project. I would really like to know if somebody understands more.
I use doctrine 2.5 and I struggle doing a multiple-count request with the queryBuilder.
MDD
As you see, the AbstractArticle entity have a ManyToOne relationship with Tag nammed mainTag and a ManyToMany relationship with the same entityTag nammed tags.
What I want to do
I want to make a request, from a list of tag Ids, to count the number of AbstractArticle main tagged AND AbstractArticle default tagged.
Here the kind of return I want
+--------+-------------------------+----------------------------+
| TagId | mainTaggedArticleCount | defaultTaggedArticleCount |
+--------+-------------------------+----------------------------+
| 1 | 2 | 0 |
| 2 | 0 | 5 |
| 3 | 2 | 2 |
+--------+-------------------------+----------------------------+
My current attempts
I did it successfully with the following mySQL request and I got exactly what I want :
SELECT
tag.id as tagId,
(select count(DISTINCT aaMain.id)) as mainTaggedArticleCount,
(select count(DISTINCT aaDefault.id)) as defaultTaggedArticleCount
FROM tag tag
/* Left join on ManyToOne nammed `mainTag` */
LEFT JOIN abstract_article aaMain ON aaMain.main_tag_id = tag.id
/* Left join on ManyToMany nammed `tags` with the junction table */
LEFT JOIN abstract_article_tag aat ON aat.tag_id = tag.id
LEFT JOIN abstract_article aaDefault ON aaDefault.id = aat.abstract_article_id
where tag.id in (3, 1, 5, 6) /* My list of tag Ids */
group by tag.id
But with doctrine is far more complicated ><... I did the leftjoin for the OneToMany relationship like this :
$qb->leftJoin(AbstractArticle::class,'mainTaggedArticle',Join::WITH,'mainTaggedArticle.mainTag = t.id')
But it doesn't work for the ManyToMany. Because the junction table abstract_article_tag is invisible throught doctrine.
Any ideas for me ?
Thanks by advance :)
I did it !
With 2 subrequests, here my solution
Inside my TagRepository
$repoAbastractArticle = $this->getEntityManager()->getRepository("AbstractArticle");
// SubRequest for main tagged article
$countMainTaggedArticleSubQuery = $repoAbstractArticle->createQueryBuilder("abstract_article");
$countMainTaggedArticleSubQuery->select('COUNT(DISTINCT abstract_article.id)')
->leftJoin('abstract_article.mainTag', 'mainTag')
->andWhere($countMainTaggedArticleSubQuery->expr()->eq('mainTag.id', 'tag.id'));
// SubRequest for tagged article
$countDefaultTaggedAbstractArticleSubQuery = $repoAbstractArticle->createQueryBuilder("default_tagged_abstract_article");
$countDefaultTaggedAbstractArticleSubQuery->select('COUNT(DISTINCT default_tagged_abstract_article.id)')
->leftJoin('default_tagged_abstract_article.tags', 'tags')
->andWhere($countDefaultTaggedAbstractArticleSubQuery->expr()->eq('tags.id', 'tag.id'));
// Main request
$qb = $this->createQueryBuilder("tag");
$qb->select('
tag.id AS tagId,
(' . $countMainTaggedArticleSubQuery . ') AS mainTaggedArticleCount,
(' . $countDefaultTaggedAbstractArticleSubQuery . ') AS defaultTaggedArticleCount'
)
->groupBy('tag.id')
->andWhere($qb->expr()->in('tag.id', ':tagIds'))
->setParameter('tagIds', $tagIds);
return $qb->getQuery()->getResult();
I’m using Rails 4.2.3. I have the following model …
class MyObjectTime < ActiveRecord::Base
belongs_to :my_object
has_one :state
has_one :country
I’m having an issue when I try and save my entity with non-empty state and country objects. I have
state = State.find_by_iso_and_country_id(hometown_parts[1].strip!, us_country.id)
country = state.country
my_object_time = MyObjectTime.new(:name => my_object_time[2],
:age => my_object_time[7],
:time_in_ms => time_in_ms,
:overall_rank => my_object_time[1],
:city => city,
:state => state,
:country => country,
:gender_rank => my_object_time[7],
:my_object_id => my_object_id)
but unless both “state” and “country” are nil, I get the error
ActiveModel::MissingAttributeError
can't write unknown attribute `my_object_time_id`
Everything will save fine if both these fields are nil. What’s going on and how can I fix it?
Edit: Here is the PostGres description of the table in question
davedb=> \d my_object_times;
Table "public.my_object_times"
Column | Type | Modifiers
----------------+-----------------------------+---------------------------------------------------------
id | integer | not null default nextval('my_object_times_id_seq'::regclass)
my_object_id | integer | not null
first_name | character varying |
last_name | character varying |
time_in_ms | bigint |
created_at | timestamp without time zone | not null
updated_at | timestamp without time zone | not null
name | character varying |
age | integer |
city | character varying |
state_id | integer |
country_id | integer |
overall_rank | integer |
age_group_rank | integer |
gender_rank | integer |
Indexes:
"my_object_times_pkey" PRIMARY KEY, btree (id)
"index_my_object_times_on_country_id" btree (country_id)
"index_my_object_times_on_my_object_id" btree (my_object_id)
"index_my_object_times_on_state_id" btree (state_id)
Foreign-key constraints:
"fk_rails_0fe1d25967" FOREIGN KEY (country_id) REFERENCES countries(id)
"fk_rails_a8771b3575" FOREIGN KEY (state_id) REFERENCES states(id)
"fk_rails_ba656ceafa" FOREIGN KEY (my_object_id) REFERENCES my_bojects(id) ON DELETE CASCADE
Referenced by:
TABLE "user_my_object_time_matches" CONSTRAINT "fk_rails_2e7860946c" FOREIGN KEY (my_object_time_id) REFERENCES my_object_times(id)
I am using Django with Postgres, and trying to load data from csv to table. However, since one filed is Geometry Field, so I have to leave it blank when I load the table(otherwise \copy from will fail).
Here's my model:
name = models.CharField(max_length=200)
lat = models.DecimalField(max_digits=6, decimal_places=4, default=Decimal('0.0000'))
lon = models.DecimalField(max_digits=7,decimal_places=4, default=Decimal('0.0000'))
geom = models.PointField(srid=4326, blank=True, null=True, default=None)
and after migration, I ran psql like this:
mydb=>\copy target(name,lat,lon) from 'file.csv' DELIMITER ',' CSV HEADER ;
and I get error like this:
ERROR: null value in column "geom" violates not-null constraint
DETAIL: Failing row contains (name1, 30.4704, -97.8038, null).
CONTEXT: COPY target, line 2: "name1, 30.4704, -97.8038"
and here's the portion of csv file:
name,lat,lon
name1,30.4704,-97.8038
name2,30.3883,-97.7386
and here's the \d+ target:
mydb=> \d+ target
Table "public.target"
Column | Type | Modifiers | Storage | Stats target | Description
------------+------------------------+-----------+----------+--------------+-------------
name | character varying(200) | not null | extended | |
lat | numeric(6,4) | not null | main | |
lon | numeric(7,4) | not null | main | |
geom | geometry(Point,4326) | | main | |
Indexes:
"target_pkey" PRIMARY KEY, btree (name)
"target_geom_id" gist (geom)
So I guess geom is set to null when loading csv to the table? How can I fix it? I want to set the default of filed geom is null so that I can update it use other query.
Thanks very much!!
Couple different options, described here.
You can either:
1) Use a placeholder (0,0) which you can then overwrite.
or:
2) Set spacial_index=False.
I would like to save phone numbers (US) in the database via Django. I have
from django.db import models
class Number(models.Model):
phone_number = models.CharField("Phone Number", max_length=10, unique=True)
When I ran:
python manage.py sql myapp
I got
BEGIN;
CREATE TABLE `demo_number` (
`id` integer AUTO_INCREMENT NOT NULL PRIMARY KEY,
`phone_number` varchar(10) NOT NULL UNIQUE
)
;
When I validate it, there was no error.
python manage.py validate
0 errors found
So I did
python manage.py syncdb
In MySQL console, I see:
mysql> select * from myapp_number;
Empty set (0.00 sec)
mysql> describe myapp_number;
+--------------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+--------------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| phone_number | varchar(10) | NO | UNI | NULL | |
+--------------+-------------+------+-----+---------+----------------+
2 rows in set (0.03 sec) `
Then in python manage.py shell, I do
from demo.models import Message, Number, Relationship, SmsLog
n=Number(phone_number='1234567890')
n.save()
When I check again in MySQL console, I see:
mysql> select * from myapp_number;
+------------+--------------+
| id | phone_number |
+------------+--------------+
| 2147483647 | 1234567890 |
+------------+--------------+
1 row in set (0.01 sec)
Why is the id a big number? In fact, because of that I cannot insert phone numbers anymore. For example, in python manage.py shell
n=Number(phone_number='0987654321')
n.save()
IntegrityError: (1062, "Duplicate entry '2147483647' for key 'PRIMARY'")
I am new to Django (using Django 1.5 and MySQL Server version: 5.1.63). If someone could point out the obvious mistake I'm making, I would very much appreciate that. On a side note, if I would like to extend the max_length of the CharField to 15, what is the simplest and cleanest (that is, not screwing up the existing set up) way to accomplish that? Thank you.
I can't see any mistakes in your code. If you don't have any data in the table, I would try dropping the demo_number table and running syncdb again.
If you don't have any data in the table, the easiest way to change to max length 15 is to change the model, drop the table in the db shell, then run syncdb again.
If you do have data in the table, you can change the model, then update column in the db. In your case (MySQL specific):
ALTER TABLE demo_number CHANGE phone_number phone_number varchar(15) NOT NULL UNIQUE;
For more complex migrations in Django, use South.
Turns out #Alasdair is right. I had to reset the app. In case anyone is wondering how to do it (I searched in StackOverflow, but might as well post it here since it's relevant), this https://stackoverflow.com/a/15444900/1330974 will work for Django >1.3.
My follow-up question is if the AutoField ID is incremented in case of error. For example, I did this in the shell:
from demo.models import Number
n=Number(phone_number='1234567890')
n.save()
n=Number(phone_number='1234567890')
n.save()
Got an IntegrityError as expected. So try a new number:
n=Number(phone_number='0987654321')
n.save()
Now when I check MySQL console, I see:
mysql> select * from demo_number;
+----+--------------+
| id | phone_number |
+----+--------------+
| 3 | 0987654321 |
| 1 | 1234567890 |
+----+--------------+
2 rows in set (0.00 sec)
Is that normal for Django to skip an ID in AutoField if there is an error? Thank you.