Im having trouble getting only a single record per foreign key ID on the below model, I have tried this query, but it doesnt seem to be doing anything at all currently
I have used a RAW query which works, but I can't use a filter on it. ive also created a list and removed duplicates from the QuerySet but again I can't filter it because its a list
Query:
queryset = BGPData.objects.annotate(max_timestamp=Max('timestamp')).filter(timestamp=F('max_timestamp')).select_related(
'device_circuit_subnet__subnet',
'device_circuit_subnet__device',
'device_circuit_subnet__circuit',
'device_circuit_subnet__device__site',
)
Model:
class BGPData(models.Model):
device_circuit_subnet = models.ForeignKey(DeviceCircuitSubnets, verbose_name="Device", on_delete=models.CASCADE)
bgp_peer_as = models.CharField(max_length=20, verbose_name='BGP Peer AS', blank=True, null=True)
bgp_session = models.CharField(max_length=10, verbose_name='BGP Session', blank=True, null=True)
bgp_routes = models.CharField(max_length=10, verbose_name='BGP Routes Received', blank=True, null=True)
service_status = models.CharField(max_length=10, verbose_name='Service Status', blank=True, null=True)
timestamp = models.DateTimeField(auto_now=True, blank=True, null=True)
Sample Data im testing with (printed as a dict), there should only be one record for "device_circuit_subnet_id" : "10", the one which is newest.
I would like the latest record per device_circuit_subnet_id, so the query should return 3 results instead of 4, as there are 2 items with the same device_circuit_subnet_id.
ive read that distinct is used for this but were running MySQL, is there another way?
Thanks
[{
"id": 4,
"device_circuit_subnet_id" : "10",
"hostname": "EDGE",
"circuit_name": "MPLS",
"subnet": "172.1.1.1",
"subnet_mask": "/30",
"bgp_session": "1w2d",
"bgp_routes": "377",
"bgp_peer_as": "1",
"service_status": "Up",
"timestamp": "2019-11-18 16:16:17"
},
{
"id": 5,
"device_circuit_subnet_id" : "11",
"hostname": "INT-GW",
"subnet": "1.1.1.1",
"subnet_mask": "/24",
"bgp_session": null,
"bgp_routes": null,
"bgp_peer_as": null,
"service_status": "unknown",
"timestamp": "2019-08-07 14:46:00"
},
{
"id": 8,
"hostname": "EDGE",
"device_circuit_subnet_id" : "20",
"circuit_name": "MPLS 02",
"subnet": "172.2.1.1",
"subnet_mask": "/30",
"bgp_session": null,
"bgp_routes": null,
"bgp_peer_as": null,
"service_status": "unknown",
"timestamp": "2019-11-15 16:18:30"
},
{
"id": 9,
"hostname": "EDGE",
"device_circuit_subnet_id" : "10",
"circuit_name": "MPLS",
"subnet": "172.1.1.1",
"subnet_mask": "/30",
"bgp_session": "1w3d",
"bgp_routes": "385",
"bgp_peer_as": "1",
"service_status": "Up",
"timestamp": "2019-11-18 16:16:44"
}
]
Thanks
Have you tried this?
from django.db.models import Max, F
max_timestamp = Max('device_circuit_subnet__bgpdata__timestamp')
result = BGPData.objects.annotate(ts=max_timestamp).filter(timestamp=F('ts')).select_related(...)
I'm not sure about the performance of this query, but it will work :) :)
Django querysets are evaluated in a lazy fashion. As such, this query will only fetch one record from the database. The ordering on timestamp is in descending order due to the - prefix, so the latest timestamp value will be the first record.
queryset = BGPData.objects.all().order_by(
#prefix field name to order by with `-` to use Descending order
'-timestamp'
).select_related(
'device_circuit_subnet__subnet',
'device_circuit_subnet__device',
'device_circuit_subnet__circuit',
'device_circuit_subnet__device__site',
)[0]
Related
I'm trying to dump my postgres data and load it locally, but I'm ending up with an error
These are my two models:
class User(AbstractUser):
pass
class Profile(models.Model):
user = models.OneToOneField(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
primary_key=True
)
dob = models.DateField(null=True, blank=True)
This is the code I'm executing for the data dump:
call_command('dumpdata', indent=4, exclude=['contenttypes', 'auth.Permission', 'sessions', 'admin.logentry'])
However, when trying to load it, I get the following error:
django.db.utils.IntegrityError: Problem installing fixtures: insert or update on table "profiles_profile" violates foreign key constraint "profiles_profile_user_id_a3e81f91_fk_accounts_user_id"
DETAIL: Key (user_id)=(1) is not present in table "accounts_user".
What I noticed when trying to read the fixture is that there are no pks for the user, and no references from the profile:
{
"model": "accounts.user",
"fields": {
"password": "xyz",
"last_login": "2022-11-27T17:28:45.854Z",
"is_superuser": true,
"username": "JohnDoe",
"first_name": "John",
"last_name": "Doe",
"is_staff": true,
"is_active": true,
"date_joined": "2020-09-02T16:28:13.329Z",
"groups": [],
"user_permissions": []
}
},
{
"model": "profiles.profile",
"pk": 1,
"fields": {
"dob": "1980-06-20",
}
},
Is that normal? Note that I also tried using the natural-foreign and natural-primary-keys arguments in the call_command as well, without any effect.
What am I doing wrong?
class Order(models.Model):
product = models.ForeignKey(Product, on_delete=models.CASCADE)
category = models.ForeignKey(
Category, null=True, on_delete=models.SET_NULL
)
user = models.ForeignKey(User, null=True, on_delete=models.SET_NULL)
placed = models.DateTimeField(auto_now=True)
shipped = models.DateTimeField(null=True)
delivered = models.DateTimeField(null=True)
I want to calculate statistics on how fast the order has been processed for each category
where process time is delivered - shipped
In result I want to achieve something like this:
[
{
"category": <category 1>
"processed_time": <average processed time in seconds>
},
{
"category": <category 2>
"processed_time": <average processed time in seconds>
},
{
"category": <category 3>
"processed_time": <average processed time in seconds>
},
]
I can calculate this outside of the ORM but I'd like to achieve this somehow with annotation/aggregation
delivered = delivered_qs.annotate(first_processed=Min("delivered"), last_processed=Max("delivered")) \
.aggregate(processed_time=F("last_processed")-F("first_processed"))
This QS returns time only for all categories and I dont know how to retrieve time for each individual category
You want to do a group by, which in Django works kinda weird. For more information see the documentation
But by first using .values you say again the queryset you gonna group by on the category. Than you determine the min, the max and the difference.
delivered = (
delivered_qs
.values('category')
.annotate(
first_processed=Min("delivered"),
last_processed=Max("delivered"),
processed_time=F("last_processed") - F("first_processed"),
)
)
Which, in my expectation, would return:
[{
"category": 1,
"first_processed": timedelta(),
"last_processed": timedelta(),
"processed_time": timedelta()
}, ...]
I have a jsonfield in Postgres db and data like below:
income_info = [
{
"id": "1",
"name": "A",
"min_income": 22000
},
{
"id": "2",
"name": "B",
"min_income": 40000
},
{
"id": "3",
"name": "C",
"min_income": 22000
}
]
Now want to use gte and lte over the django orm queryset. Already tried
Employee.objects.filter(income_info__min_income__lte = 4000000)
but did not work at all.
models.py:
class Employee(models.Model):
institute = models.ForeignKey(Institute, on_delete=models.DO_NOTHING)
income_info = JSONField(default=list)
others = models.TextField(null=True)
In django's documentation for querying JsonFields:
If the key is an integer, it will be interpreted as an index lookup in an array
As your json data is list of json datas, you need a query like this:
Employee.objects.filter(income_info__0__min_income__lte=4000000)
I have responce data from API it's look like this
{
"api": {
"results": 1,
"fixtures": {
"65": {
"fixture_id": "65",
"event_timestamp": "1533927600",
"event_date": "2018-08-10T19:00:00+00:00",
"league_id": "2",
"round": "Premier League - 1",
"homeTeam_id": "33",
"awayTeam_id": "46",
"homeTeam": "Manchester United",
"awayTeam": "Leicester",
"status": "Match Finished",
"statusShort": "FT",
"goalsHomeTeam": "2",
"goalsAwayTeam": "1",
"halftime_score": "1 - 0",
"final_score": "2 - 1",
"penalty": null,
"elapsed": "95",
"firstHalfStart": "1533927660",
"secondHalfStart": "1533931380"
}
}
}
}
Now I am trying to build fixture model to store above data in PosgreSql database. I dont understand didnt find any example of builded model with timestamptz field. I need to store event_date key in timestamptz. Can anyone to show me how i should create this field
Django does not have a default timestamp field. However, you can add one by having the following model field:
event_date = models.DateTimeField(auto_now_add=True)
EDIT
Or alternatively, something a little more up to date:
from django.utils import timezone
....
event_date = models.DateTimeField(default=timezone.now)
Make sure its timezone.now and not timzone.now()
There is my model:
class Category(models.Model):
.....
slug = models.SlugField(verbose_name=_('Slug'))
description = RedactorField(verbose_name=_('Description'))
parent = models.ForeignKey('self', null=True, blank=True,
default=None, verbose_name=_('Parent'))
The thing is I need to make a api resource, using DjangoRestFramework, and serializer should contain count of childs for each category.
Something like this, i made with inbox DRF tools like generics.ListAPIView:
[
{
"id": 1,
"created": "01-08-2017 10:42 UTC",
"modified": "01-08-2017 10:55 UTC",
"name": "Category_1",
"slug": "",
"description": "",
"parent": null,
"child_count": 12,
},
{
"id": 2,
"created": "01-08-2017 10:42 UTC",
"modified": "01-08-2017 10:55 UTC",
"name": "SubCategory_1_1",
"slug": "",
"description": "",
"parent": 1,
"child_count": 0,
},
...
]
so the queryset
Category.objects.annotate(child_count=models.Count('parent'))
gonna show only the count of parents (and its always equals to 1 or 0).
There is MPTTModel lib, that possibly could solve this, but i can't use it, because of some project specific issues exists.
This is should work:
Category.objects.annotate(child_count=Count('category_set'))
'category_set' is the name that django auto-generates to use it in reverse relations, you should add related_name='children' to the parent field and then you can use Count('children').