I am attempting to update timestamps for 100s of objects in Postgres 12 using the following query:
UPDATE foo_bar AS c SET
created_at = c2.created_at
FROM (VALUES
(101, '2021-09-27 14:54:00.0+00'),
(153, '2021-06-02 14:54:00.0+00')
) as c2(id, created_at)
WHERE c.id = c2.id;
Where created_at represents a dateTimeField:
created_at = models.DateTimeField(auto_now_add=True)
I am receiving the following error:
ERROR: column "created_at" is of type timestamp with time zone but expression is of type text
I have tried many variations of the created_at values to no avail. Any idea why this is not working?
Thank you to #a_horse_with_no_name for the answer. I had to cast the values as timestamps with timezones as shown below:
UPDATE foo_bar AS c SET
created_at = c2.created_at
FROM (VALUES
(101, '2021-09-27 14:54:00.0+00::timestamptz'),
(153, '2021-06-02 14:54:00.0+00::timestamptz')
) as c2(id, created_at)
WHERE c.id = c2.id;
Postgres docs explaining how casting works (link):
A cast specifies how to perform a conversion between two data types
By default, a cast can be invoked only by an explicit cast request, that is an explicit CAST(x AS typename) or x::typename construct.
Postgres docs explaining timestamptz (link):
timestamptz is accepted as an abbreviation for timestamp with time zone
Related
I'm using Django and Python 3.7. I'm writing a Django query to be run on a PostGres 9.4 db, but having trouble figuring out how to form my expression wrapper so that I add a number of seconds (an integer) to an existing date column. I tried the below
hour_filter = ExtractHour(ExpressionWrapper(
F("article__created_on") + timedelta(0,
F("article__websitet__avg_time_in_seconds_to_reach_ep")),
output_field=models.DateTimeField)
),
)
but I'm getting the error
unsupported type for timedelta seconds component: F
Any ideas how I can rewrite my ExpressionWrapper to do the date math inside the PostGres query?
Edit; Here are the models and relevant fields ...
class Website(models.Model):
...
avg_time_in_seconds_to_reach_ep = models.FloatField(default=0, null=True)
class Article(models.Model):
objects = ArticleManager()
website = models.ForeignKey(Website, on_delete=models.CASCADE, related_name='articlesite')
You can add database functions to Django, for this you can add a function for the INTERVAL statement in postgres
class IntervalSeconds(Func):
function = 'INTERVAL'
template = "(%(expressions)s * %(function)s '1 seconds')"
You can then use this function in your queries to add seconds to a datetime
YourModel.objects.annotate(
attr=ExpressionWrapper(
F("article__created_on") + IntervalSeconds(F("article__websitet__avg_time_in_seconds_to_reach_ep")),
output_field=models.DateTimeField()
),
)
The output of the IntervalSeconds function is a 1 second Postgres interval multiplied by the field passed to it. This can be added and subtracted from a timestamp. You could make a generic Interval function that doesn't just take seconds, this is a little more complex
The ExpressionWrapper is required to convert the result to a datetime object
I created a time field as follows:
start_date = models.DateField()
end_date = models.DateField()
When trying to create a constraint on the table with
ALTER TABLE analytics
EXCLUDE USING gist (campaign WITH =, tstzrange(start_date, end_date) WITH &&)
I get an error
ERROR: functions in index expression must be marked IMMUTABLE
Does anyone know how to fix this issue?
You are casting date to timestamp with timezone and that function is not immutable, but rather stable. It is like that because it will not always give the same result for the same argument passed.
I see 2 options here:
1) Change constraint to use daterange (or timestamp without timezone):
EXCLUDE USING gist (campaign WITH =, daterange(start_date, end_date) WITH &&)
2) Change type of those fields in table to timestamptz
I have a simple query on django's built in comments model and getting the error below with heroku's postgreSQL database:
DatabaseError: operator does not exist: integer = text LINE 1:
... INNER JOIN "django_comments" ON ("pi ns_pin"."id" = "django_...
^
HINT: No operator matches the given name and argument type(s).
You might need to add explicit type casts.
After googling around it seems this error has been addressed many times before in django, but I'm still getting it (all related issues were closed 3-5 years ago) . I am using django version 1.4 and the latest build of tastypie.
The query is made under orm filters and works perfectly with my development database (sqlite3):
class MyResource(ModelResource):
comments = fields.ToManyField('my.api.api.CmntResource', 'comments', full=True, null=True)
def build_filters(self, filters=None):
if filters is None:
filters = {}
orm_filters = super(MyResource, self).build_filters(filters)
if 'cmnts' in filters:
orm_filters['comments__user__id__exact'] = filters['cmnts']
class CmntResource(ModelResource):
user = fields.ToOneField('my.api.api.UserResource', 'user', full=True)
site_id = fields.CharField(attribute = 'site_id')
content_object = GenericForeignKeyField({
My: MyResource,
}, 'content_object')
username = fields.CharField(attribute = 'user__username', null=True)
user_id = fields.CharField(attribute = 'user__id', null=True)
Anybody have any experience with getting around this error without writing raw SQL?
PostgreSQL is "strongly typed" - that is, every value in every query has a particular type, either defined explicitly (e.g. the type of a column in a table) or implicitly (e.g. the values input into a WHERE clause). All functions and operators, including =, have to be defined as accepting specific types - so, for instance there is an operator for VarChar = VarChar, and a different one for int = int.
In your case, you have a column which is explicitly defined as type int, but you are comparing it against a value which PostgreSQL has interpreted as type text.
SQLite, on the other hand, is "weakly typed" - values are freely treated as being of whatever type best suits the action being performed. So in your dev SQLite database the operation '42' = 42 can be computed just fine, where PostgreSQL would need a specific definition of VarChar = int (or text = int, text being the type for unbounded strings in PostgreSQL).
Now, PostgreSQL will sometimes be helpful and automatically "cast" your values to make the types match a known operator, but more often, as the hint says, you need to do it explicitly. If you were writing the SQL yourself, an explicit type case could look like WHERE id = CAST('42' AS INT) (or WHERE CAST(id AS text) = '42').
Since you're not, you need to ensure that the input you give to the query generator is an actual integer, not just a string which happens to consist of digits. I suspect this is as simple as using fields.IntegerField rather than fields.CharField, but I don't actually know Django, or even Python, so I thought I'd give you the background in the hope you can take it from there.
Building on IMSoP's answer: This is a limitation of django's ORM layer when a Generic foreign key uses a text field for the object_id and the object's id field is not a text field. Django does not want to make any assumptions or cast the object's id as something it's not. I found an excellent article on this http://charlesleifer.com/blog/working-around-django-s-orm-to-do-interesting-things-with-gfks/.
The author of the article, Charles Leifer came up with a very cool solution for query's that are affected by this and will be very useful in dealing with this issue moving forward.
Alternatively, i managed to get my query to work as follows:
if 'cmnts' in filters:
comments = Comment.objects.filter(user__id=filters['cmnts'], content_type__name = 'my', site_id=settings.SITE_ID ).values_list('object_pk', flat=True)
comments = [int(c) for c in comments]
orm_filters['pk__in'] = comments
Originally i was searching for a way to modify the SQL similar to what Charles has done, but it turns out all i had to do was break the query out into two parts and convert the str(id)'s to int(id)'s.
To do not hack you ORM and external software postgres allow you register your own casts and compare operations. Please look example in similar question.
I must be missing something obvious, as the behavior is not as expected for this simple requirement. Here is my model class:
class Encounter(models.Model):
activity_type = models.CharField(max_length=2,
choices=(('ip','ip'), ('op','op'), ('ae', 'ae')))
cost = models.DecimalField(max_digits=8, decimal_places=2)
I want to find the total cost for each activity type. My query is:
>>> Encounter.objects.values('activity_type').annotate(Sum('cost'))
Which yields:
>>> [{'cost__sum': Decimal("140.00"), 'activity_type': u'ip'},
{'cost__sum': Decimal("100.00"), 'activity_type': u'op'},
{'cost__sum': Decimal("0.00"), 'activity_type': u'ip'}]
In the result set there are 2 'ip' type encounters. This is because it is not grouped by only activity_type but by activity_type AND cost which does not give the intended result. The generated SQL query for this is:
SELECT "encounter_encounter"."activity_type",
SUM("encounter_encounter"."total_cost") AS "total_cost__sum"
FROM "encounter_encounter"
GROUP BY "encounter_encounter"."activity_type",
"encounter_encounter"."total_cost" <<<< THIS MESSES THINGS
ORDER BY "encounter_encounter"."total_cost" DESC
How can I make this query work as expected (and as implied by the docs if I am not getting it wrong) and make it only do a group by on activity_type?
As #Skirmantas correctly pointed, the problem was related to order_by. Although it is not explicitly stated in the query, the default ordering in the model's Meta class is added to the query, which is then added to the group by clause because SQL requires so.
The solution is either to remove the default ordering or add an empty order_by() to reset ordering:
>>> Encounter.objects.values('activity_type').annotate(Sum('cost')).order_by()
#model
class Promotion(models.Model):
name = models.CharField(max_length=200)
start_date = models.DateTimeField()
end_date = models.DateTimeField()
#view
def promo_search(request):
...
results = Promotion.objects.filter(start_date__gte=start_date).filter(end_date__lte=end_date)
...
(The code above obviously isn't going to work I'm just using it to
help illustrate my problem.)
I want to show all active promotions between the start date and end
date.
So if a promotion starts on 01/01/09 and ends 30/01/09 and a person
searches from 01/12/08 to 01/02/09 it will still return a result. Also
if they search from inside the date range e.g. 02/01/09 - 03/01/09
they would get the same result.
Is there some magical django way of achieving this without looping
over each day?
You have four dates to consider: start_search, end_search, start_promo and end_promo. You are looking for promotions where start_promo <= end_search and end_promo >= start_search:
results = Promotion.objects.\
filter(start_date__lte=end_date).\
filter(end_date__gte=start_date)