I want to compare a DateField and a TimeField in a queryset with the current date. I searched for hours but did not find anything. Tried much with Q/F-Object but no solution, too. And now I am here and hope someone knows how to solve this :) - Btw. splitting into date and time is not my fault and there is no way to change it into a DateTimeField (too much dependencies in other projects).
class Model(models.Model):
date = models.DateField()
time = models.TimeField()
In MySQL I would do something like:
SELECT * FROM app_model WHERE CAST(CONCAT(CAST(date as CHAR),' ',CAST(time as CHAR)) as DATETIME) >= NOW()
Thanks for any suggestions!
You can do this with or'd queryset contraints (Q objects):
now = datetime.datetime.now()
future_models = Model.objects.filter(Q(date__gt=now.date()) | (Q(date=now.date()) & Q(time__gte=now.time())))
That selects all instances for which date is past today, and all instances for which date is today and the the time is greater than or equal to the current time.
Related
Context
There is a dataframe of customer invoices and their due dates.(Identified by customer code)
Week(s) need to be added depending on customer code
Model is created to persist the list of customers and week(s) to be added
What is done so far:
Models.py
class BpShift(models.Model):
bp_name = models.CharField(max_length=50, default='')
bp_code = models.CharField(max_length=15, primary_key=True, default='')
weeks = models.IntegerField(default=0)
helper.py
from .models import BpShift
# used in views later
def week_shift(self, df):
df['DueDateRange'] = df['DueDate'] + datetime.timedelta(
weeks=BpShift.objects.get(pk=df['BpCode']).weeks)
I realised my understanding of Dataframes is seriously flawed.
df['A'] and df['B'] would return Series. Of course, timedelta wouldn't work like this(weeks=BpShift.objects.get(pk=df['BpCode']).weeks).
Dataframe
d = {'BpCode':['customer1','customer2'],'DueDate':['2020-05-30','2020-04-30']}
df = pd.DataFrame(data=d)
Customer List csv
BP Name,BP Code,Week(s)
Customer1,CA0023MY,1
Customer2,CA0064SG,1
Error
BpShift matching query does not exist.
Commentary
I used these methods in hope that I would be able to change the dataframe at once, instead of
using df.iterrows(). I have recently been avoiding for loops like a plague and wondering if this
is the "correct" mentality. Is there any recommended way of doing this? Thanks in advance for any guidance!
This question Python & Pandas: series to timedelta will help to take you from Series to timedelta. And although
pandas.Series(
BpShift.objects.filter(
pk__in=df['BpCode'].tolist()
).values_list('weeks', flat=True)
)
will give you a Series of integers, I doubt the order is the same as in df['BpCode']. Because it depends on the django Model and database backend.
So you might be better off to explicitly create not a Series, but a DataFrame with pk and weeks columns so you can use df.join. Something like this
pandas.DataFrame(
BpShift.objects.filter(
pk__in=df['BpCode'].tolist()
).values_list('pk', 'weeks'),
columns=['BpCode', 'weeks'],
)
should give you a DataFrame that you can join with.
So combined this should be the gist of your code:
django_response = [('customer1', 1), ('customer2', '2')]
d = {'BpCode':['customer1','customer2'],'DueDate':['2020-05-30','2020-04-30']}
df = pd.DataFrame(data=d).set_index('BpCode').join(
pd.DataFrame(django_response, columns=['BpCode', 'weeks']).set_index('BpCode')
)
df['DueDate'] = pd.to_datetime(df['DueDate'])
df['weeks'] = pd.to_numeric(df['weeks'])
df['new_duedate'] = df['DueDate'] + df['weeks'] * pd.Timedelta('1W')
print(df)
DueDate weeks new_duedate
BpCode
customer1 2020-05-30 1 2020-06-06
customer2 2020-04-30 2 2020-05-14
You were right to want to avoid looping. This approach gets all the data in one SQL query from your Django model, by using filter. Then does a left join with the DataFrame you already have. Casts the dates and weeks to the right types and then computes a new due date using the whole columns instead of loops over them.
NB the left join will give NaN and NaT for customers that don't exist in your Django database. You can either avoid those rows by passing how='inner' to df.join or handle them whatever way you like.
I have field in my model:
class Order(BaseModel):
created_at = models.DateTimeField(auto_now_add=True)
I need to count all Order objects created in current month.
How can I do this in my views?
One of possible ways.
from datetime import datetime
current_month = datetime.now().month
Order.objects.filter(created_at__month=current_month)
See https://docs.djangoproject.com/en/stable/ref/models/querysets/#month for reference.
The (current) accepted answer is incorrect. As stated in comments, maybe OP wants current month in current year. Not current month in any year. Well most people want the first.
So I would rather do
Order.objects.filter(created_at__gte=timezone.now().replace(day=1, hour=0, minute=0, second=0, microsecond=0))
The above also gets around timezone issues.
If you're only after the current month, it's easy, because you don't have to worry about an end date - nothing can be created after now, after all.
start_of_month = datetime.date.today().replace(day=1)
orders_this_month = Order.objects.filter(created_at__gte=start_of_month)
This worked best on my side by adding .count operation
import datetime
Order.objects.filter(created_at__gte=datetime.datetime.today().replace(day=1, hour=0, minute=0, second=0, microsecond=0)).count()
I encountered a model like this:
class Task(models.Model):
timespan = models.IntegerField(null=True, blank=True)
class Todo(models.Model):
limitdate = models.DateTimeField(null=True, blank=True)
task = models.ForeignKey(Task)
I need to extract all Todos with a limitdate that is lower or equal to today's date + a timespan defined in the related Task model.
Something like (dummy example):
today = datetime.datetime.now()
Todo.objects.filter(limitdate__lte=today + F('task__timespan'))
Now, I can do that with a loop but I'm looking for a way to do it with F(), and I can't find one.
I'm starting to wonder if I can do that with F(). Maybe I should use extra ?
Please note that I don't have the luxury of changing the model code.
The main issue is that DB does not support date + integer and its hard to write ORM query to date + integer::interval, for PostgreSQL for example, where integer is the value of the task_timespan column, in days count.
However, as
limitdate <= today + task__timespan equals to
limitdate - today <= task__timespan
We could transform the query to
Todo.objects.filter(task__timespan__gte=F('limitdate') - today).distinct()
thus the SQL becomes something like integer >= date - date, that should work in PostgreSQL because date - date outputs interval which could be compared w/ integer days count.
In other DBs such as SqLite, it's complicated because dates need to be cast w/ julianday() at first...and I think you need to play w/ extra() or even raw() to get the correct SQL.
Also, as Chris Pratt suggests, if you could use timestamp in all relative fields, the query task might become easier because of less limited add and subtract operations.
P.S. I don't have env to verify it now, you could try it first.
The problem is that there's no TIMESPAN type on a database. So, F cannot return something that you can actually work with in this context. I'm not sure what type of field you actually used in your database, but the only way I can think of to do this is to the store the timespan as an integer consisting of seconds, add that to "today" as a timestamp, and then convert it back into a datetime which you can use to compare with limitdate. However, I'm unsure if Django will accept such complex logic with F.
This is driving me crazy. I've used all the lookup_types and none seem to work.
I need to select an object that was created two weeks ago from today.
Here's what I've got:
twoweeksago = datetime.datetime.now() - datetime.timedelta(days=14)
pastblast = Model.objects.filter(user=user, created=twoweeksago, done=False)
The model has a created field that does this: created = models.DateTimeField(auto_now_add=True, editable=False)
But my query isn't returning everything. Before you ask, yes, there are records in the db with the right date.
Can someone make a suggestion as to what I'm doing wrong?
Thanks
DateTimeField is very different from DateField, if you do
twoweeksago = datetime.datetime.now() - datetime.timedelta(days=14)
That is going to return today's date, hour, minute, second minus 14 days, and the result is going to include also hours minutes seconds etc. So the query:
pastblast = Model.objects.filter(user=user, created=twoweeksago, done=False)
Is going to find for a instance was created just in that exact time, If you only want to care about the day, and not hours, minutes and seconds you can do something like
pastblast = Model.objects.filter(user=user, created__year=twoweeksago.year, created__month=twoweeksago.month, created__day=twoweeksago.day, done=False)
Check the django docs:
https://docs.djangoproject.com/en/1.4/ref/models/querysets/#year
What would be the App Engine equivalent of this Django statement?
return Post.objects.get(created_at__year=bits[0],
created_at__month=bits[1],
created_at__day=bits[2],
slug__iexact=bits[3])
I've ended up writing this:
Post.gql('WHERE created_at > DATE(:1, :2, :3) AND created_at < DATE(:1, :2, :4) and slug = :5',
int(bit[0]), int(bit[1]), int(bit[2]), int(bit[2]) + 1, bit[3])
But it's pretty horrific compared to Django. Any other more Pythonic/Django-magic way, e.g. with Post.filter() or created_at.day/month/year attributes?
How about
from datetime import datetime, timedelta
created_start = datetime(year, month, day)
created_end = created_start + timedelta(days=1)
slug_value = 'my-slug-value'
posts = Post.all()
posts.filter('created_at >=', created_start)
posts.filter('created_at <', created_end)
posts.filter('slug =', slug_value)
# You can iterate over this query set just like a list
for post in posts:
print post.key()
You don't need 'relativedelta' - what you describe is a datetime.timedelta. Otherwise, your answer looks good.
As far as processing time goes, the nice thing about App Engine is that nearly all queries have the same cost-per-result - and all of them scale proportionally to the records returned, not the total datastore size. As such, your solution works fine.
Alternately, if you need your one inequality filter for something else, you could add a 'created_day' DateProperty, and do a simple equality check on that.
Ended up using the relativedelta library + chaining the filters in jQuery style, which although not too Pythonic yet, is a tad more comfortable to write and much DRYer. :) Still not sure if it's the best way to do it, as it'll probably require more database processing time?
date = datetime(int(year), int(month), int(day))
... # then
queryset = Post.objects_published()
.filter('created_at >=', date)
.filter('created_at <', date + relativedelta(days=+1))
...
and passing slug to the object_detail view or yet another filter.
By the way you could use the datetime.timedelta. That lets you find date ranges or date deltas.