Django celery crontab not working when CELERY_TIMEZONE='Asia/Calcutta' - django

I am tyring to schedule a task in Django using celery.Everything works fine when the CELERY_TIMEZONE='UTC' but doesnt work when I change the CELERY_TIMEZONE='Asia/Calcutta'.
#settings.py
CELERY_TIMEZONE='UTC'
CELERY_ENABLE_UTC = True
#tasks.py
#periodic_task(run_every=crontab(day_of_month="1-31", hour=6, minute=8), name="newtask1")
def elast():
print "test"
This works just fine but when I change my settings to
CELERY_TIMEZONE='Asia/Calcutta'
CELERY_ENABLE_UTC = False
#tasks.py
#periodic_task(run_every=crontab(day_of_month="1-31", hour=11, minute=38), name="newtask1")
def elast():
print "test"
This doesn't work.I can't seem to figure out the issue.Am I missing something? Any help would be appreciated.

Configure Celery to use a custom time zone. The timezone value can be any time zone supported by the pytz library.
kindly refer this celery reference guide

Related

Problem with Running okta-jwt-verifier in docker container

I am writing a flask-api where token-verification is done via okta-jwt-verifier package.
I have this code to verify tokens:
import asyncio
from okta_jwt_verifier import AccessTokenVerifier, JWTVerifier
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
def is_access_token_valid(token, issuer,client_id):
jwt_verifier = JWTVerifier(issuer=issuer, client_id=client_id,
audience='api://default', leeway=60)
try:
verified_token = jwt_verifier.verify_access_token(token)
parsed=jwt_verifier.parse_token(token)
g.decoded_token=parsed
loop.run_until_complete(verified_token)
return True
except Exception:
print("Exception")
return False
It works great when i run this on my machine, but when I do this inside docker-container (still on my machine) (i have docker-compose.yml file with 2 services: flask and db(PostgreSQL)), the process fails at loop.run_until_complete(verified_token). I am not sure how to work around that issue. Please help if you have any ideas! Thanks in advance!

Testing Celery Beat

i work on a celery beat task within a django project which creates Database entries periodically. I know so beacuse when i set the task up like this :
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.schedules import crontab
app = Celery("clock-backend", broker=os.environ.get("RABBITMQ_URL"))
app.config_from_object("django.conf:settings", namespace="CELERY")
app.conf.beat_schedule = {
'create_reports_monthly': {
'task': 'project_celery.tasks.create_reports_monthly',
'schedule': 10.0,
},
}
app.autodiscover_tasks()
And start my project it really creates an object every 10 seconds.
But what i really want to do is to set it up to run every first day of a month.
To do so i would change "schedule": crontab(0, 0, day_of_month="1").
Here comes my actual problem : How do i test that this really works ?
And by testing i mean actual (unit)tests.
What I've tried is to work with a package called freezegun.
A test with this looks like this :
def test_start_of_month_report_creation(self, user_object, contract_object, report_object):
# set time to the last day of January
with freeze_time("2019-01-31 23:59:59") as frozen_time:
# let one second pass
frozen_time.tick()
# give the celery task some time
time.sleep(20)
# Test Logic to check whether the object was created
# Example: assert MyModel.objects.count() > 0
But this did not work. I suspect that the celery beat does not use the time set via freezgun/python but the real "hardware" clock.
I've also tried setting the Hardwareclock like here but this did not work in my setup.
I'm thankful for any comments, remarks or help on this topic since i'd really like to implement a test for this.
Unit tests cannot test third-party libraries.
You can set the system log, to keep track.
You can check if your task is already on model PeriodicTask. This model defines a single periodic task to be run. It must be associated with a schedule, which defines how often the task should run.

Update database fields hourly with Python/Django

Suppose I have 1000 user_ids in a table and I would run every hour to get from Google API info and update 3 fields in that table. How would the impact be and how can it be done efficiently?
I've seen this variant:
m = Module.objects.get(user_id=1).update(field_one=100, field_two=200, field_three=300)
And this one:
m = Module.objects.get(user_id=1)
m.field_one = 100
m.field_two = 200
m.field_three = 300
m.save()
Also how can it be done so that it will run every hour and grab that information? Never done something like this.
Use Redis, Celery to setup asynchronous task queue every hour. Look here https://realpython.com/blog/python/asynchronous-tasks-with-django-and-celery/ for more info on how to setup asych task queue system for django.
Here is the code for tasks.py
from celery.task import periodic_task
from celery.schedules import crontab
#periodic_task(run_every=crontab(minute=0, hour='*/1'))
def get_data_from_google_api():
data_from_google =ping_google_api() # ping google api to get data
return Module.objects.get(user_id=1).update(field_one= data_from_google['field_one'], field_two= data_from_google['field_two'], field_three= data_from_google['field_three'])
Look here for more info :
https://www.caktusgroup.com/blog/2014/06/23/scheduling-tasks-celery/
How to run a Django celery task every 6am and 6pm daily?
Fof this purpose you need to run background queries with periodic taks.
Here is most popular in django task-queue-libs
For example, if you decide use celery, you can write simple periodic task:
from celery.schedules import crontab
from celery.task import periodic_task
#periodic_task(
name='UPDATE_USER',
run_every=crontab(
minute='1',
hour='1,4,7,10,13,16,19,22'))
def update_user():
#get some value from api
Module.objects.filter(user_id=1).update(
field_one=value, field_two=value, field_three=value)
All settings for django you can look in celery docs

Correct timesettings in Django for Celery

Im wondering how to correctly use timesettings in django and celery.
Here is what I have:
TIME_ZONE = 'Europe/Oslo'
CELERY_TIMEZONE = 'Europe/Oslo'
CELERY_ENABLE_UTC = True
USE_TZ = True
TZINFO = 'UTC'
But the timestamp on my Celery task is ahead by two hours. How can I fix it?
Using:
Django - 1.6b2
celery - 3.0.23
django-celery - 3.0.23
You can use TZ default environment variable. Django will automatically use it with calling: http://docs.python.org/2/library/time.html#time.tzset
If your celery runs from django, it will work there too.
Also you could use something like:
os.environ['TZ'] = 'your timezone'
at the beginning of ( manage.py or wsgi.py ) in your local installation.
I think you might be hitting a bug in django-celery that I am also running into. There were timezone related changes in the last few releases of django-celery and this bug first showed up for me when I updated from 3.0.19 to 3.0.23.
I asked about this on the #celery IRC chat and was told that the django admin based celery task view is not that great and I should be using something like Flower (https://github.com/mher/flower) to monitor my tasks.
I installed and ran Flower and it did not suffer from the same timestamp issues that the django-celery admin based view does.

Celery PeriodicTask won't expire

I'm trying to setup a Periodic Task that should expire after some time. I'm using Django 1.5.1, celery 3.0.19 and django-celery 3.0.17 (everything from pip).
This is the excerpt code to create the task:
from django.utils import timezone
from datetime import timedelta, datetime
from djcelery.models import PeriodicTask, IntervalSchedule
interval = IntervalSchedule.objects.get(pk=1) # Added through fixture - 3sec interval
expiration = timezone.now() + timedelta(seconds=10)
task = PeriodicTask(name='fill_%d' % profile.id,
task='fill_album',
args=[instance.id],
interval=interval,
expires=expiration) task.save()
And I'm running celery with ./manage.py celeryd -B
The task is being created just fine, and beat is running it every 3 seconds, but after 10 seconds it doesn't expire. At first I thought it was some timezone issue between django and celery, so I let it running for 3 hours (my difference to UTC) but it still wouldn't expire.
During my tests I've actually managed to make it expire once (and the logger kept repeating it was expired, every 3 seconds) but I haven't been able to reproduce it since.
Can anyone shed some light on what I'm doing wrong?
Thanks!
I'm having the same problem and I think celery beat is not honoring the expires. If you set a breakpoint in your task take a look at the current_task.request object and see if expires has a value (or just print current_task.request from within the task.)
For me, if I manually run the task, current_task.request.expires has a value, but if celery beat schedules it, it is None.
I'm using celery 3.1.11
I filed a bug: https://github.com/celery/celery/issues/2283
You can try use last_run_at as:
task = PeriodicTask(name='fill_%d' % profile.id,
task='fill_album',
args=[instance.id],
interval=interval,
expires=expiration,
last_run_at=expiration)
task.save()