Can I use Celery eventlet pools with Django+Postgres? - django

Does celery do any magic to make Django queries non-blocking when using an eventlet pool?
If not, is there a known good way to make it so?

Eventlet provides monkey_patch() to make as much stuff as possible non-blocking. Including sockets (covers any pure Python database library) and special cases for mysqldb and psycopg and Celery worker type eventlet calls that patcher as far as I know. If your queries are still blocking, try monkey_patch(psycopg=True).

Related

APScheduler running multiple times for the amount of gunicorn workers

I have a django project with APScheduler built in it. I have proceeded to the production environment now so binded it with gunicorn and nginx in the proceess. Gunicorn has 3 workers. Problem is that gunicorn initiates the APScheduler for each worker and runs the scheduled job 3 times instead of running it for only once.
I have seen similar questions here it seems it is a common problem. Even the APScheduler original documentation acknowledges the problem and tells no way of fixing it.
https://apscheduler.readthedocs.io/en/stable/faq.html#how-do-i-share-a-single-job-store-among-one-or-more-worker-processes
I saw in other threads people recommended putting --preconfig in the settings. But I read that --preconfig initiates the workers with the current code and does not reload when there has been a change in the code.(See "when not to preload" in below link)
https://www.joelsleppy.com/blog/gunicorn-application-preloading/
I also saw someone recommended binding a TCP socket for the APScheduler. I did not understand it fully but basically it was trying to bind a socket each time APScheduler is initiated then the second and third worker hits that binded socket and throws a socketerror. Sort of
try:
"bind socket somehow"
except socketerror:
print("socket already exists")"
else:
"run apscheduler module"
configuration. Does anyone know how to do it or know if that would actually work?
Another workaround I thought is simply removing the APScheduler and do it with cron function of the server. I am using Digital Ocean so I can simply delete the APScheduler and a cron function that will run the module instead. However, I do not want to go that way because that will make break the "unity" of the whole project and make it server dependable. Does anyone have any more ideas?
Schedule module:
from apscheduler.schedulers.background import BackgroundScheduler
from RENDER.views import dailypuzzlefunc
def start():
scheduler=BackgroundScheduler()
scheduler.add_job(dailypuzzlefunc,'cron', day="*",max_instances=2,id='dailyscheduler')
scheduler.start()
In the app:
from django.apps import AppConfig
class DailypuzzleConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "DAILYPUZZLE"
def ready(self):
from SCHEDULER import dailypuzzleschedule
dailypuzzleschedule.start()
web:
python manage.py collectstatic --no-input;
gunicorn MasjidApp.wsgi --timeout 15 --preload
use --preload.
It's working well for me.

How to log django-kronos tasks with logging module?

I have switched from celery to dramatiq and celery beat to django-kronos and now I am stuck - I am not able to figure out, how to make tasks run by kronos to log using the logging module.
Is it even possible or what is the best practice to log progress of django-kronos tasks?

Decouple and Dockerize Django and Celery

I am wondering what is the best way to decouple Celery from Django in order to dockerize the two parts and use docker swarm service? Typically one starts their celery workers and celery beat using a command that references there Django application:
celery worker -A my_app
celery beat -A my_app
From this I believe celery picks up config info from settings file and a celery.py file which is easy to move to a microservice. What I don't totally understand is how the tasks would leverage the Django ORM? Or is that not really the microservices mantra and Celery should be designed to make GET/POST calls to Django REST Framework API for the data it needs to complete the task?
I use a setup where the code for both the django app and its celery workers is the same (as in a single repository).
When deploying I make sure to have the same code release everywhere, to avoid any surprises with the ORM, etc...
Celery starts with a reference to the django app, so that it has access to the models, etc...
Communication between the workers and the main app happens either through the messaging queue (rabbitmq or redis...) or via the database (as in, the celery worker works directly in the db, since it knows the models, etc...).
I'm not sure if that follows the microservices mantra, but it does work :)
Celery's .send_task or .signature might be helpful:
https://www.distributedpython.com/2018/06/19/call-celery-task-outside-codebase/

Lots of socket errors with celery eventlet tasks

I'm getting a lot of "IOError: Socket closed" exceptions from amqplib.client_0_8.method_framing.read_method when running my celery workers with the --pool=eventlet option. I'm also seeing a lot of timeout exceptions from eventlet.hubs.hub.switch.
I'm using an async_manage.py script similar to the one at https://gist.github.com/821848, running the works like:
./async_manage.py celeryd_detach -E --pool=eventlet --concurrency=120 --logfile=<path>
Is this a known issue, or is there something wrong with my configuration or setup?
I'm running djcelery 2.2.4, Django 1.3, and eventlet 0.9.15.
The problem was a side effect of some code that was blocking. I managed to detect the blocking code using the eventlet option described in this article.
There were 2 places where blocking was occuring: DNS lookups, and MySQL database access. I managed to resolve the first by installing the dnspython package, and the second my using the undocumented MySQLdb option in eventlet:
import eventlet
eventlet.monkey_patch()
eventlet.monkey_patch(MySQLdb=True)

Does django's runserver option provide a hook for running other restart scripts?

I've recently been playing around with django and celery. One annoying thing during development is the fact that I have to restart the celery daemon each time I modify a task. When I'm developing, I usually like to use 'manage.py runserver' which automatically reloads the django framework on modifications to my apps.
Is there a way to add a hook to the reloading process that runserver does so that it automatically restarts the celery daemon I have running?
Alternatively, does celery have a similar monitor-and-reload-on-change mode that I should be using for development?
Django-supervisor works very well for this purpose. You can have it start the Django server, Celery, and anything else you need, and have different configurations for development and production servers. It also knows to reload the celery daemon when your code changes.
https://github.com/rfk/django-supervisor
I believe you can set CELERY_ALWAYS_EAGER to true.
Yes. Django provides auto reload hook, which can be used to restart other scripts.
Here is a simple management command which prints a message on reload
import subprocess
from django.core.management.base import BaseCommand
from django.utils import autoreload
def reload():
print('Code changed. Auto reloading...')
class Command(BaseCommand):
def handle(self, *args, **options):
autoreload.main(reload)
Now you can save to a reload.py and run it with python manage.py reload. A management command to reload celery workers is available here.
Celery didn't have any feature for reload code or for auto restart when the code change, than you have to restart it manually.
There isn't a way for add an hook, and I think not worthwhile of edit the source code of django just for perform a restart.
Personally while I'm developing i prefere to see the output shell of celery that is decorated with color instead of tail the logs, is more readable.
Celery 2.5 has an experimental runtime option --autoreload that could be used for this purpose, too. Here's more detail in the release notes. That being said, I think django-supervisor (via #Lee Semel) looks like the better way of doing things. I thought I would post this alternative here in case other readers do not want to have to configure another app for asynchronous processing.