Not able to reload redis cache in superset using celery - apache-superset

I am new to superset. I have created few dashboards and have cached data using redis. I am trying to reload cache using celery jobs. celery worker runs successfully but cache is not reloaded.
PFA screenshot for
celery config
celery worker success
superset terminal errors when celery job runs.
Flask-SQLalchemy is already installed
Does anyone have idea how to fix this?
I just want to reload data in the cache
celery config
celery worker success
superset terminal errors when celery job runs
Flask-SQLalchemy is already installed

Related

Where to run Celery on AWS

In my django web app I am running a calculation which takes 2-10 minutes, and the AWS server times out with a 504 server error.
It seems that the best way to fix this problem is to implement a worker, and offload the calculations. Through some research, it seems that Celery with a Redis server (maybe AWS SQS instead?) are best for Django.
However, I only see tutorials for instances that are run locally. The Redis server is hosted by Railway and Celery is run in a separate terminal than Django (on the same local machine). I'm curious if/where Celery and Redis should be running on AWS.
This answer says you should try to run celery as a deamon in the background. AWS Elastic Beanstalk uses supervisord already to run some deamon processes. So you can leverage that to run celeryd and avoid creating a custom AMI for this. It works nicely for me. but don't the Celery and Redis servers still need to run somehow?
Where does the Celery server run?
How do I start it?
How can I leverage supervisord to run daemon processes? The documentation isn't helping me very much with AWS integration
You can configure Procfile to run multiple processes like main django app, celery and celery-beat in parallel as documented here:
web: <command to start your django app>
celery: celery -A <path_to_celery_app> worker
celery_beat: celery -A <path_to_celery_app> beat

How to deploy a django project to google cloud with celery workers?

So i have a django project which I installed celery and heroku redis for it and I use google cloud for deployment. Everything works fine at local but i need to run my celery workers on website 24/7. I searched for supervisor and installed it too. I start supervisor from my command line. Celery workers runs since I ran supervisor. But there is a problem. I can not hold my pc open all the time. When I close, supervisor stops too. I did not figured out cloud tasks as well. Lastly, I read some infos about kubernetes and celery. Is it possible to use celery with kubernetes and how can i install kubernetes-celery django setup?
You need to be running your Django server with Gunicorn, your Redis service as a separate service, and your celery worker as a third service.
Alternatively, if you want one single container instance (Pod in k8s) you can set a supervisor to run gunicorn and your celery worker inside the same pod

Celery tasks are executed by different program binaries

I have a Django web application that executes tasks via celery. It is run with a combination of Apache, uWSGI, and Redis. For some reason, one of the tasks is being executed by the uWSGI server and the other is executed by the Python interpreter. This is causing permissions issues as uWSGI doesn't run as the same user as Python does.
What could cause the tasks to be run by different programs? Is there a setting somewhere?
Turns out, I needed to call the task with .delay() to get the Celery deamon to execute the task instead of uWSGI.

Celery ensure a task is run on each server

I have a flask / django webapp running on multiple web servers, in particular the celery task will run and update a server at random. I'm in a situation where I need the task to run once on all servers and have each server updated by the task.
How can I run a celery task and ensure the task is run once on all web servers?

Celery how to stop tasks being processed on a developnent machine

I have my production and development machines setup using Celery in Django. Everything works. My issue is that when I boot up my development machine it starts to consume tasks and if I have a bug (which I often do on my development version) this can give me lots of headaches.
I still want my development machine to send tasks to be consumed, but how do I stop it from processing tasks and leave just my production server for that?
PS, I don't want to have to remove celery setting on my development machine as I still want my dev one to send tasks to be consumed, if that makes sense.
These are my current settings...
Celery
BROKER_URL = 'amqp://admin:xyz#54.111.111.11:5672//'
CELERY_SEND_TASK_ERROR_EMAILS = True
I'm using RabbitMQ.
You can try:
celery purge
And if you have django-celery you can do
python manage.py celery purge