Celerybeat not recognizing new model even though django app does - django

When starting celerybeat I get the following error:
Restarting celery periodic task scheduler
Stopping celerybeat... NOT RUNNING
Starting celerybeat...
Error: One or more models did not validate:
collections.collection: 'language' has a relation with model <class 'languages.models.Language'>, which has either not been installed or is abstract.
collections.translation: 'language' has a relation with model <class 'languages.models.Language'>, which has either not been installed or is abstract.
But the model languages has for sure been added to my django settings, uwsgi and celery starts up fine and everything else but celerybeat work as it should.
It's as if celerybeat works of an old settings file, but that should not be possible or is it? I have recently also moved my settings file.

Found the problem. I had earlier moved my settings files, but not changed this in the celery settings files. So solution was to find the files:
celeryd
celerybeat
in etc/default/
and change the path to where the settings files has been moved to.
sudo nano celeryd
and edit

Related

Active Django settings file from Celery worker (how to set DJANGO_SETTINGS_MODULE Dynamically )

So I already looked around a lot for this but couldn't find a good answer. I'm using Celery celery and Django 3.2.13
., without django-celery package since newer versions of Celery don't require it anymore. I managed to set up tasks and execute them using Redis. Everything is working as it should there. However, I am integrating this in a existing, quite large, Django project. There we specified couple of Django settings files, not just one. We run different one depending on environment, for instance one for local machines and one for server. My problem is that I can't seem to be able to track down which settings file is "active" from the celery worker, which runs celery.py file in my project root (as documentation specifies). There the documentation requires to specify Django settings file like this:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', "celery_test.settings.development")
Now this works, but if I move the stuff locally I need to change it to settings.local to make it work, and that every time. Reading settings object in runtime like I do in standard Django files didn't work since celery worker executes in a different process. So, using this situation, does anyone have any idea on how to dynamically fetch active Django settings file from celery worker? Or perhaps pass it in as a variable when starting celery worker? (like for Django, etc --settings=project.settings.local) Thanks!
I found the command line solution
When initializing the celery worker on the command line, just set the environment variable prior to the celery command.
DJANGO_SETTINGS_MODULE='proj.settings' celery -A proj worker -l info
but im getting an error
my commannd line
DJANGO_SETTINGS_MODULE='celery_test.settings.development' celery -A celery_test worker -l info --pool=solo
DJANGO_SETTINGS_MODULE=celery_test.settings.development : The term
'DJANGO_SETTINGS_MODULE=celery_test.settings.development' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path
is correct and try again.
At line:1 char:1
DJANGO_SETTINGS_MODULE='celery_test.settings.development' celery -A c ...
+ CategoryInfo : ObjectNotFound: (DJANGO_SETTINGS...ngs.development:String) [], CommandNotFoundExcept
ion
+ FullyQualifiedErrorId : CommandNotFoundException

Can you daemonize Celery through your django site?

Reading the daemonization Celery documentation, if you're running Celery on Linux with systemd, you set it up with two files:
/etc/systemd/system/celery.service
/etc/conf.d/celery
I'm using Celery in a Django site with django-celery-beat, and the documentation is a little confusing on this point:
Example Django configuration
Django users now uses [sic] the exact same template as above, but make sure that the module that defines your Celery app instance also sets a default value for DJANGO_SETTINGS_MODULE as shown in the example Django project in First steps with Django.
The docs don't just come out and say, put your daemonization settings in settings.py and it will all work out, bla, bla. From another SO posts this user seems to have run into the same confusion where Django instructions imply you use init.d method.
Bonus point if you can answer if it's possible to run Celery and RabbitMQ both configured and with the Django instance (if that makes sense).
I'm thinking not Celery if only because daemon variables include CELERYD_ and first steps with django say: "...all Celery configuration options must be specified in uppercase instead of lowercase, and start with CELERY_"

Django migrations not persisting

My django app is containerized along side postgresql. The problem is that migrations do not seem to be persisting in the directory. Whenever I run docker exec -it <container_id> python manage.py makemigrations forum, the same migrations are detected. If I spin down the stack and spin it back up the and run makemigrations again, I see the same migrations detected. Changes to the fields, adding models, deleting models, none ever get detected. These migrations that do appear seem to be getting written to the database, as when I try to migrate, I get an error that there are existing fields already. But if I look at my migrations folder, only the init.py folder is present. All the migrate commands add no changes to the migrations folder.
I also tried unregistered the post model from the admin and spinning up the stack, yet I still see it present in the admin. Same things with changes to the templates. No change I make sticks from inside docker seems to stick.
*Note this problem started after I switched to wsl 2 and enabled it in docker desktop (windows)
**Update migrations can be made from bash of docker container
I found out what the problem was. My docker-stack.yml file was pointed to a directory that did not exist in the dockerfile.

Custom celery settings file for django-celery

Django-celery seems to use the current django project's settings.py file to read the configuration for celery, and, correct me if I am wrong, there is no way to override this behavior. The problem is that my celery configuration is in a different file outside of the django project, and I need to somehow tell django-celery to use that file instead. How do I do this?
Generally, django-celery used for one project(but many applications). Of course, you can make symlink of the settings.py from one project to other and run Django-celery as:
./manage.py celeryd --settings=path.to.symlink
but in my opinion would be better decision to use celery as daemon with common settings and CELERY_IMPORTS to tasks from any django projects

Permission problems prevent celery from running as daemon?

I'm currently having some trouble running celery as daemon. I use apache to serve my Django application, so I set uid and gid in celery setting all as "www-data". There are 2 places I know so far that need access permission: /var/log/celery/*.log, /var/run/celery/*.pid, and I already set them owned by "www-data". However, celery couldn't get started when I run sudo service celeryd start. If I get rid of the --uid and --gid option for the command, celery could get started by user "root".
One other thing I noticed is that if I could start celery using "root", it will put some files like: celery.bak, celery.dat, celery.dir in my CELERYD_CHDIR, which is my django application directory. I also changed the application directory owned by "www-data", celery still couldn't get started. I copied all the setting files from another machine in which celery runs fine, so I suppose it's not my setting's problem. Does anyone have any clue? Thanks.
Su to celery user and start celery from the command line. Most likely you have an app log, not celery, that you need permission for.