Django Signals and celery , is Sync - django

I am using celery to scrap some URLs , and pushing the scrap data to an API
but because of quick calls on this API
i need to use Django Signals if it Sync

you can use periodic tasks in celery can you give more specific details what you want to exactly do

Related

Can I edit cronjobs using my Django application?

I am creating a Django application for school project. I want to schedule jobs (every (work)day on 9:00 and 17:00).
I am trying to do it with Celery right now, but I stuck very hard on it, and as the deadline is in sight, I want to use alternative options: just a cronjob. I think just the cronjob works fine, but the user should be able to edit the times of the cronjobs using the Django web application (so not logging in to SSH, edit the crontab manually).
Is this possible? Can't find anything about it on the internet.
You need django-celery-beat plugin that adds new models to the django admin named "Periodic tasks" where you can manage cron schedule for your tasks.
As an alternate, if you really do not want to run a background tasks, you can create django management commands and use a library like python-crontab to add/modify/remove cron jobs in the system.

Access to Django ORM from remote Celery worker

I have a Django application and a Celery worker - each running on it's own server.
Currently, Django app uses SQLite to store the data.
I'd like to access the database using Django's ORM from the worker.
Unfortunately, it is not completely clear to me; thus I have some questions.
Is it possible without hacks/workarounds? I'd like to have a simple solution (I would not like to implement REST interface to object access). I imagine that achieving this could be done if I started using PostgreSQL instance which is accessible from both servers.
Which project files (there's just Django + tasks.py file) are required on the worker's machine?
Could you provide me with an example or tutorial? I tried looking it up but found just tutorials/answers bound to a problem of local Celery workers.
I have been searching for ways to do this simply but... Your best option is to attached a kind of callback to the task function that will call another function on the django server to carry out the database update

User management of periodic tasks in django celery app

Django celery (djcelery) is considered obselete these days, but to me it seems to offer some unique features still. For example with djcelery one can allow the user to add a periodic task easily by creating an instance of their PeriodicTask. Is this the best way to do this? If not how else could I allow users to manage periodic tasks in a django+celery app?
(Just to be clear I know I can hardcode periodic tasks in settings.py celery beat schedule with djcelery, but I'm talking about user management of periodic tasks, creating/updating/listing/deleting, via the admin or some mini app which interfaces a model like djcelery's PeriodicTask model)

Alternative to django-celery to perform asynchronous task in Django?

In my admin I have a form allowing to upload a file to fill the DB.
Parsing and filling the DB take a long time, so I'd like to do it asynchronously.
As recommended by several SO users I tried to install python-celery, but I can't manage to do it (I'm on Webfaction).
Is there any simple, easy-to-install alternative?
If webfaction supports cron jobs you can create your own pseudo broker. You could save your long running tasks to the db and in a 'tasks' table, this would allow you to return a response to the user instantaneously. Then there could be a cron that goes through very frequently and looks for uncompleteled tasks and processes them.
I believe this is what django mailer does
https://github.com/jtauber/django-mailer/
https://stackoverflow.com/a/1419640/594589
Try Gearman along with it's python client library
It's very easy to setup and run gearman. Try few examples.

Django and services

I'm building a simple website with django that requires constant monitoring of text-based data from another website, that's the way it have to be.
How could I run this service on my web-host using django? would I have to start a separate app and run it via SSH, so it updates the database used by django, or are there any easier/better way?
You could use celery to schedule a job that would read data from that other website and do whatever you want with it.
As an alternative to celery, you could also create a cron job that executes a custom django-admin command. That would give you full access to your django install and ORM. The downside is that cron's smallest time resolution is 1 minute, so if you need it to be real-time, you're not going to be able to do that.
If you do need realtime, then creating a python daemon might be a better option.