Simple custom queue without 3rd party software - django

first time posting here so my apologies if something is incorrect.
I am trying to build a simple queue system using Python's inbuilt Queue class in a Django application.
Python Queue Documentation
https://docs.python.org/3/library/queue.html
I think I know how to construct and initialize a queue on site startup so there is only one queue I add jobs to based on this post Execute code when Django starts ONCE only?
However, my problem and question is after constructing a queue on Django site startup how would I grab that queue from my various Django apps in order to add jobs to it in the Views files?
I don't see a way to get the constructed queue. I could construct a queue at the time I want to add a job but then I would run into the issue where each job added would construct its own queue which is not the functionality I want.
so far I have a a queue_construction.py file in the base dir of my django site with some test functions in it and that is working great when I execute it on command line. Now I need to get it working in the Django environment. I do not want to use Celery or any other 3rd party software to do this.
Thanks in advance

Related

Options for running on user demand asynchronous / background tasks in Django?

My Django app generates a complex report that can take upto 5 minutes to create. Therefore it runs once a night using a scheduled management command.
That's been ok, except I now want the user to be able to select the date range for the report, which means the report needs to be created while the user waits.
What are my options for running the tast in the background? So far I've found these:
Celery - might work but is complex
django-background-tasks looks like the right tool for the job but hasn't been updated for years, last supported Django is 2.2
The report/background task could be generated by AWS Lambda, basically in a microservice. Django calls the Microservice which can execute the background task then call the Django app back once finished. This is what I did last time but not sure it would work now as I'd need to send the microservice 10mb of data to process.
Use subprocess.popen which someone here said worked for them but other reports say it doesn't work from Django.
EDIT: Looks like Django 3.1 onwards supports ASync views and may be the simple solution for this.

Alternative to django-celery to perform asynchronous task in Django?

In my admin I have a form allowing to upload a file to fill the DB.
Parsing and filling the DB take a long time, so I'd like to do it asynchronously.
As recommended by several SO users I tried to install python-celery, but I can't manage to do it (I'm on Webfaction).
Is there any simple, easy-to-install alternative?
If webfaction supports cron jobs you can create your own pseudo broker. You could save your long running tasks to the db and in a 'tasks' table, this would allow you to return a response to the user instantaneously. Then there could be a cron that goes through very frequently and looks for uncompleteled tasks and processes them.
I believe this is what django mailer does
https://github.com/jtauber/django-mailer/
https://stackoverflow.com/a/1419640/594589
Try Gearman along with it's python client library
It's very easy to setup and run gearman. Try few examples.

Django: can functions within views run continuously even as other requests are made?

I'm trying to create a function that, when called, will extract information from an external source at irregular (and undefined) intervals. This data will then be placed in a database for later retrieval. I want this to be then running in the background even as other page requests are made. Is this possible?
The best way to run a Django function outside the request/response cycle is to implement it as a custom management command, which you can then set to run periodically using cron.
If you're already using it, celery supports periodic tasks using celerybeat, but this requires configuring and running the celerybeat daemon, which can be a headache. Celery also supports long-running tasks (things started in a view, but completing in their own time), as described in your question title.
Since you seem to need the function to be called when a page is loaded, you can put it inside your view as
def my_view(request):
#Call the long running function
long_running_function()
#Do view logic and return
return HttpResponse(...)
To handle the long_running_function you could use celery and create a tasks.py which implements your external data source logic. Creating tasks, adding to the queue and configuring celery is summarized here
If you just need a simpler solution for trying it out, take a look at the subprocess module.
A very similar answer here Django: start a process in a background thread?

Django + execute asynchronous process?

I am implementing a feature in a new project and I was wondering what was the optimal solution to it. The feature itself consists of sub functionality as follows: starting a process, stop a process, and checking if the process is running...all these done in a non-blocking way with django. I am trying to avoid stuff like RabbitMQ, etc. I was thinking maybe of using threading or cron.
EDIT: these functionality need to be triggered from a view.
Any comments or suggestions are the most welcomed. Thanks.
You can surely use celery with the database backend instead of RabbitMQ. Personally, for simple tasks I tend to just write a custom mangement command launched from cron, that gets its input from some database table (i.e. Django model) which is populated by Django view(s).

Rather than using crontab, can Django execute something automatically at a predefined time

How to make Django execute something automatically at a particular time.?
For example, my django application has to ftp upload to remote servers at pre defined times. The ftp server addresses, usernames, passwords, time, day and frequency has been defined in a django model.
I want to run a file upload automatically based on the values stored in the model.
One way to do is to write a python script and add it to the crontab. This script runs every minute and keeps an eye on the time values defined in the model.
Other thing that I can roughly think of is maybe django signals. I'm not sure if they can handle this issue. Is there a way to generate signals at predefined times (Haven't read indepth about them yet).
Just for the record - there is also celery which allows to schedule messages for the future dispatch. It's, however, a different beast than cron, as it requires/uses RabbitMQ and is meant for message queues.
I have been thinking about this recently and have found django-cron which seems as though it would do what you want.
Edit: Also if you are not specifically looking for Django based solution, I have recently used scheduler.py, which is a small single file script which works well and is simple to use.
I've had really good experiences with django-chronograph.
You need to set one crontab task: to call the chronograph python management command, which then runs other custom management commands, based on an admin-tweakable schedule
The problem you're describing is best solved using cron, not Django directly. Since it seems that you need to store data about your ftp uploads in your database (using Django to access it for logs or graphs or whatever), you can make a python script that uses Django which runs via cron.
James Bennett wrote a great article on how to do this which you can read in full here: http://www.b-list.org/weblog/2007/sep/22/standalone-django-scripts/
The main gist of it is that, you can write standalone django scripts that cron can launch and run periodically, and these scripts can fully utilize your Django database, models, and anything else they want to. This gives you the flexibility to run whatever code you need and populate your database, while not trying to make Django do something it wasn't meant to do (Django is a web framework, and is event-driven, not time-driven).
Best of luck!