I am building a web app, using Django, which has two parts:
deploy docker containers and store the expiry time in the database,
according to the expiry time kill them, however, a user of the app can choose to extend the life of the container.
How do I implement the second part without polling on the database?
I tried using asyncio and implemented a custom middleware in Django but it blocks the execution. Is there any other way which does the job asynchronously.
import asyncio
from threading import Thread
def callback_func(eventloop):
"""
check DB
if now:
kill
else:
register a new callback with new updated time
"""
# Logic to kill a container goes here
print ("Inside callback")
class KillerMiddleware:
def __init__(self, get_response):
self.get_response = get_response
self._eventloop = asyncio.new_event_loop()
asyncio.set_event_loop(self._eventloop)
self._t = Thread(target=lambda : self._eventloop.run_forever())
self._t.daemon = True
self._t.start()
def __call__(self, request):
response = self.get_response(request)
self._eventloop.call_later(86400, callback_func, self._eventloop)
return response
I really don't know much about python-asyncio, but it sounds like Django Channels would be right for this.
Related
In my django project, I have made a view class by using TemplateView class. Again, I am using django channels and have made a consumer class too. Now, I am trying to use celery worker to pull queryset data whenever a user refreshes the page. But the problem is, if user again refreshes the page before the task gets finished, it create another task which causes overload.
Thus I have used revoke to terminate the previous running task. But I see, the revoke permanently revoked the task id. I don't know how to clear this. Because, I want to run the task again whenever user call it.
views.py
class Analytics(LoginRequiredMixin,TemplateView):
template_name = 'app/analytics.html'
login_url = '/user/login/'
def get_context_data(self, **kwargs):
app.control.terminate(task_id=self.request.user.username+'_analytics')
print(app.control.inspect().revoked())
context = super().get_context_data(**kwargs)
context['sub_title'] = 'Analytics'
return context
consumers.py
class AppConsumer(AsyncJsonWebsocketConsumer):
async def connect(self):
await self.accept()
analytics_queryset_for_selected_devices.apply_async(
args=[self.scope['user'].username],
task_id=self.scope['user'].username+'_analytics'
)
Right now I am solving the problem in this following way. In the consumers.py I made a disconnect function which revoke the task when the web socket get closed.
counter = 0
class AppConsumer(AsyncJsonWebsocketConsumer):
async def connect(self):
await self.accept()
analytics_queryset_for_selected_devices.apply_async(args=[self.scope['user'].username],
task_id=self.scope['user'].username+str(counter))
async def disconnect(self, close_code):
global counter
app.control.terminate(task_id=self.scope['user'].username+str(counter), signal='SIGKILL')
counter += 1
await self.close()
counter is used for making new unique task id. But in this method, for every request makes a new task id is added in the revoke list which cause load in memory. To minimize the issue I limited the revoke list size to 20.
from celery.utils.collections import LimitedSet
from celery.worker import state
state.revoked = LimitedSet(maxlen=20, expires=3600)
I have a view which uses threading e.g
from .utils import my_heavy_function
def my_view(request):
if request.method == "POST":
form = my_model_form()
if form.is_valid():
#create thread
thr = threading.Thread(target=my_heavy_function,args=(form,))
thr.start()
messages.success(request, "Processing ...")
return redirect("my_template")
else:
form = my_model_form()
return render(request, "my_app/my_template.html")
and it works like a charm; it process the my_heavy_function in the background while making the user able to continue using the webpage. I just need a way to show a message when my_heavy_function is done.
Is there a way to make Django display a message even when a new-request is not called but based on some other condition? E.g on a page when a file is done loading etc.
(I have on purposed not used Django-Q, Celery or back-ground-tasks for this threading since I find it being overkill)
It's not very clear what do you want to do. If you want to notify user without it doing an HTTP request, your only bet in web technologies is to setup a WebSocket so you can push things from server.
If it's ok for user to get the message next time they open a page, you can put something in the DB when your heavy task is done. And on each request you check if you have something in DB, you'll do messages.add_message and remove that row from DB.
If you want to show the message with a result of an action, that has finished after a response to the request has been returned, you have this option to show it synchronically.
The way it works is that next time the user requests a resource, this implementation will check for any messages for this user, that has been created meanwhile (eg. after your async code finished execution and added a message.
This solution uses a superstructure to the synchronous Django messaging framework with a simple Memcache container.
Install memcached as your cache backend.
docker run -p 11211:11211 --name local-memcache -d memcached memcached -m 64
Then go and pip install django-async-messages django-pymemcache
Add this to your middleware in settings.py file:
'async_messages.middleware.AsyncMiddleware'
Ensure it comes after 'django.contrib.messages.middleware.MessageMiddleware'
Then add this to your settings.py file:
CACHES = {
'default': {
'BACKEND': 'djpymemcache.backend.PyMemcacheCache',
'LOCATION': [
'127.0.0.1:11211',
],
},
}
Where you want to use this async messaging, go from async_messages import message_user
Substitute your classical messages.add_message(... for
message_user(request.user, "your message") where first agrument is a user object
go to the django-async-messages package because it is slightly obsolete and needs a small update.
Locate the middleware.py file in the package (likely in venv/Lib/site-packages/async_messages/middleware.py)
Change it from this
from django.contrib import messages
from async_messages import get_messages
class AsyncMiddleware(object):
def process_response(self, request, response):
"""
Check for messages for this user and, if it exists,
call the messages API with it
"""
if hasattr(request, "session") and hasattr(request, "user") and request.user.is_authenticated():
msgs = get_messages(request.user)
if msgs:
for msg, level in msgs:
messages.add_message(request, level, msg)
return response
to this:
from django.contrib import messages
from async_messages import get_messages
from django.utils.deprecation import MiddlewareMixin
class AsyncMiddleware(MiddlewareMixin):
def process_response(self, request, response):
"""
Check for messages for this user and, if it exists,
call the messages API with it
"""
if hasattr(request, "session") and hasattr(request, "user") and request.user.is_authenticated:
msgs = get_messages(request.user)
if msgs:
for msg, level in msgs:
messages.add_message(request, level, msg)
return response
That is it - you have asynchronous messaging!
Ps - because you just edited a package which would change when deploying - I exctracted the package and with above changes I included it directly in my project structure.
I have a Django app that uses django-wkhtmltopdf to generate PDFs on Heroku. Some of the responses exceed the 30 second timeout. Because this is a proof-of-concept running on the free tier, I'd prefer not to tear apart what I have to move to a worker/ poll process. My current view looks like this:
def dispatch(self, request, *args, **kwargs):
do_custom_stuff()
return super(MyViewClass, self).dispatch(request, *args, **kwargs)
Is there a way I can override the dispatch method of the view class to fake a streaming response like this or with the Empy Chunking approach mentioned here to send an empty response until the PDF is rendered? Sending an empty byte will restart the timeout process giving plenty of time to send the PDF.
I solved a similar problem using Celery, something like this.
def start_long_process_view(request, pk):
task = do_long_processing_stuff.delay()
return HttpResponse(f'{"task":"{task.id}"}')
Then you can have a second view that can check the task state.
from celery.result import AsyncResult
def check_long_process(request, task_id):
result = AsyncResult(task_id)
return HttpResponse(f'{"state":"{result.state}"')
Finally using javascript you can just fetch the status just after the task is being started. Updating every half second will more than enough to give your users a good feedback.
If you think Celery is to much, there are light alternatives that will work just great: https://djangopackages.org/grids/g/workers-queues-tasks/
I need to import data from several public APIs for a user after he signed up. django-allauth is included and I have registered a signal handler to call the right methods after allaut emits user_signed_up.
Because the data import needs to much time and the request is blocked by the signal, I want to use celery to do the work.
My test task:
#app.task()
def test_task(username):
print('##########################Foo#################')
sleep(40)
print('##########################' + username + '#################')
sleep(20)
print('##########################Bar#################')
return 3
I'm calling the task like this:
from game_studies_platform.taskapp.celery import test_task
#receiver(user_signed_up)
def on_user_signed_in(sender, request, *args, **kwargs):
test_task.apply_async('John Doe')
The task should be put into the queue and the request should be followed immediately. But it is blocked and I have to wait a minute.
The project is setup with https://github.com/pydanny/cookiecutter-django and I'm running it in a docker container.
Celery is configured to use the django database in development but will be redis in production
The solution was to switch CELERY_ALWAYS_EAGER = True to False in the local.py. I was pointed to that solution in the Gitter channel of cookiecutter-django.
The calls mention above where already correct.
In my Django application I want to keep track of whether a response has been sent to the client successfully. I am well aware that there is no "watertight" way in a connectionless protocol like HTTP to ensure the client has received (and displayed) a response, so this will not be mission-critical functionality, but still I want to do this at the latest possible time. The response will not be HTML so any callbacks from the client (using Javascript or IMG tags etc.) are not possible.
The "latest" hook I can find would be adding a custom middleware implementing process_response at the first position of the middleware list, but to my understanding this is executed before the actual response is constructed and sent to the client. Are there any hooks/events in Django to execute code after the response has been sent successfully?
The method I am going for at the moment uses a subclass of HttpResponse:
from django.template import loader
from django.http import HttpResponse
# use custom response class to override HttpResponse.close()
class LogSuccessResponse(HttpResponse):
def close(self):
super(LogSuccessResponse, self).close()
# do whatever you want, this is the last codepoint in request handling
if self.status_code == 200:
print('HttpResponse successful: %s' % self.status_code)
# this would be the view definition
def logging_view(request):
response = LogSuccessResponse('Hello World', mimetype='text/plain')
return response
By reading the Django code I am very much convinced that HttpResponse.close() is the latest point to inject code into the request handling. I am not sure if there really are error cases that are handled better by this method compared to the ones mentioned above, so I am leaving the question open for now.
The reasons I prefer this approach to the others mentioned in lazerscience's answer are that it can be set up in the view alone and does not require middleware to be installed. Using the request_finished signal, on the other hand, wouldn't allow me to access the response object.
If you need to do this a lot, a useful trick is to have a special response class like:
class ResponseThen(Response):
def __init__(self, data, then_callback, **kwargs):
super().__init__(data, **kwargs)
self.then_callback = then_callback
def close(self):
super().close()
self.then_callback()
def some_view(request):
# ...code to run before response is returned to client
def do_after():
# ...code to run *after* response is returned to client
return ResponseThen(some_data, do_after, status=status.HTTP_200_OK)
...helps if you want a quick/hacky "fire and forget" solution without bothering to integrate a proper task queue or split off a separate microservice from your app.
I suppose when talking about middleware you are thinking about the middleware's process_request method, but there's also a process_response method that is called when the HttpResponse object is returned. I guess that will be the latest moment where you can find a hook that you can use.
Furthermore there's also a request_finished signal being fired.
I modified Florian Ledermann's idea a little bit... So someone can just use the httpresponse function normally, but allows for them to define a function and bind it to that specific httpresponse.
old_response_close = HttpResponse.close
HttpResponse.func = None
def new_response_close(self):
old_response_close(self)
if self.func is not None:
self.func()
HttpResponse.close = new_response_close
It can be used via:
def myview():
def myfunc():
print("stuff to do")
resp = HttpResponse(status=200)
resp.func = myfunc
return resp
I was looking for a way to send a response, then execute some time consuming code after... but if I can get a background (most likely a celery) task to run, then it will have rendered this useless to me. I will just kick off the background task before the return statement. It should be asynchronous, so the response will be returned before the code is finished executing.
---EDIT---
I finally got celery to work with aws sqs. I basically posted a "how to". Check out my answer on this post:
Cannot start Celery Worker (Kombu.asynchronous.timer)
I found a filthy trick to do this by accessing a protected member in HttpResponse.
def some_view(request):
# ...code to run before response is returned to client
def do_after():
# ...code to run *after* response is returned to client
response = HttpResponse()
response._resource_closers.append(do_after)
return response
It works in Django 3.0.6 , check the "close" function in the prototype of HttpResponse.
def close(self):
for closer in self._resource_closers:
try:
closer()
except Exception:
pass
# Free resources that were still referenced.
self._resource_closers.clear()
self.closed = True
signals.request_finished.send(sender=self._handler_class)