How does one perform the following (Django 0.96) dispatcher hooks in Django 1.0?
import django.dispatch.dispatcher
def log_exception(*args, **kwds):
logging.exception('Exception in request:')
# Log errors.
django.dispatch.dispatcher.connect(
log_exception, django.core.signals.got_request_exception)
# Unregister the rollback event handler.
django.dispatch.dispatcher.disconnect(
django.db._rollback_on_exception,
django.core.signals.got_request_exception)
Incidentally, this code is from Google's Article on Using Django on GAE. Unfortunately the dispatch code in Django was rewritten between 0.96 and 1.0, and Google's example does not work with Django 1.0.
Of course, the Django people provided a helpful guide on how to do exactly this migration, but I'm not keen enough to figure it out at the moment. :o)
Thanks for reading.
Brian
The basic difference is that you no longer ask the dispatcher to connect you to some signal, you ask the signal directly. So it would look something like this:
from django.core.signals import got_request_exception
from django.db import _rollback_on_exception
def log_exception(*args, **kwds):
logging.exception('Exception in request:')
# Log errors.
got_request_exception.connect(log_exception)
# Unregister the rollback event handler.
_rollback_on_exception.disconnect(got_request_exception)
Related
Hey guys I am quite new to django and django channels. I have a small doubt related to channels. For example, if I have a post model and I created a post, for the user to view the post, they need to either reload the page or we need to redirect them to the list page to view all the posts.
What if I want to push the newly created post to the front end or to the client side without having them to reload the page? Can we use channels for that? And if we use channels for that purpose, do we need to rewrite all the codes we wrote in normal views or adding a snippet code like sending a signal when created and running an async function will do the trick? Is it difficult to implement?
Thanks
What if I want to push the newly created post to the front end or to the client side without having them to reload the page?
YES, but with a caveat, that being you would have to cache all the posts written before that if the user(on the client) is on the <post_list> page. This won't be an issue for a small project but if their are too many posts it would take too much time to load.
Can we use channels for that? YES
And if we use channels for that purpose, do we need to rewrite all the codes we wrote in normal views or adding a snippet code like sending a signal when created and running an async function will do the trick?
YES and no, since you have used [django-rest-framework(DRF)], the rest-framework will only work on a HTTP protocol; when you use websockets you are using WS protocol thereby to handle the events django-channels has consumers just like views in django & DRF. But you can(you should for maintaining the robustness of the code) use serializers that you wrote for [django-rest-framework].
To demonstrate that, a scenario in which a user wrote a post and you received it on your django-channel AsyncConsumer you can use something like this:-
from channels.db import database_sync_to_async
#database_sync_to_async
async def save_post(self, data):
serializer = PostSerializer(data=data)
serializer.is_valid(raise_exception=True)
x = serializer.create(serializer.validated_data)#this will create the post
return PostSerializer(x).data #this will return the serialized post data
Since django-channels AsyncConsumer has all the events written in an async function
and saving a new post is a synchronous function we need to use a "#database_sync_to_async"; (make sure to use await-keyword when calling the save_post function).
To answer the later half of the question can you use django-signals?
Yes, modify the above code to call that django-signal instead of using the serializer in the "save_post" function above you can serialize the data inside the django-signals.
But I believe the above method would do the trick.
From my understanding I believe that you want to notify the user regarding a new post for that every user should be connected to the same channel-group and once the save_post function is completed dispatch an event in that group with the notification.
#inside the signals.py file
from .models.py import <Post-model>
from django.db.models.signals import post_save
from django.dispatch import receiver
from channels.layers import get_channel_layer
from asgiref.sync import async_to_sync #since everything in django
#is running synchronously.
"""
when the post gets created; the below receiver function would run.
"""
#receiver(post_save,sender=<Post-model>)
def notify_new_post(sender,instance,created,**kwargs)
if created:
channel_layer = get_channel_layer()
async_to_sync(channel_layer.group_send)(
group=<group_name>,#the group/room that will receive the broadcast
message= {
'type':"<name of the event>",
'payload':{<whatever notification you want to send>}
#before writing that payload make sure what type of
#channels you are using if you are using a
#"AsyncWebsocketConsumer" the payload should be in the
#appropriate type thereby use json.dumps to convert it to JSON.
#if you are using "AsyncJsonWebsocketConsumer" then you don't
#have to do it [this is just for precautionary].
}
)
Hope this helps, if not keep asking on this thread.
I have djoser integrated on my django project, and I need to create a stripe customer_id on account activation, how can I do this?
I've been searching on djoser doc, but there is nothing about customizing activation, or passing a callback method.
Djoser provides user_activated signal. It is usable just like ordinary django signal.
It's undocumented but working.
Example usage
from django.dispatch import receiver
from djoser.signals import user_activated
#receiver(user_activated)
def my_handler(user, request):
# do what you need here
I'm writing the tests for django views, Some of the views are making the external HTTP requests. While running the tests i dont want to execute these HTTP requests. Since during tests , data is being used is dummy and these HTTP requests will not behave as expected.
What could be the possible options for this ?
You could override settings in your tests and then check for that setting in your view. Here are the docs to override settings.
from django.conf import settings
if not settings.TEST_API:
# api call here
Then your test would look something like this
from django.test import TestCase, override_settings
class LoginTestCase(TestCase):
#override_settings(TEST_API=True)
def test_api_func(self):
# Do test here
Since it would be fairly messy to have those all over the place I would recommend creating a mixin that would look something like this.
class SensitiveAPIMixin(object):
def api_request(self, url, *args, **kwargs):
from django.conf import settings
if not settings.TEST_API:
request = api_call(url)
# Do api request in here
return request
Then, through the power of multiple inheritence, your views that you need to make a request to this api call you could do something similar to this.
class View(generic.ListView, SensitiveAPIMixin):
def get(self, request, *args, **kwargs):
data = self.api_request('http://example.com/api1')
This is where mocking comes in. In your tests, you can use libraries to patch the parts of the code you are testing to return the results you expect for the test, bypassing what that code actually does.
You can read a good blog post about mocking in Python here.
If you are on Python 3.3 or later, the mock library is included in Python. If not, you can download it from PyPI.
The exact details of how to mock the calls you're making will depend on what exactly your view code looks like.
Ben is right on, but here's some psuedo-ish code that might help. The patch here assumes you're using requests, but change the path as necessary to mock out what you need.
from unittest import mock
from django.test import TestCase
from django.core.urlresolvers import reverse
class MyTestCase(TestCase):
#mock.patch('requests.post') # this is all you need to stop the API call
def test_my_view_that_posts_to_an_api(self, mock_get):
response = self.client.get(reverse('my-view-name'))
self.assertEqual('my-value', response.data['my-key'])
# other assertions as necessary
I use:
Celery
Django-Celery
RabbitMQ
I can see all my tasks in the Django admin page, but at the moment it has just a few states, like:
RECEIVED
RETRY
REVOKED
SUCCESS
STARTED
FAILURE
PENDING
It's not enough information for me. Is it possible to add more details about a running process to the admin page? Like progress bar or finished jobs counter etc.
I know how to use the Celery logging function, but a GUI is better in my case for some reasons.
So, is it possible to send some tracing information to the Django-Celery admin page?
Here's my minimal progress-reporting Django backend using your setup. I'm still a Django n00b and it's the first time I'm messing with Celery, so this can probably be optimized.
from time import sleep
from celery import task, current_task
from celery.result import AsyncResult
from django.http import HttpResponse, HttpResponseRedirect
from django.core.urlresolvers import reverse
from django.utils import simplejson as json
from django.conf.urls import patterns, url
#task()
def do_work():
""" Get some rest, asynchronously, and update the state all the time """
for i in range(100):
sleep(0.1)
current_task.update_state(state='PROGRESS',
meta={'current': i, 'total': 100})
def poll_state(request):
""" A view to report the progress to the user """
if 'job' in request.GET:
job_id = request.GET['job']
else:
return HttpResponse('No job id given.')
job = AsyncResult(job_id)
data = job.result or job.state
return HttpResponse(json.dumps(data), mimetype='application/json')
def init_work(request):
""" A view to start a background job and redirect to the status page """
job = do_work.delay()
return HttpResponseRedirect(reverse('poll_state') + '?job=' + job.id)
urlpatterns = patterns('webapp.modules.asynctasks.progress_bar_demo',
url(r'^init_work$', init_work),
url(r'^poll_state$', poll_state, name="poll_state"),
)
I am starting to try figuring this out myself. Start by defining a PROGRESS state exactly as explained on the Celery userguide, then all you need is to insert a js in your template that will update your progress bar.
Thank #Florian Sesser for your example!
I made a complete Django app that show the progress of create 1000 objects to the users at http://iambusychangingtheworld.blogspot.com/2013/07/django-celery-display-progress-bar-of.html
Everyone can download and use it!
I would recommend a library called celery-progress for this. It is designed to make it as easy as possible to drop-in a basic end-to-end progress bar setup into a django app with as little scaffolding as possible, while also supporting heavy customization on the front-end if desired. Lots of docs and references for getting started in the README.
Full disclosure: I am the author/maintainer of said library.
I've been struggling with this problem for 5 hours and I have a feeling it's a simple solution that I'm just overlooking.
I'm trying to tie in a third party module (Django Activity Stream) that uses a series of senders and receivers to post data about user activity to a database table. Everything is set up and installed correctly, but I get a 'Signal' Object has No Attribute 'Save' error when I try to run it.
I suspect the problem is in my syntax somewhere. I'm just getting started with Signals, so am probably overlooking something a veteran will spot immediately.
In views.py I have:
from django.db.models.signals import pre_save
from actstream import action ##This is the third-party app
from models import Bird
def my_handler(sender, **kwargs):
action.save(sender, verb='was saved')
#return HttpResponse("Working Great")
pre_save.connect(my_handler, sender=Bird)
def animal(request):
animal = Bird()
animal.name = "Douglas"
animal.save()
The Django Activity Stream app has this signals.py file:
from django.dispatch import Signal
action = Signal(providing_args=['actor','verb','target','description','timestamp'])
And then this models.py file:
from datetime import datetime
from operator import or_
from django.db import models
from django.db.models.query import QuerySet
from django.core.urlresolvers import reverse
from django.utils.translation import ugettext_lazy as _
from django.utils.timesince import timesince as timesince_
from django.contrib.contenttypes import generic
from django.contrib.contenttypes.models import ContentType
from django.contrib.auth.models import User
from actstream import action
...
def action_handler(verb, target=None, **kwargs):
actor = kwargs.pop('sender')
kwargs.pop('signal', None)
action = Action(actor_content_type=ContentType.objects.get_for_model(actor),
actor_object_id=actor.pk,
verb=unicode(verb),
public=bool(kwargs.pop('public', True)),
description=kwargs.pop('description', None),
timestamp=kwargs.pop('timestamp', datetime.now()))
if target:
action.target_object_id=target.pk
action.target_content_type=ContentType.objects.get_for_model(target)
action.save()
action.connect(action_handler, dispatch_uid="actstream.models")
Your main problem is in the discipline in maintaining coding style, or rather in this case, lack of. You will find that it is easier to identify problems in your code if you do not use the same name to refer to multiple things within the same module; give each object a unique, meaningful name, and refer to it using only that name.
The bottom line here is that the docs for that project contain bad code. This line:
action.save(sender, verb='was saved')
isn't ever going to work. The from actstream import action ultimately imports a signal from actstream.signals, and signals do not and never have had a save method. Especially not with such an odd signature of sender, verb.
At first I thought maybe the author had done something odd with subclassing Signal, but after looking at the rest of the codebase, that's just not the case. I'm not entirely sure what the intention of those docs was supposed to be, but the right thing to do in your handler will either be to save a new Action (imported from actstream.models) instance, or to do something with your model.
Sadly, the project's repository has a pretty sorry set of tests/examples, so without downloading and trying the app myself, I can't tell you what needs to happen there. You might try contacting the author or simply try finding a better-documented/better-maintained Activity Streams app.