I faced with problem while trying to play with django signals. I have a project with following structure
authexample
manage.py
posts #django app
func.py #here is sender and receiver logic is realised
In posts app's models.py i created a simple post model
class Post(models.Model):
title = models.CharField(max_length=30)
body = models.CharField(max_length=50)
In my func.py which is located outside post app i realised my signals calling logic by following code
from django.db.models import models
from posts.models import Post
from django.db.models.signals import post_save
from django.dispatch import receiver
#my sender function
def func_sender(title,body):
a = Post(title=title,body=body)
a.save()
#receiver function
#receiver(post_save,
sender=func_sender)
def func_receiver(sender,**kwargs):
print("article was saved")
Than i am trying to create test article for this purposes i run
python manage.py shell
from func import *
a = Post("test_title","test_body)
When this code was executed my test article was created but i expect that after article was created my receiver function func_receiver will execute and prompt me string inside print statement. Why this isn't occur. Guide me please
After googling i found solution. My problem was because when i called receiver in sender keyword argument i specified func_sender but proper way was specify model Post itself. And here is my solution
#receiver(post_save,sender=Post)
def func_receiver(sender,**kwargs):
print("post was saved)
Related
I realize there are many other questions related to custom django signals that don't work, and believe me, I have read all of them several times with no luck for getting my personal situation to work.
Here's the deal: I'm using django-rq to manage a lengthy background process that is set off by a particular http request. When that background process is done, I want it to fire off a custom Django signal so that the django-rq can be checked for any job failure/exceptions.
Two applications, both on the INSTALLED_APPS list, are at the same level. Inside of app1 there is a file:
signals.py
import django.dispatch
file_added = django.dispatch.Signal(providing_args=["issueKey", "file"])
fm_job_done = django.dispatch.Signal(providing_args=["jobId"])
and also a file jobs.py
from app1 import signals
from django.conf import settings
jobId = 23
issueKey = "fake"
fileObj = "alsoFake"
try:
pass
finally:
signals.file_added.send(sender=settings.SIGNAL_SENDER,issueKey=issueKey,fileName=fileObj)
signals.fm_job_done.send(sender=settings.SIGNAL_SENDER,jobId=jobId)
then inside of app2, in views.py
from app1.signals import file_added, fm_job_done
from django.conf import settings
#Setup signal handlers
def fm_job_done_callback(sender, **kwargs):
print "hellooooooooooooooooooooooooooooooooooo"
logging.info("file manager job done signal fired")
def file_added_callback(sender, **kwargs):
print "hellooooooooooooooooooooooooooooooooooo"
logging.info("file added signal fired")
file_added.connect(file_added_callback,sender=settings.SIGNAL_SENDER,weak=False)
fm_job_done.connect(fm_job_done_callback,sender=settings.SIGNAL_SENDER,weak=False)
I don't get any feedback whatsoever though and am at a total loss. I know for fact that jobs.py is executing, and therefore also that the block of code that should be firing the signals is executing as well since it is in a finally block (no the try is not actually empty - I just put pass there for simplicity) Please feel free to ask for more information - I'll respond asap.
here is the solution for django > 2.0
settings.py:
change name of your INSTALLED_APPS from 'app2' to
'app2.apps.App2Config'
app2 -> apps.py:
from app1.signals import file_added, fm_job_done
Class App2Config(AppConfig):
name = 'app2'
def ready(self):
from .views import fm_job_done_callback, file_added_callback
file_added.connect(file_added_callback)
fm_job_done.connect(fm_job_done_callback)
use django receiver decorator
from django.dispatch import receiver
from app1.signals import file_added, fm_job_done
#receiver(fm_job_done)
def fm_job_done_callback(sender, **kwargs):
print "helloooooooooooooo"
#receiver(file_added)
def file_added_callback(sender, **kwargs):
print "helloooooooooooooo"
Also, I prefer to handle signals in models.py
I have a requirement to write a DB row when a user logs in. The following code is in models.py (at end of file, after model definitions);
models.py
from django.contrib.auth.signals import user_logged_in
from utils import *
def rec_login(sender, request, user, **kwargs):
u_audit('some text here', user)
user_logged_in.connect(rec_login)
There's a utilities module which is imported in modules.py. The following code is called in utils.py from the above function;
utils.py
from app.models import *
def u_audit(msg,u):
ua=UserLog(action=msg, user=u, actiontime=datetime.now())
ua.save()
I'm reusing the u_audit() function in several other places (post-login).
When a user logs in, I get a NameError for the UserLog object (i.e. it looks like the model definitions can't be accessed by the signal callback function).
The UserLog object referred to above is just a simple models.Model.
Anyone know what I'm missing?
I've tried putting a simple file write in the callback function in models.py and tried the same thing in the u_audit() function. instead and that works fine, so I know it's being called properly. I've got other signal callback functions registered (all post-login) and they're using models and working fine.
Not a circular import? Utils refers to models and models to utils - it will not work. Please change utils.py to:
def u_audit(msg,u):
from app.models import UserLog
ua=UserLog(action=msg, user=u, actiontime=datetime.now())
ua.save()
But also something like that suggest that maybe models.py is good place for u_audit and you should just move it there.
I have my models in individual files:
models
\
|__init__.py
|event.py
|a_thing.py
|...
In __init__.py I import each model and after that I set the signal handling.
For the Event model I need some post_save handling.
This is the truncated version of __init__.py:
from django.db.models.signals import post_save
from django.dispatch import receiver
from core.models.event import Event
# Event
#receiver(post_save, sender = Event)
def event_post_save(sender, dispatch_uid = 'nope', **kwargs):
print kwargs.get('created')
print '------'
Whenever I save an Event via the console the message in the post_save is printed once but whenever I use the admin interface it gets printed twice. This may be because I import the models inside admin.py as well.
Is there a workaround for this so that I can save Event objects from the admin interface without the post_save firing twice?
It's probably from Django/Python import silliness. You need dispatch_uid like you have, but I think it needs to be an argument to the decorator, not the handler itself.
I managed to fix it my moving the signal handling to the views' __init__.py instead of models' __init__.py
I have the requirement that whenever there is a model get's added/changed/deleted, it should send a mail notification. The content will be more like the django_admin_log entries. I just need to extend this functionality in my model to send the mail. Any suggestions?
Django_log_admin will only track changes made in the admin interface. If the model is changed anywhere else, it will not update the log. However, if you are OK with just admin changes, then you can use a combination of django_log_admin and the post_save signal to do the trick. Put this in your management.py:
from django.db.models.signals import post_save
from django.dispatch import receiver
from django.contrib.admin.models import LogEntry
from django.core.mail import mail_admins
from django.template.loader import render_to_string
#will be triggered every time a LogEntry is saved i.e. every time an action is made.
#receiver(post_save, sender=LogEntry)
def send_notification_email(change, **kwargs):
mail_admins(subject="model %(model) has been changed by %(user)" %
{'model':change.content_type, 'user': change.user},
message = render_to_string('change_email.html', { 'change': change }) )
note to self: wow, django really includes all the batteries :D
You should look at Django's signals. In your case, you'll connect your handlers to the post_save and post_delete signals, for starters. Look through the built-in signal documentation for others you may want to tap. No need to hack into admin.
I've been struggling with this problem for 5 hours and I have a feeling it's a simple solution that I'm just overlooking.
I'm trying to tie in a third party module (Django Activity Stream) that uses a series of senders and receivers to post data about user activity to a database table. Everything is set up and installed correctly, but I get a 'Signal' Object has No Attribute 'Save' error when I try to run it.
I suspect the problem is in my syntax somewhere. I'm just getting started with Signals, so am probably overlooking something a veteran will spot immediately.
In views.py I have:
from django.db.models.signals import pre_save
from actstream import action ##This is the third-party app
from models import Bird
def my_handler(sender, **kwargs):
action.save(sender, verb='was saved')
#return HttpResponse("Working Great")
pre_save.connect(my_handler, sender=Bird)
def animal(request):
animal = Bird()
animal.name = "Douglas"
animal.save()
The Django Activity Stream app has this signals.py file:
from django.dispatch import Signal
action = Signal(providing_args=['actor','verb','target','description','timestamp'])
And then this models.py file:
from datetime import datetime
from operator import or_
from django.db import models
from django.db.models.query import QuerySet
from django.core.urlresolvers import reverse
from django.utils.translation import ugettext_lazy as _
from django.utils.timesince import timesince as timesince_
from django.contrib.contenttypes import generic
from django.contrib.contenttypes.models import ContentType
from django.contrib.auth.models import User
from actstream import action
...
def action_handler(verb, target=None, **kwargs):
actor = kwargs.pop('sender')
kwargs.pop('signal', None)
action = Action(actor_content_type=ContentType.objects.get_for_model(actor),
actor_object_id=actor.pk,
verb=unicode(verb),
public=bool(kwargs.pop('public', True)),
description=kwargs.pop('description', None),
timestamp=kwargs.pop('timestamp', datetime.now()))
if target:
action.target_object_id=target.pk
action.target_content_type=ContentType.objects.get_for_model(target)
action.save()
action.connect(action_handler, dispatch_uid="actstream.models")
Your main problem is in the discipline in maintaining coding style, or rather in this case, lack of. You will find that it is easier to identify problems in your code if you do not use the same name to refer to multiple things within the same module; give each object a unique, meaningful name, and refer to it using only that name.
The bottom line here is that the docs for that project contain bad code. This line:
action.save(sender, verb='was saved')
isn't ever going to work. The from actstream import action ultimately imports a signal from actstream.signals, and signals do not and never have had a save method. Especially not with such an odd signature of sender, verb.
At first I thought maybe the author had done something odd with subclassing Signal, but after looking at the rest of the codebase, that's just not the case. I'm not entirely sure what the intention of those docs was supposed to be, but the right thing to do in your handler will either be to save a new Action (imported from actstream.models) instance, or to do something with your model.
Sadly, the project's repository has a pretty sorry set of tests/examples, so without downloading and trying the app myself, I can't tell you what needs to happen there. You might try contacting the author or simply try finding a better-documented/better-maintained Activity Streams app.