how not to run celery task on `python manage.py test` - django

I have a model I am sending email and sms to user in post_save signal I am creating the model multiple times so it is sending email and sms multiple time.
I am planning to write new test for testing sms and email.
def send_activation_mail_sms(sender, instance, created, **kwargs):
if created :
mobile_activation = UserMobileActivation.objects.create(user=instance,randomword=randomword(50),ref=ref)
email_activation = UserEmailActivation.objects.create(user=instance,randomword=randomword(50),ref=ref)
url_email = "{0}view/v1/email/activation/{1}/".format(HOSTNAME,email_activation.randomword) short_url_email = url_shortener(long_url_email)
url_sms = "{0}view/v1/mobile/activation/{1}".format(HOSTNAME,mobile_activation.randomword)
app.send_task("apps.tasks.send_sms",
args=[TEXTLOCAL_APIKEY,mobile_activation.stockuser.user.username ,'TXTLCL','Activate your mobile here {0}'.format(url_sms)])
app.send_task("apps.tasks.send_email",
args=[email_activation.user.user.email, EMAIL_VERIFICATION_SUBJECT,
EMAIL_VERIFICATION_TEMPLATE, {"host": HOSTNAME, "verify_email_url": url_email}])
I am passing created arg in post_save signal is there any way I can pass extra arg here so that while doing python manage.py test it will skip sending sms and email. I used versioning one way I was thinking to have different version of API for testing but as there is no request coming here I cannot catch request.version here. Please suggest.

Initially set some variable in your settings.py to identify the environment currently working on
# settings.py
MY_ENV = "DEVELOPMENT"
Then, run the celery tasks/additional scripts based on the MY_ENV
from django.conf import settings
def send_activation_mail_sms(sender, instance, created, **kwargs):
if created and settings.MY_ENV == "DEVELOPMENT":
# do your stuff
Django provide us to override the settings configs during the testing, see the doc Override Settings. So you could override the MY_ENV value in the test itself

Related

Django Celery: Task never executes

In my django application I am using celery. In a post_save signal, I am updating the index in elastic search. But for some reason the task gets hung and never actually executes the code:
What I use to run celery:
celery -A collegeapp worker -l info
The Signal:
#receiver(post_save, sender=University)
def university_saved(sender, instance, created, **kwargs):
"""
University save signal
"""
print('calling celery task')
update_university_index.delay(instance.id)
print('finished')
The task:
#task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(instance_id):
print('updating university index')
The only output I get is calling celery task. after waiting over 30 minutes, it doesn't ever get to any other print statements and the view continue to wait. Nothing ever shows in celery terminal.
Versions:
Django 3.0,
Celery 4.3,
Redis 5.0.9,
Ubuntu 18
UPDATE:
after doing some testing, using the debug_task defined inside the celery.py file in place of update_university_index does not lead to hanging. It behaves as expect. I thought maybe it could have been app.task vs task decorator but it seems that's not it.
#app.task(bind=True)
def debug_task(text, second_value):
print('printing debug_task {} {}'.format(text, second_value))
This happened with me once, I had made the dumbest error, django tells us to specify celery tasks in tasks.py file, and uses that for task discovery. After that it worked. Could you provide more insight into the directory structure using tree command?
This tutorial is for flask, but the same can be achieved in django. Where this particular tutorial shines is that after you tell celery to execute a task, it also provides you with a uuid and you can ping that url and monitor the progress of the task you triggered.
Verify that the tasks have been registered by celery using (Do make sure that celery is running):
from celery.task.control import inspect
i = inspect()
i.registered_tasks()
Or bash
$ celery inspect registered
$ celery -A collegeapp inspect registered
From https://docs.celeryproject.org/en/latest/faq.html#the-worker-isn-t-doing-anything-just-hanging
Why is Task.delay/apply*/the worker just hanging?
Answer: There’s a bug in some AMQP clients that’ll make it hang if it’s not able to authenticate the current user, the password doesn’t match or the user doesn’t have access to the virtual host specified. Be sure to check your broker logs (for RabbitMQ that’s /var/log/rabbitmq/rabbit.log on most systems), it usually contains a message describing the reason.
Change this line
#task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(instance_id):
print('updating university index')
To
#task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(self, instance_id):
print('updating university index')
Or add self to the task definition.
I'm still not sure as to why it doesn't work but I found a solution by replace task with app.task
importing app from my celery.py seemed to have resolved the issue.
from collegeapp.celery import app
#app.task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(self, instance_id):
print('updating university index')

Separation of Concerns / DI in Django Rest Framework Applications

I'm trying to create more separation of concerns with endpoints in my Django apps but not sure how to implement a service layer. I've read through a popular python article on this topic however it doesn't seem to answer the testing issue.
For instance, if I have a request come in to save a user and want to send an email after the user has been saved, it's popular to handle this logic by overriding on save like this:
**models.py**
class User(AbstractBaseUser, PermissionsMixin):
...
def save(self, *args, **kwargs):
if self._state.adding:
user.activate_user(user=self)
super(User, self).save(*args, **kwargs)
**services.py**
from services import iNotificationService
...
def activate_user(user):
user.active = True
iNotificationService().send_message(user)
In the above example, iNotificationService would be an interface that the application could choose at runtime. If this user was saved in a production environment, the application would provide a class like the following
import mandrill
class EmailClient(iNotificationService)
def send_message(user):
message = create_message(user)
mandrill.send_email(message)
into the services.py module so that an email was sent. But if the application was run in a testing environment, the test would mock the EmailClient by sending in an instance of the interface so that no email would actually be sent:
class iNotificationService(object)
def send_message(user)
pass
What I'm wondering is how I supply the instance to the services.py module so that activate_user knows which type of notification to send. My only idea around this was to pass some arg to the save method so that it would know which type of notification to use. But I'm wondering how scalable that solution would be considering the different places one might use a service.
If change in notification provider class only depends on environment (testing / development / production etc) you are running application in (as you have mentioned), I would suggest you to provide notification class to your application through settings. I personally use this pattern and have seen in many projects and find it very useful.
Say you have a services package which has notifications module in it containing all notification provider classes. Like this.
services
__init__.py
notifications.py
iNotificationService
EmailClient
Add notification provider class path in settings.py file.
settings.py
...
# this value can be changed / overridden based on environment.
# pragmatically or statically.
NOTIFICATION_PROVIDER = 'services.notifications.iNotificationService'
...
services/__init__.py
from django.conf import settings
from django.utils.module_loading import import_string
def get_notification_provider(*args, **kwargs):
try:
notification_class = import_string(settings.NOTIFICATION_PROVIDER)
except ImportError as e:
raise ImproperlyConfigured('Error loading notification provider: %s' % e)
return notification_class(*args, **kwargs)
Now you can import get_notification_provider from services package and use it across your application to get an instance of notification class based on your settings. You can plug in different providers in settings based on your needs / environment.

Calling a django function periodically

This is a rookie question on web development. I am trying to find a secure and better way for developers to call and run a function that sends emails out from my django application that they can override and send manually as well as also can be time activated thus sending the email periodically at fixed times of the week.
I found answer suggesting the use of celery on Running a function periodically in Django
But I want to change the periods without redeploying my application. I have done some research on aws tools and I think a combination of AWS gateway and aws lambda and aws cloudwatch to send a url/endpoint (or get request) to my web app to activate the function.
At the moment I have something like below.
views.py
def api_send_email(request):
#insert send email function
print ("sending")
return redirect('/')
urls.py
urlpatterns = [
url(r'^send_email$', api_send_email, name="api_send_email"),
]
So the above can either manually be triggered by going to the url https//xxx/send_email or by is sending a get request to that url periodically from aws. I have thought about doing a post request instead which will make it more secure but I am not sure if the aws tools can do that because it requires the csrf token in my app itself.
Any suggestions on what is the best way to be doing this is welcome.
Thank you
I think you can accomplish this with celery as well. For that, you can add a periodic task. Lets say you have a periodic task which initiates every 5 minutes.
Then you can have your logic in a Model to determine if it should be sent the email at that time. For example:
class YourTaskConfig(models.Model):
SEND_CHOICES = (
('minute': 'minute'),
('hour': 'hour'),
('day': 'day'),
)
send_every = models.CharField(max_length=25, choices=SEND_CHOICES)
interval_amount = models.IntegerField()
last_executed = models.DateTimeField(auto_add_now=True)
is_active = models.BooleanField(default=True)
def should_run(self):
now = timezone.now()
if self.send_every == 'minute':
td = datetime.timedelta(seconds=self.interval_amount*60)
elif self.send_every == 'day':
td = datetime.timedelta(days=self.interval_amount)
... # rest of the logic on time delta
if now - self.last_executed >= td:
self.save() # Updates current execution time
return True
return False
Your Email model can have a FK to this configuration(if you have one):
class Email(models):
config = models.ForeignKey(YourTaskConfig, on_delete=models.DO_NOTHING)
And use it periodic task:
from celery.task.schedules import crontab
from celery.decorators import periodic_task
#periodic_task(run_every=(crontab(minute='*/5')), name="some_task", ignore_result=True) # Runs every 5 minute
def some_task():
for i in YourTaskConfig.objects.filter(is_active=True): # run only active tasks
should_send_email = i.should_run()
if should_send_email:
i.email_set.all() # Here you go, you have your emails which you want send
FYI: Its an untested code, but you can get the general idea behind this solution. Hope it helps!!

Easiest strat to add a Subscribe module to a Mezzanine blog

I have a Mezzanine blog and I would like to add a little form to every page so users can type their email addresses and click 'subscribe' so, from that moment, an email will be sent to announce any new post to the blog.
I don't see that built in or any existing module for that purpose... Should I program that from scratch? Any ideas?
Since there are no answers, I will try to offer my best guess as to the easiest strategy in this situation. I don't know of any built-in functions in Mezzanine that perform this specific function, so here's how I would approach it:
python manage.py startapp subscriber
Build out basic model - email = models.EmailField(unique=True), etc.
Create basic admin.py
Update settings.py -> INSTALLED_APPS and ADMIN_MENU_ORDER
ADMIN_MENU_ORDER = (
(_("Web Content"), ((_("Pages"), "pages.Page"),
(_("Subscribers"), "subscriber.Subscriber"),
Create a forms.py - something like this:
class SubscriberForm(forms.ModelForm):
class Meta:
model = Subscriber
fields = ['email']
Setup a views.py for GET/POST of above form
Reconfigure urls.py to redirect to the new view function
Then, perhaps the only interesting part of my response, I would copy the Mezzanine Blog app directory into the project directory, and create a signals.py. Maybe add something like this:
#receiver(pre_save, sender=BlogPost, dispatch_uid="save_blogpost")
def save_blogpost(sender, instance, **kwargs):
""" Every time a new blog post is created, notify all subscribers."""
if instance.published:
subs = Subscriber.objects.all()
for sub in subs:
email = EmailMultiAlternatives(
subject="New Blog Post!",
body="A new blog post has been added!",
from_email="example#email.com",
to=[sub.email]
)
email.attach_alternative(self.body, "text/html")
email.send()
Update app.py in the blog app directory to contain this under the Config class:
def ready(self):
import blog.signals # noqa
If you've got Django configured to send mail through an external SMTP mail server, that's easier. If not, you'll likely want to setup Postfix with OpenDKIM and configure SPF to reduce probability the outgoing mail ends up in spam.
(Also, you'll obviously need to handle situations where an admin changes a draft post to a published post, etc.)
Not sure if this helps, but hopefully!

Call celery task from signal

I need to import data from several public APIs for a user after he signed up. django-allauth is included and I have registered a signal handler to call the right methods after allaut emits user_signed_up.
Because the data import needs to much time and the request is blocked by the signal, I want to use celery to do the work.
My test task:
#app.task()
def test_task(username):
print('##########################Foo#################')
sleep(40)
print('##########################' + username + '#################')
sleep(20)
print('##########################Bar#################')
return 3
I'm calling the task like this:
from game_studies_platform.taskapp.celery import test_task
#receiver(user_signed_up)
def on_user_signed_in(sender, request, *args, **kwargs):
test_task.apply_async('John Doe')
The task should be put into the queue and the request should be followed immediately. But it is blocked and I have to wait a minute.
The project is setup with https://github.com/pydanny/cookiecutter-django and I'm running it in a docker container.
Celery is configured to use the django database in development but will be redis in production
The solution was to switch CELERY_ALWAYS_EAGER = True to False in the local.py. I was pointed to that solution in the Gitter channel of cookiecutter-django.
The calls mention above where already correct.