This is a rookie question on web development. I am trying to find a secure and better way for developers to call and run a function that sends emails out from my django application that they can override and send manually as well as also can be time activated thus sending the email periodically at fixed times of the week.
I found answer suggesting the use of celery on Running a function periodically in Django
But I want to change the periods without redeploying my application. I have done some research on aws tools and I think a combination of AWS gateway and aws lambda and aws cloudwatch to send a url/endpoint (or get request) to my web app to activate the function.
At the moment I have something like below.
views.py
def api_send_email(request):
#insert send email function
print ("sending")
return redirect('/')
urls.py
urlpatterns = [
url(r'^send_email$', api_send_email, name="api_send_email"),
]
So the above can either manually be triggered by going to the url https//xxx/send_email or by is sending a get request to that url periodically from aws. I have thought about doing a post request instead which will make it more secure but I am not sure if the aws tools can do that because it requires the csrf token in my app itself.
Any suggestions on what is the best way to be doing this is welcome.
Thank you
I think you can accomplish this with celery as well. For that, you can add a periodic task. Lets say you have a periodic task which initiates every 5 minutes.
Then you can have your logic in a Model to determine if it should be sent the email at that time. For example:
class YourTaskConfig(models.Model):
SEND_CHOICES = (
('minute': 'minute'),
('hour': 'hour'),
('day': 'day'),
)
send_every = models.CharField(max_length=25, choices=SEND_CHOICES)
interval_amount = models.IntegerField()
last_executed = models.DateTimeField(auto_add_now=True)
is_active = models.BooleanField(default=True)
def should_run(self):
now = timezone.now()
if self.send_every == 'minute':
td = datetime.timedelta(seconds=self.interval_amount*60)
elif self.send_every == 'day':
td = datetime.timedelta(days=self.interval_amount)
... # rest of the logic on time delta
if now - self.last_executed >= td:
self.save() # Updates current execution time
return True
return False
Your Email model can have a FK to this configuration(if you have one):
class Email(models):
config = models.ForeignKey(YourTaskConfig, on_delete=models.DO_NOTHING)
And use it periodic task:
from celery.task.schedules import crontab
from celery.decorators import periodic_task
#periodic_task(run_every=(crontab(minute='*/5')), name="some_task", ignore_result=True) # Runs every 5 minute
def some_task():
for i in YourTaskConfig.objects.filter(is_active=True): # run only active tasks
should_send_email = i.should_run()
if should_send_email:
i.email_set.all() # Here you go, you have your emails which you want send
FYI: Its an untested code, but you can get the general idea behind this solution. Hope it helps!!
Related
I have a Django app which performs a rather time consuming statistical model run within my views.py.
As the computation progresses in the view I would like to inform the user perdiodically before the final HttpResponse, for e.g.:
Step 1 completed
Step 2 running...
Is there a way to display a message to the front-end while the view is running?
Long-running tasks should be executed asynchronously. You can use dango-celery for async tasks, then from your view start the task and redirect a user to the page where you can display progress. From your celery task, you can update a progress value as well.
class Job(models.Model):
...
progress = models.PositiveSmallIntegerField(default=0)
...
If you want to display progress value dynamically, then you need an API or at least a view which you will hit by ajax. Something like this:
def progres_view(request, job_id):
value = 0
try:
job = Job.objects.get(pk=job_id)
except Job.DoesNotExist:
job = None
if job is not None:
value = job.progress
response = {"job": job_id, "progress": value}
return HttpResponse(json.dumps(response), content_type='application/json')
I have a model I am sending email and sms to user in post_save signal I am creating the model multiple times so it is sending email and sms multiple time.
I am planning to write new test for testing sms and email.
def send_activation_mail_sms(sender, instance, created, **kwargs):
if created :
mobile_activation = UserMobileActivation.objects.create(user=instance,randomword=randomword(50),ref=ref)
email_activation = UserEmailActivation.objects.create(user=instance,randomword=randomword(50),ref=ref)
url_email = "{0}view/v1/email/activation/{1}/".format(HOSTNAME,email_activation.randomword) short_url_email = url_shortener(long_url_email)
url_sms = "{0}view/v1/mobile/activation/{1}".format(HOSTNAME,mobile_activation.randomword)
app.send_task("apps.tasks.send_sms",
args=[TEXTLOCAL_APIKEY,mobile_activation.stockuser.user.username ,'TXTLCL','Activate your mobile here {0}'.format(url_sms)])
app.send_task("apps.tasks.send_email",
args=[email_activation.user.user.email, EMAIL_VERIFICATION_SUBJECT,
EMAIL_VERIFICATION_TEMPLATE, {"host": HOSTNAME, "verify_email_url": url_email}])
I am passing created arg in post_save signal is there any way I can pass extra arg here so that while doing python manage.py test it will skip sending sms and email. I used versioning one way I was thinking to have different version of API for testing but as there is no request coming here I cannot catch request.version here. Please suggest.
Initially set some variable in your settings.py to identify the environment currently working on
# settings.py
MY_ENV = "DEVELOPMENT"
Then, run the celery tasks/additional scripts based on the MY_ENV
from django.conf import settings
def send_activation_mail_sms(sender, instance, created, **kwargs):
if created and settings.MY_ENV == "DEVELOPMENT":
# do your stuff
Django provide us to override the settings configs during the testing, see the doc Override Settings. So you could override the MY_ENV value in the test itself
I've written a custom action and through model permissions have granted these to 2 users.
But now I only want one of them to run it at any one time. So was thinking whenever they select the actionbox and press the action button, it checks if a request.POST is already being made.
So my question is can I interrogate if there any other HTTP requests made before it takes the user to the intermediary page and display a message? But without having to mine the server logs.
To take a step back, I think what you're really asking is how do you share data across entrypoints to your application. e.g. if you only wanted 1 person to be able to trigger an action on a button at a time.
One strategy for doing this is to take some deployment-wide accessible datastore (like a cache or a message queue that ALL instances of your deployment have access to) and put in a message there that acts like a lock. This would rely on that datastore to support atomic reads and writes. Within Django, something like redis or memcached work well for this purpose (especially if you're using it as your cache backend.)
You might have something that looks like this (Example taken from the celery docs):
from datetime import datetime, timedelta
from django.core.cache import cache
from contextlib import contextmanager
LOCK_EXPIRE = 600 # Let the lock timeout in case your code crashes.
#contextmanager
def memcache_lock(lock_id):
timeout_at = datetime.now() + timedelta(seconds=LOCK_EXPIRE)
# cache.add fails if the key already exists
status = cache.add(lock_id, 'locked', LOCK_EXPIRE)
try:
yield status
finally:
if datetime.now() < timeout_at and status:
# don't release the lock if we exceeded the timeout
# to lessen the chance of releasing an expired lock
# owned by someone else
# also don't release the lock if we didn't acquire it
cache.delete(lock_id)
def my_custom_action(self, *args, **kwargs):
lock_id = "my-custom-action-lock"
with memcache_lock(lock_id) as acquired:
if acquired:
return do_stuff()
else:
do_something_else_if_someone_is_already_doing_stuff()
return
I need to import data from several public APIs for a user after he signed up. django-allauth is included and I have registered a signal handler to call the right methods after allaut emits user_signed_up.
Because the data import needs to much time and the request is blocked by the signal, I want to use celery to do the work.
My test task:
#app.task()
def test_task(username):
print('##########################Foo#################')
sleep(40)
print('##########################' + username + '#################')
sleep(20)
print('##########################Bar#################')
return 3
I'm calling the task like this:
from game_studies_platform.taskapp.celery import test_task
#receiver(user_signed_up)
def on_user_signed_in(sender, request, *args, **kwargs):
test_task.apply_async('John Doe')
The task should be put into the queue and the request should be followed immediately. But it is blocked and I have to wait a minute.
The project is setup with https://github.com/pydanny/cookiecutter-django and I'm running it in a docker container.
Celery is configured to use the django database in development but will be redis in production
The solution was to switch CELERY_ALWAYS_EAGER = True to False in the local.py. I was pointed to that solution in the Gitter channel of cookiecutter-django.
The calls mention above where already correct.
I use:
Celery
Django-Celery
RabbitMQ
I can see all my tasks in the Django admin page, but at the moment it has just a few states, like:
RECEIVED
RETRY
REVOKED
SUCCESS
STARTED
FAILURE
PENDING
It's not enough information for me. Is it possible to add more details about a running process to the admin page? Like progress bar or finished jobs counter etc.
I know how to use the Celery logging function, but a GUI is better in my case for some reasons.
So, is it possible to send some tracing information to the Django-Celery admin page?
Here's my minimal progress-reporting Django backend using your setup. I'm still a Django n00b and it's the first time I'm messing with Celery, so this can probably be optimized.
from time import sleep
from celery import task, current_task
from celery.result import AsyncResult
from django.http import HttpResponse, HttpResponseRedirect
from django.core.urlresolvers import reverse
from django.utils import simplejson as json
from django.conf.urls import patterns, url
#task()
def do_work():
""" Get some rest, asynchronously, and update the state all the time """
for i in range(100):
sleep(0.1)
current_task.update_state(state='PROGRESS',
meta={'current': i, 'total': 100})
def poll_state(request):
""" A view to report the progress to the user """
if 'job' in request.GET:
job_id = request.GET['job']
else:
return HttpResponse('No job id given.')
job = AsyncResult(job_id)
data = job.result or job.state
return HttpResponse(json.dumps(data), mimetype='application/json')
def init_work(request):
""" A view to start a background job and redirect to the status page """
job = do_work.delay()
return HttpResponseRedirect(reverse('poll_state') + '?job=' + job.id)
urlpatterns = patterns('webapp.modules.asynctasks.progress_bar_demo',
url(r'^init_work$', init_work),
url(r'^poll_state$', poll_state, name="poll_state"),
)
I am starting to try figuring this out myself. Start by defining a PROGRESS state exactly as explained on the Celery userguide, then all you need is to insert a js in your template that will update your progress bar.
Thank #Florian Sesser for your example!
I made a complete Django app that show the progress of create 1000 objects to the users at http://iambusychangingtheworld.blogspot.com/2013/07/django-celery-display-progress-bar-of.html
Everyone can download and use it!
I would recommend a library called celery-progress for this. It is designed to make it as easy as possible to drop-in a basic end-to-end progress bar setup into a django app with as little scaffolding as possible, while also supporting heavy customization on the front-end if desired. Lots of docs and references for getting started in the README.
Full disclosure: I am the author/maintainer of said library.