Prevent django send_mail timimg attack - django

I have different REST-API views where I either send a mail (if an account exists) or do not send a mail.
For example, the user can input the email in the forgot-password form and a mail is sent if the account exists.
I am using from django.core.mail import send_mail to send the mail.
The problem is, that this takes some time, and so requests for valid emails are generally longer than requests for non-exiting emails.
This allows an attacker to compare the request times to find out if an account exists or not.
Is there any way that I can call send_mail() without sending the mail?
Or what would be the fix to make request times equally long for both cases?
Note: I could check how long send_mail() needs on average and wait this time if I do not send the mail. As the app runs on different servers with different configs, this can not be generally done in my case. I would rather not store the average execution time per server in a database to solve this.

It's a common practice to use celery for tasks that require some time to be finished. Celery will run a task in a separate thread and a user doesn't need to wait while it is finished. In your specific case what will happen if you use celery:
You send a task send_mail to celery and immediately return a successful response to a user.
Celery receives a task and runs it in a separate thread.
In this way, the response time for both cases will be the same.

So this is something similar to an issue I had, and my solution was actually to always send the email, but the email reads something like You tried to reset your password, but this email isn't registered to an account with us. if they don't have an account.
From a user's perspective, it can be annoying to have to wait for an email that may or may not arrive, and spend time checking spam/junk etc. Telling them they don't have an account with that email address is quicker and cleaner for them.
We saw a big drop in users enquiring with us about why they hadn't received a PW reset email.
(Sorry for not actually answering the question, I dislike it when people do this on SO but since I experienced the same issue, I thought I'd weigh in.)

Related

django with heavy computation and long runtime - offline computation and send results

I have a django app where the user sends a request, and the server does some SQL lookup, followed by computation on results and finally showing the results to the user.
The SQL lookup and the computation afterwards can take a long time, maybe 30+ minutes. I have seen some webpages ask for email in such cases then send you the URL later. But I'm not sure how this can be done in django or whether there are other options for this situation. Any pointer will be very helpful.
(I'm sorry but as I said it's a rather general question, I don't know how can I provide a min runnable code for this)
One way to accomplish this would be to use something like Celery, which is a distributed task queue. The processing task would go into the queue (synchronously or asynchronously), and it would call a function to send an email to the user alerting them it is ready when the task is complete.
Documentation: https://docs.celeryproject.org/en/stable/django/first-steps-with-django.html

Django: Send reminder email

my app has list of events with start time (date and time). I want to make a scheduled task to send reminder via email to all user participate in event 1 hour before event start. (Note: Admin can change time of event).
I currently use celery to send email to list of participants when admin change the time of event.
Please suggest me some solution for this. Thanks.
Here's a recent(ish) discussion where a potential solution is proposed for celery: https://github.com/celery/celery/issues/4522.
I built Posthook to make solving these kinds of problems easier for developers. In your case, when a new event is created or the event time changes you can schedule a request back to your app for 1 hour before the start time. Then when you get the request from Posthook you can send out the reminder after validating that it still needs to be sent out.

Django Celery issue with multiple instances

I have two instances of Celery. I want to be able to notify all users of a particular event via email, push, etc. However, I want to make sure that all users get notified only ONCE. Is there an example of how to loop through users and gaurantee each user gets contacted once?
The solution I have is to simply mark the user as having received notification... But that would be very inefficient. And there could be a condition where the user gets notified inbetween the mark is being saved.
I tried to read the following regarding this:
http://docs.celeryproject.org/en/latest/userguide/routing.html
http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html
[EDIT]
By 2 instances I mean 1 worker on two EC2's, so 2 workers.
Have you read this one? Ensuring a task is only executed one at a time
Though I think your approach is good, but storing it a database is slow, you should cache it to put very quickly. As a simple setup, you could cache the email (hashed) before sending the email. If the cache already exist, then don't send the email.
So it would be something like
from hashlib import md5
from django.core.cache import cache
from celery import task
# 1 day because I expect that the whole email task finish within a day
# But the same email may be send out if executed on the next day.
LOCK_TIME = 24 * 60 * 60
#task
def notify_user(email):
uid = md5(email).hexdigest()
# This will return False if the uid already exists in the cache
if cache.add(uid, 'notified', LOCK_TIME):
send_mail("Subject", "Message", None, [email,])
NB: I think Deleting the cache may not necessary. Since you only care to send once till it expired.

Sustain an http connection while django processes a big request (20mins+)

I've got a django site that is producing a csv download. The content of the csv is dictated by user defined parameters. It's possible that users will set parameters that require significant thinking time on the server. I need a way of sustaining the http connection so the browser doesn't kick up an error message. I heard that it's possible to send intermittent http headers to do this. Can anyone point me in the right direction to set this up on a django site?
(unfortunatly I'm stuck with the possibility of slow reports - improving my sql won't mitigate this)
Don't do it online. Trigger an offline task, use a bit of Javascript to repeatedly call a view that checks if the task has finished, and redirect to the finished file when it's ready.
Instead of blocking the user and it's browser for 20 minutes (which is not a good idea) do the time-consuming task in the background. When the task will finish and generate the result simply notify the user so that he/she will just need to download the ready result.

Sending 1000+ emails in Django

Here is my setup right now:
connection = mail.get_connection()
maillist = []
# my real setup is a little more complex for-loop, but basicly I add all recipients to a list.
for person in object_list:
mail_subject = "Mail subject here"
mail_body = "Mail body text...bla bla"
email_sender = "me#example.com"
maillist.append((mail_subject, mail_body, email_sender, [person.email]))
#send_mass_mail wants a tuple, so we convert the list
mailtuple = tuple(maillist)
mail.send_mass_mail(mailtuple, fail_silently=False, connection=connection)
However, the forloop iterates over 1000+ objects/persons and when I try this method I'm able to send 101 emails, and then it stops. No errors (as I can see) anywhere.
A fellow developer mentioned that maybe the POST size was too big? Any ideas from the SO-community?
Your SMTP server probably has some send limits. For example, I believe Gmail limits outgoing mail to 100 recipients.
As Micah suggested, there is a good chance you are hitting server limits.
Generally, when dealing with mass mail, it is always a good idea to throttle the sending. Doing 50 mails every 5 seconds for 300 seconds beats 3000 mails at once for many practical reasons including smtp server limitations.
Since you mentioned a POST limit - do you send out the emails in a view? I'm wondering how you handle canceled requests in your setup.
I'm using a management command to send out 1000+ newsletters. But instead of send_mass_mail i use the normal send method in a loop. It takes about 5 minutes (haven't a correct count atm) to send out the mails and i haven't run into any server limits yet.
My plan is to switch to celery to handle sending through a web interface. Perhaps you want to have a look at it in case you haven't already.
http://celeryproject.org/