Here is my setup right now:
connection = mail.get_connection()
maillist = []
# my real setup is a little more complex for-loop, but basicly I add all recipients to a list.
for person in object_list:
mail_subject = "Mail subject here"
mail_body = "Mail body text...bla bla"
email_sender = "me#example.com"
maillist.append((mail_subject, mail_body, email_sender, [person.email]))
#send_mass_mail wants a tuple, so we convert the list
mailtuple = tuple(maillist)
mail.send_mass_mail(mailtuple, fail_silently=False, connection=connection)
However, the forloop iterates over 1000+ objects/persons and when I try this method I'm able to send 101 emails, and then it stops. No errors (as I can see) anywhere.
A fellow developer mentioned that maybe the POST size was too big? Any ideas from the SO-community?
Your SMTP server probably has some send limits. For example, I believe Gmail limits outgoing mail to 100 recipients.
As Micah suggested, there is a good chance you are hitting server limits.
Generally, when dealing with mass mail, it is always a good idea to throttle the sending. Doing 50 mails every 5 seconds for 300 seconds beats 3000 mails at once for many practical reasons including smtp server limitations.
Since you mentioned a POST limit - do you send out the emails in a view? I'm wondering how you handle canceled requests in your setup.
I'm using a management command to send out 1000+ newsletters. But instead of send_mass_mail i use the normal send method in a loop. It takes about 5 minutes (haven't a correct count atm) to send out the mails and i haven't run into any server limits yet.
My plan is to switch to celery to handle sending through a web interface. Perhaps you want to have a look at it in case you haven't already.
http://celeryproject.org/
Related
I have different REST-API views where I either send a mail (if an account exists) or do not send a mail.
For example, the user can input the email in the forgot-password form and a mail is sent if the account exists.
I am using from django.core.mail import send_mail to send the mail.
The problem is, that this takes some time, and so requests for valid emails are generally longer than requests for non-exiting emails.
This allows an attacker to compare the request times to find out if an account exists or not.
Is there any way that I can call send_mail() without sending the mail?
Or what would be the fix to make request times equally long for both cases?
Note: I could check how long send_mail() needs on average and wait this time if I do not send the mail. As the app runs on different servers with different configs, this can not be generally done in my case. I would rather not store the average execution time per server in a database to solve this.
It's a common practice to use celery for tasks that require some time to be finished. Celery will run a task in a separate thread and a user doesn't need to wait while it is finished. In your specific case what will happen if you use celery:
You send a task send_mail to celery and immediately return a successful response to a user.
Celery receives a task and runs it in a separate thread.
In this way, the response time for both cases will be the same.
So this is something similar to an issue I had, and my solution was actually to always send the email, but the email reads something like You tried to reset your password, but this email isn't registered to an account with us. if they don't have an account.
From a user's perspective, it can be annoying to have to wait for an email that may or may not arrive, and spend time checking spam/junk etc. Telling them they don't have an account with that email address is quicker and cleaner for them.
We saw a big drop in users enquiring with us about why they hadn't received a PW reset email.
(Sorry for not actually answering the question, I dislike it when people do this on SO but since I experienced the same issue, I thought I'd weigh in.)
I am running a mailing list with Mailman which uses Postfix to send the emails through Mailgun. I'm on Mailgun's (paid) Flex plan with a verified (non-sandbox) account.
The mailing list has about 30 members and the traffic is low. Still, when an email is sent, the log (/var/log/mail.log) contains entries with Domain ... is not allowed to send large batches yet.
postfix/smtp[33682]: 019053E815: to=<...>, relay=smtp.mailgun.org[44.231.238.210]:587,
delay=1181, delays=1179/0.02/2/0.13, dsn=4.0.0, status=deferred (host smtp.mailgun.org[44.231.238.210]
said: 421 Domain ... is not allowed to send large batches yet (in reply to DATA command))
I was unable to find any information about this error message. What's a good workaround for this issue – should the outgoing emails be throttled or should I request an increase of the batch sizes from Mailgun?
The solution was to contact Mailgun support. They raised the limits on my account and after that, the emails got delivered – the mail.log file now has status=sent (250 Great success) entries.
I recently tried my hands at the new Gmail API. And all seems to work fine except one thing. My issue is as follows:
I working on a receptionist project that may need to generate more than one email in less than a minute during busy hours. So just for testing purposes I run the following code which works fine:
if __name__ == '__main__':
service = setup() //Simply an helper function to do the basic credential check. Works fine!
print('service:'+str(service))
for counter in range(1, 10):
print('Sending message '+ str(counter))
message = create_message(<SENDER_EMAIL_ID>,<RECEIVER_EMAIL_ID>, "Email Number: "+ str(counter) , "Sample text")
response = send_message(service, 'me' , message)
print(response)
The setup() function is as follows:
credentials = get_credentials()
http = credentials.authorize(httplib2.Http())
service = discovery.build('gmail', 'v1', http=http)
Now, when I run the code say thrice consecutively in less than a minute, the code runs fine and I am able to see all the 27 emails in the sent folder of the SENDER_EMAIL_ID using a web browser. And thus Gmail API is sending all the messages through whenever a request is being made. However, only some of these emails are being received at the RECEIVER_EMAIL_ID and rest are just being dropped.
However, if I run the program with say 2-5 minutes delay then all the mails are being received.
I have no idea why this is.
Any help would be really appreciated. :)
To expound more on #ken-y-n's response in the comments section, GMail API has usage limits. Specifically for this product, Daily usage is about
1 Billion quota units / day
250 quota units / user / second
You may have encountered the rateLimitExceeded error during your tests.
Since you're sending emails thru a loop, it will cost you about 100 units when calling send (plus other costs depending on the methods you're calling). This is the reason why some emails seemed to be dropped. You can counter this by implementing exponential backoff on the messages that failed to send.
Another alternative instead of running it thru a loop, is to use Batch requests which groups your API calls together to reduce the number of HTTP connections your app making.
I ran into an issue where my mail server only accepts 100 connections to the server every 5 minutes. My current code loops over my database, calling cfmail for each person on the list. I suppose the problem is im opening a new connection each time I use cfmail?
<CFLOOP QUERY="Customer" >
<!---send mail to Admin ----->
<cfmail to = "#cstEmail#"
from = "#FORM.fromAddressEmail#"
subject = "#FORM.subjectEmail#"
server = "#var.mailserver#"
port= "#var.mailport#"
username="#var.mailuser#"
password="#var.mailpass#"
failto="#var.failEmail#
type="html"
>
What I ran into was only 100 mails were being sent at a time, the rest were sent to cf's undelivered folder. I would send them to spool and again 100 would get through..
Now, I've read in older versions of cf there is a checkbox in cf administrator to "maintain connection" -Im running cf9 and dont see this option.
Would using cfmail's query attribute, force cfmail to only connect to the mail server once to send all the emails?
<cfmail query="Customer"
from = "#FORM.fromAddressEmail#"
to = "#cstEmail#"
subject = "#FORM.subjectEmail#">
Im not even sure how to test this without sending a couple hundred emails. Any thoughts if this is a viable solution to the problem?
Thanks for your help!
Biscotti
I ended up compromising by using a scheduled task to move the files every 5 minutes back over to the Spool dir from the Undelivr dir. Im not thrilled with this solution, but it works.
Thanks to Russ's Respooler extension. http://cfrespooler.riaforge.org/
By using the above code to call the QUERY within CFMAIL I only succeeded in speeding up the client side process. The mail server still rejected the mail after the 100th connection - leading me to determine there is no server side benefit to this method over simply looping CFMAIL like in my first example. I seems the only answer is to run the code within the enterprise edition of the cf environment, one that has the "maintain connection" feature enabled.
I have a Django project that needs to send out around 600 emails. I have Celery set up and it works, for the most part. I have the Django project set up to use my Google Apps (Business version -- ie: paid for) email account as the sending account. For testing purposes, I have every email sent to me -- not to the client.
The issue I am having is that Celery seems to randomly skip people in the list. When I start the process of sending all 600 emails, Celery works away, sending emails (I can see them show up in my inbox) but I only receive a total of about 420 emails. When Celery finishes, there are still 180 or so people that need the email. If I click "send emails" again with ONLY the remaining 180 people, it will finish the job and, at the end of two attempts, will have sent emails to all 600 people.
Why would Celery be skipping people?
Yes, you will get those gmail errors and it's not particularly predictable.
You could just use django-mailer instead -- I do, and deal with those gmail connection errors by letting django-mailer automatically retry the failed sending attempts until they succeed.
Check out this SO question for more folks suggesting you just use django-mailer vs celery for mail.
Advice on Python/Django and message queues