Django post-office setup - django

Perhaps it is just because I've never set up an e-mail system on Django before, or maybe I'm missing it... but does anyone have any insight on how to properly configure django post-office for sending queued e-mails?
I've got a mailing list of 1500 + people, and am hosting my app on heroku - using the standard email system doesn't work because I need to send customized emails to each user, and to connect to the server one by one leads to a timeout.
I've installed django-post_office via pip install, installed the app in settings.py, I've even been able to get an email to send by going:
mail.send(['recipient'],'sender',subject='test',message='hi there',priority='now')
However, if I try to schedule for 30 seconds from now let's say:
nowtime = datetime.datetime.now()
sendtime = nowtime + datetime.timedelta(seconds=30)
and then
mail.send(['recipient'],'sender',subject='test',message='hi there',scheduled_time=sendtime)
Nothing happens... time passes, and the e-mail is still listed as queued, and I don't receive any emails.
I have a feeling it's because I need to ALSO have Celery / RQ / Cron set up??? But the documentation seems to suggest that it should work out of the box. What am I missing?
Thanks folks

Actually, you can find this in the documentation (at the time I'm writing this comment):
Usage
If you use post_office’s EmailBackend, it will automatically queue emails sent using django’s send_mail in the database.
To actually send them out, run python manage.py send_queued_mail. You can schedule this to run regularly via cron:
* * * * * (/usr/bin/python manage.py send_queued_mail >> send_mail.log 2>&1)

Related

Cron for sending emails after 3 days in django

I dont have any knowledge about the cron. In my project once the task is assigned to employee email is sent to his mailid. If the employee does not complete the task within deadline I want to send the mail after every 3 days to complete the task. Can anyone give me I idea what I should do. The project is on my local environment, does it support or I should take server.
apscheduler would do the job
It would look like this
bg_scheduler = BackgroundScheduler()
bg_scheduler.add_job(send_email, 'interval', days=3)
bg_scheduler.start()

Background Job and Scheduling with Resque

I have a Ruby on Rails 4.0 and PostgreSQL app hosted in an Ubuntu VPS. in this application I want to send email based on data in the database. for example a background job check a table content per hour and depend on content send email to user or not. I decided to do this work by Resque.
how can I do that?
should I do in Rails app or in an independent service?
and how can I schedule this job?
There are couple of more options I advise you to try to
1. Cron : One of most preferred approach for any unix developer to run a task based upon some interval . here are read more about
FYI: if you facing problem with understanding cron settings there are gem available to do the same for you its called whenever
2. Resque-Scheduler : Surely you missed one of Resque plugins that provide exactly the same feature that you need its called resque-scheduler . It too provide cron like settings for you to work on
Please check the above link for more info
Hope this helps.
I do not use Resque because I want a process in the Ubuntu server that in a schedule time (per hour). for example per hour check the table content and send alarm to the users by email.
I make a process by Daemon and rufus-scheduler for scheduling.
Process.daemon(true)
task_test = TaskTest.new
pid = Process.fork do
task_test.task
end
class TaskTest
def task
scheduler = Rufus::Scheduler.new
scheduler.every '1h' do
msg = "Message"
mailer = MailerProcess.new
mailer.send_mail('email-address', 'password', 'to-email', 'Subject', msg)
puts Time.now
end
scheduler.join
end
end

Idling a Worker in Heroku, with Django

I am using Django in Heroku, and in my site I send batch emails every month through Celery. Since I only use this worker once a month, I don't want to pay for it all the time. I can stop the worker using a heroku scale workers=0 and scale it back up with heroku scale workers=1 manually before and after I send my emails.
However since other non-techinical staff will sending email from django as well, they can not run these commands. Can I stop a worker by executing a command from Python in my Heroku web process? I could execute any commands before sending email.
There is a bug with heroku.py see issues: https://github.com/heroku/heroku.py/issues/10 and https://github.com/heroku/heroku.py/issues/4
I made a quick work around, which uses the http resource directly:
cloud = heroku.from_key(settings.HEROKU_APIKEY)
cloud._http_resource(method='POST',
resource=('apps', 'appnane', 'ps', 'scale'),
data={'type': 'processname', 'qty': 1})
cloud._http_resource(method='POST', resource=('apps', 'appname', 'ps', 'scale'), data={'type': 'processname', 'qty': 0})
You could do this with heroku.py the python api client. Its available on PyPi with source available at https://github.com/heroku/heroku.py
You could also use the scheduler addon and have a command that is scheduled to run once a month to send out your emails, without having to scale up a process.

Django - sending 600 emails with celery -- some are skipped?

I have a Django project that needs to send out around 600 emails. I have Celery set up and it works, for the most part. I have the Django project set up to use my Google Apps (Business version -- ie: paid for) email account as the sending account. For testing purposes, I have every email sent to me -- not to the client.
The issue I am having is that Celery seems to randomly skip people in the list. When I start the process of sending all 600 emails, Celery works away, sending emails (I can see them show up in my inbox) but I only receive a total of about 420 emails. When Celery finishes, there are still 180 or so people that need the email. If I click "send emails" again with ONLY the remaining 180 people, it will finish the job and, at the end of two attempts, will have sent emails to all 600 people.
Why would Celery be skipping people?
Yes, you will get those gmail errors and it's not particularly predictable.
You could just use django-mailer instead -- I do, and deal with those gmail connection errors by letting django-mailer automatically retry the failed sending attempts until they succeed.
Check out this SO question for more folks suggesting you just use django-mailer vs celery for mail.
Advice on Python/Django and message queues

django-notifications not sending any email

I'm trying to use django-notification and I can't get it to send any emails. The notifications appears in the Notices, but when I do python manage.py emit_notices this is what I get:
acquiring lock...
acquired.
(0.001) SELECT `notification_noticequeuebatch`.`id`, `notification_noticequeuebatch`.`pickled_data` FROM `notification_noticequeuebatch`; args=()
releasing lock...
released.
0 batches, 0 sent
done in 0.00 seconds
This is the code sending the Notice:
admin = User.objects.get(id=1)
notification.send_now([amin], "order_form_created", {"from_user": admin})
Notice settings looks right, my user is subscribed to this kind of notice. From what I understand the app looks in Notification Batches, which are always empty ..
Edit: Looks like the problem was the email server .. I used an external SMTP and it worked.
try notification.queue(..) instead of send_now(..) then python manage.py emit_notices
You are doing send_now() so why the emit_notices management command?