Background Job and Scheduling with Resque - ruby-on-rails-4

I have a Ruby on Rails 4.0 and PostgreSQL app hosted in an Ubuntu VPS. in this application I want to send email based on data in the database. for example a background job check a table content per hour and depend on content send email to user or not. I decided to do this work by Resque.
how can I do that?
should I do in Rails app or in an independent service?
and how can I schedule this job?

There are couple of more options I advise you to try to
1. Cron : One of most preferred approach for any unix developer to run a task based upon some interval . here are read more about
FYI: if you facing problem with understanding cron settings there are gem available to do the same for you its called whenever
2. Resque-Scheduler : Surely you missed one of Resque plugins that provide exactly the same feature that you need its called resque-scheduler . It too provide cron like settings for you to work on
Please check the above link for more info
Hope this helps.

I do not use Resque because I want a process in the Ubuntu server that in a schedule time (per hour). for example per hour check the table content and send alarm to the users by email.
I make a process by Daemon and rufus-scheduler for scheduling.
Process.daemon(true)
task_test = TaskTest.new
pid = Process.fork do
task_test.task
end
class TaskTest
def task
scheduler = Rufus::Scheduler.new
scheduler.every '1h' do
msg = "Message"
mailer = MailerProcess.new
mailer.send_mail('email-address', 'password', 'to-email', 'Subject', msg)
puts Time.now
end
scheduler.join
end
end

Related

Rails how to schedule a rake email task from Development environment through Mailer for different timezones

I need to add a new email that can be sent to each user in the morning.
We have a multi-tenant per client system, and each client has multiple users with their respective timezones.
I have created a Rake task which loops through each user and sends the email.
Is there any way I can schedule the task for each user at a particular time(say 10 a.m.) in their respective timezones? Can I add a trigger?
I do not want to preferably use another gem, and I'm using Google cloud platform server.
You might be able to use ActiveJob for this, but you would still need something external to kick off the rake task (like cron). With ActiveJob you could do something like:
class EmailJob < ApplicationJob
def perform
hour_offset = # Your calc to figure out the hours between the user's local time and your server time
schedule_job(hour_offset)
do_work
end
def do_work
# Your mailer code
end
def schedule_job(hour_offset:)
self.class.set(wait: hour_offset.hours).perform_later
end
end
Depending on your setup, you may want to install an additional queue gem like DelayedJob or Requeue unless you're ok using the default

Cron for sending emails after 3 days in django

I dont have any knowledge about the cron. In my project once the task is assigned to employee email is sent to his mailid. If the employee does not complete the task within deadline I want to send the mail after every 3 days to complete the task. Can anyone give me I idea what I should do. The project is on my local environment, does it support or I should take server.
apscheduler would do the job
It would look like this
bg_scheduler = BackgroundScheduler()
bg_scheduler.add_job(send_email, 'interval', days=3)
bg_scheduler.start()

Django app with multiple instances - how to ensure daily email is only sent once?

I am building a Django app that uses APScheduler to send out a daily email at a scheduled time each day. Recently the decision was made to bump up the number of instances to two in order to always have something running in case one of the instances crashes. The problem I am now facing is how to prevent the daily email from being sent out by both instances. I've considered having it set some sort of flag on the database (Postgres) so the other instance knows not to send, but I think this method would create race conditions--the first instance wouldn't set the flag in time for the second instance to see or some similar scenario. Has anybody come up against this problem and how did you resolve it?
EDIT:
def start():
scheduler = BackgroundScheduler()
scheduler.add_job(send_daily_emails, 'cron', hour=11)
scheduler.start()
So this is run when my app initializes--this creates a background scheduler that runs the send_daily_emails function at 11am each morning. The send_daily_emails function is exactly that--all it does is send a couple of emails. My problem is that if there are two instances of the app running, two separate background schedulers will be created and thus the emails will be sent twice each day instead of once.
You can use your proposed database solution with select_for_update
If you're using celery, why not use celery-beat + django-celery-beat?
You can use something like the following. Note the max_instances param.
def start():
scheduler = BackgroundScheduler()
scheduler.add_job(send_daily_emails, trigger='cron', hour='23', max_instances=1)
scheduler.start()

How do I poll a web service from a GAE service in short intervals?

I'm developing a client app that relies on a GAE service. This service needs to get updates by polling a remote web service on a less than 1 minute interval so cron jobs are probably not the way to go here.
From the GAE service I need to poll the web service in intervals of a couple of seconds and then update the client app. So to break it down:
GAE service polls the remote web service in 5 sec intervals.
If a change is made, update the client app instantly.
Step 2 is solved already, but I'm struggling to find a good way on a polling of this sort. I have no control over the remote web service so I can't make any changes on that end.
I've looked at the Task queue API but the documentation specifically says that it is unsuitable for interactive applications where a user is waiting for the result
How would be the best way to solve this issue?
Use cron to schedule a bunch of taskqueue tasks with staggered etas
def cron_job(): # scheduled to run every 5 minutes
for i in xrange(0, 60*5, 5):
deferred.defer(poll_web_service, _countdown=i)
def poll_web_service():
# do stuff
Alternatively, with this level of frequency, you might as well have a dedicated instance on this. You can do this with manual-scaling microservice and you can have the request handler for /_ah/start/ never return, which will let it run forever (besides having periodic restarts). See this: https://cloud.google.com/appengine/docs/standard/python/how-instances-are-managed#instance_scaling
def on_change_detected(params):
queue = taskqueue.Queue('default')
task = taskqueue.Task(
url='/some-url-on-your-default-service/',
countdown=0,
target='default',
params={'params': params})
queue.add(task)
class Start(webapp2.RequestHandler):
def get(self):
while True:
time.sleep(5)
if change_detected: # YOUR LOGIC TO DETECT A CHANGE GOES HERE
on_change_detected()
_routes = [
RedirectRoute('/_ah/start', Start, name='start'),
]
for r in _routes:
app.router.add(r)

django celery rabbitmq execute delay

I use Django-Celery +rabbitmq to execute some asyn tasks,I define a queue 'sendmail' to execute send email task,send mail is triggered by a specific task(this task has own queue), but now I encounter a problem,after the specific task finish, the mail sometimes send at once, sometimes need 5-20minutes.I want to know what reason caused it.
Django-celery will package the taskname and param as message to rabbitmq when call task.delay().
I want to know when the message go to the rabbitmq, but use web management tool only can see total messages,can't see the every message's detail, especially the time the message reached. Django-celery log can only see the work got from broker time and execute task time.I want to know all related timepoint to sure which step the time main consumed.
Django-Celery does (I believe) report task data on a per-task basis. When you sync your database, it crates a bunch of monitoring tables which are accessible via the admin. However, in order for these tasks to be recorded in these tables, you need to run the celerycam program in the django context (python ./manage.py celerycam). The celerycam program will take "snapshots" of your tasks every second or so (by default) and record information about them. Another useful tool for monitoring is the celerymon program (which also has to run in the django context). This is a command line ncurses program that reports real-time information about tasks as they occur. Finally, rabbitmqctrl has a bunch of options that might help with monitoring.
This is a particularly useful page in the docs:
http://celery.github.com/celery/userguide/monitoring.html
Anyway, this is what I use to monitor my tasks when using celery.