Sidekiq - job stuck in enqueued - ruby-on-rails-4

I have a rails 4 app. I'm using paperclip and delayed_paperclip with sidekiq. The background job is being picked up by sidekiq but it just hangs forever in the Enqueued queue.
This is what I see on sidekiq:
{"job_class"=>"DelayedPaperclip::Jobs::ActiveJob", "job_id"=>"946856ca-c90e-4bb4-9f41-1f1e59269acb", "queue_name"=>"paperclip", "arguments"=>["User", 109, "avatar"]}
In my User model I have:
process_in_background :avatar
procfile.yml
worker: bundle exec sidekiq
I have read all related issues to this question and still couldn't figure out the issue. Thoughts anyone?
Thanks.

You've enqueued the job to the paperclip queue but you have not configured Sidekiq to pull jobs from that queue:
bundle exec sidekiq -q default -q paperclip

Related

Sidekiq Upstart on AmazonLinux 2018.03

My goal is to add the sidekiq service to upstart on AmazonLinux 2018.03.
Since I want to upgrade sidekiq to version 6, There are needs to manage the process by OS like upstart.
I put a file to /etc/init/sidekiq.conf from here.
After that, initctl list | grep sidekiq command shows nothing, so I tried sudo initctl reload-configuration, but nothing changed.
status sidekiq command shows status: Unknown job: sidekiq.
What else do I need to do to add the sidekiq service to upstart?

Using Celery and Redis with my Django application

I would like to ask a question according to Celery and Redis with my Django application.
As I will explain further in details, I get some random issues with these applications.
My environment:
I'm using :
Django 1.11.20
django-redis 4.7.0 / redis >= 2.10.5
celery 4.2.1
My local context: (Ubuntu virtual machine)
I have a celery task which send an email with export file when this file is larger than 70.000 objects. The process works fine and I get the expected email with link to download my file.
Celery is started manually : celery -A main worker -l info
My dev context: (FreeBSD server)
I have exactly the same process. But celery is daemonized on my server. I can execute celery service with : service celeryd_app start
When I launch my celery task, sometimes I need to click on the button some times before to see :
Received task: app.tasks.get_xls_export[64d31ba5-73d9-4048-b19a-a4902fd904d7]
But the main issue that I have is : My task send an email with a specific email template located in /templates/email/email.html.
Sometimes it send this email template and sometimes it send an old template which doesn't exist in my project.
My question:
Is it possible that Celery/Redis has been kept in memory an old template ? There is a way to clean cache for my specific service ? Because I have other celery services on my server according to other applications.
Thank you very much !
Looks like you have pending tasks if you want to clear the pending tasks
You can do by
from main.celery import app
app.control.purge()
or you can do celery -A main purge
If you want to discard tasks of specific queue you can do
celery amqp queue.purge <queue name>

Sidekiq: How to assign process to a worker?

I'm having bit struggle in Sidekiq with multiple workers and multiple Sidekiq processes.
I'ld like to run three sidekiq processes for a environemt.
I'm having three workers class (Lets say "worker1", "worker2" and "worker3") and three sidekiq processes (Lets say "process1", "process2" and "process3").
In my current structure the workers are running on any process which is available. But what I want is the worker1 should run on process1 and worker2 should run on process2 and so on.
I'm bit confused about achieving it and I'll be glad if I know how to set a worker to a particular sidekiq process.
Sidekiq processes are:
process1: bundle exec sidekiq -q default
process2: bundle exec sidekiq -C config/myapp_sidekiq.yml
process3: bundle exec sidekiq -q process3
Thanks in advance...
You assign processes to pull jobs from queues. You assign workers to go into queues.
process1: bundle exec sidekiq -q queue1
class Worker1
sidekiq_options queue: 'queue1'
end
# will only be processed by process1
Worker1.perform_async

Cron Job issu with rails - gem whenever

I am trying to do a simple Cron task using the gem whenever for rails.
How can I tell whenever to trigger a controller action ?
What I wan to do :
every 1.minute do
runner "Mycontroller.index", :environment => 'development'
end
What i want to do is to trigger the action index in my DataController every minute. The index action trigger a mailer.
I run : whenever --update-crontab football
also when I start/restart my server I get a tinny message as follow:
You have new mail in /var/mail/Antoine
[2015-04-12 18:38:17] INFO WEBrick 1.3.1 [2015-04-12 18:38:17] INFO
ruby 2.1.3 (2014-09-19) [x86_64-darwin14.0] [2015-04-12 18:38:17] INFO
WEBrick::HTTPServer#start: pid=22476 port=3000 ^C[2015-04-12 18:38:52]
INFO going to shutdown ... [2015-04-12 18:38:52] INFO
WEBrick::HTTPServer#start done. Exiting
You have new mail in
/var/mail/Antoine
Okay I figured it out:
every 1.day, :at => '4:30 am' do
command "curl http://localhost:3000", :environment => 'development'
end
I use the command curl to got to the route where I wish to trigger a controller action.
I also understood that I need to run whenever -w to write the cron task and then I can do crontab -l to see my ongoing cron tasks and if I wan to remove cron taks I just need to run crontab -r

Sidekiq logging to both terminal and log file

I'm using Sidekiq to queue up some jobs in my Rails server. As per the Logging wiki, it's as simple as adding the following in config/sidekiq.yml
---
:verbose: false
:pidfile: ./tmp/pids/sidekiq.pid
:logfile: ./log/sidekiq.log
:concurrency: 25
However, now this only logs to that log file, what if I do want to write out to STDOUT as well (atleast in development)?
Get rid of the logfile statement.
bundle exec sidekiq | tee ./log/sidekiq.log