Sidekiq job execution context - ruby-on-rails-4

I want to perform some tasks in background but with something like "run as". In other words, like a task was launched by the user from the context of his session.
Something like
def perform
env['warden'].set_user(#task_owner_user)
MyService::current_user_dependent_method
end
but I'm not sure it' won't collide with other tasks. I'm not very familiar with Sidekiq.
Can I safely perform separate tasks, each with a different user context, somehow?

I'm not sure what your shooting for with the "run as" context, but I've always setup sidekiq jobs that need a unique object by passing the id in the perform. This way the worker always knows which object it is trying to work on. Maybe this is what you're looking for?
def perform id
user = User.find(id)
user.current_user_dependent_method
end
Then setup a route in a controller for triggering this worker to start, something like:
def custom_route_for_performing_job
#users= User.where(your_conditions)
#users.each do |user|
YourWorker.perform_async user.id
end
redirect_to :back, notice: "Starting background job for users_dependent_method"
end

The proper design is to use a server-side middleware + a thread local variable to set the current user context per job.
class MyServerMiddlware
def call(worker, message, queue)
Thread.current[:current_user] = message['uid'] if message['uid']
yield
ensure
Thread.current[:current_user] = nil
end
end
You'd create a client-side middleware to capture the current uid and put it in the message. In this way, the logic is encapsulated distinctly from any one type of Worker. Read more:
https://github.com/mperham/sidekiq/wiki/Middleware

Related

How to schedule a celery task without blocking Django

I have a Django service that register lot of clients and render a payload containing a timer (lets say 800s) after which the client should be suspended by the service (Change status REGISTERED to SUSPENDED in MongoDB)
I'm running celery with rabbitmq as broker as follows:
celery/tasks.py
#app.task(bind=True, name='suspend_nf')
def suspend_nf(pk):
collection.update_one({'instanceId': str(pk)},
{'$set': {'nfStatus': 'SUSPENDED'}})
and calling the task inside Django view like:
api/views.py
def put(self, request, pk):
now = datetime.datetime.now(tz=pytz.timezone(TIME_ZONE))
timer = now + datetime.timedelta(seconds=response_data["heartBeatTimer"])
suspend_nf.apply_async(eta=timer)
response = Response(data=response_data, status=status.HTTP_202_ACCEPTED)
response['Location'] = str(request.build_absolute_uri())
What am I missing here?
Are you asking that your view blocks totally or view is waiting the "ETA" to complete the execution?
Did you receive any error?
Try using countdown parameter instead of eta.
In your case it's better because you don't need to manipulate dates.
Like this: suspend_nf.apply_async(countdown=response_data["heartBeatTimer"])
Let's see if your view will have some different behavior.
I have finally find a work around, since working on a small project, I don't really need Celery + rabbitmq a simple Threading does the job.
Task look like this :
def suspend_nf(pk, timer):
time.sleep(timer)
collection.update_one({'instanceId': str(pk)},
{'$set': {'nfStatus': 'SUSPENDED'}})
And calling inside the view like :
timer = int(response_data["heartBeatTimer"])
thread = threading.Thread(target=suspend_nf, args=(pk, timer), kwargs={})
thread.setDaemon(True)
thread.start()

django viewflow - StartFunction not assigning task owner info

Followed the answer provided to How to create a django ViewFlow process programmatically
However it is not assigning (or persisting) owner info in the activation record.
#flow_start_view
def start_process(request):
request.activation.prepare(request.POST or None,)
request.activation.flow_task.owner = request.user
request.activation.flow_task.task_title = "Start Process"
Also tried below and it is resulting in an error "'ManagedStartViewActivation' object has no attribute 'assign'"
#flow_start_view
def start_process(request):
request.activation.prepare(request.POST or None,)
request.activation.assign(request.user)
request.activation.flow_task.task_title = "Start Process"
That's hard to understand what's you are going to achieve with that. #start_flow_view is the decorator for the Django view.That means that process started manually by a user through the browser.
StartActivation class has ho assign method.
http://docs.viewflow.io/viewflow_core_activation.html#viewflow.activation.StartActivation
To assign a task means preventing an existing task to be executed by another user. Start task instances does not exist in the database. Each new start view invocation creates new process instance started with new start task instance.
If you need to track a user who performed a start task, you can directly initialize a start activation with a user instance
self.activation.prepare(request.POST or None, user=request.user)
Or just use viewflow StartFlowMixinfor your class based view.

Waiting for AJAX with Capybara + Poltergeist

I've seen various implementations of the wait_for_ajax method that makes Capybara wait until all AJAX requests are completed before moving forward.
I've just switched to using Poltergeist as my JavaScript driver and I'm having trouble getting it to wait for the AJAX to complete on a test (see below)
Below is the implementation that was previously working with Selenium - the only thing I modified was the evaluation script -
Previous: page.evaluate_script("jQuery.active")
Updated: page.evaluate_script("$.active").to_i
If I insert a sleep statement it passes because it has enough time to finish the AJAX call, so I definitely know that's the issue.
Is there any error in this approach?
Thanks!
it "user can log in", js: true do
visit root_path
click_tab("log-in")
# Fill in form
fill_in "user[email]", with: "al-horford#hawks.com"
fill_in "user[password]", with: "sl4mdunkz"
# Click submit and wait for AJAX
within("#log-in") { click_button("Log In")) }
wait_for_ajax
# Expectations
expect(current_path).to eq(home_index_path)
end
def wait_for_ajax
Timeout.timeout(Capybara.default_max_wait_time) do
loop until finished_all_ajax_requests?
end
end
def finished_all_ajax_requests?
request_count = page.evaluate_script("$.active").to_i
request_count && request_count.zero?
rescue Timeout::Error
end
In my experience, there are some situations wait_for_ajax will not wait for.
In our app, we have a 'to cart'-button that triggers Ajax. It then waits for a callback event before it loads the next page. This is not picked up by wait_for_ajax.
What fixed it, and what is generally a better approach than waiting a given amount of time with sleep(), is to verify that you're on the next page using one of Capybara's waiting finders.
Simply replace expect(current_path).to eq(home_index_path)
with something like expect(page).to have_content("content_home_page_should_have")
That should work.

Updating a value in rails

I aam trying to update one value in rails which is in sidekiq worker. But for first time whenever i restart the worker it works but after it doesn't update the value. I tried putting that thing into model also and calling that method from worker itself but same thing happened.
def update_value
self.update :compressing => false
end
name.update_value
OR
name.update_attribute(:compressing, 0)
OR
name.update_attribute(:compressing, false)
nothing seems to work after first time, but no error. Any hint will be really helpful.
Is this in a controller? You need to get the name instance first e.g.
def update_value
#name = Name.find(params[:id])
#name.update_attributes(compressing: false)
end

Rails controllers, rendering view from different controller, saving form inputs and error messages

Two controllers: Users and Tasks.
Main page for Users = Users/user_id.
Form on main page used to input data into tasks model.
This process handled by the Tasks Controller.
Successful input: redirect and load tasks from database, OK, all working.
Unsuccessful input, just need to refresh main page so we keep form input and specialised (non-flash) error messages
I can't seem to get the Tasks Controller to deliver the original page. Error is Missing template users/1(which is the correct syntax if I were to visit in my browser).
Should I be calling an action and passing params? Any help for this beginner would be really appreciated.
def create
#task = current_user.tasks.build(task_params)
if #task.save
flash[:success] = "New task created!"
redirect_to user_url(current_user)
else
flash[:error] = "Task not saved! Please see guidance by form labels"
render "users/#{current_user.id}"
end
end
private
def task_params
params.require(:task).permit(:label, :address, :content)
end
end
users/1 is not a template, it's a path. users/show is the template in this case.
mind that the only reason to use render is to render a template in the scope of your current controller action rather than the normal one.
i.e. you probably need to have #user etc set, or the users/show template will be upset about missing variables.
In this case it might be easier just to redirect_to user_path(id) and allow the users/#show controller action set up the #user variable etc.