Is there a way to hook into Elixir's Mix built in tasks to execute a task after another one has completed?
I know you can do something similar to this.
defmodule Mix.Tasks.Other.Get
use Mix.Task
#shortdoc "Other dependencies?"
def run(_) do
Mix.Task.run("deps.get")
end
end
But I kindof want a task to run right after mix deps.get considering using make to wrap the commands that make the most sense. (ie make deps which would run both mix deps.get then mix other.get)
You can use a Mix alias for that:
defmodule MyApp.MixProject do
use Mix.Project
def project do
[
app: :my_app,
version: "1.0.0",
aliases: aliases()
]
end
defp aliases do
[
"deps.get": ["deps.get", "custom.task"]
]
end
end
Related
I have a pluggable app I'm developing for a Django system. In it, I have a task for creating notifications that looks something like so:
installable_app.tasks
#app.task(name='tasks.generate_notifications')
def generate_notifications(...):
I have a pluggable app I'm developing for a Django system. In it, I have a task for creating notifications that looks something like so:
installable_app.tasks
#app.task(name='tasks.generate_notifications')
def generate_notifications(...):
clients = get_list_of_clients()
for client in clients:
client_generate_notification.delay(client['name'], client['id'])
return "Notification Generation Complete"
#app.task
def client_generate_notification(client_name, client_id):
...
return result
Now I want this to run periodically which can be accomplished with Celery Beat using settings. I also want it to be on its own queue:
settings.py:
CELERYBEAT_SCHEDULE ={
{'generate_schedule_notifications': {
'task': 'tasks.generate_notifications',
'schedule': crontab(hour=6, minute=0),
'options': {'queue': 'notification_gen'},
'args': ('schedule', 'Equipment', 'HVAC')},
}
}
The first task, generate_notifications is run correctly on the queue notification_gen but the client_generate_notification subtasks are run on the default queue.
I know I can specify the queues specifically in the #task decorator but since this is a django app, I would rather they be specified where it is actually run.
I've looked into using the CELERY_ROUTES option but when I tried it, it seemed to overwrite the queues for other tasks I was running.
Is the best practice to define all the possible queues in CELERY_ROUTES or is there a better way to set up my task so that they will both run on the same queue?
Do you want something like it?
Given a simple custom generator:
# lib/generators/custom_model/custom_model_generator.rb
class CustomModelGenerator < Rails::Generators::NamedBase
def rails_generate_model
generate 'model', "#{file_name} #{args.join(' ')}"
end
end
Which is used like so:
$ rails generate custom_model ModelName
How can i define the destroy behavior for this custom generator?
$ rails destroy custom_model ModelName
Actually, my problem is that this generator use the generate method to use an existing rails generator. But i couldn't find any method to reverse what this generate did.
I used to use this for my own generators (which doesn't call any existing generator), and write my own "destroy" routines:
case self.behavior
when :invoke
# do that stuff
when :revoke
# undo it!
end
I red a lot about this accross the web, but nothing relevant or up-to-date. So any advices are more than welcome.
Thanks for reading.
You can use the following piece of code (of course you can replace :scaffold with any other generator):
case self.behavior
when :invoke
generate :scaffold, "#{file_name} #{attributes}"
# Or equally:
# Rails::Generators.invoke :scaffold, args, :behavior => :invoke
when :revoke
Rails::Generators.invoke :scaffold, [file_name], :behavior => :revoke
end
I want to perform some tasks in background but with something like "run as". In other words, like a task was launched by the user from the context of his session.
Something like
def perform
env['warden'].set_user(#task_owner_user)
MyService::current_user_dependent_method
end
but I'm not sure it' won't collide with other tasks. I'm not very familiar with Sidekiq.
Can I safely perform separate tasks, each with a different user context, somehow?
I'm not sure what your shooting for with the "run as" context, but I've always setup sidekiq jobs that need a unique object by passing the id in the perform. This way the worker always knows which object it is trying to work on. Maybe this is what you're looking for?
def perform id
user = User.find(id)
user.current_user_dependent_method
end
Then setup a route in a controller for triggering this worker to start, something like:
def custom_route_for_performing_job
#users= User.where(your_conditions)
#users.each do |user|
YourWorker.perform_async user.id
end
redirect_to :back, notice: "Starting background job for users_dependent_method"
end
The proper design is to use a server-side middleware + a thread local variable to set the current user context per job.
class MyServerMiddlware
def call(worker, message, queue)
Thread.current[:current_user] = message['uid'] if message['uid']
yield
ensure
Thread.current[:current_user] = nil
end
end
You'd create a client-side middleware to capture the current uid and put it in the message. In this way, the logic is encapsulated distinctly from any one type of Worker. Read more:
https://github.com/mperham/sidekiq/wiki/Middleware
Yea, I know that this question is silly, newbee and simple, but I still can't figure it out.
I've created a class (in app/minions/ directory) to parse auth hashes from 3rd-party services (like google, twitter, etc.). It looks like this.
class AuthHash
def initialize(hash)
#hash = hash
#provider = hash[:provider]
#uid = hash[:uid]
create_user_hash
end
def create_user_hash
#user_hash = send("parse_hash_from_" << #hash[:provider], #hash)
end
def credentials
{provider: #provider, uid: #uid}
end
def user_hash
#user_hash
end
private
# parse_hash_from_* methods here
end
I've added that directory to the autoload path, so I can use it in my controllers. Now I want to write some tests for it.
I'm using RSpec with FactoryGirl for testing. So I started by adding a factory to spec/factories/ called auth_hashes.rb but I can't define a hash as a field in a factory.
So I moved the declaration to the spec/minions/auth_hash_spec.rb.
require 'spec_helper'
describe AuthHash do
before_each do
auth_hash = AuthHash.new({:provider=>"google_oauth2",:uid=>"123456789",:info=>{:name=>"JohnDoe",:email=>"john#company_name.com",:first_name=>"John",:last_name=>"Doe",:image=>"https://lh3.googleusercontent.com/url/photo.jpg"},:credentials=>{:token=>"token",:refresh_token=>"another_token",:expires_at=>1354920555,:expires=>true},:extra=>{:raw_info=>{:id=>"123456789",:email=>"user#domain.example.com",:verified_email=>true,:name=>"JohnDoe",:given_name=>"John",:family_name=>"Doe",:link=>"https://plus.google.com/123456789",:picture=>"https://lh3.googleusercontent.com/url/photo.jpg",:gender=>"male",:birthday=>"0000-06-25",:locale=>"en",:hd=>"company_name.com"}}})
end
end
But still it does not seem to work.
I know this should be alot simpler then I'm trying to do, but I can't figure it out.
Add something like this to that new spec (spec/minions/auth_hash_spec.rb) file at the top:
require Rails.root.to_s + '/app/minions/myhash.rb'
And then write your tests.
How do forcibly skip a unit test in Django?
#skipif and #skipunless is all I found, but I just want to skip a test right now for debugging purposes while I get a few things straightened out.
Python's unittest module has a few decorators:
There is plain old #skip:
from unittest import skip
#skip("Don't want to test")
def test_something():
...
If you can't use #skip for some reason, #skipIf should work. Just trick it to always skip with the argument True:
#skipIf(True, "I don't want to run this test yet")
def test_something():
...
unittest docs
Docs on skipping tests
If you are looking to simply not run certain test files, the best way is probably to use fab or other tool and run particular tests.
Django 1.10 allows use of tags for unit tests. You can then use the --exclude-tag=tag_name flag to exclude certain tags:
from django.test import tag
class SampleTestCase(TestCase):
#tag('fast')
def test_fast(self):
...
#tag('slow')
def test_slow(self):
...
#tag('slow', 'core')
def test_slow_but_core(self):
...
In the above example, to exclude your tests with the "slow" tag you would run:
$ ./manage.py test --exclude-tag=slow