How to pass parameters to hooks in python grequests - python-2.7

According the Requests documentation, event hooks can be added to .get() function.
requests.get('http://httpbin.org', hooks=dict(response=print_url))
def print_url(r, *args, **kwargs):
print(r.url)
This is fine but how to set *args with custom parameters, for example, I want to pass some custom values to print_url(), how to set those in *args ? Something like this fails :
args = ("search_item", search_item)
rs = (grequests.get(u, hooks={'response': [parse_books],'args': [args]}) for u in urls)

You cannot specify extra arguments to pass to a response hook. If you want extra information to be specified you should make a function that you call which then returns a function to be passed as a hook, e.g.,
def hook_factory(*factory_args, **factory_kwargs):
def response_hook(response, *request_args, **request_kwargs):
# use factory_kwargs
# etc.
return None # or the modified response
return response_hook
grequests.get(u, hooks={'response': [hook_factory(search_item=search_item)]})

the response_hook function has to return a response-object. the simplest workaround would be
to modify the respose object you get from the hook_factory calling the hook.
def response_hook(response, *request_args, **request_kwargs):
# use factory_kwargs
# etc.
response.meta1='meta1' #add data
response.meta2='meta2'
#etc.
return response# or the modified response
return response_hook
hope this helps.

Related

Can I pass arguments to a function in monkeypatch.setattr for a function used multiple times in one view?

My web application makes API calls to Spotify. In one of my Flask views I use the same method with different endpoints. Specifically:
sh = SpotifyHelper()
...
#bp.route('/profile', methods=['GET', 'POST'])
#login_required
def profile():
...
profile = sh.get_data(header, 'profile_endpoint')
...
playlist = sh.get_data(header, 'playlist_endpoint')
...
# There are 3 more like this to different endpoints -- history, top_artists, top_tracks
...
return render_template(
'profile.html',
playlists=playlists['items'],
history=history['items'],
...
)
I do not want to make an API call during testing so I wrote a mock.json that replaces the JSON response from the API. I have done this successfully when the method is only used once per view:
class MockResponse:
#staticmethod
def profile_response():
with open(path + '/music_app/static/JSON/mock.json') as f:
response = json.load(f)
return response
#pytest.fixture
def mock_profile(monkeypatch):
def mock_json(*args, **kwargs):
return MockResponse.profile_response()
monkeypatch.setattr(sh, "get_data", mock_json)
My problem is that I need to call get_data to different endpoints with different responses. My mock.json is written:
{'playlists': {'items': [# List of playlist data]},
'history': {'items': [# List of playlist data]},
...
So for each API endpoint I need something like
playlists = mock_json['playlists']
history = mock_json['history']
I can write mock_playlists(), mock_history(), etc., but how do I write a monkeypatch for each? Is there some way to pass the endpoint argument to monkeypatch.setattr(sh, "get_data", mock_???)?
from unittest.mock import MagicMock
#other code...
mocked_response = MagicMock(side_effect=[
# write it in the order of calls you need
profile_responce_1, profile_response_2 ... profile_response_n
])
monkeypatch.setattr(sh, "get_data", mocked_response)

Use pytest fixture in a function decorator

I want to build a decorator for my test functions which has several uses. One of them is helping to add properties to the generated junitxml.
I know there's a fixture built-in pytest for this called record_property that does exactly that. How can I use this fixture inside my decorator?
def my_decorator(arg1):
def test_decorator(func):
def func_wrapper():
# hopefully somehow use record_property with arg1 here
# do some other logic here
return func()
return func_wrapper
return test_decorator
#my_decorator('some_argument')
def test_this():
pass # do actual assertions etc.
I know I can pass the fixture directly into every test function and use it in the tests, but I have a lot of tests and it seems extremely redundant to do this.
Also, I know I can use conftest.py and create a custom marker and call it in the decorator, but I have a lot of conftest.py files and I don't manage all of them alone so I can't enforce it.
Lastly, trying to import the fixture directly in to my decorator module and then using it results in an error - so that's a no go also.
Thanks for the help
It's a bit late but I came across the same problem in our code base. I could find a solution to it but it is rather hacky, so I wouldn't give a guarantee that it works with older versions or will prevail in the future.
Hence I asked if there is a better solution. You can check it out here: How to use pytest fixtures in a decorator without having it as argument on the decorated function
The idea is to basically register the test functions which are decorated and then trick pytest into thinking they would require the fixture in their argument list:
class RegisterTestData:
# global testdata registry
testdata_identifier_map = {} # Dict[str, List[str]]
def __init__(self, testdata_identifier, direct_import = True):
self.testdata_identifier = testdata_identifier
self.direct_import = direct_import
self._always_pass_my_import_fixture = False
def __call__(self, func):
if func.__name__ in RegisterTestData.testdata_identifier_map:
RegisterTestData.testdata_identifier_map[func.__name__].append(self.testdata_identifier)
else:
RegisterTestData.testdata_identifier_map[func.__name__] = [self.testdata_identifier]
# We need to know if we decorate the original function, or if it was already
# decorated with another RegisterTestData decorator. This is necessary to
# determine if the direct_import fixture needs to be passed down or not
if getattr(func, "_decorated_with_register_testdata", False):
self._always_pass_my_import_fixture = True
setattr(func, "_decorated_with_register_testdata", True)
#functools.wraps(func)
#pytest.mark.usefixtures("my_import_fixture") # register the fixture to the test in case it doesn't have it as argument
def wrapper(*args: Any, my_import_fixture, **kwargs: Any):
# Because of the signature of the wrapper, my_import_fixture is not part
# of the kwargs which is passed to the decorated function. In case the
# decorated function has my_import_fixture in the signature we need to pack
# it back into the **kwargs. This is always and especially true for the
# wrapper itself even if the decorated function does not have
# my_import_fixture in its signature
if self._always_pass_my_import_fixture or any(
"my_import_fixture" in p.name for p in signature(func).parameters.values()
):
kwargs["my_import_fixture"] = my_import_fixture
if self.direct_import:
my_import_fixture.import_all()
return func(*args, **kwargs)
return wrapper
def pytest_collection_modifyitems(config: Config, items: List[Item]) -> None:
for item in items:
if item.name in RegisterTestData.testdata_identifier_map and "my_import_fixture" not in item._fixtureinfo.argnames:
# Hack to trick pytest into thinking the my_import_fixture is part of the argument list of the original function
# Only works because of #pytest.mark.usefixtures("my_import_fixture") in the decorator
item._fixtureinfo.argnames = item._fixtureinfo.argnames + ("my_import_fixture",)

Removing boilerplate logging setup

I'm seeing this very common code I'm using to setup logging.
def has_host_running(self):
log = CustomLogger.action_logger(name=sys._getframe().f_code.co_name, **self.menvargs)
result = self.bash_query.check_is_server_available(log)
self.results[sys._getframe().f_code.co_name] = result
log.debug('result: {}'.format(result))
return result
Looking for dry way to implement this behavior.
Key is that I need to be able to reference/call a log statement from within the function and any child function calls.
******************************edit2*******************************
The most elegant sloppy solution I could get to work.
Loosely adapted from: https://wiki.python.org/moin/PythonDecoratorLibrary#Controllable_DIY_debug
heavily based on https://wiki.python.org/moin/PythonDecoratorLibrary#Controllable_DIY_debug
class ActionLog:
def init(self):
pass
def __call__(self, f):
log = self.get_actionlogger(name=f.func_name)
def newf(log, *args, **kwds):
# pre-function call actions:
log.debug('Start.')
log.debug(' info: params= {args}, {kwds}'.format(args=args, kwds=kwds))
# function call
f_result = f(log, *args, **kwds)
# post-function call actions:
log.debug(' info: result= {result}'.format(result=f_result))
log.debug('Complete.')
return f_result
# changes to be made to returned function
newf.__doc__ = f.__doc__
return newf(log)
def get_actionlogger(self, name, **kwargs):
import logging
import ast
from Helper import ConfigManager
logname = 'action.{func_name}'.format(func_name=name)
logger = logging.getLogger(logname)
# value stored in ini file.
# either DEBUG or ERROR right now.
# todo: store actual logging_level
# todo: store an array/dict for log_name in .ini
# this will allow multiple parameters to be stored within the single entry.
# ex:
# action.check_stuff: logging_level=DEBUG,handler_stream=TRUE,handler_file=stuff.log,formatter='{name} - {message}
conf_logging_level = ConfigManager('config.ini').get_section_dict('CustomLogging_Level').get(logname, 'DEBUG')
logging_level = logging.DEBUG
if conf_logging_level == 'DEBUG':
logging_level = logging.DEBUG
if conf_logging_level == 'ERROR':
logging_level = logging.ERROR
logger.setLevel(logging_level)
# very hacky
# while logging.getLogger is a singleton, adding the handler is not.
# without this check, this code will result in duplicate handlers added.
# currently will not edit/replace the existing handler.
# currently will not allow another handler to be added after the first.
# main issue here is that I can't figure out how to determine labels/names within logger.handlers
# todo: properly label handler
# todo: check for existing labels & types (file, stream, etc)
if len(logger.handlers) == 0:
ch = logging.StreamHandler()
ch.setLevel(logging_level)
ch.set_name = logname
# create formatter
formatter = logging.Formatter(' %(name)s - %(message)s')
ch.setFormatter(formatter)
logger.addHandler(ch)
return logger
#ActionLog()
def check_stuff(log, *args, **kwds):
result = True
log.debug(' info: text call from within function.')
return result
if check_stuff:
print 'check_stuff is true.'
So it works with the one parameter "log" being passed into the class. does not work if the class does not have the log parameter. I'm not sure how to handle if there are further parameters... likely with *args or **kwargs, but this solution doesn't handle that.
Apologies on the code formatting... I can't seem to get the class decorator in the same block as the decorated func and func call.
* v3.0 *
v2 had a problem with multiple arguments. Solved that and streamlined v3 quite a bit.
def ActionLog(wrapped):
def _wrapper(*args, **kwargs):
log = CustomLogger.action_logger(wrapped.func_name)
newargs = list()
for a in args:
newargs.append(a)
newargs.append(log)
f = wrapped(*newargs, **kwargs)
if f is None:
f = ''
else:
f = '\tResult: {}'.format(f)
log.debug('Complete.{}'.format(f))
return f
return _wrapper
This works better and has replaced most of my boilerplate logging calls for actions that take a single argument.
Still having problems with named args vs kwargs. I'd like to just pass args through and add my custom items to kwargs, but that had a few issues.

Using tastypie api from other views

I am calling tastypie api from normal django views.
def test(request):
view = resolve("/api/v1/albumimage/like/user/%d/" % 2 )
accept = request.META.get("HTTP_ACCEPT")
accept += ",application/json"
request.META["HTTP_ACCEPT"] = accept
res = view.func(request, **view.kwargs)
return HttpResponse(res._container)
Using tastypie resource in view
Call an API on my server from another view
achieve the same thing but seems harder.
Is my way of calling api acceptable?
Besides, it would be awesome if I could get the result in python dictionary instead of json.
Is it possible?
If you need a dictionary, it means that you must design your application better. Don't do important stuff in your views, nor in the Tastypie methods. Refactor it to have common funcionality.
As a general rule, views must be small. No more than 15 lines. That makes the code readable, reusable and easy to test.
I'll provide an example to make it clearer, suppose in that Tastypie method you must be creating a Like object, maybe sending a signal:
class AlbumImageResource(ModelResource):
def like_method(self, request, **kwargs):
# Do some method checking
Like.objects.create(
user=request.user,
object=request.data.get("object")
)
signals.liked_object(request.user, request.data.get("object"))
# Something more
But, if you need to reuse that behavior in a view, the proper thing would be to factorize that in a different function:
# myapp.utils
def like_object(user, object):
like = Like.objects.create(
user=request.user,
object=request.data.get("object")
)
signals.liked_object(request.user, request.data.get("object"))
return like
Now you can call it from your API method and your view:
class AlbumImageResource(ModelResource):
def like_method(self, request, **kwargs):
# Do some method checking
like_object(request.user, request.data.get("object")) # Here!
And in your view...
# Your view
def test(request, object_id):
obj = get_object_or_404(Object, id=object_id)
like_object(request.user, obj)
return HttpResponse()
Hope it helps.

How to write a request filter / preprocessor in Django

I am writing an application in Django, which uses [year]/[month]/[title-text] in the url to identitfy news items. To manage the items I have defined a number of urls, each starting with the above prefix.
urlpatterns = patterns('msite.views',
(r'^(?P<year>[\d]{4})/(?P<month>[\d]{1,2})/(?P<slug>[\w]+)/edit/$', 'edit'),
(r'^(?P<year>[\d]{4})/(?P<month>[\d]{1,2})/(?P<slug>[\w]+)/$', 'show'),
(r'^(?P<year>[\d]{4})/(?P<month>[\d]{1,2})/(?P<slug>[\w]+)/save$', 'save'),
)
I was wondering, if there is a mechanism in Django, which allows me to preprocess a given request to the views edit, show and save. It could parse the parameters e.g. year=2010, month=11, slug='this-is-a-title' and extract a model object out of them.
The benefit would be, that I could define my views as
def show(news_item):
'''does some stuff with the news item, doesn't have to care
about how to extract the item from request data'''
...
instead of
def show(year, month, slug):
'''extract the model instance manually inside this method'''
...
What is the Django way of solving this?
Or in a more generic way, is there some mechanism to implement request filters / preprocessors such as in JavaEE and Ruby on Rails?
You need date based generic views and create/update/delete generic views maybe?
One way of doing this is to write a custom decorator. I tested this in one of my projects and it worked.
First, a custom decorator. This one will have to accept other arguments beside the function, so we declare another decorator to make it so.
decorator_with_arguments = lambda decorator: lambda * args, **kwargs: lambda func: decorator(func, *args, **kwargs)
Now the actual decorator:
#decorator_with_arguments
def parse_args_and_create_instance(function, klass, attr_names):
def _function(request, *args, **kwargs):
model_attributes_and_values = dict()
for name in attr_names:
value = kwargs.get(name, None)
if value: model_attributes_and_values[name] = value
model_instance = klass.objects.get(**model_attributes_and_values)
return function(model_instance)
return _function
This decorator expects two additional arguments besides the function it is decorating. These are respectively the model class for which the instance is to be prepared and injected and the names of the attributes to be used to prepare the instance. In this case the decorator uses the attributes to get the instance from the database.
And now, a "generic" view making use of a show function.
def show(model_instance):
return HttpResponse(model_instance.some_attribute)
show_order = parse_args_and_create_instance(Order, ['order_id'])(show)
And another:
show_customer = parse_args_and_create_instance(Customer, ['id'])(show)
In order for this to work the URL configuration parameters must contain the same key words as the attributes. Of course you can customize this by tweaking the decorator.
# urls.py
...
url(r'^order/(?P<order_id>\d+)/$', 'show_order', {}, name = 'show_order'),
url(r'^customer/(?P<id>\d+)/$', 'show_customer', {}, name = 'show_customer'),
...
Update
As #rebus correctly pointed out you also need to investigate Django's generic views.
Django is python after all, so you can easily do this:
def get_item(*args, **kwargs):
year = kwargs['year']
month = kwargs['month']
slug = kwargs['slug']
# return item based on year, month, slug...
def show(request, *args, **kwargs):
item = get_item(request, *args, **kwargs)
# rest of your logic using item
# return HttpResponse...