115

When defining a decorator using a class, how do I automatically transfer over__name__, __module__ and __doc__? Normally, I would use the @wraps decorator from functools. Here's what I did instead for a class (this is not entirely my code):

class memoized: """Decorator that caches a function's return value each time it is called. If called later with the same arguments, the cached value is returned, and not re-evaluated. """ def __init__(self, func): super().__init__() self.func = func self.cache = {} def __call__(self, *args): try: return self.cache[args] except KeyError: value = self.func(*args) self.cache[args] = value return value except TypeError: # uncacheable -- for instance, passing a list as an argument. # Better to not cache than to blow up entirely. return self.func(*args) def __repr__(self): return self.func.__repr__() def __get__(self, obj, objtype): return functools.partial(self.__call__, obj) __doc__ = property(lambda self:self.func.__doc__) __module__ = property(lambda self:self.func.__module__) __name__ = property(lambda self:self.func.__name__) 

Is there a standard decorator to automate the creation of name module and doc? Also, to automate the get method (I assume that's for creating bound methods?) Are there any missing methods?

1

6 Answers 6

93

Everyone seems to have missed the obvious solution. Using functools.update_wrapper:

>>> import functools >>> class memoized(object): """Decorator that caches a function's return value each time it is called. If called later with the same arguments, the cached value is returned, and not re-evaluated. """ def __init__(self, func): self.func = func self.cache = {} functools.update_wrapper(self, func) ## TA-DA! ## def __call__(self, *args): pass # Not needed for this demo. >>> @memoized def fibonacci(n): """fibonacci docstring""" pass # Not needed for this demo. >>> fibonacci <__main__.memoized object at 0x0156DE30> >>> fibonacci.__name__ 'fibonacci' >>> fibonacci.__doc__ 'fibonacci docstring' 
Sign up to request clarification or add additional context in comments.

5 Comments

The __name__ and __doc__ are set on the instance, but not the class, which is what is always used by help(instance). To fix, a class-based decorator implementation cannot be used, and instead the decorator must be implemented as a function. For details see stackoverflow.com/a/25973438/1988505.
I'm not sure why my answer was suddenly marked down yesterday. No one asked about getting help() to work. In 3.5, inspect.signature() and inspect.from_callable() got a new follow_wrapped option; perhaps help() should do the same?
Fortunately ipython's fibonacci? does show both the doc from the wrapper, and the memoized class so you get both
This does not produce picklable class decorators
Update: Starting from Python 3.9, help() now shows the wrapped function's docstring, though it still lacks the argument list
30

I'm not aware of such things in stdlib, but we can create our own if we need to.

Something like this can work :

from functools import WRAPPER_ASSIGNMENTS def class_wraps(cls): """Update a wrapper class `cls` to look like the wrapped.""" class Wrapper(cls): """New wrapper that will extend the wrapper `cls` to make it look like `wrapped`. wrapped: Original function or class that is beign decorated. assigned: A list of attribute to assign to the the wrapper, by default they are: ['__doc__', '__name__', '__module__', '__annotations__']. """ def __init__(self, wrapped, assigned=WRAPPER_ASSIGNMENTS): self.__wrapped = wrapped for attr in assigned: setattr(self, attr, getattr(wrapped, attr)) super().__init__(wrapped) def __repr__(self): return repr(self.__wrapped) return Wrapper 

Usage:

@class_wraps class memoized: """Decorator that caches a function's return value each time it is called. If called later with the same arguments, the cached value is returned, and not re-evaluated. """ def __init__(self, func): super().__init__() self.func = func self.cache = {} def __call__(self, *args): try: return self.cache[args] except KeyError: value = self.func(*args) self.cache[args] = value return value except TypeError: # uncacheable -- for instance, passing a list as an argument. # Better to not cache than to blow up entirely. return self.func(*args) def __get__(self, obj, objtype): return functools.partial(self.__call__, obj) @memoized def fibonacci(n): """fibonacci docstring""" if n in (0, 1): return n return fibonacci(n-1) + fibonacci(n-2) print(fibonacci) print("__doc__: ", fibonacci.__doc__) print("__name__: ", fibonacci.__name__) 

Output:

<function fibonacci at 0x14627c0> __doc__: fibonacci docstring __name__: fibonacci 

EDIT:

And if you are wondering why this wasn't included in the stdlib is because you can wrap your class decorator in a function decorator and use functools.wraps like this:

def wrapper(f): memoize = memoized(f) @functools.wraps(f) def helper(*args, **kws): return memoize(*args, **kws) return helper @wrapper def fibonacci(n): """fibonacci docstring""" if n <= 1: return n return fibonacci(n-1) + fibonacci(n-2) 

8 Comments

Thanks mouad. Do you know what the purpose of the __get__ method is?
Oh, I see: it makes the decorator work with methods? It probably should be in class_wraps then?
@Neil: Yes For more detail: stackoverflow.com/questions/5469956/… , IMO i don't think so because it will violate one of the principles that i believe in for function or class which is unique responsibility , which in the case of class_wraps will be to Update a wrapper class to look like the wrapped. no less not more :)
@mouad: Thanks a lot. I have a couple more questions (for you or anyone else) if you don't mind: 1. Isn't it true that we will want to override __get__ for all "callable class" decorators? 2. Why do we use functools.partial instead of returning a bound method with types.MethodType(self.__call__, obj)?
@Neil: 1. Yes if you want to be able to decorate also methods (not just functions) as you already said , and i strongly believe that it's a good practice to implement also the _get__ method for class decorator so to not have any weird problems after :) 2. I think it just a a question of preference the beauty is in the eye of the beholder right , i prefer to use functools.partial in cases like this one and mostly i use types.* to test the types of an object, Hope i answer your questions :)
|
28

Turns out there's a straightforward solution using functools.wraps itself:

import functools def dec(cls): @functools.wraps(cls, updated=()) class D(cls): decorated = 1 return D @dec class C: """doc""" print(f'{C.__name__=} {C.__doc__=} {C.__wrapped__=}') 
$ python3 t.py C.__name__='C' C.__doc__='doc' C.__wrapped__=<class '__main__.C'> 

Note that updated=() is needed to prevent an attempt to update the class's __dict__ (this output is without updated=()):

$ python t.py Traceback (most recent call last): File "t.py", line 26, in <module> class C: File "t.py", line 20, in dec class D(cls): File "/usr/lib/python3.8/functools.py", line 57, in update_wrapper getattr(wrapper, attr).update(getattr(wrapped, attr, {})) AttributeError: 'mappingproxy' object has no attribute 'update' 

3 Comments

I don't understand what this has to do with my question. How do I implement memoized as a class and provide the wrapping functionality?
@NeilG if it makes you feel better, this at least answered my question, which mapped to the title of the OP question but not the specific memoize example.
This solved my problem of searching an equivalent of functools.wraps for clases
5

I needed something that would wrap both classes and functions and wrote this:

def wrap_is_timeout(base): '''Adds `.is_timeout=True` attribute to objects returned by `base()`. When `base` is class, it returns a subclass with same name and adds read-only property. Otherwise, it returns a function that sets `.is_timeout` attribute on result of `base()` call. Wrappers make best effort to be transparent. ''' if inspect.isclass(base): class wrapped(base): is_timeout = property(lambda _: True) for k in functools.WRAPPER_ASSIGNMENTS: v = getattr(base, k, _MISSING) if v is not _MISSING: try: setattr(wrapped, k, v) except AttributeError: pass return wrapped @functools.wraps(base) def fun(*args, **kwargs): ex = base(*args, **kwargs) ex.is_timeout = True return ex return fun 

1 Comment

Side note, I invite everybody to use this .is_timeout=True idiom to mark your timeout caused errors and accept this API from other packages.
1

All we really need to do is modify the behavior of the decorator so that it is "hygienic", i.e. it is attribute-preserving.

#!/usr/bin/python3 def hygienic(decorator): def new_decorator(original): wrapped = decorator(original) wrapped.__name__ = original.__name__ wrapped.__doc__ = original.__doc__ wrapped.__module__ = original.__module__ return wrapped return new_decorator 

This is ALL you need. In general. It doesn't preserve the signature, but if you really want that you can use a library to do that. I also went ahead and rewrote the memoization code so that it works on keyword arguments as well. Also there was a bug where failure to convert it to a hashable tuple would make it not work in 100% of cases.

Demo of rewritten memoized decorator with @hygienic modifying its behavior. memoized is now a function that wraps the original class, though you can (like the other answer) write a wrapping class instead, or even better, something which detects if it's a class and if so wraps the __init__ method.

@hygienic class memoized: def __init__(self, func): self.func = func self.cache = {} def __call__(self, *args, **kw): try: key = (tuple(args), frozenset(kw.items())) if not key in self.cache: self.cache[key] = self.func(*args,**kw) return self.cache[key] except TypeError: # uncacheable -- for instance, passing a list as an argument. # Better to not cache than to blow up entirely. return self.func(*args,**kw) 

In action:

@memoized def f(a, b=5, *args, keyword=10): """Intact docstring!""" print('f was called!') return {'a':a, 'b':b, 'args':args, 'keyword':10} x=f(0) #OUTPUT: f was called! print(x) #OUTPUT: {'a': 0, 'b': 5, 'keyword': 10, 'args': ()} y=f(0) #NO OUTPUT - MEANS MEMOIZATION IS WORKING print(y) #OUTPUT: {'a': 0, 'b': 5, 'keyword': 10, 'args': ()} print(f.__name__) #OUTPUT: 'f' print(f.__doc__) #OUTPUT: 'Intact docstring!' 

1 Comment

The @hygienic does not work for code where the wrapped decorator class has a class attribute. Mouad's solution works though. The problem reported is: AttributeError: 'function' object has no attribute 'level' when trying to do decoratorclassname.level += 1 inside the __call__
0

Another solution using inheritance:

import functools import types class CallableClassDecorator: """Base class that extracts attributes and assigns them to self. By default the extracted attributes are: ['__doc__', '__name__', '__module__']. """ def __init__(self, wrapped, assigned=functools.WRAPPER_ASSIGNMENTS): for attr in assigned: setattr(self, attr, getattr(wrapped, attr)) super().__init__() def __get__(self, obj, objtype): return types.MethodType(self.__call__, obj) 

And, usage:

class memoized(CallableClassDecorator): """Decorator that caches a function's return value each time it is called. If called later with the same arguments, the cached value is returned, and not re-evaluated. """ def __init__(self, function): super().__init__(function) self.function = function self.cache = {} def __call__(self, *args): try: return self.cache[args] except KeyError: value = self.function(*args) self.cache[args] = value return value except TypeError: # uncacheable -- for instance, passing a list as an argument. # Better to not cache than to blow up entirely. return self.function(*args) 

4 Comments

The reason you shouldn't use this is because, as you show, you have to call the __init__ method of the parent classes (not necessarily just super(); you should google for method resolution order python).
@ninjagecko: Isn't it up to the super class to call the __init__ method of the other parent classes?
It somewhat of an open question, as far as I know, I may be wrong though. fuhm.net/super-harmful Also stackoverflow.com/questions/1385759/… does not seem to indicate any consensus.
@ninjagecko: Yes, I've read the first article. What I've been doing is to always call super().__init__ from every class no matter what. This way I can count on all __init__ methods being called as long as everyone I inherit from does this. Unfortunately, I've discovered that PyQt classes don't do this. I really thought that this was how co-operative inheritance had to work, but from what you're saying it sounds like I might be the only one!

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.