50

I would like to define some generic decorators to check arguments before calling some functions.

Something like:

@checkArguments(types = ['int', 'float']) def myFunction(thisVarIsAnInt, thisVarIsAFloat) ''' Here my code ''' pass 

Side notes:

  1. Type checking is just here to show an example
  2. I'm using Python 2.7 but Python 3.0 whould be interesting too

EDIT 2021: funny that type checking did not go antipythonic in the long run with type hinting and mypy.

14
  • 5
    As a note, this is generally a really bad idea - it goes against the grain of Python. Type checking is a bad thing in almost all cases. It's also worth noting that it might make more sense to use argument annotations to do this if you are in 3.x. Commented Mar 8, 2013 at 17:35
  • 6
    @Lattyware: Enforcing function arguments and return types is one of examples in the original pep for decorators Commented Mar 8, 2013 at 17:43
  • 1
    @Lattyware: it's unpythonic, but if you really want to do it, decorator and argument annotation is the best way to do it. Commented Mar 8, 2013 at 17:57
  • 4
    You guys are trolling right ? or Is it me becoming quite touchy on this philosophical point ? ;) Commented Mar 8, 2013 at 18:00
  • 1
    I'm not trolling, I'm just making the point that most of the time, if you are type checking, you are doing it wrong, and would be better off doing it another way. It's really common to see people on SO type checking and producing inflexible functions that don't work as well or as efficiently thanks to type checking. Commented Mar 8, 2013 at 18:00

10 Answers 10

55

From the Decorators for Functions and Methods:

Python 2

def accepts(*types): def check_accepts(f): assert len(types) == f.func_code.co_argcount def new_f(*args, **kwds): for (a, t) in zip(args, types): assert isinstance(a, t), \ "arg %r does not match %s" % (a,t) return f(*args, **kwds) new_f.func_name = f.func_name return new_f return check_accepts 

Python 3

In Python 3 func_code has changed to __code__ and func_name has changed to __name__.

def accepts(*types): def check_accepts(f): assert len(types) == f.__code__.co_argcount def new_f(*args, **kwds): for (a, t) in zip(args, types): assert isinstance(a, t), \ "arg %r does not match %s" % (a,t) return f(*args, **kwds) new_f.__name__ = f.__name__ return new_f return check_accepts 

Usage:

@accepts(int, (int,float)) def func(arg1, arg2): return arg1 * arg2 func(3, 2) # -> 6 func('3', 2) # -> AssertionError: arg '3' does not match <type 'int'> 

arg2 can be either int or float

Sign up to request clarification or add additional context in comments.

10 Comments

@AsTeR: Create a minimal complete code example that reproduces your problem and post it as a new question.
I recommend to use this solution, it has good readability if there are many input params. code.activestate.com/recipes/…
@IAbstract In Python 3, func_code (as I know) has been replaced with the magic attribute __code__ instead.
@user4447514 func_name has also been replaced by __name__
Doesn't this solution break if I call func(3, arg2=2)? Then, 3 is in *args and 2 is in **kwargs because the latter was provided as keyword argument.
|
19

On Python 3.3, you can use function annotations and inspect:

import inspect def validate(f): def wrapper(*args): fname = f.__name__ fsig = inspect.signature(f) vars = ', '.join('{}={}'.format(*pair) for pair in zip(fsig.parameters, args)) params={k:v for k,v in zip(fsig.parameters, args)} print('wrapped call to {}({})'.format(fname, params)) for k, v in fsig.parameters.items(): p=params[k] msg='call to {}({}): {} failed {})'.format(fname, vars, k, v.annotation.__name__) assert v.annotation(params[k]), msg ret = f(*args) print(' returning {} with annotation: "{}"'.format(ret, fsig.return_annotation)) return ret return wrapper @validate def xXy(x: lambda _x: 10<_x<100, y: lambda _y: isinstance(_y,float)) -> ('x times y','in X and Y units'): return x*y xy = xXy(10,3) print(xy) 

If there is a validation error, prints:

AssertionError: call to xXy(x=12, y=3): y failed <lambda>) 

If there is not a validation error, prints:

wrapped call to xXy({'y': 3.0, 'x': 12}) returning 36.0 with annotation: "('x times y', 'in X and Y units')" 

You can use a function rather than a lambda to get a name in the assertion failure.

3 Comments

Looks interesting but really tough to understand at first glance. I'll give a look when I'll be less tired.
This is an incredibly obfuscatory implementation. Technically, it works. But it makes the eyes bleed. For a far more readable (albeit slightly less powerful) alternative, see sweeneyrod's concise @checkargs decorator under a similar question.
@CecilCurry: Can you elaborate why you think it is so bad? I think the checking via lambda very sensible.
12

As you certainly know, it's not pythonic to reject an argument only based on its type.
Pythonic approach is rather "try to deal with it first"
That's why I would rather do a decorator to convert the arguments

def enforce(*types): def decorator(f): def new_f(*args, **kwds): #we need to convert args into something mutable newargs = [] for (a, t) in zip(args, types): newargs.append( t(a)) #feel free to have more elaborated convertion return f(*newargs, **kwds) return new_f return decorator 

This way, your function is fed with the type you expect But if the parameter can quack like a float, it is accepted

@enforce(int, float) def func(arg1, arg2): return arg1 * arg2 print (func(3, 2)) # -> 6.0 print (func('3', 2)) # -> 6.0 print (func('three', 2)) # -> ValueError: invalid literal for int() with base 10: 'three' 

I use this trick (with proper conversion method) to deal with vectors.
Many methods I write expect MyVector class as it has plenty of functionalities; but sometime you just want to write

transpose ((2,4)) 

3 Comments

"As you certainly know, it's not pythonic to reject an argument only based on its type.". Do you have reference for that?
I believe he's referencing "duck typing", if it quacks like a duck and walks like a duck then it's a duck... but then I would argue with primitive types, such as float and decimal, eg. Decimal(1.3) is not the same as Decimal('1.3')
Doesn't this solution break if I call func(3, arg2=2)? Then, 3 is in *args and 2 is in **kwargs because the latter was provided as keyword argument.
9

The package typeguard provides a decorator for this, it reads the type information from type annotations, it requires Python >=3.5.2 though. I think the resulting code is quite nice.

@typeguard.typechecked def my_function(this_var_is_an_int: int, this_var_is_a_float: float) ''' Here my code ''' pass 

Comments

2

I think the Python 3.5 answer to this question is beartype. As explained in this post it comes with handy features. Your code would then look like this

from beartype import beartype @beartype def sprint(s: str) -> None: print(s) 

and results in

>>> sprint("s") s >>> sprint(3) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "<string>", line 13, in func_beartyped TypeError: sprint() parameter s=3 not of <class 'str'> 

Comments

1

To enforce string arguments to a parser that would throw cryptic errors when provided with non-string input, I wrote the following, which tries to avoid allocation and function calls:

from functools import wraps def argtype(**decls): """Decorator to check argument types. Usage: @argtype(name=str, text=str) def parse_rule(name, text): ... """ def decorator(func): code = func.func_code fname = func.func_name names = code.co_varnames[:code.co_argcount] @wraps(func) def decorated(*args,**kwargs): for argname, argtype in decls.iteritems(): try: argval = args[names.index(argname)] except ValueError: argval = kwargs.get(argname) if argval is None: raise TypeError("%s(...): arg '%s' is null" % (fname, argname)) if not isinstance(argval, argtype): raise TypeError("%s(...): arg '%s': type is %s, must be %s" % (fname, argname, type(argval), argtype)) return func(*args,**kwargs) return decorated return decorator 

1 Comment

I ended up using this one: relatively simple, uses only standard library, and it works with variable number of *args and **kwargs. Only caveat is that func_code was renamed to __code__ in Python 3, I don't know if there's a cross-version way to do this.
1

I have a slightly improved version of @jbouwmans sollution, using python decorator module, which makes the decorator fully transparent and keeps not only signature but also docstrings in place and might be the most elegant way of using decorators

from decorator import decorator def check_args(**decls): """Decorator to check argument types. Usage: @check_args(name=str, text=str) def parse_rule(name, text): ... """ @decorator def wrapper(func, *args, **kwargs): code = func.func_code fname = func.func_name names = code.co_varnames[:code.co_argcount] for argname, argtype in decls.iteritems(): try: argval = args[names.index(argname)] except IndexError: argval = kwargs.get(argname) if argval is None: raise TypeError("%s(...): arg '%s' is null" % (fname, argname)) if not isinstance(argval, argtype): raise TypeError("%s(...): arg '%s': type is %s, must be %s" % (fname, argname, type(argval), argtype)) return func(*args, **kwargs) return wrapper 

Comments

1

All of these posts seem out of date - pint now provides this functionality built in. See here. Copied here for posterity:

Checking dimensionality When you want pint quantities to be used as inputs to your functions, pint provides a wrapper to ensure units are of correct type - or more precisely, they match the expected dimensionality of the physical quantity.

Similar to wraps(), you can pass None to skip checking of some parameters, but the return parameter type is not checked.

>>> mypp = ureg.check('[length]')(pendulum_period) 

In the decorator format:

>>> @ureg.check('[length]') ... def pendulum_period(length): ... return 2*math.pi*math.sqrt(length/G) 

Comments

1

you could try with the pydantic validation_decorator. from the documentation pydantic:

Data validation and settings management using python type annotations. pydantic enforces type hints at runtime, and provides user friendly errors when data is invalid. In benchmarks pydantic is faster than all other tested libraries.

from pydantic import validate_arguments, ValidationError @validate_arguments def repeat(s: str, count: int, *, separator: bytes = b'') -> bytes: b = s.encode() return separator.join(b for _ in range(count)) a = repeat('hello', 3) print(a) #> b'hellohellohello' b = repeat('x', '4', separator=' ') print(b) #> b'x x x x' try: c = repeat('hello', 'wrong') except ValidationError as exc: print(exc) """ 1 validation error for Repeat count value is not a valid integer (type=type_error.integer) """ 

Comments

1

For me, the codes shared above looks complicated. What I did for defining 'generic decorator' for type-check:

I used *args, **kwargs feature, little extra work when using function/method but easy to manage.

Appropriate example definition for test

argument_types = { 'name':str, 'count':int, 'value':float } 

Decoration Defination

//from functools import wraps def azure_type(func): @wraps(func) def type_decorator(*args, **kwargs): for key, value in kwargs.items(): if key in argument_types: if type(value) != argument_types[key]: #enter code here return 'Error Message or what ever you like to do' return func(*args, **kwargs) return type_decorator 

Simple sample in code

// all other definitions @azure_type def stt(name:str, value:float)->(int): #some calculation and creation of int output count_output = #something int return count_output // call the function: stt(name='ati', value=32.90) #can test from that 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.