View source on GitHub |
A Layer characterized by iteratively given functions.
Inherits From: Layer
tfp.experimental.nn.Sequential( layers, also_track=None, validate_args=False, name=None ) Attributes | |
|---|---|
also_track | |
layers | |
name | Returns the name of this module as passed or determined in the ctor. |
name_scope | Returns a tf.name_scope instance for this class. |
non_trainable_variables | Sequence of non-trainable variables owned by this module and its submodules. |
submodules | Sequence of all sub-modules. Submodules are modules which are properties of this module, or found as properties of modules which are properties of this module (and so on).
|
trainable_variables | Sequence of trainable variables owned by this module and its submodules. |
validate_args | Python bool indicating possibly expensive checks are enabled. |
variables | Sequence of variables owned by this module and its submodules. |
Methods
load
load( filename ) save
save( filename ) set_trace
set_trace( trace ) summary
summary() with_name_scope
@classmethodwith_name_scope( method )
Decorator to automatically enter the module name scope.
class MyModule(tf.Module):@tf.Module.with_name_scopedef __call__(self, x):if not hasattr(self, 'w'):self.w = tf.Variable(tf.random.normal([x.shape[1], 3]))return tf.matmul(x, self.w)
Using the above module would produce tf.Variables and tf.Tensors whose names included the module name:
mod = MyModule()mod(tf.ones([1, 2]))<tf.Tensor: shape=(1, 3), dtype=float32, numpy=..., dtype=float32)>mod.w<tf.Variable 'my_module/Variable:0' shape=(2, 3) dtype=float32,numpy=..., dtype=float32)>
| Args | |
|---|---|
method | The method to wrap. |
| Returns | |
|---|---|
| The original method wrapped such that it enters the module's name scope. |
__call__
__call__( inputs, **kwargs ) Call self as a function.
__getitem__
__getitem__( i )
View source on GitHub