chainer.links.MLPConvolution2D¶
-
class
chainer.links.
MLPConvolution2D
(self, in_channels, out_channels, ksize=None, stride=1, pad=0, activation=relu.relu, conv_init=None, bias_init=None)[source]¶ Two-dimensional MLP convolution layer of Network in Network.
This is an “mlpconv” layer from the Network in Network paper. This layer is a two-dimensional convolution layer followed by 1x1 convolution layers and interleaved activation functions.
Note that it does not apply the activation function to the output of the last 1x1 convolution layer.
- Parameters
in_channels (int or None) – Number of channels of input arrays. If it is
None
or omitted, parameter initialization will be deferred until the first forward data pass at which time the size will be determined.out_channels (tuple of ints) – Tuple of number of channels. The i-th integer indicates the number of filters of the i-th convolution.
ksize (int or pair of ints) – Size of filters (a.k.a. kernels) of the first convolution layer.
ksize=k
andksize=(k, k)
are equivalent.stride (int or pair of ints) – Stride of filter applications at the first convolution layer.
stride=s
andstride=(s, s)
are equivalent.pad (int or pair of ints) – Spatial padding width for input arrays at the first convolution layer.
pad=p
andpad=(p, p)
are equivalent.activation (callable) – Activation function for internal hidden units. You can specify one of activation functions from built-in activation functions or your own function. It should not be an activation functions with parameters (i.e.,
Link
instance). The function must accept one argument (the output from each child link), and return a value. Returned value must be a Variable derived from the input Variable to perform backpropagation on the variable. Note that this function is not applied to the output of this link.conv_init – An initializer of weight matrices passed to the convolution layers. This option must be specified as a keyword argument.
bias_init – An initializer of bias vectors passed to the convolution layers. This option must be specified as a keyword argument.
See: Network in Network.
- Variables
activation (callable) – Activation function. See the description in the arguments for details.
Methods
-
add_link
(link)[source]¶ Registers a child link and adds it to the tail of the list.
- Parameters
link (Link) – The link object to be registered.
-
add_param
(name, shape=None, dtype=<class 'numpy.float32'>, initializer=None)[source]¶ Registers a parameter to the link.
- Parameters
name (str) – Name of the parameter. This name is also used as the attribute name.
shape (int or tuple of ints) – Shape of the parameter array. If it is omitted, the parameter variable is left uninitialized.
dtype – Data type of the parameter array.
initializer – If it is not
None
, the data is initialized with the given initializer. If it is an array, the data is directly initialized by it. If it is callable, it is used as a weight initializer. Note that in these cases,dtype
argument is ignored.
-
add_persistent
(name, value)[source]¶ Registers a persistent value to the link.
The registered value is saved and loaded on serialization and deserialization. The value is set to an attribute of the link.
- Parameters
name (str) – Name of the persistent value. This name is also used for the attribute name.
value – Value to be registered.
-
addgrads
(link)[source]¶ Accumulates gradient values from given link.
This method adds each gradient array of the given link to corresponding gradient array of this link. The accumulation is even done across host and different devices.
- Parameters
link (Link) – Source link object.
-
append
(value)¶ S.append(value) – append value to the end of the sequence
-
children
()[source]¶ Returns a generator of all child links.
- Returns
A generator object that generates all child links.
-
clear
() → None -- remove all items from S¶
-
cleargrads
()[source]¶ Clears all gradient arrays.
This method should be called before the backward computation at every iteration of the optimization.
-
copyparams
(link, copy_persistent=True)[source]¶ Copies all parameters from given link.
This method copies data arrays of all parameters in the hierarchy. The copy is even done across the host and devices. Note that this method does not copy the gradient arrays.
From v5.0.0: this method also copies the persistent values (e.g. the moving statistics of
BatchNormalization
). If the persistent value is an ndarray, the elements are copied. Otherwise, it is copied usingcopy.deepcopy()
. The old behavior (not copying persistent values) can be reproduced withcopy_persistent=False
.
-
count
(value) → integer -- return number of occurrences of value¶
-
count_params
()[source]¶ Counts the total number of parameters.
This method counts the total number of scalar values included in all the
Parameter
s held by this link and its descendants.If the link containts uninitialized parameters, this method raises a warning.
- Returns
The total size of parameters (int)
-
delete_hook
(name)[source]¶ Unregisters the link hook.
- Parameters
name (str) – The name of the link hook to be unregistered.
-
disable_update
()[source]¶ Disables update rules of all parameters under the link hierarchy.
This method sets the
enabled
flag of the update rule of each parameter variable toFalse
.
-
enable_update
()[source]¶ Enables update rules of all parameters under the link hierarchy.
This method sets the
enabled
flag of the update rule of each parameter variable toTrue
.
-
extend
(values)¶ S.extend(iterable) – extend sequence by appending elements from the iterable
-
from_chainerx
()[source]¶ Converts parameter variables and persistent values from ChainerX to NumPy/CuPy devices without any copy.
-
index
(value[, start[, stop]]) → integer -- return first index of value.¶ Raises ValueError if the value is not present.
-
init_scope
()[source]¶ Creates an initialization scope.
This method returns a context manager object that enables registration of parameters (and links for
Chain
) by an assignment. AParameter
object can be automatically registered by assigning it to an attribute under this context manager.Example
In most cases, the parameter registration is done in the initializer method. Using the
init_scope
method, we can simply assign aParameter
object to register it to the link.class MyLink(chainer.Link): def __init__(self): super().__init__() with self.init_scope(): self.W = chainer.Parameter(0, (10, 5)) self.b = chainer.Parameter(0, (5,))
-
links
(skipself=False)[source]¶ Returns a generator of all links under the hierarchy.
- Parameters
skipself (bool) – If
True
, then the generator skips this link and starts with the first child link.- Returns
A generator object that generates all links.
-
namedlinks
(skipself=False)[source]¶ Returns a generator of all (path, link) pairs under the hierarchy.
- Parameters
skipself (bool) – If
True
, then the generator skips this link and starts with the first child link.- Returns
A generator object that generates all (path, link) pairs.
-
namedparams
(include_uninit=True)[source]¶ Returns a generator of all (path, param) pairs under the hierarchy.
- Parameters
include_uninit (bool) – If
True
, it also generates uninitialized parameters.- Returns
A generator object that generates all (path, parameter) pairs. The paths are relative from this link.
-
params
(include_uninit=True)[source]¶ Returns a generator of all parameters under the link hierarchy.
- Parameters
include_uninit (bool) – If
True
, it also generates uninitialized parameters.- Returns
A generator object that generates all parameters.
-
pop
([index]) → item -- remove and return item at index (default last).¶ Raise IndexError if list is empty or index is out of range.
-
register_persistent
(name)[source]¶ Registers an attribute of a given name as a persistent value.
This is a convenient method to register an existing attribute as a persistent value. If
name
has been already registered as a parameter, this method removes it from the list of parameter names and re-registers it as a persistent value.- Parameters
name (str) – Name of the attribute to be registered.
-
remove
(value)¶ S.remove(value) – remove first occurrence of value. Raise ValueError if the value is not present.
-
repeat
(n_repeat, mode='init')[source]¶ Repeats this link multiple times to make a
Sequential
.This method returns a
Sequential
object which has the sameLink
multiple times repeatedly. Themode
argument means how to copy this link to repeat.Example
You can repeat the same link multiple times to create a longer
Sequential
block like this:class ConvBNReLU(chainer.Chain): def __init__(self): super(ConvBNReLU, self).__init__() with self.init_scope(): self.conv = L.Convolution2D( None, 64, 3, 1, 1, nobias=True) self.bn = L.BatchNormalization(64) def forward(self, x): return F.relu(self.bn(self.conv(x))) net = ConvBNReLU().repeat(16, mode='init')
The
net
object contains 16 blocks, each of which isConvBNReLU
. And themode
wasinit
, so each block is re-initialized with different parameters. If you givecopy
to this argument, each block has same values for its parameters but its object ID is different from others. If it isshare
, each block is same to others in terms of not only parameters but also the object IDs because they are shallow-copied, so that when the parameter of one block is changed, all the parameters in the others also change.- Parameters
n_repeat (int) – Number of times to repeat.
mode (str) – It should be either
init
,copy
, orshare
.init
means parameters of each repeated element in the returnedSequential
will be re-initialized, so that all elements have different initial parameters.copy
means that the parameters will not be re-initialized but object itself will be deep-copied, so that all elements have same initial parameters but can be changed independently.share
means all the elements which consist the resultingSequential
object are same object because they are shallow-copied, so that all parameters of elements are shared with each other.
-
reverse
()¶ S.reverse() – reverse IN PLACE
-
serialize
(serializer)[source]¶ Serializes the link object.
- Parameters
serializer (AbstractSerializer) – Serializer object.
-
to_chainerx
()[source]¶ Converts parameter variables and persistent values to ChainerX without any copy.
This method does not handle non-registered attributes. If some of such attributes must be copied to ChainerX, the link implementation must override this method to do so.
Returns: self
-
to_cpu
()[source]¶ Copies parameter variables and persistent values to CPU.
This method does not handle non-registered attributes. If some of such attributes must be copied to CPU, the link implementation must override
Link.to_device()
to do so.Returns: self
-
to_device
(device)[source]¶ Copies parameter variables and persistent values to the specified device.
This method does not handle non-registered attributes. If some of such attributes must be copied to the device, the link implementation must override this method to do so.
- Parameters
device – Target device specifier. See
get_device()
for available values.
Returns: self
-
to_gpu
(device=None)[source]¶ Copies parameter variables and persistent values to GPU.
This method does not handle non-registered attributes. If some of such attributes must be copied to GPU, the link implementation must override
Link.to_device()
to do so.- Parameters
device – Target device specifier. If omitted, the current device is used.
Returns: self
-
zerograds
()[source]¶ Initializes all gradient arrays by zero.
Deprecated since version v1.15: Use the more efficient
cleargrads()
instead.
Attributes
-
device
¶
-
local_link_hooks
¶ Ordered dictionary of registered link hooks.
Contrary to
chainer.thread_local.link_hooks
, which registers its elements to all functions, link hooks in this property are specific to this link.
-
update_enabled
¶ True
if at least one parameter has an update rule enabled.
-
within_init_scope
¶ True if the current code is inside of an initialization scope.
See
init_scope()
for the details of the initialization scope.