bn
– Batch Normalization¶
-
theano.tensor.nnet.bn.
batch_normalization
(inputs, gamma, beta, mean, std, mode='low_mem')¶ This function will build the symbolic graph for applying batch normalization to a set of activations. Work also on GPU
New in version 0.7.1.
Parameters: - inputs (symbolic tensor) – Mini-batch of activations
- gamma (symbolic tensor) – BN scale parameter, must be of same dimensionality as inputs and broadcastable against it
- beta (symbolic tensor) – BN shift parameter, must be of same dimensionality as inputs and broadcastable against it
- mean (symbolic tensor) – inputs means, must be of same dimensionality as inputs and broadcastable against it
- std (symbolic tensor) – inputs standard deviation, must be of same dimensionality as inputs and broadcastable against it
- mode (‘low_mem’ or ‘high_mem’) – Specify which batch_normalization implementation that will be used. As no intermediate representations are stored for the back-propagation, ‘low_mem’ implementation lower the memory usage, however, it is 5-10% slower than ‘high_mem’ implementation. Note that 5-10% computation time difference compare the batch_normalization operation only, time difference between implementation is likely to be less important on the full model fprop/bprop.