chainer.no_backprop_mode¶
-
chainer.no_backprop_mode()[source]¶ Make a context manager which disables back-propagation.
In this context, Chainer does not make a computational graph. It has the benefit of reducing memory consumption. However, a
Variablecreated in this context does not hold a reference to theFunctionNodethat created itself so no gradients are accumulated bybackward().In the following example,
yis created in this context, which means that callingbackward()onyhas no effect on the gradients ofx.>>> x = chainer.Variable(np.array([1,], np.float32)) >>> with chainer.no_backprop_mode(): ... y = x + 1 >>> y.backward() >>> x.grad is None True
See also
See
force_backprop_mode()for details on how to override this context.