chainer.gradient_check.check_double_backward¶
-
chainer.gradient_check.check_double_backward(func, x_data, y_grad, x_grad_grad, params=(), params_grad_grad=(), eps=0.001, atol=0.0001, rtol=0.001, no_grads=None, dtype=None, detect_nondifferentiable=False)[source]¶ Test twice differentiation of a given procedure.
This function automatically checks if the backward procedure of
funcis correctly implemented for further differentiation. It first computes the gradient offuncw.r.t. its inputs in the same way ascheck_backward(). This function then further invokes the backward procedure against the gradient variables, starting from the initial gradient given byx_grad_grad. It also computes the second gradient usingnumerical_grad(). The resulting gradients are compared to confirm if the second-order gradients are approximately correct.Note that this function DOES NOT check if the first-order differentiation is correct; the numerical gradient assumes that the first-order gradient given by the usual
chainer.Variable.backward()is correct. The implementation of each differentiable function should be tested bycheck_backward()first, and then should be tested by this function if neccessary.For the details of the arguments, see
check_backward(). The additional argumentsx_grad_gradandparams_grad_gradare (tuples of)Variable(s) that include the initial gradient corresponding to the first-order gradient of each input and parameter. Note that the default error toleranceatolandrtolare slightly larger than those ofcheck_backward()because the numerical gradients of the second order differentiation are less accurate than those of the first order gradients.