chainer.cross_entropy¶
-
chainer.cross_entropy(dist1, dist2)[source]¶ Computes Cross entropy.
For two continuous distributions \(p(x), q(x)\), it is expressed as
\[H(p,q) = - \int p(x) \log q(x) dx\]For two discrete distributions \(p(x), q(x)\), it is expressed as
\[H(p,q) = - \sum_x p(x) \log q(x)\]This function call
kl_divergence()andentropy()ofdist1. Therefore, it is necessary to register KL divergence function withregister_kl()decoartor and defineentropy()indist1.- Parameters
dist1 (
Distribution) – Distribution to calculate cross entropy \(p\). This is the first (left) operand of the cross entropy.dist2 (
Distribution) – Distribution to calculate cross entropy \(q\). This is the second (right) operand of the cross entropy.
- Returns
Output variable representing cross entropy \(H(p,q)\).
- Return type