Deriving a New Divergence Measure from Extended Cross-Entropy Error Function
Deriving a New Divergence Measure from Extended Cross-Entropy Error Function
INTERNATIONAL JOURNAL OF CONTENTS / INTERNATIONAL JOURNAL OF CONTENTS, (P)1738-6764; (E)2093-7504
2015, v.11 no.2, pp.57-62
https://doi.org/10.5392/ijoc.2015.11.2.057
Oh, Sang-Hoon
(Division of Information Communication Engineering Mokwon University)
Wakuya, Hiroshi
(Graduate School of Science and Engineering Saga University)
Park, Sun-Gyu
(Division of Architecture Mokwon University)
Noh, Hwang-Woo
(Department of Visual Design Hanbat National University)
Yoo, Jae-Soo
(School of Information and Communication Engineering Chungbuk National University)
Min, Byung-Won
(Division of Information Communication Engineering Mokwon University)
Oh, Yong-Sun
(Division of Information Communication Engineering Mokwon University)
Oh, Sang-Hoon,
Wakuya, Hiroshi,
Park, Sun-Gyu,
Noh, Hwang-Woo,
Yoo, Jae-Soo,
Min, Byung-Won,
&
Oh, Yong-Sun.
(2015). Deriving a New Divergence Measure from Extended Cross-Entropy Error Function. , 11(2), 57-62, https://doi.org/10.5392/ijoc.2015.11.2.057
Abstract
Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.
- keywords
-
Cross-Entropy,
The n-th Order Extension of Cross-Entropy,
Divergence Measure,
Information Theory,
Neural Networks