바로가기메뉴

본문 바로가기 주메뉴 바로가기

logo

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

INTERNATIONAL JOURNAL OF CONTENTS / INTERNATIONAL JOURNAL OF CONTENTS, (P)1738-6764; (E)2093-7504
2015, v.11 no.2, pp.57-62
https://doi.org/10.5392/IJoC.2015.11.2.057

Hiroshi Wakuya





Abstract

Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

keywords
Cross-Entropy, The n-th Order Extension of Cross-Entropy, Divergence Measure, Information Theory, Neural Networks

INTERNATIONAL JOURNAL OF CONTENTS