바로가기메뉴

본문 바로가기 주메뉴 바로가기

ACOMS+ 및 학술지 리포지터리 설명회

  • 한국과학기술정보연구원(KISTI) 서울분원 대회의실(별관 3층)
  • 2024년 07월 03일(수) 13:30
 

logo

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

INTERNATIONAL JOURNAL OF CONTENTS / INTERNATIONAL JOURNAL OF CONTENTS, (P)1738-6764; (E)2093-7504
2015, v.11 no.2, pp.57-62
https://doi.org/10.5392/IJoC.2015.11.2.057
오상훈 (목원대학교)
Hiroshi Wakuya (사가대학교)
박선규 (목원대학교)
노황우 (한밭대학교)
유재수 (충북대학교)
민병원 (목원대학교)
오용선 (목원대학교)

Abstract

Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

keywords
Cross-Entropy, The n-th Order Extension of Cross-Entropy, Divergence Measure, Information Theory, Neural Networks

INTERNATIONAL JOURNAL OF CONTENTS