바로가기메뉴

본문 바로가기 주메뉴 바로가기
 

logo

  • P-ISSN1738-6764
  • E-ISSN2093-7504
  • KCI

Controlling Neural Network Training via Modification of Cross-Entropy Loss Function

INTERNATIONAL JOURNAL OF CONTENTS / INTERNATIONAL JOURNAL OF CONTENTS, (P)1738-6764; (E)2093-7504
2025, v.21 no.4, pp.133-139
SANG HOON OH

Abstract

There have been various loss functions proposed to improve the training of neural networks with sigmoid activation output nodes. For neural networks with softmax activation output nodes, the cross-entropy loss function is commonly used for training and several attempts have been made to improve the performance of such networks by modifying the standard cross-entropy loss. However, rather than simply aiming to improve overall classification accuracy, it is often necessary to address misclassification costs differently depending on their real-world importance. In practice, the cost of errors can vary greatly across domains such as finance, security, insurance, and healthcare. From this perspective, this paper proposes a modified cross-entropy loss function designed to control the training of neural networks with softmax output nodes. The effectiveness of the proposed loss function is demonstrated through simulations on the CEDAR handwritten digit recognition task, showing that classification performance can be adjusted according to the order of the modified loss function. This approach will serve as the basis for designing a novel cost-sensitive learning method tailored to neural networks with softmax outputs.

keywords
Error Back-propagation, Cross-entropy Loss Function, Cost-sensitive Learning, Neural Networks, Softmax Activation

INTERNATIONAL JOURNAL OF CONTENTS