ISSN : 1738-6764
The EBP (Error Back-Propagation) Algorithm was initially proposed for training MLP’s (Multi-Layer Perceptrons) and is now widely used for training deep neural networks. This supervised learning algorithm minimizes the error function between the actual output values of MLP’s and their desired values. However, ICA (Independent Component Analysis) is an unsupervised learning algorithm that aims to maximize the independence among the outputs of neural networks. ICA has been shown to realize visual features in the V1 layer of the human brain by learning from natural scenes and cochlear features of the human ear by learning from auditory signals. In this paper, we propose merging the supervised EBP algorithm with the unsupervised ICA algorithm to enhance the performance of neural networks by training independent features in the initial learning stage. This approach mirrors the feature-learning process observed in mammals during the early stages of life. Furthermore, the proposed approach is verified through simulations on isolated-word recognition tasks, achieving improved classification performance with faster learning convergence. In detail, when the number of hidden nodes is 100, EBP with ICA reaches a misclassification ratio of 2.78% on the test data at 160 epochs, while EBP achieves 3.28% at 300 epochs.
