Learning criteria for training neural network classifiers

P Zhou, J Austin

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a study of two learning criteria and two approaches to using them for training neural network classifiers, specifically a Multi-Layer Perceptron (MLP) and Radial Basis Function (RBF) networks. The first approach, which is a traditional one, relies on the use of two popular learning criteria i.e. learning via minimising a Mean Squared Error (MSE) function or a Cross Entropy (CE) function. It is shown that the two criteria have different charcteristics in learning speed and outlier effects, and that this approach does not necessarily result in a minimal classification error. To be suitable for classification tasks, in our second approach an empirical classification criterion is introduced for the testing process while using the MSE or CE function for the training. Experimental results on several benchmarks indicate that the second approach, compared with the first, leads to an improved generalisation performance, and that the use of the CE function, compared with the MSE function, gives a faster training speed and improved or equal generalisation performance.

Original languageEnglish
Pages (from-to)334-342
Number of pages9
JournalNeural computing & applications
Volume7
Issue number4
Publication statusPublished - 1998

Keywords

  • benchmarks
  • learning criteria
  • multilayer perceptron networks
  • pattern classification
  • radial basis function networks
  • training methods
  • CLASSIFICATION
  • ERROR

Cite this