Skip to main content

Research Repository

Advanced Search

Investigating the performance of MLP classifiers where limited training data are available for some classes

Parikh, CR; Pont, MJ; Li, Y; Jones, NB

Authors

CR Parikh

MJ Pont

Y Li

NB Jones



Abstract

The standard implementation of the back-propagation training algorithm for multi-layer Perceptron (MLP) neural networks assumes that there are equal number of samples for training each of the required classes. Where limited training data are available for one (or more) classes, sub-optimal performance may be obtained. We have demonstrated in a previous study [Parikh et al., 1999. Proceedings of Condition Monitoring 1999, Swansea, UK] that, where unequal training class cannot be avoided, performance of the classifier may be substantially improved by duplicating the available patterns in the smaller class. In this study, we investigate whether the addition of random noise to the `duplicated' training patterns will further improve the classification performance. In the study conducted here, we conclude that the addition of noise does not give a consistent improvement in performance.

Citation

Parikh, C., Pont, M., Li, Y., & Jones, N. (1999, July). Investigating the performance of MLP classifiers where limited training data are available for some classes. Presented at Recent Advances in Soft Computing Techniques and Applications, Leicester, UK

Presentation Conference Type Other
Conference Name Recent Advances in Soft Computing Techniques and Applications
Conference Location Leicester, UK
Start Date Jul 1, 1999
End Date Jul 2, 1999
Publication Date Jul 1, 2000
Deposit Date Jul 27, 2015
Series Title Advances in Soft Computing
Additional Information Event Type : Conference

Downloadable Citations