Skip to main content

Research Repository

Advanced Search

Improving the performance of multilayer perceptrons where limited training data are available for some classes

Parikh, CR; Pont, MJ; Li, Y; Jones, NB

Authors

CR Parikh

MJ Pont

Y Li

NB Jones



Abstract

The standard multi-layer perceptron (MLP) training algorithm implicitly assumes that equal numbers of examples are available to train each of the network classes. However, in many condition monitoring and fault diagnosis (CMFD) systems, data representing fault conditions can only be obtained with great difficulty: as a result, training classes may vary greatly in size, and the overall performance of an MLP classifier may be comparatively poor. We describe two techniques which can help ameliorate the impact of unequal training set sizes. We demonstrate the effectiveness of these techniques using simulated fault data representative of that found in a broad class of CMFD problems.

Citation

Parikh, C., Pont, M., Li, Y., & Jones, N. (1999, September). Improving the performance of multilayer perceptrons where limited training data are available for some classes. Presented at 9th International Conference on Artificial Neural Networks: ICANN '99, Edinburgh

Presentation Conference Type Other
Conference Name 9th International Conference on Artificial Neural Networks: ICANN '99
Conference Location Edinburgh
Start Date Sep 7, 1999
End Date Sep 10, 1999
Publication Date Sep 7, 1999
Deposit Date Jul 27, 2015
Publisher Institution of Engineering and Technology (IET)
Publisher URL http://dx.doi.org/10.1049/cp:19991113
Additional Information Event Type : Conference


Downloadable Citations