Skip to the content

Investigating the performance of MLP classifiers where limited training data are available for some classes

Parikh, CR, Pont, MJ, Li, Y and Jones, NB 2000, Investigating the performance of MLP classifiers where limited training data are available for some classes , in: Recent Advances in Soft Computing Techniques and Applications, 1-2 July 1999, Leicester, UK.

Full text not available from this repository. (Request a copy)

Abstract

The standard implementation of the back-propagation training algorithm for multi-layer Perceptron (MLP) neural networks assumes that there are equal number of samples for training each of the required classes. Where limited training data are available for one (or more) classes, sub-optimal performance may be obtained. We have demonstrated in a previous study [Parikh et al., 1999. Proceedings of Condition Monitoring 1999, Swansea, UK] that, where unequal training class cannot be avoided, performance of the classifier may be substantially improved by duplicating the available patterns in the smaller class. In this study, we investigate whether the addition of random noise to the `duplicated' training patterns will further improve the classification performance. In the study conducted here, we conclude that the addition of noise does not give a consistent improvement in performance.

Item Type: Conference or Workshop Item (Paper)
Schools: Schools > School of Computing, Science and Engineering
Journal or Publication Title: Advances in Soft Computing
Publisher: Physica-Verlag: Heidelberg
Refereed: Yes
Series Name: Advances in Soft Computing
Funders: Non funded research
Depositing User: Yuhua Li
Date Deposited: 27 Jul 2015 11:00
Last Modified: 05 Apr 2016 18:18
URI: http://usir.salford.ac.uk/id/eprint/33144

Actions (login required)

Edit record (repository staff only) Edit record (repository staff only)