Share Email Print
cover

Proceedings Paper

Cross-validation techniques for n-tuple-based neural networks
Author(s): Christian Linneberg; Thomas Martini Joergensen
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In spite of the simple classification concept, impressing performances have been reported using the n-tuple architecture in combination with a very simple training strategy. In general, however, the performance of the n- tuple classifier is highly dependent on the choice of input connections and on the encoding of the input data. Accordingly, the simple architecture needs to be accompanied with design tools for obtaining a suitable architecture. Due to the simplicity of the architecture, it is simple to perform leave-one-out cross-validation tests and extensions of the concept. Therefore, it is also possible to operate with design methods that make extensively use of such tests. This paper describes such design algorithms and especially introduces a simple design strategy that allows the n-tuple architecture to perform satisfactorily in cases with skewed class priors. It can also help to resolve conflicts in the training material. The described methods are evaluated on classification problems from the European StatLog project. It is hereby shown that the design tools extends the competitiveness of the n-tuple classification method.

Paper Details

Date Published: 22 March 1999
PDF: 12 pages
Proc. SPIE 3728, Ninth Workshop on Virtual Intelligence/Dynamic Neural Networks, (22 March 1999); doi: 10.1117/12.343045
Show Author Affiliations
Christian Linneberg, Intellix A/S (Denmark) and Riso National Lab. (Denmark)
Thomas Martini Joergensen, Riso National Lab. (Denmark)


Published in SPIE Proceedings Vol. 3728:
Ninth Workshop on Virtual Intelligence/Dynamic Neural Networks
Thomas Lindblad; Mary Lou Padgett; Jason M. Kinser, Editor(s)

© SPIE. Terms of Use
Back to Top