Share Email Print
cover

Proceedings Paper

Comparing the computational complexity of the PNN, the PDM, and the MMNN (M2N2)
Author(s): Samir R. Chettri; Yoshimichi Murakami; Isamu Nagano; Jerry Garegnani
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In classification, the goal is to assign an input vector to a discrete number of output classes. Classifier design has a long history and they have been put to a large number of uses. In this paper we continue the task of categorizing classifiers by their computational complexity as begun. In particular, we derive analytical formulas for the number of arithmetic operations in the probabilistic neural network (PNN) and its polynomial expansion, also known as the polynomial discriminant method (PDM) and the mixture model neural network (M2N2). In addition we perform tests of the classification accuracy of the PDM with respect to the PNN and the M2N2 find that all three are close in accuracy. Based on this research we now have the ability to choose one or the other based on the computational complexity, the memory requirements and the size of the training set. This is a great advantage in an operational environment. We also discus the extension of such methods to hyperspectral data and find that only the M2N2 is suitable for application to such data.

Paper Details

Date Published: 1 March 1998
PDF: 7 pages
Proc. SPIE 3240, 26th AIPR Workshop: Exploiting New Image Sources and Sensors, (1 March 1998); doi: 10.1117/12.300049
Show Author Affiliations
Samir R. Chettri, NASA Goddard Space Flight Ctr. (United States)
Yoshimichi Murakami, JVC (Japan)
Isamu Nagano, Kanazawa Univ. (Japan)
Jerry Garegnani, NASA Goddard Space Flight Ctr. (United States)


Published in SPIE Proceedings Vol. 3240:
26th AIPR Workshop: Exploiting New Image Sources and Sensors
J. Michael Selander, Editor(s)

© SPIE. Terms of Use
Back to Top