Share Email Print
cover

Proceedings Paper

Partial least-squares regression neural network (PLSNET) with supervised adaptive modular learning
Format Member Price Non-Member Price
PDF $17.00 $21.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

We present in this paper an adaptive linear neural network architecture called PLSNET. This network is based on partial least-squares (PLS) regression. The architecture is a modular network with stages that are associated with the desired number of PLS factors that are to be retained. PLSNET actually consists of two separate but coupled architectures, PLSNET-C for PLS calibration, and PLSNET-P for prediction (or estimation). We show that PLSNET-C can be trained by supervised learning with three standard Hebbian learning rules that extracts the PLS weight loading vectors, the regression coefficients, and the loading vectors for the univariate output component case (single target values). The PLS information that is extracted by PLSNET-C after training, i.e., three sets of synaptic weights, is used by the PLSNET-P as fixed weights (through the coupling) in its architecture. PLSNET-C can then yield predictions of the output variable given test measurements as its input. Two examples are presented, the first illustrates the typical improved predictive capability of PLSNET compared to classical least-squares, and the second shows how PLSNET can be used for parametric system identification.

Paper Details

Date Published: 22 March 1996
PDF: 12 pages
Proc. SPIE 2760, Applications and Science of Artificial Neural Networks II, (22 March 1996); doi: 10.1117/12.235962
Show Author Affiliations
Fredric M. Ham, Florida Institute of Technology (United States)
Ivica Kostanic, Florida Institute of Technology (United States)


Published in SPIE Proceedings Vol. 2760:
Applications and Science of Artificial Neural Networks II
Steven K. Rogers; Dennis W. Ruck, Editor(s)

© SPIE. Terms of Use
Back to Top