Share Email Print
cover

Proceedings Paper

Multifunctional hybrid optical/digital neural net
Author(s): David P. Casasent
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

A multi-functional hybrid neural net is described. It is hybrid since it uses a digital hardware Hecht-Nielsen Corporation (HNC) neural net for adaptive learning and an optical neural net for on-line processing/classification. It is also hybrid in its combination of pattern recognition and neural net techniques. The system is multi-functional. It can function as an optimization and adaptive pattern recognition neural net as well as an auto and heteroassociative processor. I . W. JTRODUCTION Neural nets (NNs) have recently received enormous attention [1 -2] with increasing attention to the use of optical processors and a variety of new learning algorithms. Section 2 describes our hybrid NN with attention to Its fabrication and the role for optical and digital processors. Section 3 details Its use as an associative processor. Section 4 highlights is use in 3 optimization NN problems (a mixture NN a multitarget tracker (MTT) NN and a matrix inversion NN). Section 5 briefly notes it use as a production NN system and symbolic NN. Section 6 describes its use as an adaptive pattern recognition (PR) NN (that marries PR and NN techniques). 2. HYBRID ARCHITECTURE Figure 1 shows our basic hybrid NN [3]. The optical portion of the system is a matrix-vector (M-V) processor whose vector output P3 is the product of the vector at P1 and the matrix at P2. An HNC digital hardware NN is used during learning determine the interconnection weights forP2. If P2 is a spatial light modulator (SLM) its contents can be updated (using gated learning) from thedigital NN. The operations in most adaptive PR NN learning algorithms are sufficiently complex thatthey are best implemented digitally. In addition the learning operations required are often not well suited for optical realization for optimization NNs the weights are fixed and in adaptive learning learning is off-line and once completed the weights can often be fixed. Four gates are shown that determine the final output or the new P1 input neurons (Depending on the application). We briefly discuss these cases now and detail how each arises in subsequent sections. In most optimization NNs an external vector a is added to the P3 output (Gate I achieves this). In all NNJs a nonlinear thresholding (P3 outputs are 0 or 1 ) truncation (allP3 outputs lie between 0 and 1 ) or maximum selection (the maximum P3 output is set to 1 and all other P3 outputs to 0) SPIE Vol. 1294 Applications of Artificial Neural Networks (1990) / 31

Paper Details

Date Published: 1 August 1990
PDF: 11 pages
Proc. SPIE 1294, Applications of Artificial Neural Networks, (1 August 1990); doi: 10.1117/12.21154
Show Author Affiliations
David P. Casasent, Carnegie Mellon Univ. (United States)


Published in SPIE Proceedings Vol. 1294:
Applications of Artificial Neural Networks
Steven K. Rogers, Editor(s)

© SPIE. Terms of Use
Back to Top