Share Email Print
cover

Proceedings Paper

Relaxation properties and learning paradigms in complex systems
Author(s): Gianfranco Basti; Antonio Luigi Perrone; Giovanna Morgavi
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

With respect to the three different paradigms of neural networks generally studied (convergent, oscillatory, chaotic), a fourth is proposed. In some general sense, it makes the precedent ones three particular cases of itself. It is defined as a nonstationary model of a spin- glass like neural net. It has both a dynamics on the spins and on the weights in view of granting to the net a continuous redefinition of its phase space on a purely dynamic basis. So the system displays different behaviors (noisy, chaotic, stable) in function of its finite temporal order parameter, i.e., in function of a finite correlation among the spins acting on the weight dynamics. A first analysis of this model, capable of making nonstationary the probability distribution function on the spins, is developed in comparison with several paradigms of relaxation neural nets, developed in the classical framework of statistical mechanics. The nonstationary, analytically unpredictable, but deterministic and hence computable behavior of such a model is useful to make a neural net able to reckon with recognition tasks of nonsteady inputs and semantical problems.

Paper Details

Date Published: 1 August 1991
PDF: 18 pages
Proc. SPIE 1469, Applications of Artificial Neural Networks II, (1 August 1991); doi: 10.1117/12.45011
Show Author Affiliations
Gianfranco Basti, Pontifical Gregorian Univ. (Italy)
Antonio Luigi Perrone, Univ. of Rome "Tor Vergata" (Italy)
Giovanna Morgavi, Institute for Electronic Circuits/NRC (Italy)


Published in SPIE Proceedings Vol. 1469:
Applications of Artificial Neural Networks II
Steven K. Rogers, Editor(s)

© SPIE. Terms of Use
Back to Top