
Proceedings Paper
Multiplicative versus additive noise in multistate neural networksFormat | Member Price | Non-Member Price |
---|---|---|
$17.00 | $21.00 |
Paper Abstract
The effects of a variable amount of random dilution of the synaptic
couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents the dying or pruning of synapses and, hence, a static disruption of the learning process which can be considered as a form of multiplicative noise in the learning rule. Both parallel and sequential updating of the neurons can be treated. Symmetric dilution in the statics of the network is studied using the mean-field theory approach of statistical mechanics. General dilution, including asymmetric pruning of the couplings, is examined using the generating functional (path integral) approach of disordered systems. It is shown that random dilution acts as additive gaussian noise in the Hebbian learning rule with a mean zero and a variance depending on the connectivity of the network and on the symmetry. Furthermore, a scaling factor appears that essentially measures the average amount of anti-Hebbian couplings.
Paper Details
Date Published: 25 May 2004
PDF: 11 pages
Proc. SPIE 5471, Noise in Complex Systems and Stochastic Dynamics II, (25 May 2004); doi: 10.1117/12.546293
Published in SPIE Proceedings Vol. 5471:
Noise in Complex Systems and Stochastic Dynamics II
Zoltan Gingl, Editor(s)
PDF: 11 pages
Proc. SPIE 5471, Noise in Complex Systems and Stochastic Dynamics II, (25 May 2004); doi: 10.1117/12.546293
Show Author Affiliations
Desire Bolle, Katholieke Univ. Leuven (Belgium)
Jordi Busquets Blanco, Katholieke Univ. Leuven (Belgium)
Jordi Busquets Blanco, Katholieke Univ. Leuven (Belgium)
Toni Verbeiren, Katholieke Univ. Leuven (Belgium)
Published in SPIE Proceedings Vol. 5471:
Noise in Complex Systems and Stochastic Dynamics II
Zoltan Gingl, Editor(s)
© SPIE. Terms of Use
