Share Email Print

Proceedings Paper

Neural information transfer in a noisy environment
Author(s): Mark D. McDonnell; Charles E. M. Pearce; Derek Abbott
Format Member Price Non-Member Price
PDF $17.00 $21.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

For an array of N summing comparators, each with the same internal noise, how should the set of thresholds, (theta) i, be arranged to maximize the information at the output, given the input signal, x, has an arbitrary probability density, P(x)? This problem is easy to solve when there is no internal noise. In this case, the transmitted information is equal to the entropy of the output signal, y. For N comparators there are N+1 possible output states and hence y can take on N+1 values. The transmitted information is maximized when all output states have the same probability of occupation, that is, 1/(N+1). In this paper we address some preliminary considerations relating to the maximization of the transmitted information I = H(y) - H(y|x) when there is finite internal noise.

Paper Details

Date Published: 21 November 2001
PDF: 11 pages
Proc. SPIE 4591, Electronics and Structures for MEMS II, (21 November 2001); doi: 10.1117/12.449175
Show Author Affiliations
Mark D. McDonnell, Adelaide Univ. (Australia)
Charles E. M. Pearce, Adelaide Univ. (Australia)
Derek Abbott, Adelaide Univ. (Australia)

Published in SPIE Proceedings Vol. 4591:
Electronics and Structures for MEMS II
Neil W. Bergmann; Derek Abbott; Alex Hariz; Vijay K. Varadan, Editor(s)

© SPIE. Terms of Use
Back to Top