Paper
21 November 2001 Neural information transfer in a noisy environment
Mark D. McDonnell, Charles E. M. Pearce, Derek Abbott
Author Affiliations +
Proceedings Volume 4591, Electronics and Structures for MEMS II; (2001) https://doi.org/10.1117/12.449175
Event: International Symposium on Microelectronics and MEMS, 2001, Adelaide, Australia
Abstract
For an array of N summing comparators, each with the same internal noise, how should the set of thresholds, (theta) i, be arranged to maximize the information at the output, given the input signal, x, has an arbitrary probability density, P(x)? This problem is easy to solve when there is no internal noise. In this case, the transmitted information is equal to the entropy of the output signal, y. For N comparators there are N+1 possible output states and hence y can take on N+1 values. The transmitted information is maximized when all output states have the same probability of occupation, that is, 1/(N+1). In this paper we address some preliminary considerations relating to the maximization of the transmitted information I = H(y) - H(y|x) when there is finite internal noise.
© (2001) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Mark D. McDonnell, Charles E. M. Pearce, and Derek Abbott "Neural information transfer in a noisy environment", Proc. SPIE 4591, Electronics and Structures for MEMS II, (21 November 2001); https://doi.org/10.1117/12.449175
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Interference (communication)

Stochastic processes

Neurons

Signal to noise ratio

Solids

Quantization

Electronic components

Back to Top