Home
Scholarly Works
Analytical Calculation of Mutual Information...
Journal article

Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication

Abstract

Mutual information is a commonly used measure of communication between neurons, but little theory exists describing the relationship between mutual information and the parameters of the underlying neuronal interaction. Such a theory could help us understand how specific physiological changes affect the capacity of neurons to synaptically communicate, and, in particular, they could help us characterize the mechanisms by which neuronal dynamics gate the flow of information in the brain. Here we study a pair of linear-nonlinear-Poisson neurons coupled by a weak synapse. We derive an analytical expression describing the mutual information between their spike trains in terms of synapse strength, neuronal activation function, the time course of postsynaptic currents, and the time course of the background input received by the two neurons. This expression allows mutual information calculations that would otherwise be computationally intractable. We use this expression to analytically explore the interaction of excitation, information transmission, and the convexity of the activation function. Then, using this expression to quantify mutual information in simulations, we illustrate the information-gating effects of neural oscillations and oscillatory coherence, which may either increase or decrease the mutual information across the synapse depending on parameters. Finally, we show analytically that our results can quantitatively describe the selection of one information pathway over another when multiple sending neurons project weakly to a single receiving neuron.

Authors

Cannon J

Journal

Neural Computation, Vol. 29, No. 1, pp. 118–145

Publisher

MIT Press

Publication Date

January 1, 2017

DOI

10.1162/neco_a_00915

ISSN

0899-7667

Contact the Experts team