Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Deterministic networks for probabilistic computing

Abstract

Neuronal-network models of high-level brain function often rely on the presence of stochasticity. The majority of these models assumes that each neuron is equipped with its own private source of randomness, often in the form of uncorrelated external noise. In biological neuronal networks, the origin of this noise remains unclear. In hardware implementations, the number of noise sources is limited due to space and bandwidth constraints. Hence, neurons in large networks have to share noise sources. We show that the resulting shared-noise correlations can significantly impair the computational performance of stochastic neuronal networks, but that this problem is naturally overcome by generating noise with deterministic recurrent neuronal networks. By virtue of the decorrelating effect of inhibitory feedback, a network of a few hundred neurons can serve as a natural source of uncorrelated noise for large ensembles of functional networks, each comprising thousands of units

Similar works

Full text

thumbnail-image

Juelich Shared Electronic Resources

redirect
Last time updated on 21/07/2018

This paper was published in Juelich Shared Electronic Resources.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.