Professor Peter Latham, Gatsby Computational Neuroscience Unit, UCL Chair: Mark van Rossum
Thu 23 Apr 2015, 11:00 - 12:00
Informatics Forum (IF-4.31/4.33)

If you have a question about this talk, please contact: Mary-Clare Mackay (mmackay3)

Organisms face a hard problem: based on noisy sensory input, they must
set a large number of synaptic weights. However, they do not receive
enough information in their lifetime to learn the correct, or optimal
weights (i.e., the weights that ensure the circuit, system, and
ultimately organism functions as effectively as possible). Instead,
the best they could possibly do is compute a probability distribution
over the optimal weights. Based on this observation, we hypothesize
that synapses represent probability distribution over weights --- in
contrast to the widely held belief that they represent point
estimates. From this hypothesis, we derive learning rules for
supervised, reinforcement and unsupervised learning. This introduces a
new feature: the more uncertain the brain is about the optimal weight
of a synapse, the more plastic it is. This is consistent with current
data, and introduces several testable predictions.

Peter Latham's web site    http://www.gatsby.ucl.ac.uk/~pel/