Matthias Hennig and Alireza Alemi
Tue 03 May 2016, 11:00 - 12:00
IF Room 4.31/4.33

If you have a question about this talk, please contact: Gareth Beedham (gbeedham)

Matthias Hennig

"Unsupervised spike sorting for large scale, high density multielectrode arrays"

Over the past years, we have developed a set of methods and tools to analyse large-scale high density multielectrode array recordings. I will give an overview of this work, and specifically explain how we solved the high-dimensional clustering problem of assigning millions of events detected in these recordings to single neurons.

 

Alireza Alemi

"Optimizing information storage in recurrent neural networks"

Recurrent neural networks can store memory patterns as fixed-point attractors of their dynamics. A measure of performance of attractor networks is their storage capacity, which is 2N patterns for conventional recurrent network of N

binary neurons. The three-threshold learning rule (3TLR) is able to store up to this maximal capacity without relying on an explicit “error signal”. However, this storage capacity is equivalent to a maximal information capacity of 2 bits/weight for unconstrained weights which is far from ideal since an unconstrained weight has the capacity to store infinite amount of bits in a noiseless theoretical scenario — a capacity that conventional attractor networks cannot achieve.

Here, I propose a hierarchical attractor network that can achieve an ultra high information capacity. The network has two layers: a visible layer with N­v neurons, and a hidden layer with Nv neurons. The visible-to-hidden connections are set at random and kept fixed during the training phase, in which the memory patterns are stored as fixed-points of the network dynamics. The hidden-to-visible connections, initially normally distributed, are learned via 3TLR. My simulations suggest that the maximal information capacity grows exponentially with the expansion ratio Nh/Nv. As a first order approximation to understand the mechanism providing the high capacity, I performed a naive mean-field approximation (nMFA) of the network. The rapid increase in capacity was captured by the nMFA, revealing that a key underlying factor is the correlation between the hidden and the visible units. The nMFA can be reformulated as a perceptron problem which is amenable to an analytical calculation—the so-called replica method— the result of which is in agreement with the simulations. Additionally, it was observed that, at maximal capacity, the degree of symmetry of the connectivity between the hidden and the visible neurons increases with the expansion ratio. These results highlight the role of hierarchical neural architecture in information storage.