VVTNS.png
https://www.wwtns.online - on twitter: wwtns@TheoreticalWide

You are cordially invited to the lecture 

Riccardo Zecchina

Bocconi University, Milano


Local Deep Learning without Gradients in

Asymmetric Recurrent Networks


The lecture will be held on zoom on Wednesday, June 18, 2025, at 11:00 am EDT  
   


Abstract:  We introduce a statistical physics framework for learning in neural architectures composed of single or interconnected asymmetric attractor networks. These systems can exhibit a manifold of global fixed points capable of implementing sophisticated input-output mappings, which we characterize analytically. Learning from extensive datasets is achieved through the stabilization of fixed points via a fully distributed and local learning process, implemented at the single-neuron level. This simple mechanism yields performance comparable to that of conventional feedforward deep neural networks trained using gradient-based methods. The effectiveness of the model stems from the dense and accessible manifolds of stable fixed points, which encode the internal representations of data. Unlike other approaches to deep learning without backpropagation, our method does not attempt to estimate gradients.

About VVTNS : Launched as the World Wide  Theoretical Neuroscience Seminar (WWTNS) in November 2020 and renamed in homage to Carl van Vreeswijk in Memoriam (April 20, 2022), Speakers have the occasion to talk about theoretical aspects of their work which cannot be discussed in a setting where the majority of the audience consists of experimentalists. The seminars, held on Wednesdays at 11 am ET,  are 45-50 min long followed by a discussion. The talks are recorded with authorization of the speaker and are available to everybody on our YouTube channel.