
In this second part of my discussion with Wolfgang (check out the first part), we talk about spiking neural networks in general, principles of brain computation he finds promising for implementing better network models, and we quickly overview some of his recent work on using these principles to build models with biologically plausible learning mechanisms, a spiking network analog of the well-known LSTM recurrent network, and meta-learning using reservoir computing.
Randy and I discuss his LEABRA cognitive architecture that aims to simulate the human brain, plus his current theory about how a loop between...
Mentioned in the show: Dan’s Stanford Neuroscience and Artificial Intelligence Laboratory: The 2 papers we discuss Performance-optimized hierarchical models predict neural responses in higher...
Support the show to get full episodes, full archive, and join the Discord community. Dean Buonomano runs the Buonomano lab at UCLA. Dean was...