
In this second part of my discussion with Wolfgang (check out the first part), we talk about spiking neural networks in general, principles of brain computation he finds promising for implementing better network models, and we quickly overview some of his recent work on using these principles to build models with biologically plausible learning mechanisms, a spiking network analog of the well-known LSTM recurrent network, and meta-learning using reservoir computing.
Mentioned in the show Adam’s Website. Follow him on Twitter. He made Technology Review’s 35 Innovators Under 35. The paper we discuss: Toward an...
Part 3 in our 100th episode celebration. Previous guests answered the question: Given the continual surprising progress in AI powered by scaling up parameters...
Rodrigo and I discuss concept cells and his latest book, NeuroScience Fiction. The book is a whirlwind of many of the big questions in...