
In this second part of my discussion with Wolfgang (check out the first part), we talk about spiking neural networks in general, principles of brain computation he finds promising for implementing better network models, and we quickly overview some of his recent work on using these principles to build models with biologically plausible learning mechanisms, a spiking network analog of the well-known LSTM recurrent network, and meta-learning using reservoir computing.
Part 3 in our 100th episode celebration. Previous guests answered the question: Given the continual surprising progress in AI powered by scaling up parameters...
Businessportrait Francisco Webber Cortical.ioThe white paper we discuss: Semantic Folding Theory And its Application in Semantic Fingerprinting. A nice talk Francisco gave: Semantic fingerprinting:...
Support the show to get full episodes and join the Discord community. Sri and Mei join me to discuss how including principles of neuromodulation...