
In this first part of our discussion, Brad and I discuss the state of neuromorphics and its relation to neuroscience and artificial intelligence. He describes his work adding new neurons to deep learning networks during training, called neurogenesis deep learning, inspired by how neurogenesis in the dentate gyrus of the hippocampus helps learn new things while keeping previous memories intact. We also talk about his method to transform deep learning networks into spiking neural networks so they can run on neuromorphic hardware, and the neuromorphics workshop he puts on every year, the Neuro Inspired Computational Elements (NICE) workshop.
Show Notes:
Mentioned in the show The two papers we discuss: The Roles of Supervised Machine Learning in Systems Neuroscience Machine learning for neural decoding Kording...
Support the show to get full episodes and join the Discord community. In the intro, I mention the Bernstein conference workshop I'll participate in,...
This is the 6th in a series of panel discussions in collaboration with Neuromatch Academy, the online computational neuroscience summer school. This is the...