
In this first part of our discussion, Brad and I discuss the state of neuromorphics and its relation to neuroscience and artificial intelligence. He describes his work adding new neurons to deep learning networks during training, called neurogenesis deep learning, inspired by how neurogenesis in the dentate gyrus of the hippocampus helps learn new things while keeping previous memories intact. We also talk about his method to transform deep learning networks into spiking neural networks so they can run on neuromorphic hardware, and the neuromorphics workshop he puts on every year, the Neuro Inspired Computational Elements (NICE) workshop.
Show Notes:
Dileep’s homepage. Dileep on Twitter: @dileeplearning Vicarious, the general AI robotics company Dileep cofounded. Vicarious on Twitter: @vicariousai. The papers we discuss: A generative...
Mentioned in the show: Mark’s lab The excellent blog he writes on Medium The paper we discuss: An ensemble code in medial prefrontal cortex...
Dileep and I discuss his theoretical account of how the thalamus and cortex work together to implement visual inference. We talked previously about his...