This is the 5th in a series of panel discussions in collaboration with Neuromatch Academy, the online computational neuroscience summer school. This is the 2nd of 3 in the deep learning series. In this episode, the panelists discuss their experiences “doing more with fewer parameters: Convnets, RNNs, attention & transformers, generative models (VAEs & GANs).
The other panels:
Mentioned during the show: The paper we discuss: Hippocampal place cell encoding of sloping terrain. Blake’s website, where he writes his blog Noise mystery...
Grace’s websiteTwitter: @neurograce.Models of the Mind: How Physics, Engineering and Mathematics Have Shaped Our Understanding of the Brain.We talked about Grace’s work using convolutional...
Show notes: This is the first in a series of episodes where I interview keynote speakers at the upcoming Cognitive Computational Neuroscience conference in...