This is the 5th in a series of panel discussions in collaboration with Neuromatch Academy, the online computational neuroscience summer school. This is the 2nd of 3 in the deep learning series. In this episode, the panelists discuss their experiences “doing more with fewer parameters: Convnets, RNNs, attention & transformers, generative models (VAEs & GANs).
The other panels:
K, Josh, and I were postdocs together in Jeff Schall’s and Geoff Woodman’s labs. K and Josh had backgrounds in psychology and were getting...
Support the Podcast Jess and I discuss construction using graph neural networks. She makes AI agents that build structures to solve tasks in a...
Show notes: His new book, The Deep Learning Revolution: His Computational Neurobiology Laboratory at the Salk Institute. His faculty page at UCSD. His first...