This is the 5th in a series of panel discussions in collaboration with Neuromatch Academy, the online computational neuroscience summer school. This is the 2nd of 3 in the deep learning series. In this episode, the panelists discuss their experiences “doing more with fewer parameters: Convnets, RNNs, attention & transformers, generative models (VAEs & GANs).
The other panels:
Check out my free video series about what's missing in AI and Neuroscience Support the show to get full episodes and join the Discord...
Ken and I discuss open-endedness, the pursuit of ambitious goals by seeking novelty and interesting products instead of advancing directly toward defined objectives. We...
Support the show to get full episodes and join the Discord community. Hakwan and I discuss many of the topics in his new book,...