This is the 5th in a series of panel discussions in collaboration with Neuromatch Academy, the online computational neuroscience summer school. This is the 2nd of 3 in the deep learning series. In this episode, the panelists discuss their experiences “doing more with fewer parameters: Convnets, RNNs, attention & transformers, generative models (VAEs & GANs).
The other panels:
Support the show to get full episodes and join the Discord community. Hakwan and I discuss many of the topics in his new book,...
Support the show to get full episodes and join the Discord community. Jolande Fooken is a post-postdoctoral researcher interested in how we move our...
Randy and I discuss his LEABRA cognitive architecture that aims to simulate the human brain, plus his current theory about how a loop between...