This is the 5th in a series of panel discussions in collaboration with Neuromatch Academy, the online computational neuroscience summer school. This is the 2nd of 3 in the deep learning series. In this episode, the panelists discuss their experiences “doing more with fewer parameters: Convnets, RNNs, attention & transformers, generative models (VAEs & GANs).
The other panels:
Thomas and I discuss the role of recurrence in visual cognition: how brains somehow excel with so few “layers” compared to deep nets, how...
What is creativity? How do we measure it? How do our brains implement it, and how might AI?Those are some of the questions John,...
Olaf and I discuss the explosion of network neuroscience, which uses network science tools to map the structure (connectome) and activity of the brain...