This is the 5th in a series of panel discussions in collaboration with Neuromatch Academy, the online computational neuroscience summer school. This is the 2nd of 3 in the deep learning series. In this episode, the panelists discuss their experiences “doing more with fewer parameters: Convnets, RNNs, attention & transformers, generative models (VAEs & GANs).
The other panels:
Support the show to get full episodes and join the Discord community. In the intro, I mention the Bernstein conference workshop I'll participate in,...
Stefan and I discuss creativity and constraint in artificial and biological intelligence. We talk about his Asimov Institute and its goal of artificial creativity...
Randy and I discuss his LEABRA cognitive architecture that aims to simulate the human brain, plus his current theory about how a loop between...