BI 147 Noah Hutton: In Silico

September 13, 2022 01:37:08
BI 147 Noah Hutton: In Silico
Brain Inspired
BI 147 Noah Hutton: In Silico

Sep 13 2022 | 01:37:08

/

Show Notes

Check out my free video series about what's missing in AI and Neuroscience

Support the show to get full episodes and join the Discord community.

Noah Hutton writes, directs, and scores documentary and narrative films. On this episode, we discuss his documentary In Silico. In 2009, Noah watched a TED talk by Henry Markram, in which Henry claimed it would take 10 years to fully simulate a human brain. This claim inspired Noah to chronicle the project, visiting Henry and his team periodically throughout. The result was In Silico, which tells the science, human, and social story of Henry's massively funded projects - the Blue Brain Project and the Human Brain Project.

0:00 - Intro 3:36 - Release and premier 7:37 - Noah's background 9:52 - Origins of In Silico 19:39 - Recurring visits 22:13 - Including the critics 25:22 - Markram's shifting outlook and salesmanship 35:43 - Promises and delivery 41:28 - Computer and brain terms interchange 49:22 - Progress vs. illusion of progress 52:19 - Close to quitting 58:01 - Salesmanship vs bad at estimating timelines 1:02:12 - Brain simulation science 1:11:19 - AGI 1:14:48 - Brain simulation vs. neuro-AI 1:21:03 - Opinion on TED talks 1:25:16 - Hero worship 1:29:03 - Feedback on In Silico

View Full Transcript

Episode Transcript

Speaker 1 00:00:03 It was the first time I'd really heard a scientist plant a flag, uh, on, in terms of a timeline on a certain time in the future, when a great insight will be gained, this was a crazy idea that we could do a human brain in 10 years. And I, I continued to run into that criticism and to this day, I mean, people who have seen this film and reacted to it, think that that's a, a, a crazy landmark to be thinking of in, in any kind of timeframe of a decade, you wanna feel like the, the work you're doing has a real, has an actual impact in your lifetime, in the, in the community of humans you live amongst. And when you're doing such a long term pro speculative project, that's building a model from the bottom up over so much time. What keeps you in it? Um, what, what keeps you bound to the work? Speaker 0 00:00:57 This is brain inspired. Speaker 2 00:01:12 That was Noah Hutton. And I am Paul Middlebrooks. Welcome to brain inspired. This is a different kind of episode for brain inspired, partly because Noah is not officially a scientist or philosopher, but a film maker and starting today, his documentary in Silco is available to stream on Vimeo in Silco is the result of Noah's 10 plus year journey capturing the progress of massively funded projects in neuroscience, the blue brain project and the human brain project, which we discuss in the episode. The reason it took about 10 years is because that's the timeframe that Henry Markram set for himself and a large team of scientists given enough funding to build a full simulation of a human brain in all its gnarly detail and complexity. That aspiration was first articulated in a 2009 Ted talk by Henry a Ted talk that drew Noah in and sparked what has become in Silco Noah was, uh, generous and allowed myself and brain inspired Patreon supporters to screen the film. Speaker 2 00:02:18 And he agreed to record this episode, live with a bunch of those Patreon supporters to field their questions and mine about his experiences and the science and the people involved in the science. So a different kind of episode on that front as well. Uh, especially if you enjoy brain inspired, you'll enjoy Insco and you're doing yourself a disservice. If you don't watch it, I'm actually recording this before I have the Vimeo link, but I will put that link in the show notes at brain inspired.co/podcast/ 147, or you can just go straight to the films website, which is Insco film.com and you'll find links there. So I said that Noah is not a scientist, but you might not know that by listening to him as you'll hear, he's always been interested in neuroscience and is quite knowledgeable about it. So enjoy Noah, Noah Hutton. Insco uh, congratulations. First of all, congratulations on the film and a huge thank you, uh, for sharing it with my podcast supporters. Uh, you, you were gracious enough to, to give us a screening and now we're all, many of us are here anyway, to, uh, to, uh, ask you questions and, and learn more about the film. So thanks. Speaker 1 00:03:28 Thank you so much for having me I'm, I'm, uh, an avid listener of your show. So I, I really it's honored to be here and I appreciate you all watching the film. Speaker 2 00:03:37 So, uh, I guess this will come out either the day of, or maybe the day after the release of the film, but are you gonna have like a, a premier or an event? Speaker 1 00:03:48 There is a, um, event in New York city on the, on the premier date on, so that's, uh, September 13th and sandbox films, the company that, that financed the film, uh, mostly we also got a, some funding from the Sloan alpha peace Sloan foundation. Sandbox films is going to host this event at the Angelica theater, uh, in New York. So I'm excited for that. I I've never seen the film with a real audience. We had a virtual theatrical run, which is, which was a big COVID term. And I had two films come out during COVID. Um, this one, also a narrative sci-fi feature, uh, called lapses. And both of them had these virtual theatrical runs where I never got to see them with an audience. So I'm particularly excited, um, to be able to yeah. Be in a, be in a room with people watching the film cuz I'm, that's how I'm used to releasing the work. And it's been a funny time to do it all virtually Speaker 2 00:04:40 And I I'm, I'm really naive about how film works, but are, is it gonna be in theaters after that in some limited number of theaters or is it just gonna be the, the premier event or is that something that gets determined after Speaker 1 00:04:51 It might have been, but we, because we did this virtual theatrical, we kind of, that that blew our theatrical run. Uh, we, we were in theaters virtually all across the country. Yeah. So, you know, but they were, they were hosting it on streaming services. Um, so, so it won't be in physical theater. So that day it's released on, um, streaming platform. So people can access it on iTunes, for example, on apple TV. And, um, the one that I'm pointing people to the most is VI on demand. Um, I don't know how many people watch or rent movies on Vimeo, but it's, it's the only platform for us where the film will be available all over the world. So, um, it's, it's sort of the widest reach. So VIM on demand on September 13th Speaker 2 00:05:32 And, and what's after the premiere, what's the after party gonna be like Speaker 1 00:05:37 <laugh>, I don't know. Um, you know, it's, it's also a strange thing having not gone out much in, you know, and I, I live in upstate New York, so it's, it's, it's sort of gonna be quite a shock for me to go into the big city after all this time <laugh> and have a, have a party. I dunno. Some people might be well used to this already by now, but I've been, I've been up in the country. Speaker 2 00:05:58 And, uh, last question about the premiere. Are you going to invite, uh, mark Ru? Is he gonna be there? Speaker 1 00:06:05 <laugh> I don't think so. We can talk about this more, but it's been a pretty icy, uh, response from him in the project to this film, which I totally expected and understand. Speaker 2 00:06:15 Well, so the, the film ends with, uh, you showing, uh, them making their own movie, which may or may not be a response to your, uh, documentary, which we'll get into more here in just a moment. But I mean, is that, did that come out or are they, are they in production on that? Do you know? Speaker 1 00:06:35 I don't know. Um, I've actually heard rumors at some of these virtual screenings I've been doing that they are working on a response documentary. I, I I'd be eager to see that. I, I, I don't, I haven't heard anything more about it. I don't know when it will come out that, that film, that there, as you mentioned at the end, not to give away too much, but the end of in Silco my film, um, there is a film that they are working on and it, at the time it struck me as just another of, in a long series of promotional films that they make about their work. And, you know, that's how I got into making this film in the first place. I saw one of their, you know, stunning fly through videos of this little piece of, you know, cortex. And, um, I, I decided to make my film. So I, I, I understand, well the power of, um, the visualization, it, it's what got me involved in the first place, but I, I came to be quite critical and we can talk more about that O O of the promotional efforts and the emphasis on them. And I, I think that film that they're making near the end of in Silco is just sort of another one of those. Speaker 2 00:07:37 So you started off, uh, I guess, an undergraduate taking a bunch of neuroscience courses, and I understand that you took, uh, enough of those courses to have majored in neuroscience. Should you have wanted to, is that right? Speaker 1 00:07:52 Well, no, I, you know, at, at my, um, university, I went to Wesleyan university, which is, you know, a liberal arts college. And so I, I took, I was actually an art history major, although I took as many neuroscience classes as art history classes. So I felt like I was double majoring, but I actually didn't get the, a neuroscience degree because to, to get that degree, you do have to do a year of orgo, a year of physics, a year of biology. So I, I didn't, I didn't really want to do that. I wanted just to take the upper level neuroscience classes, which I did, and I took like, you know, nine of those, I think, and I think you had to take nine art history courses to major. So I felt like I was double majoring, but I wasn't, I was just getting an art history degree, but it felt like a good, a good preparation, um, to go make this film because we were assigned, I, I was taking a, a mammalian cortical circuits class, and we were assigned the mark papers from the nineties, um, with Bert Soman and, and others. And, you know, I, I, I was already enamored by that work. And then, so to he to see his Ted talk and to, to have that trust, that foundation of trust built, um, it, it allowed me to, I think, to, to be swept away in a, in a certain way, Speaker 2 00:08:59 But not you, you didn't, um, fall in love with neuroscience enough to be swept away with neuroscience because you, your, your real passion is filmmaking, I suppose. And art. Speaker 1 00:09:09 Yeah. I, I, I guess my, I guess my craft is filmmaking, but I, I, I think some people go to, you know, school for film. A lot of people go to film school, um, and are, are swept away in a certain more direct way by film and film history. And I never got into all that. I, I never was like a, a film nerd and I never got into the craft. And that way I was always trying to make films about the things I was interested in. And so neuroscience was really the thing I was interested in. And, and, and it was my, it was my intellectual passion and film for me was a craft I, and, and art history. I never, never did anything with, so I it's one of those stale, liberal arts degrees sitting on my, you know, shelf. I don't know. I, I thought maybe sometime I would be like working a museum or something, but I, I never went down that road. Speaker 2 00:09:52 Okay. So, well, you, you read these papers, uh, by mark Ru and others, and then you saw this Ted talk, he kind of starts with mark Ru's infamous, uh, I suppose now Ted talk where he, um, describes a project whereby within 10 years, they will fully simulate a human brain. And you were, um, oh, well, optimistic, I suppose, uh, in the beginning and, and this sort of set off the beginnings of your, of the project is, is, do I have that right? Speaker 1 00:10:24 Yeah, that's right. I mean, I, I, as I mentioned, I, I came out of reading these papers. I went over there very wide eyed. I'd also read if maybe PE maybe listeners. Remember, I, I, I read, um, there was this, there was this profiled by Joan LAER, um, at the time who was writing, I believe in seed magazine, um, and, and wrote a very glow portrait of, of mark as this kind of rock star of neuroscience and had, you know, had gone over there himself and had gotten a tour around their machine room. And, um, you know, I, so I had read that I had seen their videos online. I had seen the Ted talk and, um, I went over there and, and this came out of also a place of, I, I did wanna make a film about neuroscience. I had already made a documentary about the oil boom in North Dakota, and that was my first film. And I made that as a senior in college, um, called crude independence and I'd gone and, and lived in North Dakota for a while. Wow. Um, and that Speaker 2 00:11:18 Was, wait, hang on. Just, how was that experience living in North Dakota in, in the, it Speaker 1 00:11:22 Was, it was, it was, um, really the beginnings of this like wild west oil boom out there. It was the beginning of fracking and, um, and, and oil drilling in North Dakota, which continued for, you know, and, and continues to this day. But people were lining up at Dawn at the courthouse to, you know, get land deeds and it really, what had this boom town feel. And, um, it was great for me to kind of cut my teeth as a new documentary filmmaker and on my own out there getting out of my, um, east coast bubble for sure. And, um, you know, I, I, I, I learned a ton making that film and then I, I, but, but, you know, I wasn't necessarily like passionate about that sub I, I, it wasn't my intellectual, um, passion. Um, I, I had, I think I had seen, like there will be blood, you know, that movie. Speaker 1 00:12:08 And I think I was like, oh my God, this is, this is like the real world version of that happening. And that swept me away and you know, enough to make the film. But, um, no, I met, I met some wonderful people out there and I actually ended up making another film there in 2015 called deep time. Um, which is so, so I kind of had, had this parallel track all these years of making these environmental films about North Dakota. Um, but, but I really wanted to make a film about neuroscience. I wanna make a brain, a brain documentary of some kind, but it's difficult. I, I found because, you know, you often think when you go into making a film as documentary, certainly if you're writing a screenplay, you need to think about this. But I think a lot of people who go into documentaries, it, it is a little risky to go into it. Speaker 1 00:12:51 Totally. Open-end not knowing exactly where you might land or your story might land for an ending, especially if you're spending your own money on it. Or, you know, if you're trying to get funding, you definitely need to do this. You need to write what the equivalent of grant applications are to get someone to believe that you actually have a film here about something. But for me, I was, um, with, with my brain documentary with, with the beatings of this film, I was self-funding and I was doing freelance work then in New York, doing commercials and so forth. And I, so it was very risky for me to think about, um, beginning a film with my own money, where, where was this gonna go? And I couldn't figure out with neuroscience stories where the ending would be. I, I, I really found it difficult to, to see out of anything any lab was doing, like, you know, in the discussion sections and conclusions of papers, there's, there's stuff dangled about, you know, curing a cure for Alzheimer's or this will lead to, you know, um, or certainly about consciousness. Speaker 1 00:13:47 You know, there's like the, the, the sort of like dangling fruit of will we, will we crack the secret of consciousness? I thought, well, I can make, maybe I can make a film about these things, but it, this was so far fetched and was always kind of punted over the horizon of whatever research I, you know, I was tracking that year or something. So I found it difficult to think about what would the third act be for a documentary about, about this subject. And it was when I saw this Ted talk in, um, 2009 by Henry, where it was the first time, I'd really heard a scientist plant a flag, uh, on, in terms of a timeline on a certain time in the future, when a great insight will be gained. And I, you know, in retrospect, listen, I was 22. Um, I was very, I, I, I was an, as I mentioned, I had read the papers and I was enamored by the scientist. Speaker 1 00:14:37 And I, I think I was much more easily swept away then than I would be now in my, you know, 30 something brain. But I, now that you're cynical and pessimistic <laugh>. Yeah, right. We, um, so I, I, uh, I was, I was taken by the idea that 10, you know, 10 years felt like long enough that anything could happen, but, but yet for a 22 year old, somewhat short enough that I could imagine finishing this film in a, you know, 32 felt like so far away for me, but still a decade didn't seem crazy. And I thought, if I just stick with it, if I, if he says he's gonna do this in 10 years. And if I go once a year, at least, and I track this, that's an interesting timeline for film. And I, I liked the idea of longitudinal films. I was inspired by this series, Michael app that had done the seven up series of people are familiar with that, where he tracked, um, a group of people every se, seven years for their entire lives and made this ongoing documentary about it. And so there there's been some wonderful longitudinal documentaries that were kind of reference points for me, and I thought, okay, I'm gonna jump in and do this. Um, I'm gonna go over there the first year I had done a music video for Joe Laue fan, the Amy Delo. Speaker 2 00:15:48 Oh, I, I saw that he was like a, a producer on, in the credits and I was wondering how that came about. So is that, Speaker 1 00:15:54 Yeah, so I, I did a, I was doing music videos and commercials in New York and, um, I was into neuroscience. So I got like, I crossed paths with Joe. Laue Speaker 2 00:16:02 Real cool. That's real, real cool band, right? Yeah. Speaker 1 00:16:05 Yeah. I, that, so he's at NYU and he had this band called Melo, me, Meg Deloittes. And, um, I did a music video for them, which is on YouTube somewhere. And, um, and Joe knew Henry, you know, I think not, not super well, but knew him from the community of, of, of sci of scientists working at a high level. Um, and so he helped me get in touch with Henry. I had emailed Henry Reddy a couple times and hadn't written back. So Joe helped me said that you vouched for me to, to Henry. And that's, that's kind of how I got my first interview with Henry. And then, and then, um, I had just come from going to, I had gone to a, like neuro aesthetics conference in Copenhagen with my CA I was just, so I was so like, I, I, I was wide-eyed, I was taking everything in, I, I, I was trying to see what I could make my film about. Speaker 1 00:16:52 So I, I think Henry was impressed. I had come from this conference and, and, you know, but I, I really was genuinely, I wasn't there to get the scoop or show the real blue brain project or something. I really was genuinely going in thinking that this was gonna happen and that I was gonna, you know, be there to document it along the way I had, I had not an I Iotta yet of sense of what the criticisms were, you know, I'm sure on that Ted talk, if I had just scrolled down into comment section, I could have seen some early criticisms. It probably would've helped me, um, you know, sharpen that side of, of the film a little bit earlier. But, um, I, it took me a few years actually to, um, even realize that there were criticisms of this project. So that's how into it. I was in the beginning. Yeah. Speaker 2 00:17:35 So, I mean, you know, you were working under a lot of uncertainty, which must not have felt great, uh, at times. Right. But so, okay. There, there's already a couple questions in the chat that, um, I'll weave in here. One is just, you know, you went, uh, was it, I guess it was 10 years or was it beyond 10 years? Speaker 1 00:17:53 It went a little bit beyond, I mean, my, my, I did 10 visits to, to the project. So I, there was a nice, um, sort of like wholeness in the end about that. I, there, there was once a year for 10 years, but you know, it takes a couple years to finish to, this was a big edit. I had a lot of footage. Oh my God. I was Speaker 2 00:18:12 Gonna ask you about that. Yeah. Speaker 1 00:18:13 Film. Yeah. So it took, it took a good year to just get the film edited after all that. And then it takes a while to get the film out. So that's where we are now, but at a certain point, which is in the, I put in the film purposefully, cuz I wanted to be a little, um, self-aware about my own. Um, the way I was selling my film to the world because I got carried away. I think with what I had become when the human brain project happened and Henry mark leads this team that gets a billion euros, um, from the European commission as part of this flagship award, I, I was really like, oh my God, what have I, I I've really stumbled into it here. I've hit gold. I, I, I'm part of this even bigger project. Now I'm making a film about this billion dollar global project. Speaker 1 00:18:57 Oh my God. And so I, I, I went on MSNBC and I, I said, I'm making a 15 year film, cuz at that point it was five year film, five years in. And they had launched the human brain project, which had 10 new 10 year timeline. So I thought, okay, wow, I'm making a 15 year film now. So I, I, I got carried away. I wanna put that in the film itself. Um, but in the end I went back to my original promise to myself and, you know, to Henry, which was a 10 year film and that's so I, I, I cut my 15 year timeline short, the human brain project continues now, you know? And, and so is the blue brain project. This was the, you know, this was an arbitrary timeline in the end set by the Ted talk and then followed very closely by me and maybe nobody else <laugh> Speaker 2 00:19:40 Okay. So here's a couple questions from the chat already. So you did these 10 visits to the site. Um, uh, someone's wondering, um, how much time you actually, when you visited how much time you would actually spend, uh, on site and just the, and then I I'll add, you know, what the nature of those visits were like, were you having to try to pull people in, get people's attention or did you have set meetings? That sort of thing? Speaker 1 00:20:06 Yeah, I would go, so I, I went 10 years in a row. Some, some years I did two visits. Um, so it wasn't, it wasn't an Orthodox 10, 10 trips and only 10 trips. I, I, there were several years where I went twice because there were notable things hap that would come up. Um, and they, and they would sometimes let me know about them. And sometimes I would realize that, oh, the yearly summit, you know, is happening this year in, uh, Barcelona or something, I'm gonna go there and then I'm gonna go back with the team to Loza or something. So I, I, I would sometimes try to hit some notable event to see them outside of their, their labs and their offices too. And so that, there's a bit of that in the film. Um, but yeah, when I went, when I would go on these trips, uh, the, the first eight years I self-funded everything and that was difficult. Speaker 1 00:20:55 Jesus. Um, because it's, it's very expensive to travel to, to Europe and especially Switzerland is, was quite expensive. Um, I'm not sure what the exchange rate is now with, with the, the dollar is much stronger now, but <laugh>, but at the time Swiss, Franks, it was just, it was just very expensive to get, to get there and to stay for any amount of time. So I, I would freelance in New York, I would save up. I would, you know, I could afford like a, a four day trip ish four day usually. Um, and I would stay maybe like three nights and I would, so I would usually have like three full days or so at their, their building. And, and, you know, they would help me create a, a, a sort of schedule of, of people to interview. And they had, they had new people that might have joined the project that year. Speaker 1 00:21:41 They wanted to make sure I would include, and, you know, it always, it always towed a bit of a line of, I, I, I was very careful to make sure that I could, that I was retaining my independence and my own control of what I was, what I was interested in, what I was filming. I did not wanna become the defacto videographer for this project who would come over and they would think was capturing something for them. And w and that is the CA you know, I did retain my independence. I, we never had any agreement that was signed or any kind of, I Speaker 2 00:22:14 Wasn't, was it implicitly understood from them that you might be sort of on their side? Speaker 1 00:22:21 I think they hoped that they hoped that, um, I don't think I led them, um, on, in any way. In fact, we had to have a couple tough conversations because in the first five years of this project, I was releasing yearly updates to this. And I don't know if anyone had seen those back then, but at the time this, my film was called blue brain and scientific American premiered one, um, vice premiered one. And they were, you know, they were like 10 or 15 minute edits that, that would, that I would make from that year in footage and post online. And the first couple were probably felt to many people. Like there were just promotional videos for the blue brain project, cuz they had no critics in them. They were just, it was just Henry talking, showing the visualizations, blah blah, then year three, I included. Speaker 1 00:23:06 Um, and this is no, this all got thrown up in the air and reedited for the film in the end plus the end of the five years as well. But I had Sebastian Sung's interview in, in, in uh, year three. And that was the first piece of criticism because Sebastian had included in his book, connectome a, a bit of criticism about Henry the blueberry project. I'd read that I had gone and sought out. Sebastian interviewed him. And then in my, in the that year's edit, I put that in there and I didn't tell Henry in the project, I was gonna do that. And I posted that publicly. Um, and they were, you know, they were taken aback by that and, and their project manager at the time I sent by Henry, uh, when I got there, sat me down and sort of had a talk to talking to with me about how, if I was gonna include critics in the future, I would need to tell them about it, not run, run, run, like, you know, the material by them. Speaker 1 00:23:57 They were still to their credit. They were, they, they underst they understood my independence and respected it and never challenged it directly. But I think they were, you know, they were a little frustrated that they had given me all this access and I think that's fair on their part. They'd given me all this access. And then I had kind of blindsided them with this, with this criticism, which they knew they, they, they were, you know, quick to tell me, we know all the criticism, you know, it's not like this is new to us, but just let us know because when you're gonna make your edit, we're gonna be in dialogue with these critics when you cut back and forth. And, um, we wanna know who we're in dialogue with. So we, we had a, we had that, you know, um, friendly tension throughout the years, I would call it of, um, what I, what I'm doing, what their awareness of what I'm doing is, and again, to their credit, they let me, they kept letting me come back to do what I was doing. Speaker 1 00:24:49 Um, and in the end, you know, I'm, I'm, I, I wonder if they knew, uh, the extent of the criticism I had captured because I didn't release anything the last five years. Um, I, I went and, you know, a lot of that, a lot of the material, the most critical material in the film comes from people responding to the, you know, the open letter and all of the fallout over the human brain project. And that all was material. I just kept until the end. Um, so we had a very interesting relationship along the way, but there was no implicit promise or anything about what this film was gonna be. Speaker 2 00:25:22 Okay. So this leads to the nicely, to the next question. Um, I'm curious how you perceived the shift, if any, in Henry's attitude from year one to year 10, having spent a lot of time with him, I'm wondering, because it struck me as a bit arrogant to be so confident about a 10 year timeline to build one of the most complex things known to science in parentheses, just like Elon Musk's attitude with regards to bring computer interfaces. Did you see a shift in how he approached this, his project idea slash vision as a scientist and not as a Messiah Speaker 1 00:25:53 Over the whole course of the film? Yeah. Or the whole course of the 10 years? Speaker 2 00:25:57 Yeah. Um, wait, can I just say before, before you answer that, because the, the Messiah aspect, I don't know, my reading of Henry during the film was that he just remained sort of focused and confident and that the salesmanship part of his delivery, uh, was definitely there, but I don't know the full extent of, you know, how arrogant and, uh, salesmanship he was. And I don't know if you wanna comment on that, but, um, yeah. So maybe just, uh, as a preamble to the answer, to your, to, to the question. Speaker 1 00:26:36 Yeah. Um, right. So you're, you're, you're saying you, you're wondering if there was material left on the cutting room floor. Are you wondering if I saw stuff that didn't make it in the film that, that spoke more to those qualities or, Speaker 2 00:26:48 Well, no, this is from the chat. So, um, that question's from the chat, uh, but just from my reading of it, um, he seemed, uh, sincere in the, and focused the entire time and, and not bitter, he didn't seem terribly bitter, but okay. That was just my reading of it. Speaker 1 00:27:07 Yeah. I, um, it's complex and a lot, a lot of the dynamics of salesmanship are, are complex because as a salesman, you often believe in the product, um, you believe in the possibility of the product, but you also, especially, I think when you're garnering funding for a dream like this of a grand vision of where something could get, if all of the variables line up in the, you know, if computing, if the computing track continues on the way you expect it to, and, and you continue to get better and better machines that can simulate more and more detail, um, that's one track, you know, you also need, um, to, to get, to have access and to standardized on the informatics track, you need the data to fit in, in the right way. Right. And be able to, to, to load massive amounts of data into the simulation. So there's all there were all these tracks. And when he does the Ted talk in 2009, does he need know for sure that the tracks are gonna line up in 10 years? I don't fuck. Speaker 2 00:28:04 No, fuck. No. Speaker 1 00:28:05 <laugh>, I don't think so. And I don't think anyone, you know, would think that he did. And I didn't, I mean, I, I, I trusted him though because of the papers and because of his career before that, I trusted that like other scientists, he wouldn't say something unless he really did think that these tracks could end, could land there in 10 years. And that's where I wonder if I was a little mistaken in that, in that trust, because I think there's a degree of salesmanship that went on with this project over the years. Um, we're, we're potentially, there was a little bit of, we need to run out and get a bunch of funding so that we can see so that we can try to get there, but we're not, we're not really sure if we could get there in 10 years and there, and all of the criticism, I, I ran into that and had first, a few years in was really that, that this, this was a crazy idea that we could do a human brain in 10 years. Speaker 1 00:28:57 And I, I continued to run into that criticism and to this day, I mean, people who have seen this film and reacted to it, think that that's a, a crazy landmark to be thinking of in, in any kind of timeframe of a decade. And, um, you know, maybe you could get a generalized model of a human brain with, with very abstracted point neurons or whatever, but to think of that, the level of detail they were talking about and were proposing in, in that initial proof of concepts, cortical column that they simulated back in 2009 could be scaled to a human brain. Um, I don't think anyone would believe that. And I, and that's where it's complicated, cuz I'm not sure even Henry really, really thought that that that could happen in a decade. Um, it's, it's impossible. It's a fiction you're kind of writing science fiction as you make the proclamation in a way. Speaker 1 00:29:46 And you're maybe hoping that the tracks catch up and science fiction becomes reality. He might have really thought that it could be, you know, Henry really I, in my time with him, he, his thinking is, is much closer to the kind of futurism of someone like Ray Kurtz while than it is the, the, the various traditional, um, academic biologist of many of whom I, I interview in this film who are speaking about the blue brain project, but that kind of futurism, you know, Bre, it's a way of thinking where, where anything could be possible in a decade if we just all got together and, and worked on the same vision in concert. It's just not how the world works though. So it's a little, it's a little divorced from reality. Um, I think that becomes problematic when public funds are at stake. So when the salesmanship is, is from, um, the largest S of one billionaire, uh, and, and you want to go out and run and try to solve a problem, it's a little bit of a different conversation, I think. Speaker 1 00:30:46 And it's interesting to see what the Allen Institute did with the, with the general, you know, the, the, the money from Paul Allen. Um, they, they made tools that have been widely used. And so, you know, well regarded in the community as, as sort of cornerstone now, foundational maps and so forth. So I, I think it's what it's, it's difficult, you know, but, but my point is that if they had gone out and tried to do the same project, I don't know if I would've been as critical, but when, when it you're dealing with state funded science money coming from originally from taxpayers, it's, I think the criticism is you open yourself up to the public gays and you, you should be responsible to absorb that criticism and to deal with it and to respond to it. Um, and, and you should be in dialogue with it. I think so I, um, I think in many ways, this film audits, um, that the kind of salesmanship, as in, in regards to public funds, uh, in a, in a great dream like this, and, but I've gone a bit on a tangent here, your original question was about how Henry himself changed. Um, Speaker 2 00:31:49 Well, if, if you could, yeah. Since that change or, or what, since you had of his, of his change. Speaker 1 00:31:53 Right, right. Right. So as a Speaker 2 00:31:55 Scientist, not a sales. Speaker 1 00:31:57 So as a scientist, um, I, I, I saw, I saw Henry, um, really continue to focus on, on the mouse. Brainin the whole time I was working on, on the plane. Really. We didn't, we, we never even got to, um, the work going on there is just like com continues to be completely focused on the mouse brain. And they're, they're still trying to work towards a full scale simulation of that mouse sprain and the cell paper in 2015 was there, you know, their, their crowning achievement to date from their point of view. Um, and you know, you had, you had a, a guest who used to work on the project and a few episodes ago on your podcast who told you about this great calcium triumph that they had, which I didn't put in the film. I thought it was a little too technical to, to get into in the film, but, but Speaker 2 00:32:43 By the way, that that guest will remain unnamed. However, I was, uh, going to bring up the, uh, blue brain and human brain project. And I was asked not to. Speaker 1 00:32:53 Yeah. Well, you, you, you, I, you still did. You still, you still tried to, right. You still, you asked him a couple anyway, whatever, <laugh> a couple questions. I was just interested that a lot of people don't wanna talk about this project and, um, people who, certainly people who work there still, and, and people who used to work there, I, I just think it's, it's listen, it's I kind of get it it's becomes, it's been such a lightning rod for, for so many years for these people. And a lot of these people are humble scientists who, who came up through the project and wanted to just, um, you know, get, do their work and, and move on with their careers. And, you know, they, they shouldn't necessarily have to respond to, to, to answer to this stuff directly. So that's fair, but okay. Let back to back to the, um, changing as a scientist. Speaker 1 00:33:38 So one thing that has changed that I've noticed is that, um, there is no really mention of the human brain anymore. They've scrubbed from their website when this started, you know, Henry gave the Ted talk and it was human brain in 10 years, that those were the headlines that was the press, they were ginning up. And there was a roadmap that they were gonna do, um, you know, mouse, sprain, cat, brain, primate brain, and then human brain. And there were, I, I saw PowerPoints with that roadmap laid out that that kind of got scrubbed and has Eva, and has disappeared. And now, um, I think they've Henry has realized that being more realistic and, and, um, targeted with the milestones helps dissuade people from being so vocally critic critical towards probably if he just talks about simulating the mouse brain, it's a little more feasible, still a pretty big pipe dream, but, but a little more feasible than, than throwing out the human brain 10 years, uh, timeline that's been one change. Speaker 1 00:34:36 Um, I, I don't think that his, his core, um, beliefs changed in, in the work. Um, yeah, I, I, I didn't see him wave one inch. And I, I was, I was looking for that in the last couple years of making this film and, you know, in the last interviewer to I'm asking him, um, you know, do you still believe in this, in this project the same, and in this attempt to simulate at this level of detail, the same way you did when you began, and yet the answer was, is always yes. Um, and I, I assume continues to be. And, um, you know, I, I think in order to run a project like this, and to continue to get the funding that they do from the Swiss government, you have to believe, and you, you have to have, you know, everyone around the project speaks to the vision of the project. Um, you know, that there, there is an element of that leadership that, that remain that has to remain inspired and positive. And, um, in order to, you know, keep, keep the people around him, um, believing in it and keep the funding coming. Speaker 2 00:35:43 I'm gonna jump in with a question on my own, and by the way, there's lots of questions in the chat. So nice job guys, and keep those questions coming. Um, just a little anecdote. I think I've told this on the podcast before, but when I was interviewing for, uh, posts, I had a conversation with a fairly new faculty member. And he said that the lab that he came from his advisor gave him this advice to just say crazy shit, and eventually you'll get the funding and you might be right. And, and likewise, um, and that always stuck with me. And I, whenever I think of, you know, projects like this, or, or, or ambitions like this, I had a friend, uh, growing up in high school and in college, and it always bothered me so much that, um, when in a group conversation to overtake, he would just overtake and get attention. He would just speak louder mm-hmm <affirmative> and he would just yell above everyone else. And it always really bothered me and I, that problem exists in science as well. Um, essentially that attention seekers get attention. I mean, it's same with the same with children, and I don't know how to fix the problem, but, and I don't even know if you wanna comment on this, uh, maybe, you know, reflecting on your experience. Do you, do you have the same sense that, that, that it's an open problem? And if so, how do we fix it? <laugh> Speaker 1 00:37:10 Yeah, it, it, it's a pro it's kind of like a, I think you've, you've put your finger on a, a universal problem with human psychology, but we can apply it. We can drill down on in science, in particular, but you know, the scientists in your crowd probably know more, more about this than, than I do in terms of the day to day pressures that, but that you feel as a scientist to attention seek. Um, I do think we've, we're, we're scientists operate in a system that incentivizes is metrics and, and pushing out publications and pushing out and being, being loud about, about the implications of your publications. And so seeking attention for your metrics and then writing books that, you know, pu pump it up even further, probably. So I, I, I don't know, I'm not, you know, this is not what I, I, I don't design policy. Speaker 1 00:38:03 I don't know how to change that system, but just to affirm what you're saying. I mean, I, I, I think yes, yes, people do that, but it's, there is this sort of danger. I think this is obvious to people, but when you don't deliver, when you over promise and under deliver, that's when public trust and faith in, in science can, can potentially erode, and you can promise a lot of, um, get, get all this attention for promising nice, shiny things, but then the sort of stultifying slog of, of the realities of research in Sue and, and then you're, and you are in institutions that from my point of view, and I tried to include some of this critical, um, context in the film. And there, there is a sort of culture of corporate bureau bureaucracy around science and these institutions in many ways. And it's, it's hard, I think, to, um, to take risks and, and play and fail, um, in, in smaller ways than, and, you know, and many people are just pushed to be loud and produce and, um, and produce what, what I've heard other people call salami publishing like this kind of just getting just churning stuff out, rep you know, sort of mimicking what other people are doing, changing it slightly. Speaker 1 00:39:17 And there listen that there's a mountain of science that, that, that is comes out of that that is valuable. There are, um, but you know, there's not, maybe not as as much, um, null results and, um, the, the sort of other underside of that mountain we could use to, to see a little, a little bit more the, the less loud, the less flashy side. Um, what about those experiments that didn't go the way they felt they would go. And, and, um, so I, I, I don't know how to, I I'm, I'm not prescriptive with policy or anything. Speaker 2 00:39:46 Come on. You gotta fix it, man. Okay. So, uh, another question here, Chris asks, um, if I remember correctly, uh, what did it feel like when he did not sit down for the last meeting? He meaning Henry? Uh, if you could just briefly take us through that visit, please. Speaker 1 00:40:03 Yeah. Um, that was, so he did sit down for the last two meetings that was act, the one where he stood me up was actually two, um, was actually the third to last interview, um, that I did with them. So again, to their, to his credit, um, he came back even after that. Um, he, he came back and maybe just cuz he felt bad. He wanted, you know, he, and even that year that he stood me up, um, he made sure to do a, a video call with me to, you know, uh, to, to sort of catch up when I got back to New York mm-hmm <affirmative> but it was, yeah, it was, um, it, it was tough to go over there again self-funded at that point still. Oh yeah. And make that trip and spend thousands and you know, and then, and then, um, get stood up. Yeah. But, and, and, and now it's it's, it was interesting cuz I had I'd we'd been in this rhythm of doing it for like seven years at that point. And um, I had become, I was becoming increasingly critical with, with my angle and I think they could probably feel, he could probably feel that. And um, listen, I don't know. I, I mean, he might have just been a, I don't know if, how, I don't know how intentional it was, but from my point of view, Speaker 2 00:41:13 People Speaker 1 00:41:13 Are busy. We had this on the calendar for months, you know, I, I was coming, I was only, you knew I was only gone for a few days. It was tough. It was a tough one. And it, it, it sort of, to me felt like I, you know, my relationship with them was changing a little bit. Speaker 2 00:41:29 Uh, this is from Rokus. I like how you, uh, pinpointed that simulating imperfections is super hard and might be the necessary ingredient. Do you think using computer science terms to explain the neuroscience concepts are harmful for the field, by the way, you don't have to answer any of these, you know, or, or any of these questions, if it's gonna give away too much of the film, you know, feel free because you know, to keep it in your pocket, of course Speaker 1 00:41:54 <laugh> um, using, I, I would love an example from, um, of, of the kinds of computer science terms that, um, they're in that Speaker 2 00:42:05 Question. Okay. If you wanna jump in Roku to, uh, clarify. Speaker 1 00:42:08 Cause I would actually something that I, I, something that I I talk about sometimes is, um, I, I worry sometimes and listen, I, I, you know, this show, your show is called brain inspired. This is, this is a two way street. So we, we were talking about, you know, computer science inspired by brains. And now in this, in the, in context of this question, brain science that people are using computer, um, language to talk about. I actually, I sometimes bristle at the overuse of brain of, of brain language in computer science. Yes. I, I think that people dignify their computer science, um, to a, to too large of a degree sometimes by ex extracting the language of biology that has been hard won by millions of years of biological evolution. And just plopping that onto the network that they've designed and saying, this thing is a neural network and it learns and it's like, man. Speaker 1 00:43:06 Wow. Um, okay. So, so by, uh, evolution took millions of years to do that thing and, and you've done it in, in, in the, you know, in the computer. So I, I think people just need to be a little, um, careful about, about bringing the language of biology into computer science and I get it, listen, I'm not, it's, it's a little of, a bit of a stickler to say like, well, it's not a neural network. I get why people call it that. And of course, and, and, and people calling point neurons, or, you know, in their, in their deep learned networks, neurons using the word neuron, it's like, wow, neuron has a lot more going on with it than, than what you have going on in that com in that, um, model. Well, and, and so I, you know, it's like, yes, there's a, there's a degree of what neurons do and how they are fit into networks. Speaker 1 00:43:51 And that's why we're using it. But I, I worry that people that it, it leads what it, what it does. And this is why I'm, I'm a little critical of it. Is it potentially Overin inflates our sense of where we are with these, um, with, with the, the way in which we're capturing biology and machines. And so one of, one of the things that the question is pulled out of the film and the, I intentionally put this line of criticism in the film about imperfections, or I, you know, we call, I call 'em tiny mistakes in the film is that this is something that, um, you know, I, I was hunting around for, in, in the 10 years of making this film was how are they capturing, um, true vari variability and, and chaos and from biology stochasticity, um, we could also say what, how, what, what are the models to capture that in a computer simulation of biology? Speaker 1 00:44:41 And, um, you know, I found in the end that yes, we use very state of the art, random, random number generators to generate some of that noise. And they call it jittering in the, in the blue brain simulation. But ultimately a scientist tells me in the film that, well, we can never know the right kind of variability. And I just found that to be a pretty revealing answer. So ultimately, if you don't know the right kind of variability, then you're, you're, you're really creating a model to do what you want it to do. You're, you're injecting a bit of jitter just enough. Okay. That, that seems somewhat realistic, but really I'm gonna fit this model to the question I'm answering and that I, as a, as, as the human researcher want to figure out, and it's just, it's important to realize, to draw a bit of a line in the sand that you have not necessarily captured the actual variability that has driven random natural selection over millions of years. Speaker 2 00:45:30 Hmm. Uh, how much did you, uh, have an, well, how did you debate how much neuroscience to put in the film versus making it more accessible to the general public? Speaker 1 00:45:40 Yes, this was, this was a difficult balance to strike. Had to put a little bit of basic neuroscience in the film to, to be able to, to get to that general public audience. I wanted to not have this just be a film for the field. Um, I, but I also wanted the film to appeal to the field and to have conversations like this with people who are, who think deeply about this in their professional lives and, and, you know, um, are, are, are very knowledgeable about the subject. And I, I don't know if we hit the right balance, but that was, yes. The answer. The answer is yes, that was an active debate in the edit room. Do we need a little, do we need a little bit of a primer here on what axons and Dendri are probably yes, we need, we need a little bit of that. Gotta Speaker 2 00:46:22 Do it. Yep. Speaker 1 00:46:23 Do we need a little section on models? Yep. We gotta have a little bit of a thing about what models are. Um, but man, there could have been so many more primers that we went into. This could have been so much longer this film, it was a real challenge to get this. I always wanted it to be like less than 90 minutes. I feel like for a documentary that's a, that's a, um, I, I don't, I don't like asking people to watch something longer than that. So, um, and listen, that's my own, that's my own. I think people are used to these days watching things, you know, maybe longer than that, but I, that was my challenge. Yeah. Speaker 2 00:46:53 I mean, uh, you know, like all good stories, this it's about people. Yes. You know, and so I guess you had to have the science in the background at least, uh, to, to sort of extract, um, where people were coming from, perhaps, uh, Harry says, I like the sequence where they debated using 20, 30 versus 2050, I guess this is toward the end of the film. Uh, I wonder if you think blue brain could have gotten funded with a 30 plus year timeline, I guess that wasn't on the table though. Was it, I mean, it, it was a 10, it was a 10 year grant that was being awarded in the, in the, uh, from, from the government. Speaker 1 00:47:28 Yeah. So the, the, the, the grant, the 10 year gr grant, um, was the human brain project where there was actually money earmarked for, for a specific like decade window of a project, which continues to this day, the blue brain project. Um, I, and I, I wasn't privy to their like inner finances, but my understanding was they actually were, were funded just yearly and that they had yearly reviews. Yeah. And, and that, or maybe they were, um, you know, every two years, I'm not sure exactly the pace of the reviews, but there, I don't, I don't believe that, um, the blue brain project was funded was technically funded for 10 years from the beginning. Um, they were funded on, on a reviewing cycle. It's more, it's more that the dream that was set out with what this project will do was 10 years. I see, um, the actual benchmarks had to be, um, reviewed by whatever, you know, committee would come in and look at it. So I don't know if they would've, you know, that saying, saying, I don't know that the, that the pub, the excitement would've been generated by saying we are gonna do this by 2030, that would've felt in 2009, like so far off, I, I just don't think you get the same kind of Speaker 2 00:48:34 Response. Well, you see in the film, I think, I believe, you know, Henry says something like, well, 2050 is just too far away. Let's, let's bump it Speaker 1 00:48:43 Up. Right. So that, that would be the equivalent. Right. So that's, that's how 2030, I think would've felt in 2009 right now. Um, you know, they were sort of in that scene, they're sort of talking about, um, the, the session titles for an upcoming summit. Um, and so it's, it was, I thought very interesting to hear them talking about time again, you know, at one point he had been talking about 10 years from 2009, and now here they are, Speaker 2 00:49:05 But that's pure, that's pure sales Speaker 1 00:49:07 That's right. Speaker 2 00:49:08 That's not even, yeah. Speaker 1 00:49:10 Yeah. It's like, what feels better? What feels, what, what just feels closer and more tangible. It's not, it doesn't really have anything to do with what can actually be accomplished by a, a certain date. Yeah. Speaker 2 00:49:22 Uh, at what point Chris asked, what, at what point did you become aware that they were kind of doing a bait and switch dazzling you with the pin stripes rather than the progress and performance, so to speak, that was year three, right? Speaker 1 00:49:35 Year three is when I started to realize there are critics out here, but to, to be fair, I didn't, I didn't yet think in my mind, right. Of a bait and switch. And, you know, I, I also dunno that I would go so far as to call this a BA and switch people look in at this film and they look at Henry and they, they, I think when they don't know much about it, or they're coming into seeing my film for the first time, they, I think there's an expectation given what's gone on in our world in the last decade of a kind of story, like an Elizabeth Holmes type story, um, a, a theos type story of, of a real like snake snake oil salesman, who like, was, was selling a product that was completely Vaus inside in the end. Um, that, that isn't this story. So this is more complicated. It is. Speaker 2 00:50:18 Yeah. Speaker 1 00:50:19 And so, you know, and, and yes, and as I just mentioned, they have been reviewed very regularly by very serious scientists. I've just seen that there's so much politics in scientists in science. I mean, my goodness, the people who come Speaker 2 00:50:32 In isn't that so disheartening Speaker 1 00:50:34 <laugh>, it's just incredible. I'll go and talk to scientists who think that the, the whole thing is a fraud and that the people who come in to review it are like in the back pocket of this project. Like, it's amazing the stuff I've heard that people won't tell me on camera, because of course they want, you know, it's like, but if I were, if I was some muck racking, investigative journalists writing, you know, a, an article, maybe I could have gotten those in quotes and really, you know, but, but again, that, that, that's their feeling that I don't know that there's proof of that. Um, I didn't find any proof of that. And, um, you know, people, people making claims that, that, that the media, that they're, they're like journalists who are friendly to this all those years and stuff. I don't know. I mean, I, I think that, um, I never felt a bait, a true bait and switch as the question implies, but I did start to feel like, um, oh wow. Speaker 1 00:51:23 There's, there's like a critical, there's a critical, um, response out there that I wasn't familiar with in the beginning than I am becoming more familiar with. As of year three of this film, I really felt the fallout after the human brain, after the open letter, you have over 800 scientists speaking out against this thing in an open letter. That was when I thought, wow, I mean, is, is this like, is this completely hot air? And should I stop making this film? Oh yeah. I really thought about, I really about, I thought about stopping after all that, because I, I had, I had to really fall from my, that was really when I, my dream, my dream was crushed about being there. And it was more, it was honestly, it was more about like, I'm, there's nothing, there's no third act. There's no end here. I'm gonna just continue making this film. And there, there will be no moment of that. I had dreamed of, which is like, at the end of the decade, the switch is turned on and the simulation comes alive that, that my own science fiction dreaming, you know, Speaker 2 00:52:20 How, but how close did you come to quitting? And, and did you have you, you're kind of, were you working in a vacuum, like super solo or what, you know, did you, were you in constant contact with your producers and editor and did anyone try to dissuade you from continuing or did people try to pump you up and say, come on, man, you can do it. It'll something will ha happen. Speaker 1 00:52:41 Just my, just my family. Um, I, I had no one else. I wasn't working with anyone else on, on this film for eight years. I was completely solo. Um, did Speaker 2 00:52:50 You, did you ever cry Speaker 1 00:52:51 <laugh> I definitely had some, some dark flights home from feeling like I got no footage out of that year. You know, nothing interesting happened. I had to kind of lean into that. I had to, I had to lean into the, the kind of tantric quality of time of time in this project. There was nothing going there. I mean, listen, from their point of view, um, it was very exciting every year, there, there were things going on, they were getting closer to the, to a model of the entire mouse neocortex, um, at which they continued to, to work on. And I would come home and be like, well, okay, that, that, that might be true for them. And they are seem very earnest about it. I was here to make a film about a 10 year simulation of the human brain and we're, we're not gonna get there. Speaker 1 00:53:34 And so I had to fall, I had to kind of like fall from that in a very classic arc of like a coming of age story in a way I had to, like, my heroes had to be, had to fall from grace for me to decide what was I interested in? And the rebirth for me in, in like climbing back into this project was getting interested in these questions of, of chaos and variability. That was what got me back in into intellectually mm-hmm <affirmative> to explore how they were capturing that in a, in this level of detail, in a simulation. And that once I got interested in that, I felt like I could now explore something, um, in a genuine way each year again. So yeah, no one, I, I, I didn't have anyone. I had wonderful producers on this project, but they came in in the last couple years of the film once we got funding and I could actually afford to, to have to pay people to, to work on this project. Um, and then we had, and then I worked with another editor at the end, in the end, too. So that was at that point, there was no quitting anymore. We had fi we had funding. We were gonna, we were gonna finish the film. Speaker 2 00:54:34 H how did you know when it was the end? Speaker 1 00:54:37 Uh, I, I just decided in an Orthodox way that I was gonna do 10 years. And, um, okay. So you just, and so, so ultimately, but, but I still needed to feel like I had that last trip to blue brain project that tied it up. Um, I, I was feeling like by year nine that I could feel the ending. There's a, there's a particularly potent moment for me in the film. I don't know if it resonated for other people, but for me, it's, it's a potent moment when, um, Henry sort of, we have a long sequence of people making, including Henry making the 10 year promise over and over again. And Kurt wild isn't that sequence too. Yeah. Of, of forecasting of where we're gonna be by certain years. Um, and I just wanted to show the, that there is a Silicon valley streak in, in the kind of salesmanship that Henry was doing all of those years and that, and the kinship with Kurtz while. Speaker 1 00:55:28 And so it, it resolves with Henry kind of saying, well, a 10 year project, a 15 year project, a 20 year project. Well, what do you think? You know, and he was kind of asked me and, and then, and we end there on that scene and moves on to, I think, the final year after that. So I honestly, at that point I was like, okay. So I, I, I like we're wrapping up here. We, we can, we can work. Like, it's kind of, it's all arbitrary in a way, these timelines, I need to just decide how my story wraps up here. And part of that journey too, was realizing I needed to put my journey into this film. I did not think from the beginning and I, and my other two documentaries that I made about North Dakota do not have my journey in the film. And I, I was alert. Yeah. Speaker 2 00:56:10 My, my daughter who's nine asked me to ask you, I said, I asked her if you, if she had any questions and that was her question is why you decided to put yourself in the film. Sorry to interrupt. Yeah. Speaker 1 00:56:19 Good, good question. I didn't want to, I was allergic. I, I, I just, I never liked documentaries where the filmmaker decides that their story is as important or more important. So you feel like sometimes people decide than, than whatever's going on in the film, uh, with the subjects that they started following initially. So I, I, I didn't wanna do it. Um, and I realized by the end that I, you know, there's something, there's something we talk about in filmmaking, like the reliability of, of a narrator. So we, we often critique an unreliable narrator, for example. Um, and this is a, this is something I started to think about. I was like, who's gonna narrate who's, who's telling the story of this film. If I, if I'm not telling the story, um, it's gonna have to be Henry really. I mean, he's the one he, like, I have the most interviews with him. Speaker 1 00:57:06 I'm gonna have to just keep showing interviews with him. And he'll narrate where this project is each year, but because he maintained his positivist belief in, in this project the full time and continues to this day, as far as I understand, to, to believe in it and has not, you know, doesn't, hasn't become self critical or isn't SELFA facing at all. I feel like I had to be self a critical, and SELFA facing in this story in terms of my original salesmanship and, and awe and, and, and sort of being swept away by the salesmanship. I had to cuddle that myself in the film and be self reflective about it. And I didn't know how that was gonna happen if, unless I jumped in and became a, a, a, a character in a way in this film. And that's how that's, it was really because I felt like it, I had to get, I had to be critical and I had to be self critical. Hmm. Speaker 4 00:58:01 You know, could I, could I jump in with a question about the salesmanship? Um, I'm wondering, especially cuz you mentioned, you know, you had that montage of all these different people saying 10 years, 10 years, 10 years for different, uh, projects. Although I think they were all computer science related. I wonder how much of it is salesmanship and how much of it is just the perennial human, uh, you know, FOS of underestimating, how hard things are in the short term, but then also underestimating how impactful they can be in the long term. Uh, you know, it seems like five, 10 years is all or, you know, 20, if you wanna be conservative is always the timeframe for the thing, whether it's electric cars or, you know, whatever. So I'm wondering if you think it really was, um, uh, salesmanship in the maybe slightly dishonest sense or it is just like we did, we are just bad at estimating timelines and we always feel like 10 years is the right number. Speaker 1 00:58:59 Yes. I, I, I think actually that you, you have a, a good point there 10 years, listen, it, it was the right. It felt like the right number for me to, as I said to, it was long enough anything could happen in, in my film, it was short enough. I could imagine actually doing it. And I think that that probably is actually how Henry felt about his, his promise long enough, like gimme that amount of time. I might do it, but you know, it's also not too much time. You're not gonna be, you're not gonna forget about it. <laugh> like, you're gonna keep giving me funding to, to do it. So there's a lot of truth in that psychologically, I'd love to see some sort of study about why, why the 10 year horizon is, is, you know, attractive in that way. It does, it has probably some proportional relationship to our lifetime or something. Um, you know, the way we think in decades and generations. And, um, there is a lot of our thought that's organized around that. Um, totally arbitrarily. Right. But, uh, it's, it is a, it's a great point. Speaker 2 00:59:55 I, I had the thought that I just, the thought just occurred to me a moment ago that your journey, your mental journey parallels that of a lot of people in neuroscience, PhD programs and, and throughout their kind of, you get beat down sort of you come in wide-eyed yeah. And think you're gonna really do something. Think you're gonna learn a lot about the brain. And then you start to focus on these narrow questions, things don't work and it kind of beats you down. And, um, so it's interesting that there are people like Henry that remain super optimistic and I don't, I wouldn't call them wide eyed, but, um, but, but seem to have escaped this sort of beat down that, uh, a lot of us go through anyway, that was just a comment. Um, <laugh>, uh, you know, thinking about like timelines also, uh, I had an idea for a business, uh, I don't know, five, 10 years ago where it was a, um, it was a business to hold construction companies accountable for their timelines, because like in construction and in all arts film at all sciences, uh, you're always over budget and over, uh, the deadline, but there doesn't seem to be any accountability for it. Speaker 2 01:01:09 Uh, and I don't know how I would make a profit in this business. I don't know who would pay for it. You know, I have to be like publicly public funding, but, um, yeah, I, I don't, did you go over budget and over time <laugh> Speaker 1 01:01:22 I, um, you know, I, I, I, there was no budget. I was, I was self-funding Speaker 2 01:01:27 It was your budget. Speaker 1 01:01:27 Yeah. You know, it was my budget. So I, I, I, I couldn't go over budget and still pay my rent in, in New York, so expensive. But, um, I ultimately, you know, I, I, didn't the timeline. I, I, I, I re promises. I said, 15 years, and then I, I ended up coming in under the, under that reboot. I <inaudible> arbitrarily with the human brain project. Um, and then we got funding, we ended up getting these two grants, one from Sloan foundation, one from Simon's foundation through their entity sandbox films. And, you know, we, um, we didn't go over. I'm, I'm proud to say we didn't go over budget with our grants. We stayed under we, and, you know, we, we, um, we, we have a little bit leftover for marketing now, which is nice. <laugh> so, so, yeah. Speaker 2 01:02:12 Okay. So back to the science a little bit, I'm gonna, there are two questions that are kind of related here. Uh, one is, so I'm gonna ask them both, and then you can comment on both or either one's from Michael. I was curious, did the final brain simulation do anything productive and how long could they run it for? And then a related question from Hannah is how close did they, did they get to matching the number of cortical columns to the scale of the mouse brain? Speaker 1 01:02:38 Uh, good. Yeah, those are good questions. So the, um, there wasn't, it's, I'll just reframe it a little. There wasn't a final simulation in my film, even that I captured, um, it, it, it's ongoing. Um, they don't have a full simulation of a mouse brain yet, and that's what they're working towards still. I've stopped really following so closely. The, the research, honestly, I felt like I, I did my part for 10 years. I'm not Speaker 2 01:03:00 Really you're, you're Speaker 1 01:03:01 Out, I'm, I'm not gonna continue to monitor their, their releases and everything. It's it's okay. I've done my part. So, but, but to answer the question a little bit, um, the LA the, the last major release that they had, um, of work was really the cell paper in 2015. That was when they released their cortical column, simulation showed what it could do. And they were able to mimic, um, in that paper, which is Vaness it's, it's a huge paper. It's actually like much more about the methodology of doing the simulation. It's a huge method section. Um, and, and, but in the paper, they were able to basically mimic, mimic some classic electrophysiology papers from in vivo work. Um, and that was how they showed. And I think why they got this paper published eventually. And they had to, as, as we mentioned before, people had listened to this previous episode with this former researcher at blue brain, they were having an issue with calcium levels. Speaker 1 01:03:57 They realized that in vitro, um, and in vivo calcium levels are, are actually slightly different. They made a tweak in the simulation. All of a sudden the simulation is behaving. That's showing not behaving. It's not an animal, but it's, it's showing signatures of, of electrophysiology electrophysiology that you would expect to see, um, in, in a typical cortico slice, like, like they simulated. So, um, that's the, that's what it does. That's what it did. That's the, that was the, the big moment, um, as far as what I captured. Um, but people were even skeptical of that. You know, they, um, I think people are skeptical of how much parameter fitting. Um, it happens if, if the calcium thing could be tweaked so easily, what else could be tweaked to get this kind of, um, excitation that they're showing and these, these little oscillations they're showing? Um, I don't think that there's anyone who thinks that it's some sort of scam as we were talking about, like that it doesn't actually do this stuff. Speaker 1 01:04:54 I just think people are skeptical to this significance of what this, this model will actually be able to reveal about the brain. So they haven't had results that have, have, I was from my point of view, have broken new ground in and, and reshaped our understanding of the brain in any way, um, in the way that, you know, a discovery like grid cells and the hippocampus or something, something monumental like that, like there some sort of form foundational work has not come out yet, um, of this project, not to say it won't, but that's basically as far as the work got that I captured. And then in terms of the question about the columns, my understanding is they really, you know, there are, there is even controversy of the column, the New York world column, as a unit, people, people are skeptical that, um, the, that the column which comes originally from work by Vernon, Mount castle is actually a, a, a, a unit that could be, that could be meaningfully scaled up in a simulation like this. Speaker 1 01:05:50 Um, and that maybe there is actually more, more intricate stuff happening at Melo scales, where there are different types of, of geometric formations and topologies that aren't, that isn't just this like clean column. When you see that they're when you see their, um, simulation and there's some of this in the film, it really feels like they're treating the column as a microprocessor, and they they're treating each one of these things as a, as a sort of chip. And then they're gonna scale it up to the size of a neocortex and that listen, that might work. We'll see. Um, I have no idea. Um, no one really knows yet. Um, but that, but it's, it's just important to point out that, that, that isn't even a found that isn't even a widely, like agreed upon concept that the column is, is, is the foundational unit in the neocortex. Speaker 2 01:06:37 Let's say it's fairly agreed, agreed upon, but not universally. Speaker 1 01:06:40 Okay. Now university. Yeah. Speaker 2 01:06:43 Uh, I really, uh, Milan says I really appreciated your metaphor of Kaha at the end. I'll say Ramone Kaha, Milan. You're supposed to always say Ramone E with the Kaha is what I learned at some point. Uh, did you spend some time with Henry's family and son? Speaker 1 01:06:59 I didn't get to, um, spend time with his son cuz he, he lives in, um, uh, Israel I believe and still, and um, I never, I never made it there for the film, but I, I spent, I, I spent a bit of time with him and his family and in Loza and that was his, uh, his two daughters, um, and his wife Camilla who's in the film briefly. So, um, they were very warm to me and, and um, yeah, I, I, I wa was at their home. I, I spent some time with them. Speaker 2 01:07:28 Uh, so back to science real quick and I, I don't know if you'll have an answer for this. Um, Sammy asks, uh, how scientifically relevant does the community consider those oscillations that emerged after reaching a certain scale? Speaker 1 01:07:42 I, I can't, I can't speak to the community. Um, but I can speak to the, the, you know, I, I included a bit of the community in the response to it in response to that work in the film when the, when that work came out, um, Corey barman and Zach Mayon were quoted Zach may and, you know, being one of the fiercest critics of this project come coming from his original co-authorship of the open letter with, um, Alex PGE. Speaker 2 01:08:08 He, he was very enjoyable in the film, by the way. Speaker 1 01:08:10 Yeah, he was, he was a wonderful interview. He and I thought, I thought he had great stuff to say. Um, and, um, it was, it was nice to, for me, it was nice to, um, interview an American over there. He, he was in Portugal, so I went to Liman interview him. Um, but I, I, uh, you know, I, I, I, I just noted that, that he was less, you know, less than ex impressed by what they had shown the oscillation oscillations. They had shown in the, in the slice and Cory barman also sort of said, oh, it's a, it's a, it's a nice start, but, um, there's a lot to do before this airplane takes off or something, some quotes those effects. And so I, you know, I, I, I did monitor the response a bit and I, I, I felt like it was tepid. I felt like the, the community didn't feel like this was a revelatory signature of life or something inside the simulation. I didn't get that sense. And maybe people can correct me if, if they feel otherwise. But, um, I, I, I felt like it was, people saw this as like a, a Magnum Opus of methodology to get of how to do like a, a hyper detailed simulation. Um, not necessarily that the, that the effects coming out of it were, you know, all, all that revelatory or impressive. Speaker 2 01:09:26 Okay. So, um, uh, by the way, uh, in a weird twist of, uh, fate or coincidence, uh, next, I guess on Monday, perhaps I think it's on Monday, I'm interviewing someone from the human brain project. <laugh> who's who gets funding from the human brain project. And who does this kind of super detailed, um, bottom up. Yeah. Kind of modeling. Uh, so it'll be interesting. I'm gonna send him this video and it'll be interesting to get his reaction, um, to the movie as well. For sure. So it would be fun to do another one of these live things, uh, with him, but I don't, I'm not sure that that's gonna happen well, Speaker 1 01:10:02 Just though just say, just to say briefly before, before you move on, um, is, is that I, I, I have found that in some of the early screens we've done for this, with this film, uh, some people have, have thought that this is like the film about the human brain project. It, it really isn't. I, I, I only went into the human brain project. Yeah. Briefly with Henry when he was the, when he was steering the ship there. And when he proposed the idea, went through all the hoops to get it funded. And a lot of people give him credit for that. Rightly so. I mean, he, he led a neuroscience project to, to the, to winning this flagship grant that could have gone to any other field. And a lot of scientists continue to get funding from it and do really interesting work. And, you know, as I'm sure your guest will, will talk about, um, so, so, so Henry did that. Speaker 1 01:10:44 There was a lot of problems with his leadership style that people had, and it's a mixed bag as much of this is, but I didn't continue to follow the human brain project in any way. I went, as soon as Henry was mediated out of his leadership position, I went right back to the blue brain project and continued to make my phone about that. And I don't purport to have told or continue, you know, that like the human brain project is its own can of worms and that it has retooled itself in what looks like a more promising direction. And they, and they're doing other work now. So I just wanted to say that Speaker 2 01:11:13 <laugh> and generating, uh, methodologies and technologies that are clearly gonna be useful. Okay. This is a comment from Anna and it, uh, I was gonna ask, and then I'll follow up with a question I was gonna ask you anyway, uh, she says I'm in a social environment where people are very confident about quote short AGI, artificial general intelligence timelines. This is so back to the timelines. Uh, I'm curious to see how that will actually develop. So my question for you is, you know, future projects wise, I know that you're kind of taking a break from, uh, neuroscience documentation, but, you know, the AGI community is filled with many colorful characters and very smart people. Of course. Uh, so maybe it's not a question, maybe it's a comment that would be an interesting, uh, you should be the timeline the 10 year timeline documentary person, Speaker 1 01:12:07 Auditor, Speaker 2 01:12:08 Auditor. Yeah. Yeah. Okay. That, that's a good way to put it, but have you, do you have a, um, okay, so this will be leading to my next question, but do you have a sense of the AGI community or is that something, anything that you're paying any attention to, and, and do you have a, uh, you know, thinking about your experience with this neuroscience based, uh, project, do you have opinions or perspectives on, you know, the AGI, uh, community and their timeline, I guess, you know, you put Ray Kurtzweil in the film, so, Speaker 1 01:12:39 Yes. Right. Um, I, I do pay attention to it a bit. I mean, I, I, I find myself agreeing with the skepticism of Gary Marcus more than I, you know, uh, find myself getting and people could probably expect that given my, the, the, the tone of my film. Um, I, I, I understand why people, again, it's, it goes back to the question before, um, about, about the psychology of, of these 10. I just saw a little Twitter debate, um, open up the other day between Gary and, and someone maybe from open AI, but maybe some Speaker 2 01:13:10 He's always debating someone it's like seven 17 people at a time. Speaker 1 01:13:14 That's right. About someone who, um, you know, was willing to place a bet that that AGI would be here in 10 years. Um, and, and Gary said, okay, well, well, what do you wanna bet? And he ended up just saying like a, a t-shirt or something. So I was like, okay, well, that's, that's not much of a bet, but, um, okay. But, but, but, um, I, I, I think like I would love to make a film about that. I, I think it's, it's, it's tough to think about doing that now. I'm a little burnt out from, from doing this. I Speaker 2 01:13:43 Dunno, I can, Speaker 1 01:13:44 I am paying attention to it though. I don't, I don't, I wouldn't, um, have anything that your listeners, you know, don't know already to say about it. Um, just that I, I, I definitely am, am a skeptic and, and, and come at from that point of view, Speaker 2 01:13:56 But you you're, I think I've heard you mention that you're interested in sci-fi right now in producing sci-fi Speaker 1 01:14:02 Creating sci. Right. So I, I, I wrote and directed this, this, um, narrative feature called lapses, which came out last year. Um, and, and so I'm, and, and I'm, I'm writing more sci-fi now, and that is sort of what I'm heading towards next, but I, I do want to go back and forth a as time moves on here between documentary and, and fiction. Um, and I, I find that I find both to be rewarding in different ways. So I, I would love to make another documentary at some point. I probably need a little, a little beat a little time off from it. And then, um, you know, if there's a, if there's an AGI story to be told, I, I would love to get to get involved. Although I, I, um, you know, I, I might have a somewhat critical take on the timelines. Speaker 2 01:14:41 <laugh> there are always AGI story tellers. I don't know if there's a story to be told, however, Speaker 1 01:14:44 Right, exactly. Speaker 2 01:14:46 Okay. So this takes me, uh, to my next question. So this podcast is ostensibly, although we talk about a lot of different things about the intersection of neuroscience and artificial intelligence, and, uh, I, I don't know how much you have followed the artificial intelligence boom, since, you know, 2012, the deep learning revolution. And now you have all these large language models that are just scaling and scaling. Um, so a just generally, do you have perspective and thought about, you know, the, the artificial intelligence push, I know that you are sympathetic with Gary Marcus and, and, you know, but that's AGI, we were just talking about. Right. Uh, and then I, I'm curious, so that, that happened in, uh, let's say the deep learning revolution, quote unquote took off in, in 2012, although there were many of advances before that as, as well, but that's when convolution neural networks really came on the scene. Speaker 2 01:15:37 And then from then there have been other kind of milestones of landmarks. Uh, and, you know, people are using neuroscientists are using these artificial intelligent, uh, machine learning models to say, try to understand how the brain is doing things and look at, look at the dynamics and compare the activity of the models to brain areas, et cetera. Was there any talk about that, uh, within the people that did, did AI ever come up, uh, as an alternative means to understand our brains, because the flip side of that, you know, is the, these super detailed simulations, right. In, in some sense you could say it's the flip side. Were there any conversations had about that? And then, I don't know if you just have a, a general perspective on it. Speaker 1 01:16:24 Yeah, it is. It is the flip side. So there, what did come up often was a real skepticism, um, and a disdain for the, the, the, what was seen as the overhyping I know, and over it was so funny to hear that it was sort of like, yeah, but you guys think that they're overselling something. That's interesting, but, so, so, so there was a lot of criticism from them and I put a little bit of it in the film about, um, Google's about, about Google's big triumph with, with, uh, AlphaGo beat, you know, beating that player and go, right. So I, I, um, I put a little bit in the, in the film of that, because I thought that was just an, it was interesting to see their, um, disdain for this project that had, that was announcing this amazing achievement and, you know, in AI and whatever. Speaker 1 01:17:08 So I, I, I, there was talked about, um, like that, and there's also even going back further. There was a, um, in the beginning there when I first got involved in the project, and this was my first whiff of criticism, but I, I didn't even take it seriously was when, um, Henry mark was beginning and was really involved with IBM, cuz they were supplying the machinery. There was an IBM researcher named Darra Moda, I believe, um, who, who had come out around that time with a, what he was calling like a, a, a Mo a pointer on model of a cat brain or something. There was some sort of pointer on model he had come out with. It was getting a lot of hype Henry wrote some sort of open letter about, about MOTA's pointer, on model being very critical of it. And I remember I just, it, it, I remember it because it was, it was a TA an early taste of what became even a wider rift between these two projects. Speaker 1 01:17:59 And these two efforts, a really hyper detailed simulation of the brain from the bottom up and a deep learned convolutional neural net built, built, um, model that was extracting principles of, of biology as we've talked about, but wasn't worrying about glia and vasculature <laugh>. So, um, I, I don't, you know, I think that that, that might be a bit of a false rift as you've, um, identified that there's ways in which, um, the tools of AI now will be able to, to cross the rift and help. And I've, I've heard of that from other labs. It's not really something I follow in this film, but, you know, automated techniques to improve understanding of the brain, basically foisting AI upon, I know this is something you talk about a lot on the show, but using, using the, the techniques of AI to improve biological neuroscience, um, and to speed things up and to, to do, to automate things so that humans don't have to spend hours at the bench doing them and all of that fun stuff. Speaker 1 01:18:53 I will say that, um, something else to note here, which I, I happened a bit after, um, my film was done, but it's, it might be interesting to your audience is that, um, Henry ends up, ended up, um, doing a spinoff company called innate, um, and they have a website up, people can check out, um, it's, I N a I T and it's an AI startup that he has in, in Loza where he's using insights from the blue brain project to, to basically it feels like what they're doing from their website is to create like a, an, a sort of Watson, styled, um, AI model that could be useful for businesses. Um, oh, and, and so they're, there, there, it seems like, I don't know if they have any actual use cases yet or anything, but he's, and it's, but it's not an, it's not a Watson, it's not a, a deep learned network. Speaker 1 01:19:42 It's it is. I don't know. I, I, I don't know exactly what it is, but I, I, what they're sort of marketing it as, as different than AI, because it is from the insights are generated from these hyper detailed, um, models of, of real brain. So I, he is, Henry is now dabbling. It seems in, in this sort of like booming business of business services from AI, and, you know, there's a long history of, of publicly funded research being spun off into patents. And so Henry I, I also noticed that Henry now has a few patents he's filed about re related to the, the blue grain simulation. Um, you know, there's a long history of that, of, of scientists sort of spinning off into, into privately private businesses and, and filing patents and, and what have you. But I also noticed that as I, as I talk about in the film, there are a number of researchers who have left blue brain and moved to AI and, and EIF Mueller was one of the, the head of the simulation platform moved to mil in Montreal. And, um, and then there, couple others now have, have gone to work more in AI. So there, there is actually this sort of like highway between the two, um, that I think this rift is a little, you know, it's, it's more of a comp I think it's more of a competitive energy than it is a real, like academic or intellectual rift. It's, there's a lot of crosstalk going on. Speaker 2 01:21:04 Uh, do you still consume, uh, enjoy and or believe Ted talks? Speaker 1 01:21:13 Uh, I, I, haven't watched a Ted talk in, in maybe a decade <laugh> so I, I don't, I, I, I actually don't, um, I haven't watched a Ted talk in a long time. I, I don't, I don't listen. I, I, I don't have anything against them. Um, I just, I, I, maybe it's a little dangerous, like if I watch another Ted talk, I'll make another 10 year film. I don't know. I, I I'm, I, I consume more podcasts these days. <laugh> Speaker 2 01:21:36 You gonna be drawn in? Yeah, it's interesting because right now in neuroscience, there's this, um, well, at least from the slice that I, uh, I, I talk with a lot of people who are into these, uh, low dimensional dynamical structures, and, uh, it's all about how to relate those to cognition. And, you know, uh, I've been in, in around enough to know that, so there's this kind of hype right now in neuroscience. And because there there's been a lot of traction using these kinds of methods to relate, um, populations of neural activity, to ongoing behaviors and, and cognition, but I immediately am like skeptical, and I don't know how to articulate my skepticism, uh, but it feels good that I'm skeptical because I don't want to make a 10 year documentary, uh, about it. I'm also excited about it, but that's kind of the way that I feel about, uh, artificial intelligence too, and, and neuroscience writ large. So, um, yeah, I don't do, do you just, do you sense that you have a more measured, uh, dis discernment when things are being hyped up just in general? Speaker 1 01:22:46 Yes, I'm, I'm, I'm, I would definitely say measured, if not, I'm just more, I am more critical now. Um, I, I, and I, and I don't think that this is, this is the type of person who would start making a 10 year film again, like yes, when you are, when you do have that kind of criticism as a, as a filter, um, for what comes in now, you have to check that too. It's important to check your own criticism and, and, and your own bias and, and, and not overlook something that is, you know, very, you know, I, I was, I listened, I listened with care because I really respect her, uh, her work. And it's someone I've, someone's who, who I've been interested in a long time in neuroscience, who I've felt like is very different from Henry Speaker 2 01:23:25 E Speaker 1 01:23:25 Martyr. Ah, you got it. I thought you might. Yeah. That's E martyr. So I I've I've I listened with care to your, uh, recent talk with her cuz I just think she's fantastic. And you know, I, I, I think that, um, and she was she, so I was interested for, with what she had to talk, say about modeling and about where these machines might take us. And she seems, you know, a bit on the positive side about surprisingly. Speaker 1 01:23:48 Yeah. Yeah. So I, um, that's nice to hear and it, it actually made me look at my, you know, think of to myself for a second and go, like, I gotta, I gotta be open to this stuff still because, you know, look at, look at this amazing scientist who stay who's focused on the same little network. And, you know, it's only little from our point of view, it's a, a vastly complex network in the gut of a lobster for all these years. And, you know, she's stayed open to the possibilities of machine learning and, and modeling, and she's continued to use modeling in her work very effectively. So, um, it's important to stay open. I am, I am more, my criticism actually comes more from the sort of like, uh, cultural and political implications of some of this work where I see that I, I, I worry that a, the AI hype machine kind of leads to, uh, a culture where these things are used for, um, you know, labor surveillance and like, like the actual applications of AI in our world have so much more to do with the, the, the ways in which our market, the marketplace, our economy, and the way in which, you know, uh, labor is organized these days is broken. Speaker 1 01:24:55 And I, and I, I worry about, you know, I, I, I, I have much more sympathy with the kind of like ethical criticisms of AI. And I, I, I hope that those are taken more seriously as time goes on, too. Um, and, and aren't just ethics for ethics sakes and sort of like nice dressing upon the top of the, the AI cake, but actually get baked into the cake itself. Speaker 2 01:25:17 Uh, something else that has bothered me again, this is just more of a comment. Uh, something that's bothered me is hero worship, uh, in, in science because, you know, people are there, there is hero worship across, you know, all, uh, different sectors in society and stuff. But I think as well, especially in AI talking about the godfathers of deep learning and, uh, you know, people are worship like Jeff Hinton Lacoon and stuff. And I have a really measured take on that as well. I think, you know, admiration is one thing, but it it's borderline worship, uh, in many cases. And did you, you know, is within blue brain, um, did you see any kind of hero worship of its leader? Speaker 1 01:26:00 Yes, this is, this is probably important to continue doing work, work, uh, speculative work that doesn't have, you know, you're not, I think as a, as a human being when you're, when you're doing work and I felt this with, with my film too, like it's hard to not, this is why I was releasing stuff in those first five years. You wanna have a dialogue with, with other people in your community. You wanna be either feel like you're helping them by providing insights through the podcast you're releasing, or the episodes of your yearly film update you're releasing. And as a scientist, I can imagine, although I've never done it myself, you wanna feel like the, the work you're doing has a real, has an actual impact in your lifetime, in the, in the community of humans you live amongst. And when you're doing such a long term pro speculative project, that's building a model from the bottom up over so much time and is starting with a mouse and you might not even get to the it's like a cathedral, you know, it's like, you're build, you might not in your generation ever see this thing finished. Speaker 1 01:26:56 Um, what keeps you in it? Um, what, what keeps you bound to the work? And I think it is a deep part of our psychology, that hero worship and the, the, the following of a, of a powerful leader is something that can keep you doing work. And I think we've all felt that to a degree. Um, and, and it, you know, it might, it might feel a little dirty after the fact, but, um, like when you get out of that and you're go like, oh God, I, you know, what, what, what was I taken with there? But I think we've all been there. And I did feel a bit of that at the project. I mean, everyone talk talked, um, with this same vocabulary about the vision of the project, and there's actually, um, a wonderful anthropologist named Tara ma food who did her dissertation on the, on the blue brain project. Speaker 1 01:27:43 And was there overlapping with me a bit. So she she's written a very, uh, uh, academic text, but, but nicely academic, like it goes, it's much more able to be much more detailed than my film can be about this kind of thing we're talking about, about the sort of sociological dynamics within that project and the leadership style and this use of the vision as this kind of, um, you know, shining light on a hill that everyone was walking towards. And that, I think it happens with these figures you're talking about, um, it, it helps us walk towards something that is undefined. Speaker 2 01:28:18 All right. No, it's been, uh, an hour and a half almost here, um, before my last, uh, question. And if anyone else has any more questions, put it in the chat, um, or just speak up. Uh, but is there anything else from the film that you'd like to highlight, um, uh, you know, joys concerns, uh, more terror <laugh> and then before I ask you a final question, Speaker 1 01:28:39 Um, no, no, I think we've, we've covered a good deal of it. And in fact, I don't, I don't wanna give, I don't wanna give away too much cuz I, I would like people to still go out and watch this movie we've been talking about. Yeah, Speaker 2 01:28:48 Yeah, yeah. It's a, uh, it's a great film. And again, I wanna thank you for just letting my, uh, podcast supporters screen it you're screen it. I got a, a couple emails already saying that it was a great film and just appreciative that you did that. Uh, so thank you for that. My last question to you is given that, so, you know, this film has taken you a really long time and you know, you're kind of done with, um, communicating with the blue brain project and, uh, and you're kind of done with the neuroscience for now, but you're going to release this film and I know you've done a lot of screening, so you've already gotten a lot of feedback, but you're gonna release this film. It's gonna go out to a wiper audience. Are you prepared for blowback? Are you prepared? You know, maybe my question also is what has been the feedback, the nature of the feedback? Has it been half positive, half blow back or it must have been a small proportion of blowback and are you prepared to sort of reengage and I know we we've already just had this conversation, but reengage with those kinds of discussions. Speaker 1 01:29:50 Yeah. I'm, I'm um, I feel like when, when people watch a film, um, you know, and, and wanna engage in good faith criticism or, or, uh, a dialogue I'm here for it, um, they've watched the film, they've engaged. I, I owe it to people to, to engage a bit. So I'm not just gonna walk away from, from this. And if, if there are conversations to be at I'm, I'm more than glad to have them. Um, I, I have had a, a good deal of feedback. Yes, already, because we've been doing a lot of screenings around at universities and other institutions for about a year now. And they've, they've been generally, you know, warm audiences and receptive and some, you know, some probing questions about my methodology and how I went about it and my relationship with the project. And that's all wonderful. The, the, certainly the toughest thing, um, has been the, the dialogue with the blue brain project and with Henry, because, um, and I think you're kind of asking about that in a way too. Speaker 1 01:30:50 Um, but, but that, that is difficult just because I, but I had, I had a feeling was coming because I had seen over the years, every time there was something written every time a journalist went in and tried to do a piece and usually the way journalism is done and, and documentaries are done is in a much shorter, you know, time scale. So they come in, they do interviews, they write a piece and wired or, and where wherever, and, and that happened many times over the time scale, I was there making my film. So I got to see other journalists come in, talk to Henry, talk to other people, write their piece. And there wasn't much positive press over that decade, to be honest. And it, you know, it, it, it culminated in ed young and, and wrote a, a fairly critical piece, not too long ago, sort of also calling out the fact that this 10 year promise didn't come to fruition just to remind people of that. Speaker 1 01:31:40 So anytime something like that would happen, um, I would hear from Henry in the project when I would get over there the next time, like, Ugh, these journalists, no one gets it. Right. You know, these journal basically a just sweeping critique of journal journalists, trying to understand the project, no one was taking the time. No one was really engaging with the science and they all got it wrong. And I had a feeling the same would come for me at some point, too, even though I was treated a little differently because of my time scale, I was the one who hadn't released the thing yet. I was still trying to understand I kept coming back, you know, so when I did release my thing, um, they had a very sharp, similar response right directly to me. And, and, um, I sent the film before I had locked. Speaker 1 01:32:25 It locked it, um, to Henry as a sort of good faith agreement. I, I had with him where I was gonna show him the film, really for, I, for me, it was for fact checking more than his editorial feedback. I wanted to know if I'd gotten, you know, basic things wrong and they pointed out a couple things like, oh, you use this, you know, uh, clip of a, of a celebration at celebrating the wrong moment and good. I, I changed some of that. I got, I got the made sure I fact checked and got that stuff. Right. They had much more sweeping editorial feedback about my take on the project and my interpretation of the tepidness, let's say, for example, to the, the reception of their cell paper, stuff like that, which is, which is, you know, they, they purported to know the objective truth of how their project stands in the world. Speaker 1 01:33:11 There is no objective truth about subjective responses to this work. And so I'm not purporting to have made an objective film. I am someone who believes documentaries are always to a degree subjective every time you make a cut and assemble something, you're bending time. You're putting someone's words after another person's words, changing the context they're, they're heard in. And that is the editorial process. It's subjective. This is my point of view, this film and I'm, I, I, I anticipated the blow back. It happened, um, Henry I don't, I believe Henry really doesn't like the film. Um, there was, there was almost a little bit of a legal threat. Um, although I'm very secure. I was almost, I, I feel very secure in, in, you know, I, the way I went about it and, and, um, crossing it off, even close to slander, it's not that like, yeah. Speaker 1 01:33:59 Yeah. I, I mean, they, I think that they feel like it, like it is some sure sensitive, so, yeah, I, you know, um, I, I'm pretty secure in feeling like it isn't and, and I, and I'm also buff B buffered by conversations. I had that I hope to have with Henry with like someone like EIF Mueller who's in the film who was Henry's right hand man for many years, E you know, left the project. And when he viewed the film, he texted me right after he said, you know, this is tough for me to watch, but I think you did a really good job. And we, and we, um, had a dialogue for a film festival where we were in, we were, we had like an hour long debate kind of about the film. And that was actually wonderful. I love doing that. And it's productive cuz people get to see both sides talking to each other. Speaker 1 01:34:42 I, the, the behind closed doors, like threatening sort of animosity, isn't, isn't helpful because I feel like I did stuff above board and, and was respectful. And by the way, they, I always credit them for, you know, keeping the door open for me and giving me access. Cuz that's, that's a difficult thing to do too. Yeah. I mean, if you're a scientist giving a documentarian access for 10 years, um, boy, that's a big ask. So they did that. They let me in and, and they to be credited for that. So I, I, I anticipate maybe some more criticism. Sure. But it can't be, you know, it can't be harder for me to, to deal with than the, the relationship with Henry the fallout I've had with Henry in the project. Cuz that was that after 10 years, it's hard to see it just crumble like that. But I kind of had to do it for the independent angle that I took with the film. That was the sacrifice Speaker 2 01:35:32 Noah, uh, impressive. Uh, it's an overused phrase toward a force, but kudos to you for sticking it through and just what, what a hell of a project man. So congrats again, I hope the premiers fund, send me a picture with a drink in your hand, please at the, uh, after, uh, after party. So thanks again. And I appreciate you being here. Speaker 1 01:35:55 Thank you so much, Paul. It was a great chat and uh, thanks for having me on Speaker 2 01:36:14 I alone produce brain inspired. If you value this podcast, consider supporting it through Patreon, to access full versions of all the episodes and to join our discord community. Or if you wanna learn more about the intersection of neuroscience and AI consider signing up for my online course, neuro AI, the quest to explain intelligence, go to brain inspired.co to learn more, to get in touch with me, [email protected] you're, hearing music by the new year. Find [email protected]. Thank you. Thank you for your support. See you next time.

Other Episodes

Episode 0

January 16, 2023 01:35:12
Episode Cover

BI 158 Paul Rosenbloom: Cognitive Architectures

Check out my free video series about what's missing in AI and Neuroscience Support the show to get full episodes and join the Discord...

Listen

Episode 0

August 30, 2023 01:35:45
Episode Cover

BI 173 Justin Wood: Origins of Visual Intelligence

Support the show to get full episodes and join the Discord community. In the intro, I mention the Bernstein conference workshop I'll participate in,...

Listen

Episode 0

April 03, 2022 01:17:20
Episode Cover

BI 132 Ila Fiete: A Grid Scaffold for Memory

Announcement: I'm releasing my Neuro-AI course April 10-13, after which it will be closed for some time. Learn more here. Support the show to...

Listen