I can’t wait for you to watch this episode of Are You Human, where my guest is Reed Albergotti, the Technology Editor at Semafor, The New York Times bestselling author of Wheelmen: Lance Armstrong, the Tour de France, and the Greatest Sports Conspiracy Ever, and ex-The Wall Street Journal journalist.
We talk tech, AI, the future of journalism, sport scandals, Lance Armstrong and getting people involved to talk on a record, prostitution, public opinion manipulation and all what you’d want from a good drama movie – except, that everything happened for real.
Transcript
00:00:00.080 I never really thought that proving that
00:00:02.360 he doped was something that worthwhile
00:00:04.880 because I thought you know you’re never
00:00:06.480 going to get like the true Inside Story
00:00:08.599 it’s always going to be circumstantial
00:00:10.120 evidence and he can always deny it and
00:00:11.840 he has his army of followers and so why
00:00:14.759 even why even bother so we focused on
00:00:17.640 the business of of Lance Armstrong right
00:00:20.840 I I wrote a story about how he actually
00:00:23.480 tried to along with some of his his uh
00:00:26.519 wealthy businessmen who backed his team
00:00:29.119 he tried to buy the tour to
00:00:39.360 France and eventually we did it we broke
00:00:41.879 the story with Vanessa oconnell who was
00:00:44.480 my co-author on that book we broke the
00:00:46.680 story and that was the beginning of the
00:00:48.640 end for
00:00:52.480 Armstrong are you
00:00:55.640 is that a motorcycle helmet in the
00:00:58.079 background are you yes but I’m I’m a
00:01:00.640 passenger oh you’re the passenger okay
00:01:03.079 yeah and I can see your book there in in
00:01:05.560 in the background oh yeah I have to
00:01:07.439 promote got to promote that that’s
00:01:09.840 actually that’s what we are going to
00:01:11.080 talk about um so finally finally finally
00:01:14.799 we managed to make I know I’m so sorry
00:01:17.520 about that the other day you know my
00:01:19.640 internet after that my internet was
00:01:21.880 going out like once a week it like
00:01:24.119 really yeah they kept screwing up I went
00:01:26.320 outside I heard them they were working
00:01:28.159 on it out in the street and I heard
00:01:30.479 saying oh this is a
00:01:32.520 show something was wrong I don’t know I
00:01:35.040 have a starl satellite over there
00:01:36.759 sitting in my every couple days I’d have
00:01:38.560 to go out and put the starlink out there
00:01:40.399 to like get internet crazy okay why
00:01:42.600 don’t you use it all the time you know
00:01:44.960 it’s still better to have CA cable you
00:01:47.840 know I mean I guess I guess you could I
00:01:49.880 guess you could but it’s it’s it’s more
00:01:52.159 expensive and slower than cable Internet
00:01:55.200 it’s like it’s like
00:01:56.479 $150 a month you know okay good fair
00:02:00.320 fair enough uh so I hope they gave you
00:02:02.799 at least a discount or some some kind of
00:02:05.439 uh extra yes they gave me $5 a discount
00:02:08.598 of
00:02:10.000 $5 doesn’t quite make up for the $150
00:02:12.879 not
00:02:13.599 really so you want to look out for
00:02:16.160 different provider
00:02:17.879 huh I mean it’s a monopoly there’s only
00:02:20.959 one where I live it’s terrible oh yeah
00:02:23.800 you can always move to Europe it’s a
00:02:25.879 it’s much better with internet a little
00:02:27.760 better yeah yeah yeah I’d love that
00:02:30.840 actually yeah if you can get me an EU
00:02:32.440 passport that’ be
00:02:33.920 great you know like Brits after brexit
00:02:37.519 suddenly they found their grand grand
00:02:40.159 grand parents living in Ireland right
00:02:44.239 totally yeah yeah I go to Italy I have I
00:02:47.360 have an Italian last name so I’ll go to
00:02:49.360 Italy and exactly that’s
00:02:52.560 what
00:02:55.599 and no no no I’m not really Italian but
00:02:59.800 They Came From Italy like so so long ago
00:03:02.920 okay so but you can do 23 and me and
00:03:06.319 then you can just claim yeah that you
00:03:08.319 are close relativ if that’s what I have
00:03:10.920 to do to get a passport I’ll learn
00:03:12.840 Italian it’s fine I’ll do it you know
00:03:15.159 there are some other ways like I guess
00:03:16.799 you are married right so that’s that’s
00:03:19.400 kind I’m married to I’m married to an
00:03:20.879 Australian so um so uh I guess you don’t
00:03:25.319 know how I found you it wasn’t about the
00:03:27.360 book but I then I started reading about
00:03:31.120 what what kind of work you are um doing
00:03:33.840 and it’s it’s on my list I haven’t read
00:03:36.640 it I I’ll be honest but it sounds
00:03:39.239 amazing from the reviews and everything
00:03:41.080 but we will talk about it how I found
00:03:43.400 you I I signed up to neuron you know
00:03:46.159 this um newsletter about Ai and and new
00:03:48.480 tech and then there was some interview
00:03:50.760 about uh like you you did with with the
00:03:53.640 guys and yeah and then and then I
00:03:56.239 started reading a little bit uh of of
00:04:00.040 your newsletter as well that’s why I I
00:04:03.840 saw that one landed earlier today so I
00:04:06.920 have to still uh read it then I checked
00:04:10.560 your book why did you uh why did you
00:04:13.400 choose this subject and why uh Louis
00:04:16.918 Armstrong yeah I mean sorry L Lance
00:04:19.839 Lance yeah LS Armstrong is is great too
00:04:22.759 other gu he didn’t cheat as far as I
00:04:25.280 know so I was at the Wall Street Journal
00:04:28.680 uh I basically started my career at the
00:04:30.680 Wall Street Journal and then we started
00:04:32.960 a sports page there and I was sort of
00:04:36.120 you know roped in to covering Sports it
00:04:38.360 wasn’t uh it wasn’t like I had set out
00:04:40.680 to be a sports reporter but it was like
00:04:43.120 Hey you know this is a good opportunity
00:04:44.759 to kind of like move up and you know
00:04:47.120 become a become a real reporter at the
00:04:49.240 journal I was a news assistant at the
00:04:51.320 time and it really didn’t start out as a
00:04:55.759 story about Lance Armstrong we were
00:04:57.400 looking into you know just corruption I
00:05:00.440 guess in in the Olympic you know as as
00:05:03.000 you probably know there’s like a lot of
00:05:05.080 corruption in the ioc and it goes back
00:05:07.960 many many years and so there was this
00:05:10.639 you know un UCI which is the the cycling
00:05:14.320 governing body within the Olympics had
00:05:16.840 all these issues and I did a bunch of
00:05:19.720 reporting on it um you know it was Lance
00:05:22.880 Armstrong was actually retired at the
00:05:25.199 time that I was doing this reporting and
00:05:28.160 everybody sort of knew that he had doped
00:05:30.440 it was kind of a shock to me that in the
00:05:32.800 US I think the his reputation was like
00:05:36.000 so great and people were talking about
00:05:37.720 him you know running for governor of
00:05:39.440 Texas and becoming a you know president
00:05:41.720 one day and all this stuff and I
00:05:44.680 realized that like actually we had just
00:05:46.759 totally like the American Press had
00:05:49.560 totally ignored this story and it which
00:05:52.400 in Europe I think was different I think
00:05:54.240 people he was much bigger yeah people
00:05:55.800 were much more skeptical of of him and
00:05:59.600 so I kind of thought well it’s been
00:06:02.080 covered the gu’s basically I mean the
00:06:04.759 information is out there right it’s just
00:06:06.639 a question of whether you choose to
00:06:08.759 believe it or not to like to look at the
00:06:10.720 truth right like take the green pill or
00:06:12.440 the red pill or whatever since he was
00:06:14.639 retired it wasn’t I I thought well
00:06:16.639 there’s not much of a story here and
00:06:18.759 then he decided to come out of
00:06:20.560 retirement and so I had all this
00:06:23.000 reporting in my notebook about him and
00:06:25.919 so we just started writing about him and
00:06:28.919 I never really thought that proving that
00:06:31.280 he doped was something that worthwhile
00:06:33.720 because what are you going to do like
00:06:35.840 you know you’re never going to get the I
00:06:38.000 thought you know you’re never going to
00:06:39.120 get like the true Inside Story it’s
00:06:41.120 always going to be circumstantial
00:06:42.479 evidence and he can always deny it and
00:06:44.199 he has his army of followers and so why
00:06:47.080 even why even bother so we focused on
00:06:49.960 the business of of Lance Armstrong I
00:06:52.800 wrote a story about how he actually
00:06:55.160 tried to along with some of his his uh
00:06:58.199 wealthy business men who backed his team
00:07:00.800 he tried to buy the Tour de France um
00:07:03.479 and turn it into this league and of
00:07:05.360 course like the subtext of all this is
00:07:07.080 always there’s always a doping subtext
00:07:09.680 right like I think they felt like well
00:07:12.120 we should make cycling a Pro Sport like
00:07:14.879 you know kind of like the NFL or Major
00:07:16.639 League Baseball in the US where drug
00:07:18.680 testing is not as strict like you can
00:07:20.680 pretty much do whatever you want in in
00:07:22.639 these Sports and you if you know the
00:07:24.960 right protocols that you know you can
00:07:26.680 get away with it and I think and if you
00:07:28.960 even if get caught I think a lot of
00:07:30.520 times you know it just gets sort of
00:07:32.240 rushed under the rug and that’s what
00:07:34.960 they wanted to do and I I just thought
00:07:37.440 all this stuff was fascinating but then
00:07:39.599 you know one day one of my sources said
00:07:42.000 you know Floyd Landis who was Lance
00:07:44.680 Armstrong’s lieutenant in the in the
00:07:47.000 tour who I had talked to before has gone
00:07:50.639 totally CR he’s gone off the deep end
00:07:52.319 and he’s writing these emails accusing
00:07:55.000 Armstrong of of doping but like to a
00:07:57.759 small group of people and and that was
00:08:00.879 the that was the moment I was like oh my
00:08:02.479 God this is like like I never thought
00:08:05.120 any this would ever happen but of course
00:08:07.680 you know then I had to try to find the
00:08:09.360 emails and then you know Landis wasn’t
00:08:12.720 talking to anyone and you know I
00:08:15.319 couldn’t really had to then try to
00:08:16.919 confirm that the emails were actually
00:08:18.840 real and legitimate so it was this whole
00:08:21.520 like probably like month of reporting to
00:08:24.639 try to uncover this and eventually we
00:08:28.280 did it we broke the story with Vanessa
00:08:30.479 oconnell who was my co-author on that
00:08:32.880 book you know we we broke the story and
00:08:35.799 it was like that was the beginning of
00:08:38.120 the end for Armstrong it was this huge
00:08:41.479 turning point and then there was this
00:08:42.839 Federal criminal investigation into the
00:08:46.160 team and it sort of shook a bunch of
00:08:48.680 things loose started because of of the
00:08:50.600 book of of what you discovered yeah I
00:08:52.640 mean basically it was sort of the
00:08:54.600 investigation was kind of into cycling
00:08:57.519 into you know doping cycling and then I
00:08:59.959 think after after we broke that story I
00:09:03.360 think it really kind of like shifted
00:09:05.079 focus and became about about about
00:09:07.640 Armstrong it was actually focused on
00:09:09.480 another team uh lower level team called
00:09:12.360 Rock racing before that there were a lot
00:09:14.519 of crazy characters in that in that book
00:09:17.320 you know which is the best the the best
00:09:19.399 stories how did you make them speak how
00:09:23.120 how how did you get to the source and
00:09:25.920 and how did you make them agree to good
00:09:29.519 I mean that’s like the question of uh
00:09:31.160 journalism right it’s like how do you
00:09:33.040 get people to talk to you you don’t have
00:09:35.040 you don’t have subpoena power like law
00:09:37.320 enforcement officials um but I think
00:09:39.800 that you know first of all there were
00:09:41.800 whistleblowers in that in that case
00:09:44.440 there were people who really wanted the
00:09:46.079 story to come out Betsy Andreu um who
00:09:49.519 was the wife of one of Lance’s former
00:09:52.279 teammates people like that were really
00:09:55.200 you know willing to talk of course they
00:09:57.320 didn’t have all the information so you
00:09:59.160 do have to get you do have to get people
00:10:01.480 on the inner circle to talk and you know
00:10:04.200 Flo Floyd eventually you know he I think
00:10:08.200 he wanted to get this off of his
00:10:10.680 shoulders you know he felt the weight of
00:10:12.680 all of these lies you know kind of
00:10:15.440 bringing him down and I spent a lot of
00:10:19.640 time like going around and meeting with
00:10:21.279 him just off the record you know just
00:10:23.320 talking with him before he eventually
00:10:25.640 decided to do you know to go on the
00:10:27.440 record and and tell his story and it was
00:10:30.320 interesting cuz like I remember sitting
00:10:32.279 with him you know like he I met him at
00:10:34.800 his cabin in Idol wild California which
00:10:37.519 is out in the out in the woods and you
00:10:40.000 know we’re sitting on his deck and he
00:10:42.800 just is like you could tell he was just
00:10:45.360 like a different he was he was this
00:10:46.880 weight you could almost see the weight
00:10:48.399 coming off of his shoulders you know he
00:10:50.000 just and then the story just starts
00:10:52.440 coming out you know and he’s talking and
00:10:54.519 it’s just like and it was we had like
00:10:56.680 basically it was almost like a three-day
00:10:58.560 long conversation where we would talk
00:11:01.399 all day and and you know a mix of like
00:11:05.160 talking and telling the story and then
00:11:06.760 also just sort of like getting to know
00:11:09.079 each other and you know the tape
00:11:11.279 recorder is running and then it stopped
00:11:13.079 and then you know and then we’d go meet
00:11:14.399 to the new location and um he didn’t
00:11:17.320 what he didn’t tell me was at the time
00:11:18.920 he had he was actually wearing in that
00:11:21.399 investigation into rock racing he was
00:11:23.680 actually wearing a wire like he he had
00:11:25.880 been like meeting with these guys
00:11:27.200 wearing like an FBI wire you know to try
00:11:30.320 to catch this like drug ring you know
00:11:33.399 and then and then I think the other I
00:11:35.000 think a lot of other people in you know
00:11:37.800 in that sport also wanted to talk you
00:11:40.920 know like they didn’t all go on the
00:11:42.240 record but they felt this was sort of
00:11:45.760 time to tell the story and and I think
00:11:48.440 also to kind of like if the story is
00:11:50.800 going to come out then maybe they should
00:11:53.519 just have their you know make sure that
00:11:55.560 their part of it is true right and that
00:11:57.360 they get there to two cents in and and
00:12:00.800 then there’s also a bunch of ancillary
00:12:02.600 characters right there’s not it’s not
00:12:04.440 just like you can get you know you can
00:12:07.639 get information from you know secondhand
00:12:10.800 sources and then eventually use that to
00:12:14.000 to get the firsthand sources to tell you
00:12:16.320 things and yeah it’s that sort of thing
00:12:18.880 but I remember one of the anecdotes in
00:12:21.600 the in the story they would have these
00:12:23.279 training camps in in Austin Texas and
00:12:26.920 the point of this training camp in
00:12:28.360 Austin was really not to train it was
00:12:30.399 kind of like to you know get the
00:12:33.320 teammate like figure out who should be
00:12:34.920 on the inner circle you know of this
00:12:36.880 team right and so they would start off
00:12:39.320 they’d go for a bike ride with some
00:12:40.839 sponsors and then they would go to like
00:12:43.120 a party and then they’d go to like
00:12:45.199 another party and then as the night went
00:12:47.880 on they would go you know then they went
00:12:50.079 to a strip club called the Yellow Rose
00:12:52.360 and then they went back to Stapleton you
00:12:54.639 know his his agent’s office Bill
00:12:57.000 stapleton’s office and you had this
00:12:59.120 other party where they were like
00:13:01.600 prostitutes and like cocaine and stuff
00:13:04.079 like that and like the you know the
00:13:06.480 people who ended up at that party I
00:13:09.199 think that it was sort of like they
00:13:10.519 could be trusted you know to keep the
00:13:12.800 secret and so it was sort of like this
00:13:15.079 test and so I thought it was kind of I
00:13:17.680 thought it was a really I thought it was
00:13:18.920 an important anecdote to yeah you know
00:13:22.079 to get into the into the story and I was
00:13:25.880 at the Wall Street Journal at the time
00:13:27.160 and we’re trying to get the story into
00:13:28.440 the wall journal and they’re like I
00:13:30.800 remember sitting in the office um with
00:13:33.560 uh with Jerry Baker who was the managing
00:13:35.720 editor of the journal at the time and he
00:13:38.040 was like this seems like taboy you know
00:13:40.639 you’re talking about like cocaine and
00:13:43.160 hooker and it was he was like worried
00:13:44.800 that it was like going to make the
00:13:45.839 journal look taoy and of course like
00:13:48.360 rert Murdoch had just had just bought
00:13:50.759 the the the paper and people might talk
00:13:54.399 people might say it’s the you know it’s
00:13:56.959 like Murdoch’s making the paper Tablo so
00:13:59.240 we had to make this argument that it was
00:14:00.720 an important part of the story because
00:14:02.360 it it wasn’t just frivolous it it had a
00:14:05.320 lot to do with like the psychology of of
00:14:07.880 the whole thing and I but the bar was so
00:14:10.800 high to get that published how many
00:14:12.600 people actually saw this and you know
00:14:14.800 can we trust Floyd Landis he had already
00:14:17.600 lied about his own doping at one point I
00:14:20.480 remember just calling anybody I could
00:14:22.480 possibly think of who might have been at
00:14:24.199 that party and then finally I just found
00:14:27.279 this one guy who was like a kid at the
00:14:30.639 time you know a junior bike racer and he
00:14:33.519 ended up at this party and he ended up
00:14:36.040 quitting cycling he wasn’t a pro racer
00:14:38.120 anymore confirmed it all it was like
00:14:40.000 yeah I was there and it was like he he
00:14:41.959 just didn’t think much about it and then
00:14:44.560 I remember him calling me later and
00:14:46.079 saying like oh I I want to take it back
00:14:49.000 like I’m just like sorry like the
00:14:52.600 information’s out like this is going in
00:14:55.560 but you know it was like that kind of
00:14:57.440 stuff just sorry that it was a long
00:14:59.079 minuted answer to your question but no
00:15:00.839 no no no it’s it’s it’s great and
00:15:02.480 actually that’s what I read like the
00:15:04.360 Atlantic said that um your anecdotes and
00:15:07.959 all the information you collected they
00:15:09.880 said a chilling tail and and they sound
00:15:13.519 like they were actually crafted in a TV
00:15:16.720 drama writer’s room because it is it
00:15:18.920 sounds like you know the even the what
00:15:23.880 what was the the movie based on on the
00:15:26.759 ter teros right or
00:15:29.279 even House of Cards kind of thing kind
00:15:31.279 of like it it reminds me of the in the
00:15:34.440 end it’s based all about on human
00:15:37.759 interactions trust and and mistrust and
00:15:41.279 um and I guess maybe those um those
00:15:45.120 meetings in those very controversial
00:15:47.759 places um could be a hook uh for for for
00:15:52.240 them to use against those people right I
00:15:54.920 don’t I don’t know how how they um got
00:15:57.920 to maybe record something but that’s
00:16:00.800 actually sometimes you know what happens
00:16:02.880 in politics right they then they like
00:16:05.839 opposition finds out or like the the
00:16:08.160 ruling party finds out that they they
00:16:10.959 were recorded yeah for sure and I do I
00:16:13.680 do agree I think this it was like yeah
00:16:16.560 it was like Theos but like with with
00:16:18.759 like professional athletes and you know
00:16:21.160 and celebrities and Cheryl Crow and you
00:16:23.600 know it’s just and like all over the
00:16:25.800 world you know it’s just it was so much
00:16:27.839 fun to to cover yeah I I’m jealous like
00:16:31.319 you you got to live through this and and
00:16:33.720 speak with all those people like it’s
00:16:35.720 it’s an amazing story and and you did
00:16:38.040 you did a great job right because you
00:16:40.800 you recognized is it a bestseller in new
00:16:43.959 New York Times New York yeah it was a
00:16:45.720 New York Times bestseller yeah
00:16:47.600 yeah well done well done thank you so um
00:16:50.600 yeah is is there any other book in the
00:16:53.000 making I don’t know maybe there’s a book
00:16:55.160 on AI to be done that’s what I’ve been
00:16:57.480 writing about lately I think there’s
00:16:59.680 actually some similarities in the sense
00:17:01.639 that yes it’s it’s a very wide group of
00:17:05.959 just Fascinating People you know with I
00:17:09.679 don’t know if there’s some deep dark
00:17:11.439 secret you know that everybody’s trying
00:17:13.240 to protect but I think that part’s
00:17:15.919 different I mean it would be great if I
00:17:17.400 could find something like that because
00:17:18.880 that always creates a great story but um
00:17:21.480 but there is tension right there is
00:17:23.599 tension
00:17:24.599 around you know what is the direction
00:17:27.000 that this technology should go
00:17:30.039 and are there is it a threat to humanity
00:17:32.160 and all of this stuff I think you know
00:17:34.280 some of that I think is a bit you know
00:17:36.320 some of it may be kind of silly and you
00:17:38.799 know I don’t I don’t really believe that
00:17:41.799 but but it does create some human drama
00:17:44.559 you know and I think any great story
00:17:47.120 ultimately is about people you know not
00:17:49.840 not artificial people but real people
00:17:52.280 yeah real people yeah and it’s it it is
00:17:55.919 a problem it can be a problem if uh
00:17:59.960 because AI is such a powerful technology
00:18:03.240 right that if it ends up in in the hands
00:18:06.480 of few then it can become a problem
00:18:09.760 because they direct how we think what we
00:18:13.200 read how we feel like it’s it’s it’s
00:18:17.480 Unthinkable yeah yeah I don’t and I
00:18:19.760 don’t think that will happen I mean I
00:18:21.480 think this is actually I think the the
00:18:24.679 direction that I think it’s going is
00:18:26.840 probably more open source
00:18:28.880 and you know these models themselves are
00:18:32.200 are probably kind of um commoditized I
00:18:35.880 think we’re going to see a lot of
00:18:36.919 disruption before we see concentration
00:18:39.679 of power I mean these these waves of
00:18:42.440 Technology they it’s it’s almost
00:18:44.720 impossible if you if you really think
00:18:46.400 about it it’s almost impossible for
00:18:48.640 incumbents to somehow hold on to their
00:18:51.559 seat as incumbents into the next wave of
00:18:54.280 Technology because it’s just so you know
00:18:56.960 it’s when truly disruptive technology
00:18:59.320 comes along you know big big companies
00:19:02.280 are just too slow to you know to be able
00:19:05.640 to capitalize and then it’s before they
00:19:07.799 know it it’s too late and I I know a lot
00:19:10.280 of these big companies even developed
00:19:11.960 the technology you know helped develop
00:19:13.440 it but I think there’s way more to come
00:19:17.000 into I think we’re only at the very
00:19:18.440 beginning of of this AI Revolution and
00:19:24.000 you
00:19:24.679 know I think people’s eye people eyes
00:19:28.039 are not on the ball like yesterday it
00:19:30.720 was the announcement of Sora you know
00:19:32.799 the the open AI text to video model and
00:19:37.320 it’s really stunning I mean you look at
00:19:38.880 the videos and it’s like oh my gosh like
00:19:42.720 this is so real and it is it’s amazing
00:19:45.679 and I’m impressive that they’ve been
00:19:47.240 able to accomplish it at the same time
00:19:49.559 it’s kind of the shiny object right like
00:19:52.280 it’s a it’s a that is a a an
00:19:54.799 evolutionary product that comes from
00:19:57.360 that is you know as it is great but it’s
00:20:00.799 not a revolutionary product and I think
00:20:03.240 at on the same day meta released their V
00:20:07.280 JEA model um or framework for for
00:20:11.240 training AI models and an open sourc it
00:20:14.720 right and I think and that got very
00:20:16.960 little attention right because it’s it
00:20:19.000 wasn’t very but I I read that and I
00:20:20.960 thought this is actually I don’t know if
00:20:22.960 this is if this in itself this paper
00:20:25.720 will be the some breakthrough but I
00:20:28.640 think it probably will have more impact
00:20:31.280 or it could potentially have more impact
00:20:33.280 than something like Sora because that’s
00:20:35.080 Sora is a a product that they’re putting
00:20:38.360 out on the market this is like a new way
00:20:41.520 a new Avenue toward even more
00:20:44.200 intelligent Ai and yeah you know and I
00:20:47.840 think that’s I think that’s what we need
00:20:49.360 I don’t think large language models are
00:20:52.159 are just going to get bigger and more
00:20:53.559 powerful and
00:20:54.720 eventually that’s the end I I think
00:20:57.080 there’s I think there are other
00:20:58.200 directions that the technology needs to
00:21:00.240 go first yeah um I’m not an AI
00:21:02.960 researcher but I mean this is sort of
00:21:04.679 like you can talk to I think you can
00:21:06.960 sort of talk to AI researchers and get a
00:21:08.799 sense of like the big picture and that’s
00:21:12.360 kind of like my my takeaway but you must
00:21:15.760 admit that you know tools like Sora will
00:21:19.919 or are already redefining how we think
00:21:22.799 of creativity and all the graphic
00:21:25.279 designers all the like movie directors
00:21:28.600 I don’t know if it will they will be on
00:21:30.679 pair
00:21:31.679 but like people like us who don’t don’t
00:21:35.400 know how to create videos if they if we
00:21:39.679 have access to such tools and you know
00:21:42.679 we we can maybe create like a starting
00:21:46.440 point of of some some creative project
00:21:49.320 it’s funny because people people are
00:21:51.600 worried about this as far as like
00:21:54.520 disrupting Hollywood I I actually think
00:21:58.679 I mean people are worried about actors
00:22:00.159 being put out of business yeah I think
00:22:02.480 actually Studios should be worried like
00:22:05.600 I think Studios have somehow been able
00:22:08.600 to take the digital revolution in like
00:22:12.159 digital film and and all this stuff and
00:22:15.000 somehow like keep the power right and
00:22:17.760 and and the budgets the budgets of these
00:22:19.919 films just get bigger and bigger and I I
00:22:22.400 remember when the digital film re I’m
00:22:24.440 old enough I guess where like I remember
00:22:26.360 when it was like oh
00:22:28.520 imagine of Imagine when movies are
00:22:30.440 filmed on digital film film is so
00:22:32.600 expensive and it’s hard to edit you know
00:22:35.120 like but if you did it on digital film
00:22:37.520 it could democratize it and then anybody
00:22:39.320 could make a future film and like of
00:22:40.919 course that didn’t happen right like you
00:22:42.760 I mean you have some independent films
00:22:44.440 and there’s clerks but then look at
00:22:46.279 everything everywhere all at once that
00:22:47.640 was a lowbudget film yes that won best
00:22:50.360 picture so if anybody should be worried
00:22:53.480 it’s the studios because I think like
00:22:56.840 democratizing I think think we’re not
00:22:58.480 there yet but democratizing special
00:23:00.480 effects is like that’s a studio issue
00:23:03.480 because
00:23:05.240 storytelling like that is the that is
00:23:07.799 actually what makes these things
00:23:09.039 valuable right like really great stories
00:23:11.120 I mean you I don’t know if you like
00:23:12.840 theater but I mean you can go to like a
00:23:14.919 you can go to the theater right and Off
00:23:17.039 Broadway where they don’t have any set
00:23:19.480 practically and it could just be five
00:23:21.640 people on a stage yeah and you can be
00:23:23.720 like riveted right because your
00:23:25.320 imagination does the work so I think
00:23:28.600 storytelling is just like you’re that’s
00:23:30.640 a human endeavor I don’t think AI is
00:23:32.760 going to replace storytelling I think
00:23:35.120 people are going to do that and all this
00:23:36.799 AI stuff is just is just the that’s the
00:23:40.039 set that’s the dressing around the story
00:23:42.400 so I actually see it as a positive I
00:23:45.440 guess in the long run for for for human
00:23:48.240 creativity yeah yeah I think we will see
00:23:50.960 more inly like more
00:23:53.559 individual uh projects right and I guess
00:23:56.200 it will spurk further
00:23:58.559 creativity yeah for sure I mean like art
00:24:02.679 art drawing and all this stuff I mean
00:24:05.120 you know like there are probably great
00:24:06.679 photographers today who just could not
00:24:10.320 have been photographers in like the
00:24:12.080 1980s you know yeah like they just would
00:24:14.440 have never made it because they didn’t
00:24:16.400 have the resources you know and now like
00:24:19.039 anybody can just go if they can think if
00:24:20.840 they can dream it up and you know they
00:24:23.200 can probably afford to get the equipment
00:24:25.840 necessary to do it I I don’t know I
00:24:28.200 think it’s I think that’s a really good
00:24:30.200 thing like I can’t I I’m particularly
00:24:32.360 not great at like drawing things you
00:24:34.600 know but I have like ideas I have like
00:24:37.399 you know like I could I can now draw
00:24:40.600 things with the AI and come and and make
00:24:43.000 my ideas come to show it to somebody why
00:24:45.840 isn’t that good this is what I meant
00:24:47.880 yeah and I I get that like that then
00:24:50.880 people who are like good at that sort of
00:24:52.880 like physical act of putting pen to
00:24:56.000 paper and like transla
00:24:58.360 three dimensions into two dimensions and
00:25:01.039 that’s like an amazing
00:25:03.080 skill but I mean in the end like if we
00:25:05.919 can have machines do that part you know
00:25:08.520 like I think it just it just frees up
00:25:11.880 people with other skills to to try that
00:25:14.159 and that’s like the story of it’s the
00:25:16.440 story of uh Humanity I I remember in
00:25:19.799 high school taking a technical drawing
00:25:21.720 class I don’t know why I did but like
00:25:23.360 somehow I ended up taking this class and
00:25:25.360 I remember like the first half of the
00:25:26.840 class was like everything done by hand
00:25:29.120 and I was like terrible at it I’m like
00:25:30.640 how am I getting like a bad grade in
00:25:32.279 like technical drawing like this is
00:25:34.640 ridiculous and then and then like we
00:25:37.120 switched over to computers for the
00:25:39.159 second half of the class and we used
00:25:41.039 like Cad and then I got an A and it was
00:25:44.200 fine like because I didn’t have to worry
00:25:47.120 about drawing the straight line with my
00:25:49.039 hand and all this stuff because
00:25:50.240 computers could do that part and I could
00:25:51.640 just worry about whatever the math or
00:25:54.120 you know whatever the other so I think
00:25:56.440 you know that’s
00:25:58.399 you know I guess I see it as a like I
00:26:00.919 don’t know who’s to say say What’s
00:26:03.600 what’s fair or not fair like you know I
00:26:06.000 think it’s just the way think new tools
00:26:08.080 are created and that benefits like
00:26:09.760 different people with different types of
00:26:11.440 uh of creativity and brain power yeah
00:26:14.880 exactly and if there is such a huge
00:26:17.039 demand it means that you know it’s
00:26:19.399 needed like people people really uh want
00:26:22.600 to expand on on the cre on creativity on
00:26:26.600 on the creative part of
00:26:28.399 of the
00:26:29.320 thinking okay let’s go back to the
00:26:32.159 scandals since you you like to cover
00:26:35.080 that part um I hosted um as like two few
00:26:40.480 few few episodes um back um Professor um
00:26:45.480 Guido palaza Professor ofan he’s
00:26:48.679 a professor of business ethics and he
00:26:53.320 covers corporate scandals and Mafia he’s
00:26:57.000 Italian so uh that also helps um and he
00:27:01.760 says that ethics is a muscle and you
00:27:04.279 need to train it uh to remain strong and
00:27:08.000 obviously humans can be both evil and um
00:27:12.919 like they are capable of great and evil
00:27:15.720 things uh and it’s easy to cross the
00:27:18.240 line and since like right now what we
00:27:22.080 are seeing with the with the more and
00:27:24.240 more po powerful uh tools and Technology
00:27:28.279 do you think there will be always
00:27:32.679 balance between those forces I mean
00:27:36.159 like we’ve seen some scandals in in Tech
00:27:40.159 before it wouldn’t it wasn’t only uh
00:27:42.840 what what was the Cambridge analytica
00:27:45.679 and paranos and Wei work and all the um
00:27:50.080 larger and smaller scandals but what do
00:27:53.399 you think will like how can we maybe
00:27:56.519 different different question how can we
00:27:59.080 make sure that we are protected from the
00:28:01.200 evil
00:28:02.320 actors um because it’s so much easier
00:28:05.279 right now to to fake things Theos and
00:28:08.559 all these scandals the mafia in in Italy
00:28:12.480 these are not failures of humanity right
00:28:16.399 there’s always going to be people who
00:28:18.440 are put into positions where they make
00:28:20.519 where they make unethical decisions
00:28:22.440 that’s that’s part of that’s part of
00:28:24.080 being human right like we’re all all of
00:28:25.799 us are always being test in that regard
00:28:28.640 yes and some people are just going to
00:28:30.120 fail that test I think the those are
00:28:33.000 failures of
00:28:35.640 society is a failure of um you know of
00:28:39.880 our of of the US’s white collar crime
00:28:44.159 justice system if you look at the if you
00:28:46.440 look at the statistics in the Northern
00:28:48.799 District of California which is the US
00:28:51.480 attorney’s office that’s responsible for
00:28:54.320 prosecuting white collar crime I mean
00:28:57.000 they do do almost nothing really when it
00:28:59.360 comes to why call it crime they’re
00:29:01.080 they’re focused on you know drugs and
00:29:03.399 things like that immigration the vast
00:29:05.360 majority of what our of the US
00:29:07.000 Department of Justice does is
00:29:08.720 immigration and drugs everything else is
00:29:11.080 like almost nothing and if you look at
00:29:13.120 the number of white collar crime
00:29:15.039 prosecutions over the last you know 20
00:29:17.600 or 30 years I mean it is just declined
00:29:20.919 it is the opposite of hockey stick
00:29:23.480 growth it’s just plummeted so if if you
00:29:28.000 if you don’t fund you know regulation
00:29:31.559 and the enforcement of laws then you
00:29:34.080 will have more crime I mean it’s just as
00:29:36.320 simple as that so yeah I think we just H
00:29:39.320 that’s just these are just decisions
00:29:41.000 that we make as as a society do we want
00:29:44.559 do we want strong federal agencies and
00:29:47.799 law enforcement um to sort of keep
00:29:50.440 everything in Balance you know that’s
00:29:53.640 that’s we have to decide I don’t know
00:29:55.200 some people would say no I mean I think
00:29:56.799 if you ask like Republicans they would
00:29:58.640 say no like the the free market will
00:30:00.640 solve all the every problem and let
00:30:03.399 people roam free you know I tend to
00:30:06.399 think the evidence does not really
00:30:07.960 support that that that thesis um I but I
00:30:11.960 also think there’s a lack of
00:30:14.640 appreciation uh
00:30:16.760 for I think on I think we don’t we don’t
00:30:20.320 as a society also just appreciate that
00:30:22.320 there are different types of people like
00:30:24.440 you want you want people who are willing
00:30:26.440 to bend the rules and and drive and stop
00:30:29.240 at nothing to get to get what they want
00:30:31.880 because those people actually sometimes
00:30:33.919 accomplish great things right and but
00:30:36.279 you also want them you want them to be
00:30:38.240 kept in check right like the the point
00:30:40.200 is like you want to allow those people
00:30:43.000 to do what they do but within boundaries
00:30:45.600 like within certain boundaries and and
00:30:48.720 setting it’s like it’s like raising kids
00:30:50.720 I have like young kids right and like
00:30:53.279 you don’t want your kids to be little
00:30:55.080 Rob robots and just do everything you
00:30:57.720 know never question authority and all
00:30:59.840 this stuff you but you you also need to
00:31:02.039 set boundaries and so it’s like this
00:31:04.440 kind of delicate balance I think I think
00:31:07.240 one of the big problems honestly is like
00:31:09.559 this this idea I don’t know if it’s as
00:31:11.760 big in Europe as it is here but like of
00:31:13.679 corporate ethics like there’s this
00:31:15.919 there’s idea that like companies need to
00:31:18.039 like behave ethically and you know I
00:31:22.440 just I just think that that is letting
00:31:25.080 regulators and law enforcement off the
00:31:26.880 hook
00:31:27.760 companies don’t act act ethically
00:31:30.000 companies act in the interest of
00:31:31.880 shareholders and that’s actually a good
00:31:33.799 thing that’s the way it should be but if
00:31:37.039 you don’t enforce the laws and you say
00:31:39.679 companies are allowed to just do
00:31:41.000 whatever you know they’ll they’ll
00:31:42.760 they’ll regulate themselves they have
00:31:45.000 you know pressure from the public to be
00:31:47.080 ethical and protect the environment and
00:31:49.639 all this stuff well of course they’re
00:31:52.120 just going to do stuff that like looks
00:31:53.919 good you know that that makes good
00:31:55.960 marketing and they’re not really they’re
00:31:58.159 really ultimately looking out for
00:31:59.279 shareholders so I think it’s I think we
00:32:01.679 have a lot of just naive people who who
00:32:05.679 don’t who want to who want to believe
00:32:07.519 that everybody can somehow be kept you
00:32:10.919 know can be made good by just sort of
00:32:13.519 like pressuring them publicly or
00:32:16.159 something on social media or whatever
00:32:18.159 and that just not how it is that’s not
00:32:20.480 that’s not Humanity some people some
00:32:22.799 people are rule followers and do
00:32:26.519 everything they have a moral compass and
00:32:28.799 they follow that moral compass and some
00:32:30.399 people don’t some people care
00:32:32.440 abouts right some people are psychopaths
00:32:35.159 and some people care about only money
00:32:37.399 you know and yes getting ahead in their
00:32:39.360 careers and that’s just you know you
00:32:41.039 just have to we I think we just need to
00:32:42.960 accept that as a as a as a and design
00:32:46.880 around that as well design around it
00:32:49.240 exactly which we have we just I think
00:32:50.919 we’ve gotten away from it maybe that
00:32:53.000 ethics as a muscle is a really good
00:32:55.000 point right like we have to think about
00:32:57.360 all this stuff as a muscle that needs to
00:32:59.200 be
00:33:00.480 exercised yeah yeah yeah but the the
00:33:03.519 point with um designing like for example
00:33:07.639 in terms of AI right like you have this
00:33:10.200 AI e ethics uh acts in in Europe and
00:33:14.120 like in us there was something in
00:33:15.760 Congress recently right like uh I don’t
00:33:18.480 know if there was anything signed or it
00:33:20.200 was just discussions but there is this
00:33:24.399 uh fear that those regulations are
00:33:27.639 actually uh serving the big guys because
00:33:30.840 they can afford to you know pay the
00:33:33.240 lawyers to to to to build and and use
00:33:37.960 maximum power they have within the legal
00:33:41.159 um um like power while the small guys
00:33:45.360 are not going to even try because they
00:33:47.559 cannot afford that yeah I mean I I I
00:33:51.200 think that’s a fair argument many people
00:33:52.919 have made that argument and that is that
00:33:54.399 is what tends to happen with with
00:33:57.000 regulation I’m not ultimately that
00:33:58.840 worried I think like I said before I
00:34:00.399 think that there will be a lot of
00:34:02.320 disruption and these these rules about
00:34:05.919 you know like you know AI it’s like
00:34:09.800 these executive orders I mean it’s not
00:34:11.520 these are not like hard and fast laws I
00:34:13.199 think I think that ultimately you know
00:34:16.719 that they’re not going to get in in the
00:34:18.239 way of innovation but I I do I mean I
00:34:20.800 think that you know there like I said
00:34:23.399 before there are existing laws that can
00:34:25.879 be used to enforce
00:34:27.639 pretty much any outcome any bad outcome
00:34:30.960 that comes from this AI so I think if
00:34:34.119 companies feel
00:34:35.440 like um if we do something bad like
00:34:38.879 we’re going to be held liable
00:34:40.520 potentially you know civil but
00:34:42.639 potentially even criminally liable if we
00:34:44.960 like really do something bad like that
00:34:47.399 is actually going to keep them in line
00:34:50.079 much more than like well you should
00:34:52.520 really think about safety and come talk
00:34:55.000 to us before you spend trillion dollar
00:34:57.800 on training your model which they’re
00:34:59.920 going to do anyway you know they’re
00:35:01.440 going to go talk to government anyway I
00:35:03.880 I just think that’s what like we we
00:35:05.960 misunderstand what people are afraid of
00:35:09.960 people are afraid of going to jail like
00:35:11.480 honestly like I think that’s you know
00:35:14.920 that’s a motivate like if if you’re like
00:35:17.000 worried you might go to jail then you’re
00:35:18.560 going to think twice about like doing
00:35:20.760 something horrible true but in I don’t
00:35:24.480 know in this in this moment in this type
00:35:28.280 of technology I think sometimes it’s
00:35:30.839 hard to predict um the outcome and the
00:35:35.480 the consequences of of you know
00:35:38.240 spreading I don’t know misinformation
00:35:41.800 and and then just
00:35:43.560 worrying um yeah worrying about
00:35:47.119 consequences well misinformation is not
00:35:49.560 actually a crime you know there’s not
00:35:52.319 actually anything against the law about
00:35:54.240 spreading misinformation at least not in
00:35:55.960 this country you know MH um you can do
00:35:59.480 that I mean I don’t know I think we I
00:36:02.680 think the misinformation thing is is
00:36:04.359 kind of overblown too I mean it like
00:36:08.160 ultimately you know if he there there’s
00:36:11.160 certain people who want to believe that
00:36:13.440 you know whatever Hillary Clinton is
00:36:15.240 running a child molestation ring out of
00:36:17.960 a pizza parlor you know and like those
00:36:21.079 people they were not like they were not
00:36:23.839 going to vote for you know they were not
00:36:27.680 like they didn’t like switch their vote
00:36:29.319 based on that right like they yeah yeah
00:36:31.440 you know they want to believe this crazy
00:36:33.079 stuff and like I think there’s you know
00:36:35.640 there’s just there’s always a certain
00:36:37.880 population that will just believe
00:36:39.119 whatever their tribe says and you know I
00:36:43.000 I ultimately think that’s not a good
00:36:45.760 thing and I wish that were not the case
00:36:48.560 but when we try to when we actually get
00:36:51.640 together and we say you know what we
00:36:52.839 really need to make sure that like gosh
00:36:55.440 we got to we got to protect people here
00:36:57.480 from all this bad information you know
00:37:00.599 it’s a slippery slope and where does it
00:37:02.520 end right like I think ultimately it
00:37:05.000 always ends you always end up crossing
00:37:07.200 the line and then backing it right like
00:37:09.240 we did cross the line I think during the
00:37:11.599 during the pandemic during the last
00:37:13.319 election like you know the Biden laptop
00:37:16.440 thing like censoring that censoring all
00:37:19.119 the covid you know um skepticism around
00:37:22.119 the vaccine and all this stuff
00:37:25.000 ultimately that had a had the opposite
00:37:27.640 of the intended effect right like all
00:37:30.119 like like trying to stop the covid
00:37:33.280 misinformation I think only only
00:37:36.599 emboldened the people who were against
00:37:40.960 exctly because then they could say oh
00:37:42.240 well now there’s this like conspiracy to
00:37:44.800 keep the information you know I think
00:37:47.280 the way you win the way you win is like
00:37:49.760 you just you say the true information
00:37:52.640 you go out and you and you tell the
00:37:54.400 truth and you try to make them better
00:37:56.680 more convincing argument and get people
00:37:58.880 onto your side like that’s that’s how
00:38:01.800 you win you don’t win by silencing
00:38:04.280 people who are spreading misinfo and
00:38:07.680 disinfo it just yeah it’s a it’s a
00:38:10.200 losing battle and ultimately ai ai just
00:38:14.440 it makes it a little bit easier to to do
00:38:16.920 that stuff but it doesn’t actually
00:38:18.520 change the game that much I mean people
00:38:21.240 people still have to spread that through
00:38:23.520 social media through those channels and
00:38:26.640 that’s ultimately the trick right it’s
00:38:28.319 getting those algorithms to get your
00:38:31.960 misinfo and disinfo to the right people
00:38:34.440 at the right time so that’s my view on
00:38:36.920 all that stuff sorry you are right but
00:38:40.520 then uh I guess
00:38:43.240 ultimately the person who has more money
00:38:46.400 to fund um this Miss like this type of
00:38:49.760 information wins right like it it it is
00:38:53.040 about how much and where who whom does
00:38:56.839 this information reach and there was so
00:38:58.960 many people undecided in even in terms
00:39:01.520 of voting um so you would want them to
00:39:05.079 see the both sides and right now with
00:39:08.359 any social media maybe I don’t know
00:39:10.640 about X right
00:39:13.000 now um but it’s it’s like we are in a
00:39:16.480 bubble right like we we just kept being
00:39:19.160 fed the same thing which already
00:39:21.560 confirms our bias yeah yeah I mean I
00:39:26.119 think sure all that’s true I just I just
00:39:29.480 think in the end I I don’t know I mean
00:39:32.400 yeah you might have more money and you
00:39:34.640 could spend that money on there’s
00:39:36.680 there’s probably just as much money on
00:39:38.400 both sides of these things right I think
00:39:40.839 I think ultimately it’s just making that
00:39:43.960 it’s making that convincing argument and
00:39:46.480 trying to and trying to get people onto
00:39:48.480 your side I mean that’s that’s
00:39:50.720 ultimately the best way to go about this
00:39:53.280 stuff I just I honestly I think it’s
00:39:55.400 like you know I mean look at X the the
00:39:59.680 whole thing a lot of that what’s going
00:40:01.400 on with X is like that is a direct
00:40:03.960 reaction to to I think that company
00:40:08.760 going totally overboard and the and you
00:40:11.560 know the pendulum SW it went it started
00:40:13.480 out as like totally free speech right
00:40:16.480 like where the like like Twitter was
00:40:18.480 like we’re not going to police speech at
00:40:20.880 all and then it like gradually becomes
00:40:23.240 like well maybe we need to like stop the
00:40:25.119 terrorists well maybe we need to stop
00:40:26.680 the Nazis and like and then you’re just
00:40:28.760 like you know getting more and more and
00:40:30.599 pretty soon you have like thousands of
00:40:32.480 people doing nothing but censoring
00:40:34.880 people on your platform and of course
00:40:37.119 it’s going to go overboard and you know
00:40:39.280 they’re kicking off like all these
00:40:41.319 conservatives or anybody who questions
00:40:43.520 the covid vaccine and all this stuff
00:40:45.599 they created like an army that
00:40:47.599 eventually just like took over and now
00:40:50.400 now now it’s gone completely off the
00:40:52.359 deep end in the other direction and you
00:40:54.720 know I don’t know like is that the
00:40:56.960 result that people wanted like I I think
00:40:59.200 we should just think about like what is
00:41:01.000 the result that you want like what is
00:41:02.760 ultimately the result that you want to
00:41:04.359 get to and like what’s the best way to
00:41:06.000 get there I don’t think I don’t think
00:41:08.319 like creating some gigantic censorship
00:41:11.200 apparatus is like is going to get people
00:41:13.720 to the result that they want to get to
00:41:15.599 no because there is no discussion there
00:41:17.520 right like it’s just all
00:41:19.680 siloed yeah and it just creates a
00:41:21.680 backlash I mean yeah you don’t win you
00:41:24.640 can’t win you’re never going to silence
00:41:26.079 people people into like you know
00:41:30.599 submission never works even if you’re
00:41:32.680 really well-meaning and even if you’re
00:41:34.359 right which I happen to think those you
00:41:37.119 know those people are like I’m not I was
00:41:38.760 not a vaccine like I was encouraging
00:41:40.680 people I got the vaccine you know like
00:41:42.319 I’m not yeah but but I you know I I wish
00:41:45.800 that the disinformation misinformation
00:41:48.040 did not exist but it does and it’s like
00:41:50.880 you know it is it is what it is yeah but
00:41:54.440 then how how can we help
00:41:57.599 people like get to the truth get get to
00:42:00.720 get to the
00:42:01.720 source people are not as dumb as
00:42:04.880 everybody thinks like I I think a lot of
00:42:07.560 times when people when people are you
00:42:11.200 know saying this stuff they’re not
00:42:14.359 like like saying they believe in these
00:42:17.040 in these conspiracy theories or whatever
00:42:18.880 I think a lot of times they’re just it’s
00:42:20.400 a back they’re lashing out like it’s not
00:42:23.200 like ultimately when people sit down and
00:42:25.240 they’re levelheaded they’re not angry
00:42:27.040 and they’re not you know and they just
00:42:28.359 sit down and they’re like I would like
00:42:30.319 to make a you know real effort to figure
00:42:33.400 out what the truth is they ultimately
00:42:35.599 know pretty much where they should go
00:42:37.920 like you go to trusted sources right
00:42:39.760 like mainstreams you don’t go to like
00:42:41.720 some random websites and like seek out
00:42:45.440 crazy I mean sure maybe you could find
00:42:48.200 cherry pick and find some examples of
00:42:50.200 where that might have happened but for
00:42:52.040 the most part I think people realize
00:42:54.280 like there are you know there are
00:42:57.079 trusted sources to go to and and like
00:43:02.760 probably aren’t aliens you know landing
00:43:06.319 on Earth and like whatever the the the Q
00:43:09.200 andon thing is right like the lizard
00:43:11.079 people have taken over I mean it’s just
00:43:12.720 ridiculous like I I just don’t it’s it’s
00:43:16.720 to me it’s like laughable and like
00:43:18.280 taking it super seriously and like we
00:43:20.599 need to stop people from believing in
00:43:21.960 the lizard people it just makes you
00:43:23.440 sound you’re just lowering yourself to
00:43:25.319 like that level in a sense you know of
00:43:28.040 course but it it makes for a good story
00:43:30.280 right and clickbaits when I was when I
00:43:33.040 was a kid before like when when people I
00:43:35.920 grew up in a time when people still went
00:43:37.440 to the grocery store yeah and you would
00:43:39.839 like stand in line to buy your groceries
00:43:41.599 and you did that often because there was
00:43:44.040 no delivery there was no instacart right
00:43:46.160 so you were in the grocery store all the
00:43:47.640 time and every time I went to the
00:43:49.640 grocery store there were magazines right
00:43:52.720 and there was always some Magazine with
00:43:54.640 a crazy UFO you know conspiracy theories
00:43:58.160 right and like I don’t know we didn’t
00:44:00.240 freak out and say like you know however
00:44:03.240 many millions of people who stand in
00:44:05.160 line in the grocery store twice a week
00:44:07.839 are are being exposed to these
00:44:10.319 conspiracy theories like we didn’t freak
00:44:12.280 out of it was just like it we laughed
00:44:13.920 about it like oh haha like the national
00:44:16.760 but maybe there were too many of them so
00:44:19.200 in the end like you become immune like
00:44:21.440 you just ignore it well maybe that’s
00:44:24.880 well maybe that is what happen here
00:44:27.040 right like maybe maybe as AI makes
00:44:30.880 disinformation misinformation just more
00:44:33.079 and more prevalent maybe people will
00:44:34.640 just tune it out right so maybe it’s a
00:44:36.319 good thing maybe you’re
00:44:39.240 right Tricky Tricky Tricky okay but what
00:44:43.520 okay but since you cover so much on on
00:44:46.640 on on this um what kind of tech or what
00:44:49.559 kind of uh tools do you see which are
00:44:52.119 really cool in in maybe curating new
00:44:56.000 news or apart from obviously your
00:44:58.559 newsletter um but yeah yeah we’re using
00:45:02.480 AI we’re actually we’re we’re doing we
00:45:04.920 have this thing called signals where you
00:45:06.800 sort of we sort of survey the the the
00:45:09.800 landscape and find you know find the
00:45:13.079 best articles to kind of highlight and
00:45:15.480 and give give readers just a sense of
00:45:17.240 like here’s how the world is looking at
00:45:19.520 this this top story today and we’re
00:45:22.240 actually using AI to go out and find
00:45:24.240 those sources but not just to find them
00:45:26.680 but also like translate things from
00:45:29.839 other languages so here’s like here’s
00:45:32.400 what the papers in Russia are saying
00:45:34.280 today about like navali you know and
00:45:36.960 that kind of stuff so I
00:45:39.440 think that’s a good I think that’s a
00:45:41.599 good use of the of that tool I don’t I
00:45:44.599 don’t think it will replace writing or
00:45:47.000 journalists really like it’ll it it’s
00:45:49.920 not it’s not that it’s not good enough
00:45:52.280 to do that and I think people who do
00:45:53.720 that are totally misguided but
00:45:56.720 but I think it’s it is I think it can be
00:45:58.800 a really powerful tool and we haven’t
00:46:00.400 even really begun to scratch the surface
00:46:02.160 yet and on using it um I think bigger
00:46:05.680 companies are starting to figure out how
00:46:07.520 to use it like they’re ba basically the
00:46:10.599 main way is like rag you know like
00:46:13.280 retrieval augmented generation so they
00:46:16.680 like they these companies that have
00:46:18.720 unstructured data that they’ve never
00:46:20.520 really been able to make use of are now
00:46:23.359 seeing if they can take AI models and
00:46:25.400 turn that into valuable information and
00:46:27.920 there’s there’s starting to be like a
00:46:29.079 whole cottage industry just around the
00:46:31.559 data aspect of that like figuring out
00:46:33.400 how to move data or not even move data
00:46:36.520 but process data and then move that
00:46:39.119 process metadata to like to to the cloud
00:46:42.720 to be analyzed by the AI
00:46:45.400 models I mean we’re not high budgeted
00:46:48.280 enough to like have something crazy like
00:46:50.400 that right now but ultimately I think we
00:46:53.839 should and I think I think um news
00:46:56.960 organizations have a ton of unstructured
00:47:00.800 data that is very very valuable I mean
00:47:03.359 if you think about just like all the
00:47:04.880 notes and everything I mean clear
00:47:06.400 clearly there are issues there around
00:47:08.520 protecting your sources and all this
00:47:10.000 stuff but but I mean it figuring out how
00:47:13.680 to how to how to tap into that power is
00:47:17.040 I think I think really could
00:47:19.040 supercharge uh media organizations and
00:47:22.359 that’s what we need I mean there’s how
00:47:24.520 many newspapers are going out of
00:47:26.040 business every day it’s just it’s it’s a
00:47:28.599 blood bath and I think you know figuring
00:47:31.920 out how to more efficiently cover the
00:47:34.559 news in a in like a real quality way not
00:47:37.040 just like clickbait or you know yeah
00:47:39.640 sort of repeating stuff is valuable yeah
00:47:43.480 so I don’t know that’s how I view it how
00:47:44.640 do you view it no no no I I you are you
00:47:47.839 are the expert here um I can only to I’m
00:47:50.640 I’m working on AI in terms of um public
00:47:53.880 and private sector but like more more on
00:47:56.960 like customer service and decision
00:47:58.680 intelligence but um do you think is
00:48:02.559 there is it going to be like a hybrid
00:48:05.200 for for for those media companies to
00:48:09.280 monetize I think that you would be I
00:48:12.800 think if you start selling the data
00:48:14.960 somehow or becoming like a data broker
00:48:17.240 kind I think that would be sort of bad
00:48:19.440 for a organization’s reputation you know
00:48:23.280 even if it even if it wasn’t unethical
00:48:25.119 even if it was sort of you know you got
00:48:28.000 people to agree to the terms of service
00:48:29.640 or whatever I think it just turn then it
00:48:31.359 turns you into something into something
00:48:33.760 else I think it’s bad enough that like
00:48:36.119 that like you know the media websites
00:48:39.319 use all these trackers and everything
00:48:41.119 and at the same time they’re writing
00:48:42.599 about you know how bad that is you know
00:48:45.960 that’s happening but I think you know
00:48:49.119 but I I was thinking more like just in
00:48:52.400 the news Gathering production
00:48:56.200 um side it would be really good I mean I
00:48:59.359 I’m not an expert like I’m not on the
00:49:01.000 business side of of the I’ve never been
00:49:03.720 on the business side of a news
00:49:05.079 organization so um I’ve had maybe like a
00:49:08.319 closer look at it just being at the
00:49:09.920 information and um and now semaphore
00:49:13.119 because you get at a at a startup you
00:49:15.079 get to see more of how that works but I
00:49:18.400 think um you
00:49:20.599 know to me like if you can if you can
00:49:24.359 pay if you can charge subscriptions
00:49:26.160 that’s like the ultimately like the best
00:49:28.599 thing because that’s the most pure form
00:49:30.760 of Journalism but then I think you run
00:49:32.599 into some issues around just impact
00:49:35.799 right because how many people can read
00:49:38.520 it and get behind the pay wall but um
00:49:41.839 but I think that we’re doing something
00:49:43.960 interesting which around like quite you
00:49:47.359 know premium non-programmatic
00:49:49.839 advertising so it’s like much more I
00:49:53.200 think to me in my what I like about
00:49:55.520 about it is like it reminds me more of
00:49:58.240 the old um the original like model of
00:50:01.960 newspapers which is you have advertisers
00:50:04.680 who agree to buy a certain number of ads
00:50:07.559 in your publication over a certain
00:50:09.880 amount of time and they don’t they don’t
00:50:12.640 get to decide which article it goes next
00:50:14.920 to or you know like exactly which reader
00:50:19.119 eyeballs get on it and like you know
00:50:21.359 it’s more like I’m going to get this
00:50:23.200 General audience which I think is a
00:50:24.799 really good audience for me to read and
00:50:27.319 as long as they have a certain
00:50:28.720 circulation over you know a certain time
00:50:31.799 I know I’m going to generally reach
00:50:33.319 those people and then that I think is
00:50:36.240 the best of both worlds because it gives
00:50:38.520 you that that good advertising Revenue
00:50:42.040 but you’re not um incentivized to you’re
00:50:45.559 not getting paid per click right which I
00:50:47.480 think just creates that downward spiral
00:50:50.280 that we’ve seen in the in the news
00:50:51.720 industry that you know it just
00:50:54.079 ultimately doesn’t it doesn’t doesn’t
00:50:55.640 pan out the the P the traffic paperclick
00:50:58.760 model it’s just
00:51:00.079 like I think I think it’s just not a
00:51:02.359 good way to fund
00:51:04.400 journalism it’s of course it’s not to
00:51:07.400 yeah it’s it’s at the end it becomes
00:51:10.359 very biased
00:51:12.040 but we since we everything is right now
00:51:15.160 digital I don’t I don’t see it coming
00:51:17.920 like going back to to the old model I
00:51:20.599 don’t know it’s it just maybe not and by
00:51:24.240 the way this is just my person opinion I
00:51:26.119 mean I’m not like this is not the
00:51:28.319 opinion of any publication I work for or
00:51:31.079 have worked for it’s just it’s just my
00:51:33.720 view but um yeah I mean I think what I’m
00:51:38.480 saying is like it’s not going to go back
00:51:39.920 to the old way but I think there there
00:51:41.520 may be ways to find Happy mediums you
00:51:45.799 know to to to the old ways I think the
00:51:49.440 you know we’re doing events and things
00:51:50.880 like that too that which which bring in
00:51:53.400 revenue and I think you know you just
00:51:56.480 you find the revenue where you you know
00:51:58.440 I think as when you’re running a company
00:52:00.440 like you have to figure out where you
00:52:02.359 know where to get the revenue and to
00:52:04.480 stay afloat and all this stuff so it’s a
00:52:07.079 tough job I mean it’s not this is not
00:52:09.160 like you know you’re not in a in an
00:52:11.760 industry where you know you’re going to
00:52:14.599 see thousand X growth right it’s it’s
00:52:17.079 it’s a you’re creating a product that
00:52:19.440 does you know take resources to build
00:52:22.520 and create and um you know that’s uh
00:52:27.240 that’s part of the but that’s part of
00:52:28.640 the joy too you know it’s it’s a it’s a
00:52:31.240 valuable business but have you ever
00:52:33.319 considered going into like Tech uh
00:52:36.480 company instead of of being in media but
00:52:39.359 covering for for tech oh like uh working
00:52:43.559 like working in in technology yeah no I
00:52:47.160 mean I think I’m probably like too old
00:52:48.960 for that now but I think you know like I
00:52:51.920 do admire that I mean I think you know
00:52:54.799 to to do I think it’ be really fun to do
00:52:57.079 a startup or something you know but I
00:53:00.079 also think I’ve read enough and talked
00:53:02.040 to enough Founders that I know it’s it
00:53:04.640 is just one of the hardest things it is
00:53:07.559 that is very rewarding if it works works
00:53:10.240 out and um you have to get to you have
00:53:14.000 to find the product Market fit you have
00:53:15.920 to understand the problem and um you
00:53:18.400 know that’s exactly what you did with
00:53:20.319 your with your book and with lots of
00:53:22.240 other projects so a book is kind of like
00:53:24.599 a startup
00:53:26.160 yes it is it is it is you have to fight
00:53:28.799 for funding you have to fight for
00:53:30.359 audience there is lots of lots of things
00:53:32.160 which which are very very and we have
00:53:35.079 him common so I think this I think CEO I
00:53:38.160 really have a lot of respect for like
00:53:40.000 startup CEOs it’s just it is it is just
00:53:44.200 a
00:53:45.559 painful you know endurance sport that I
00:53:49.440 you know that I I think is it’s fun it’s
00:53:51.839 fun it’s more probably more fun to write
00:53:53.520 about than actually do
00:53:56.720 I would say which which which startups
00:53:59.040 or which which um Industries do you
00:54:01.480 particularly um like or
00:54:04.559 follow like like within like just within
00:54:07.200 Tech or or um yeah within Tech I don’t
00:54:10.559 know lately I’ve really kind of been um
00:54:13.720 I’ve I’ve had a lot of fun talking to
00:54:15.240 people who were in The biotech space uh
00:54:17.720 it’s not like my main area but um I love
00:54:21.160 this idea of just of um being able to
00:54:24.240 use AI to
00:54:26.119 manipulate biology to like digiti you
00:54:29.839 into um
00:54:31.799 longevity well I guess that’s part of I
00:54:34.440 guess that’s longevity is kind of like
00:54:37.040 part of that like I I I’ve interviewed
00:54:39.079 the in silico um CEO and that’s they’re
00:54:43.160 doing that stuff but like really they’re
00:54:45.599 talking about longevity through you know
00:54:49.280 curing diseases right like I think I
00:54:52.119 like I think it’s really exciting if we
00:54:53.839 get to a place where
00:54:55.720 you can
00:54:56.960 actually like in instead of when you go
00:55:00.160 to the doctor instead of like trying to
00:55:02.559 figure out what’s wrong with you they
00:55:03.680 just like look at every piece of DNA
00:55:05.839 that’s in your body and they can see
00:55:07.599 like all the bacteria that you have and
00:55:09.839 every you know every virus you know
00:55:13.720 that’s swimming around every Cancer cell
00:55:15.760 and just be like okay like here’s where
00:55:18.240 the problem is we’re going to create a a
00:55:21.039 custom a customade drug for you that
00:55:23.960 just does exactly it needs to do I mean
00:55:26.400 I think that’s like the promise of where
00:55:27.960 we’re headed obviously that’s science
00:55:29.520 fiction right now and I don’t think
00:55:30.720 that’s happening anytime soon SE so many
00:55:32.839 of it in in the movies right already I
00:55:35.799 mean I think it’s going to happen at
00:55:37.599 some point maybe not in our lifetime but
00:55:39.599 like I just think that that area is
00:55:41.559 really exciting and um I still I sort of
00:55:45.920 actually wonder if autonomous
00:55:48.640 cars at some point are going to happen I
00:55:51.720 think it’s it’s going to it’s there
00:55:54.640 needs to be another breakthrough you
00:55:56.599 know but I see things like the thing I
00:55:59.400 mentioned earlier with with um V JEA
00:56:02.680 that meta is doing and I’m like yeah oh
00:56:04.880 that’s interesting like you know what if
00:56:07.440 you what if that you take that method
00:56:10.359 and you follow it to its conclusion
00:56:12.280 you’re like okay well what you have is
00:56:13.920 like computer vision that’s just a lot
00:56:17.000 smarter um and probably takes less
00:56:20.559 compute power to to look at the world
00:56:24.359 and you know maybe that’s maybe that’s
00:56:27.359 actually the route to autonomous driving
00:56:30.760 you know I I live in San Francisco I
00:56:33.720 mean I live in the Bay area I’m in San
00:56:35.520 Francisco a lot have you seen these wh
00:56:38.720 cars driving around with with no driver
00:56:41.799 uh I saw the the videos but I haven’t
00:56:44.400 seen in person though I still I mean weo
00:56:49.520 there’s I well maybe I should be but no
00:56:52.760 I’m not really but Cruz was driving
00:56:55.319 around too now they’re off the road
00:56:56.920 because they had they had some issues
00:56:58.599 but but there were there was a time
00:57:00.760 where it was Cruz and wh Mo and you
00:57:03.440 could end up at night usually there
00:57:05.559 there more of them at night so you’d end
00:57:07.119 up sometimes in like a neighborhood and
00:57:08.920 every other car just seemed to be a
00:57:11.000 driverless car with no one in it and I
00:57:13.799 still just like every time I see one I’m
00:57:15.440 just like I cannot believe that that we
00:57:18.400 that we’re here yet like I thought it
00:57:19.839 would take so much longer and I I know I
00:57:22.200 know it’s a bit of a there’s a parlor
00:57:25.319 aspect to it because they’ve done so
00:57:27.400 much human painstaking you know bespoke
00:57:31.520 labor to to get it to work in this
00:57:34.160 geographic area so like it’s not scaling
00:57:37.799 yet you can’t just like tell the car to
00:57:39.480 go to any City and drive but it’s still
00:57:42.839 amazing and I think that is like to me
00:57:46.599 still one of the great challenges that I
00:57:49.839 mean so many people die every you know
00:57:52.680 there like a million people a year or
00:57:54.400 something around the world who die in
00:57:56.359 car accidents every year like that’s
00:57:58.160 just you know but I think we’ve like
00:58:00.920 forgotten about it because of all this
00:58:02.799 this AI stuff but I think it’s the large
00:58:05.400 language model stuff you have the waves
00:58:07.520 of hype right like and what was before
00:58:09.839 AI um was it VR
00:58:13.880 Krypto well now we’re back to VR right
00:58:16.280 we’re back you ask me are you gonna are
00:58:18.760 you going to get one of those headsets
00:58:20.319 the the from what I was reading and and
00:58:23.400 I obviously watched the uh Mark’s um
00:58:26.799 video slashing Apple I’m I’m thinking I
00:58:30.480 I will I will check them both are
00:58:33.480 you I no but I real when I read all the
00:58:36.760 stories about people returning them I
00:58:38.960 was like oh I should have done that I
00:58:41.039 should have bought one I should have
00:58:42.119 bought one so I got it on day one and
00:58:43.839 then I could like blog about it and
00:58:45.960 write about it and then return it two
00:58:47.520 weeks later why didn’t I think of
00:58:50.240 that because they didn’t give me a
00:58:52.119 review unit but you know I have the I
00:58:56.079 have the vision the this thing I have
00:58:58.400 this one a review unit of the quest 3
00:59:02.480 it’s really cool I I’ve actually been a
00:59:04.680 huge fan of VR since day one I remember
00:59:07.119 when Facebook acquired Oculus I had like
00:59:10.760 one of the first things the cool things
00:59:12.200 I did when I moved to San Francisco is I
00:59:14.200 I was at a party and they had like an
00:59:15.920 Oculus this was before Facebook acquired
00:59:18.079 it and like just like walking around do
00:59:20.839 fighting zombies I’m like this is so
00:59:22.839 cool but ultimately like I think it’s I
00:59:25.680 think Mark Zuckerberg was right it’s
00:59:27.799 it’s much more of like a fun thing to do
00:59:31.079 rather than like a work tool like I
00:59:32.839 don’t think I don’t see I just I think
00:59:35.280 sitting looking at a computer screen is
00:59:37.039 bad enough putting a head you know on
00:59:40.240 face looking at the screen is like just
00:59:41.760 gonna get really old really fast yeah
00:59:44.760 yeah yeah you cannot you cannot es I
00:59:47.480 don’t know you cannot Escape I yeah I
00:59:49.640 don’t I I I don’t buy this into this uh
00:59:53.920 for for purposes yeah for gaming sure
00:59:57.240 but a lightweight pair of glasses that
01:00:00.039 looks exactly like regular glasses that
01:00:03.280 you can have a screen in or transpose
01:00:06.160 things in the real world that would be
01:00:07.960 awesome I think but I think that’s it is
01:00:10.880 coming yeah but I don’t know I mean
01:00:13.799 right now you know the
01:00:16.400 physics that needed right the the the
01:00:19.520 the understand like the breakthroughs
01:00:21.119 have not happened yet so who know like
01:00:23.839 it’s impossible to predict when if and
01:00:27.039 when that will ever arrive I Heard lots
01:00:30.440 of um new development is happening in
01:00:32.839 Japan and Taiwan obviously for
01:00:35.920 semiconductors uh so we may have we may
01:00:39.200 hear of some new uh companies soon from
01:00:43.280 there okay I believe you that will be
01:00:46.520 tricky yeah I am I might know
01:00:49.440 someone okay so V are when I know when I
01:00:53.760 can I will
01:00:55.119 know um okay so VR anything else you are
01:01:00.039 excited about what’s what’s what’s been
01:01:02.359 what you’ve been writing about and I
01:01:05.599 still think you want to try I still
01:01:07.599 think blockchain is interesting you know
01:01:10.000 I think I think we’re seeing that sort
01:01:12.760 of quietly under the radar happen now
01:01:16.799 not not crypto but you know like in the
01:01:19.799 way that it in the way that it sort of
01:01:22.119 was originally meant to happen you know
01:01:24.760 it’s I think it’s still to me it’s still
01:01:27.079 a great concept this
01:01:30.240 decentralized way of of of keeping track
01:01:33.359 of everything where you know there no
01:01:35.720 one’s no one’s actually no one has like
01:01:37.599 that power and and it’s lower it’s much
01:01:40.720 lower cost you know you can get you can
01:01:42.680 do things much faster I think it’s it’s
01:01:45.599 actually it’s kind of starting to to
01:01:48.079 catch on I’m I’m interested to see you
01:01:50.640 know what happens I don’t care at all
01:01:52.319 about like who’s gonna like what the
01:01:55.520 price of Bitcoin is and you know who’s
01:01:58.279 going to get rich off of the latest nft
01:02:00.599 and all that stuff I I care about the
01:02:03.640 fundamental underlying technology and
01:02:06.440 what that means for the future of of the
01:02:09.319 web and the internet and there’s some
01:02:10.680 interesting tie-ins with with AI as well
01:02:13.079 where there’s this there’s this idea
01:02:15.480 this concept that you could like run and
01:02:17.760 train AI models on
01:02:20.760 distributed compute across the world I
01:02:23.880 mean that’s it’s an old concept but
01:02:26.200 there are people trying to trying to do
01:02:28.520 like the really hard technical work to
01:02:30.400 make that possible and I think that’s
01:02:32.680 really kind of cool too it is it is and
01:02:35.920 if you think of it of that lots
01:02:40.200 of things which some people find re
01:02:43.960 revolutionary at this point like AI has
01:02:46.559 been around since 60s right and it just
01:02:49.839 took time for for for people to catch up
01:02:52.720 to to G gain momentum and I guess enough
01:02:56.240 scientists enough of actually different
01:02:58.440 type of people
01:03:01.240 to yeah to to to Showcase their the true
01:03:05.160 potential yeah I mean we’re we’re now in
01:03:08.440 a cold war again you know with China and
01:03:12.279 it’s driven by technology so I mean if
01:03:15.520 you look at like all these advances
01:03:17.400 right like the internet all this stuff
01:03:18.760 it came out of it came out of the Cold
01:03:20.599 War and yeah you know it’s unfortunate
01:03:23.599 actually that that you know that war is
01:03:27.279 like the biggest driver of of human
01:03:29.920 Innovation um but you know I I mean I I
01:03:35.000 hopefully we look at it as the
01:03:36.319 prevention of War right um but but
01:03:40.599 that’s to me I mean I think just seeing
01:03:42.400 more and more um tax dollars go into
01:03:46.319 basic research is really exciting
01:03:48.680 because that’s where I think the real
01:03:51.160 the real gains are the real Innovation
01:03:53.599 happens and
01:03:55.039 at least the breakthroughs and you know
01:03:57.799 I think the not not knocking capitalism
01:04:00.640 and our whole system we have here I
01:04:03.640 think that I think Venture capitalists
01:04:06.119 and and startup Founders play a very
01:04:08.680 crucial role as well but like I think
01:04:10.880 the I think the public investment has
01:04:12.799 been like lacking in in recent years so
01:04:16.480 I think getting getting that you know up
01:04:20.559 is just the ROI in that is going to be
01:04:23.160 great for the economy and and Humanity
01:04:26.760 yeah so how how can we be more vocal
01:04:29.760 that it’s needed like some random John
01:04:34.240 Smith how can how can
01:04:37.839 we well yeah tell the government I think
01:04:41.079 you know I don’t know I mean I think to
01:04:43.520 me I try to I think we in the media at
01:04:47.799 least well that’s not your question it’s
01:04:50.000 like how’s the average Joe I don’t know
01:04:52.640 to be honest I I I don’t know I mean I
01:04:55.200 don’t know write your write your your
01:04:57.119 local Congressman or who whatever your
01:04:59.440 country you’re in write your local po
01:05:01.520 but um I think the word is getting out I
01:05:05.200 think the the the narrative around that
01:05:08.599 um has changed I think people like
01:05:11.400 Mariana mazaka um who who’s written
01:05:14.720 about this for years now for like a
01:05:16.520 decade she has done a lot to kind of
01:05:19.599 like change the narrative and I think um
01:05:21.920 you’re even hearing that in Silicon
01:05:23.359 Valley like the story today I did and
01:05:26.200 which I know you haven’t read yet but
01:05:27.559 like I interviewed uh researchers who um
01:05:32.279 you know about about like how AI
01:05:34.799 research is kind of becoming more
01:05:36.720 secretive in in Silicon Valley because
01:05:38.839 it used to be before chat PT it was like
01:05:42.400 let’s publish all our research you know
01:05:44.720 and now it’s like oh well actually it’s
01:05:46.640 too valuable to publish and you know and
01:05:49.680 but a lot of the research I talked to
01:05:52.520 said yeah we need government like we
01:05:54.839 need we need uh Academia which is
01:05:56.799 Government funding essentially we need
01:05:58.799 Academia to to to have more resources
01:06:02.680 more compute resources so that they we
01:06:04.599 need that that competition it’s it’s
01:06:06.839 important so I think that to me tells me
01:06:09.559 that I think the narrative is different
01:06:11.839 than it was I I remember you know I
01:06:14.559 think
01:06:15.640 before when I got out here I got out to
01:06:18.640 cover Tech in end of 2013 I think the
01:06:22.520 the the you know think the narrative was
01:06:25.000 like what do we need government for like
01:06:27.319 this you know every the private sector
01:06:29.119 does all the Innovation now government
01:06:31.720 is completely useless you know they’re
01:06:34.920 slow and look at the website for the
01:06:38.440 healthc care thing it’s terrible and all
01:06:40.440 this stuff and it’s like well actually
01:06:42.000 that’s not you know that that I think
01:06:44.880 talking about that you’re ignoring the
01:06:47.039 the basic research stuff I mean you
01:06:49.319 could people still say a lot of bad
01:06:51.200 things about government and all that
01:06:52.520 stuff but I think they at least now are
01:06:54.240 saying you know actually like we need
01:06:56.319 that funding for for basic research so
01:06:59.880 that’s good yeah it’s encouraging and
01:07:03.079 yes that’s promising okay and last uh
01:07:06.520 question since um you know the elections
01:07:09.839 are coming and it’s a it’s being called
01:07:12.599 I Know It’s Tricky Tricky Tricky subject
01:07:15.760 and but there are so many elections
01:07:17.440 happening all over the world um how do
01:07:20.440 you see how do you see it playing out um
01:07:24.480 yeah with with the whole Ai and what we
01:07:28.880 discussed already a little bit um deep
01:07:32.480 fakes
01:07:34.279 well and some government not playing the
01:07:37.200 the role of um actually showing the
01:07:41.279 truth surprise surprise I can’t believe
01:07:44.079 politicians don’t tell the truth um I I
01:07:47.920 uh no I think
01:07:50.119 that I mean AI will play a role but but
01:07:54.359 I think everybody wants AI to play a
01:07:57.200 bigger role than it actually is gonna
01:07:59.319 play they all like everybody wants wants
01:08:01.960 AI it wants it to be the AI election I
01:08:04.200 mean I think it’s really I think it’s
01:08:06.520 terrible we should focus on we should
01:08:08.920 focus on the ideas you know the ideas
01:08:11.559 around the election I I hear so much
01:08:14.000 like so much of election coverage is
01:08:15.599 like it’s just the personalities of the
01:08:17.960 politicians and who wore what who made
01:08:21.198 what Gaff who’s too old who’s Too Young
01:08:23.640 or whatever and then it’s like now we
01:08:26.000 have this whole AI thing well the AI the
01:08:27.880 AI is going to like ruin the election
01:08:29.319 it’s like can we just talk about like
01:08:31.719 what what are the actual ideas like I
01:08:34.000 wish we could just get to the substance
01:08:35.759 of it because I don’t know I think a lot
01:08:38.080 of this stuff is just a distraction and
01:08:40.679 you know you freak out people freak out
01:08:43.198 when the wrong person is elected and
01:08:45.520 stuff instead of instead of thinking
01:08:47.359 about well why was the wrong person
01:08:48.799 elected like what you know what what
01:08:51.679 happened here I I I mean
01:08:54.359 is you can make AI the scapegoat but
01:08:56.880 really like do you really think that
01:08:58.719 that’s the that’s the problem I mean I
01:09:01.560 think people are people are feeling for
01:09:04.960 whatever reason they’re feeling fed up
01:09:07.158 and like willing to like blow up their
01:09:11.040 not literally blow up but you know just
01:09:12.679 like see their institutions destroyed
01:09:16.279 because you know they’re fed up with
01:09:18.600 with Society like that should be like a
01:09:21.238 that is not ai’s fault you know think
01:09:24.158 there’s something else going on there so
01:09:26.479 let’s keep that in mind yeah okay yeah
01:09:31.040 but uh maybe in uh countries like in
01:09:34.640 Europe there there are some some what
01:09:36.520 what is it Taiwan India
01:09:38.920 Russia um and and you guys so while in
01:09:43.759 in the US it may be such narrative but
01:09:46.880 um Russia may not be so lucky with um
01:09:51.759 getting enough information
01:09:54.360 well Russia is a totally different uh
01:09:56.840 different story obviously especially
01:09:59.679 today’s terrible news um you know and
01:10:02.760 you don’t know if it’s true or
01:10:04.520 not true yeah you don’t know whether
01:10:07.960 anything’s true I know well you know
01:10:09.840 hopefully I mean that’s the thing like
01:10:12.159 that is the that is like some people in
01:10:15.000 the US are like okay with well yeah like
01:10:17.320 maybe we should go that route too yeah
01:10:20.760 to me that’s just crazy like what yeah
01:10:24.120 people don’t appreciate what they have
01:10:25.800 you know until until it’s gone and I
01:10:27.440 think yes you know I mean we should be
01:10:30.600 really we should appreciate what we have
01:10:32.239 here and we should want to hold on to it
01:10:33.960 I mean what no matter what party you’re
01:10:35.840 in no matter what your you know personal
01:10:38.760 situation is and looking over there it’s
01:10:42.199 like a great it’s a great example
01:10:43.840 instead we have people like taco Carlson
01:10:45.800 going over there and trying to say I
01:10:48.159 don’t know what I don’t know what that
01:10:49.400 guy is talking about you know these days
01:10:52.560 but you know
01:10:54.480 I just it’s to me it’s just that this is
01:10:57.320 not a technology problem
01:10:59.440 though I mean that’s the bottom line
01:11:01.800 this is this is a problem that goes so
01:11:04.520 much deeper than than technology and if
01:11:07.400 any if anything technology is just is
01:11:09.400 just like making it more apparent you
01:11:11.239 know it’s shining a light on all these
01:11:13.640 problems but um you know it’s
01:11:18.520 a it’s a deeper issue I don’t have the
01:11:21.920 answer it’s people it’s people’s ISS
01:11:24.640 right it’s always it’s always our fault
01:11:26.840 it’s our it’s always
01:11:29.440 people okay I I guess we are running a
01:11:33.080 bit on on time so um I want to
01:11:37.639 also ask you about this really cool
01:11:40.520 project you you I don’t know if you
01:11:43.120 you’re still running those um the video
01:11:45.679 series um called the Olympics how hard
01:11:48.520 can it
01:11:49.480 [Laughter]
01:11:51.960 be so
01:11:54.520 so you tried H hockey with figure
01:11:58.040 skating uh you you tried to learn this
01:12:02.000 that was okay so that was um yeah I
01:12:05.239 actually that was when I was no I’m not
01:12:07.239 doing that anymore um it was it was
01:12:09.280 really fun I was cover well I was
01:12:11.800 covering Sports again um and it was the
01:12:15.800 um I was covering the Vancouver Olympics
01:12:17.960 and I thought uhuh wouldn’t it be fun if
01:12:20.120 I got to just go try these Olympic
01:12:23.000 sports and what if I could find Olympic
01:12:25.719 athletes you know who would won medals
01:12:27.679 in the Olympics and get them to like
01:12:29.600 give me a lesson in their Sport and so I
01:12:32.080 came up with this title you know the
01:12:33.920 Olympics how hard can it be and I did
01:12:37.080 you know speed skating figure
01:12:40.560 skating uh Moguls the
01:12:44.040 L ice hockey
01:12:47.800 bathon I don’t know if I did anything
01:12:49.960 else but it was the L was terrifying
01:12:53.480 that
01:12:54.639 was this guy named Gordy Shear who would
01:12:57.320 won a silver medal in the L he like
01:12:59.520 pushed he just like pushed me down I was
01:13:01.120 like don’t try to steer and shooting on
01:13:04.159 this L track I can’t see anything
01:13:06.600 because I’m hitting the wall like my
01:13:08.600 foot’s hitting the wall and there’s snow
01:13:10.080 hitting hitting my eyes so I’m like
01:13:11.679 basically blind going down the Lou Track
01:13:14.360 but you know I survived it was okay okay
01:13:17.080 so you’re not doing this anymore you’re
01:13:19.120 not doing it professionally you know
01:13:20.920 what if somebody would pay me to go do
01:13:22.520 that again I would totally do it it
01:13:23.920 probably just hurt myself this time but
01:13:25.800 you know I’d still I’d still do it it
01:13:27.880 will be a lot of fun it was great I love
01:13:30.600 the Olympics I think the Olympics is the
01:13:32.320 best is the best sporting event and it’s
01:13:35.280 really special so it is it is it shows
01:13:38.320 you what people are capable of in in the
01:13:42.040 good way yeah it is it’s
01:13:45.239 great well thank you thank you thank you
01:13:48.840 re as well and um hopefully see you in
01:13:51.520 Europe soon and um if need help with
01:13:54.639 finding ways to get this European
01:13:57.400 passport I I I can I can ask around okay
01:14:00.320 let me know let me know if you know
01:14:01.679 someone all right thank you okay take
01:14:05.040 care bye bye thanks bye