• My why
  • Are You Human
  • Understanding AI
  • Entrepreneurship Handbook
  • Skill up
  • Inspiration
Tech, business and everything In between
  • My why
  • Are You Human
  • Understanding AI
  • Entrepreneurship Handbook
  • Skill up
  • Inspiration
Tech, business and everything In between
Tech, business and everything In between
suhair khair on are you human podcast
Are You Human
Suhair Khan: Today's Tech Led Us To Isolation And Loneliness
Loading
00:00 / 1:06:30
Apple Podcasts Spotify
RSS Feed
Share
Link
Embed

Download file | Play in new window | Duration: 1:06:30 | Recorded on 4th April 2024

Subscribe: Apple Podcasts | Spotify

In this episode of AYH I had an incredible opportunity to speak with Suhair Khan, the founder of open-ended and the creative force behind numerous UK’s cultural institutions and businesses, including the Design Museum, the British Library, Sadler’s Wells and many others.

We really dig into the importance of diversity and inclusivity in tech to reduce bias in AI and data. Suhair doesn’t hold back on tough topics like the potential impact of AI on creative jobs and how artists can stay relevant with all this disruption. We touched on cultural decolonisation within technology, examining how to break down existing power structures and inequalities through inclusive design and innovation.

Suhair has great insights on the role cultural institutions can play in driving tech innovation forward, and how art and tech can team up to tackle major social and environmental issues. She is doing such vital work at the intersection of creativity, culture and emerging tech like AI and can’t wait for you to listen to our chat.

Transcript
00:00:00,040 –> 00:00:04,000
what we’ve seen now is the manifestation

2
00:00:02,040 –> 00:00:06,279
of what Silicon Valley creativity is

3
00:00:04,000 –> 00:00:08,080
South by Southwest is the best example

4
00:00:06,279 –> 00:00:10,840
you know Art Is Not Just Fine Art

5
00:00:08,080 –> 00:00:12,679
Museums but at the core of it that new

6
00:00:10,840 –> 00:00:14,519
generation of creativity that has come

7
00:00:12,679 –> 00:00:16,840
out of technology and out of Silicon

8
00:00:14,519 –> 00:00:19,600
Valley is at that Confluence of film

9
00:00:16,840 –> 00:00:23,080
media music and it is I think closest to

10
00:00:19,600 –> 00:00:23,080
what you would probably see at South by

11
00:00:25,199 –> 00:00:30,840
Southwest and I’m sure there’s lots of

12
00:00:27,720 –> 00:00:32,320
incredible now technology uh CEOs and

13
00:00:30,840 –> 00:00:33,800
people who’ made a lot of money who are

14
00:00:32,320 –> 00:00:35,360
acquiring art in the way that they would

15
00:00:33,800 –> 00:00:37,120
in New York or London but I just think

16
00:00:35,360 –> 00:00:40,960
it’s a very different ethos as to what

17
00:00:37,120 –> 00:00:40,960
creativity is and also what culture

18
00:00:45,520 –> 00:00:50,399
is there is a climate crisis that is

19
00:00:48,480 –> 00:00:52,879
happening that we’re part of that we are

20
00:00:50,399 –> 00:00:55,079
experiencing however we think about it

21
00:00:52,879 –> 00:00:57,719
and there is this kind of inequality

22
00:00:55,079 –> 00:00:59,160
shift which is probably a crisis you

23
00:00:57,719 –> 00:01:02,120
know although we’ve had inequality in

24
00:00:59,160 –> 00:01:03,719
the past and then at the core of it I

25
00:01:02,120 –> 00:01:05,799
think technology has led to its own

26
00:01:03,719 –> 00:01:07,520
crisis of isolation and loneliness it’s

27
00:01:05,799 –> 00:01:09,799
led to misinformation

28
00:01:07,520 –> 00:01:11,960
misunderstanding unkindness a lack of

29
00:01:09,799 –> 00:01:13,400
empathy I think if we added it up all

30
00:01:11,960 –> 00:01:14,960
over the world for all the billions of

31
00:01:13,400 –> 00:01:17,240
people who use the internet there’s

32
00:01:14,960 –> 00:01:20,880
probably a lot of hurt that has come

33
00:01:17,240 –> 00:01:20,880
with this experience of being connected

34
00:01:22,479 –> 00:01:28,840
ironically are you

35
00:01:25,799 –> 00:01:30,240
human hello kamla thank you for having

36
00:01:28,840 –> 00:01:32,320
me you thank you so much much for

37
00:01:30,240 –> 00:01:33,920
finding time I know it’s it’s very busy

38
00:01:32,320 –> 00:01:36,920
I’ve been waiting to have this

39
00:01:33,920 –> 00:01:39,640
conversation with you for so long I a

40
00:01:36,920 –> 00:01:42,840
circle of of friends and colleagues is

41
00:01:39,640 –> 00:01:45,719
more artistic than I guess than mine

42
00:01:42,840 –> 00:01:48,200
you’re working on the verge of Art and

43
00:01:45,719 –> 00:01:50,799
technology and I know that you’ve been

44
00:01:48,200 –> 00:01:55,159
working with British Library design

45
00:01:50,799 –> 00:01:58,159
Museum and then you finally decided to

46
00:01:55,159 –> 00:02:01,119
um you you felt the need to create a

47
00:01:58,159 –> 00:02:03,600
platform for all the those people who

48
00:02:01,119 –> 00:02:06,439
have ideas which should be heard by The

49
00:02:03,600 –> 00:02:08,679
Wider community and you start at the

50
00:02:06,439 –> 00:02:11,160
open-ended let’s start from the very

51
00:02:08,679 –> 00:02:14,519
beginning why did you get interested in

52
00:02:11,160 –> 00:02:16,480
this so I grew up uh mostly in Pakistan

53
00:02:14,519 –> 00:02:18,400
and uh my mom is from India my dad’s

54
00:02:16,480 –> 00:02:22,319
from Pakistan and then I ended up going

55
00:02:18,400 –> 00:02:25,400
to University in America my dad had also

56
00:02:22,319 –> 00:02:27,239
undergrad in the US and um it felt like

57
00:02:25,400 –> 00:02:30,480
you know pursuing a liberal arts degree

58
00:02:27,239 –> 00:02:33,239
was the right yes path at the time

59
00:02:30,480 –> 00:02:36,160
um and I went to America at quite a

60
00:02:33,239 –> 00:02:38,959
complicated time right just after 9911

61
00:02:36,160 –> 00:02:40,400
so as someone coming from out anywhere

62
00:02:38,959 –> 00:02:42,519
else actually at the time but

63
00:02:40,400 –> 00:02:45,280
particularly a Muslim country it did

64
00:02:42,519 –> 00:02:47,959
take me time I think to find the space

65
00:02:45,280 –> 00:02:50,080
and to figure out what I wanted to do um

66
00:02:47,959 –> 00:02:52,720
and my first job was in Investment

67
00:02:50,080 –> 00:02:54,599
Banking in New York very early days of

68
00:02:52,720 –> 00:02:58,280
my career but also very different time

69
00:02:54,599 –> 00:03:01,239
on Wall Street um before the financial

70
00:02:58,280 –> 00:03:03,200
crisis and uh I never really felt like

71
00:03:01,239 –> 00:03:04,879
it was for me I never really felt

72
00:03:03,200 –> 00:03:07,959
settled but of course what I loved about

73
00:03:04,879 –> 00:03:09,879
New York was uh the museums and the art

74
00:03:07,959 –> 00:03:12,000
scene and I had a lot of friends who

75
00:03:09,879 –> 00:03:14,680
worked in Galleries and then my sister

76
00:03:12,000 –> 00:03:17,000
did her undergrad at uh upen and after

77
00:03:14,680 –> 00:03:19,040
that joined a gallery in New York so

78
00:03:17,000 –> 00:03:21,480
that became very much my fun world but

79
00:03:19,040 –> 00:03:23,040
not my life uh even though I’d grown up

80
00:03:21,480 –> 00:03:25,200
you know my mom has an art gallery in

81
00:03:23,040 –> 00:03:28,840
Islamabad and my whole you know family

82
00:03:25,200 –> 00:03:31,680
are creative um so I ended up going to

83
00:03:28,840 –> 00:03:34,439
do my masters after a couple of years at

84
00:03:31,680 –> 00:03:36,040
the Kennedy School at Harvard studying

85
00:03:34,439 –> 00:03:38,360
International Development I was very

86
00:03:36,040 –> 00:03:40,120
interested in doing a PhD in economics

87
00:03:38,360 –> 00:03:42,599
at the time and of course growing up in

88
00:03:40,120 –> 00:03:45,599
a developing country I felt very drawn

89
00:03:42,599 –> 00:03:48,319
to understanding better what development

90
00:03:45,599 –> 00:03:50,680
quote unquote meant and what the Divide

91
00:03:48,319 –> 00:03:53,239
between the global North and the global

92
00:03:50,680 –> 00:03:55,400
South was and how I could help and work

93
00:03:53,239 –> 00:03:58,439
with organizations like the World Bank

94
00:03:55,400 –> 00:04:00,599
or look at at the time newly emerging I

95
00:03:58,439 –> 00:04:02,560
guess uh industry like social

96
00:04:00,599 –> 00:04:05,640
entrepreneurship as a path towards

97
00:04:02,560 –> 00:04:07,560
navigating my career and I think at the

98
00:04:05,640 –> 00:04:10,319
time I was very invested in this idea of

99
00:04:07,560 –> 00:04:12,000
how can you do good from an outside

100
00:04:10,319 –> 00:04:15,280
perspective and how can you help to

101
00:04:12,000 –> 00:04:18,440
incubate better economic situations um

102
00:04:15,280 –> 00:04:21,840
for people living outside of a western

103
00:04:18,440 –> 00:04:23,759
context so I did that and it became a

104
00:04:21,840 –> 00:04:25,639
lot more research driven in terms of the

105
00:04:23,759 –> 00:04:27,840
work that I was doing looking mostly at

106
00:04:25,639 –> 00:04:29,280
macroeconomics I published a I was lucky

107
00:04:27,840 –> 00:04:30,280
I got a couple of papers published

108
00:04:29,280 –> 00:04:32,479
actually

109
00:04:30,280 –> 00:04:34,240
at the end of it but I also found it to

110
00:04:32,479 –> 00:04:37,039
be very lonely in a way you know

111
00:04:34,240 –> 00:04:38,320
Academia is lonely I know teach it so

112
00:04:37,039 –> 00:04:41,520
you know I’m aware of that from a

113
00:04:38,320 –> 00:04:44,000
research perspective um and I felt like

114
00:04:41,520 –> 00:04:46,479
I wanted to be doing and being around

115
00:04:44,000 –> 00:04:48,800
people and so in a very longwinded way I

116
00:04:46,479 –> 00:04:51,280
ended up very luckily joining Google and

117
00:04:48,800 –> 00:04:53,120
moving to California I got a call from a

118
00:04:51,280 –> 00:04:55,720
recruiter asking me if I was interested

119
00:04:53,120 –> 00:04:59,919
in um strategy position and Mountain

120
00:04:55,720 –> 00:05:03,560
View and so I just went and um you know

121
00:04:59,919 –> 00:05:06,280
when in those days this is like 2010 I

122
00:05:03,560 –> 00:05:07,840
think um I mean even now it’s amazing

123
00:05:06,280 –> 00:05:10,880
you know if you go to the Google

124
00:05:07,840 –> 00:05:15,400
headquarters in California it’s like a

125
00:05:10,880 –> 00:05:17,960
Dreamland with you know everybody looks

126
00:05:15,400 –> 00:05:22,120
happyish everyone is a mission everyone

127
00:05:17,960 –> 00:05:24,639
is uh you know kind of seeking something

128
00:05:22,120 –> 00:05:26,560
and coming from you know much more kind

129
00:05:24,639 –> 00:05:29,319
of serious world of economics and before

130
00:05:26,560 –> 00:05:30,560
that W which was obviously serious but

131
00:05:29,319 –> 00:05:34,639
also

132
00:05:30,560 –> 00:05:36,240
um you know felt much more intense um I

133
00:05:34,639 –> 00:05:38,639
think of Wall Street as a very dark

134
00:05:36,240 –> 00:05:42,080
place in my head I think of like 2: am.

135
00:05:38,639 –> 00:05:43,919
and 3:00 a.m. and um you know the

136
00:05:42,080 –> 00:05:46,600
pressure of having her BlackBerry at the

137
00:05:43,919 –> 00:05:48,840
time you know kind of buzzing that that

138
00:05:46,600 –> 00:05:52,000
you know is all gone no that’s not

139
00:05:48,840 –> 00:05:53,560
that’s already gone right yeah so um so

140
00:05:52,000 –> 00:05:55,880
then anyway so it’s very boring but

141
00:05:53,560 –> 00:05:58,479
that’s how I ended up in in California

142
00:05:55,880 –> 00:06:00,600
and then from there I was very lucky to

143
00:05:58,479 –> 00:06:02,560
join Google at a time where even though

144
00:06:00,600 –> 00:06:04,639
it was still a very big company was

145
00:06:02,560 –> 00:06:07,800
smaller than it is now and there was a

146
00:06:04,639 –> 00:06:09,680
lot more flexibility in moving from you

147
00:06:07,800 –> 00:06:12,919
know an engineering team to a business

148
00:06:09,680 –> 00:06:15,440
team to a Partnerships team in really

149
00:06:12,919 –> 00:06:17,919
working with mentors and uh senior

150
00:06:15,440 –> 00:06:19,720
leadership to have a career that felt

151
00:06:17,919 –> 00:06:21,520
like it suited you know anyone

152
00:06:19,720 –> 00:06:23,599
individual’s journey and that’s what I

153
00:06:21,520 –> 00:06:27,160
was lucky enough to do it wasn’t always

154
00:06:23,599 –> 00:06:28,880
easy um and I think you know companies

155
00:06:27,160 –> 00:06:33,080
and big companies probably never really

156
00:06:28,880 –> 00:06:35,199
are but it was really um the beginning

157
00:06:33,080 –> 00:06:36,680
of my journey and so to answer your

158
00:06:35,199 –> 00:06:39,919
question is how did I end up in the art

159
00:06:36,680 –> 00:06:43,639
and Tech space is that um during my

160
00:06:39,919 –> 00:06:45,360
first couple of years in San Francisco I

161
00:06:43,639 –> 00:06:47,639
spent all my time in the office I was

162
00:06:45,360 –> 00:06:49,400
there all the time you had to commute I

163
00:06:47,639 –> 00:06:51,440
I lived in San Francisco so you got on

164
00:06:49,400 –> 00:06:53,840
the Google shuttle you ended up an hour

165
00:06:51,440 –> 00:06:55,039
or two hours later in Mountain View and

166
00:06:53,840 –> 00:06:57,759
then you never really left so your

167
00:06:55,039 –> 00:06:59,840
entire life became about working for

168
00:06:57,759 –> 00:07:02,759
Google and you know there was people

169
00:06:59,840 –> 00:07:04,440
working on projects with lucid dreaming

170
00:07:02,759 –> 00:07:06,840
and machine learning there was people

171
00:07:04,440 –> 00:07:10,120
working on politics there was people

172
00:07:06,840 –> 00:07:12,039
working on uh the internet and the truth

173
00:07:10,120 –> 00:07:13,800
and uh there was a group of people who

174
00:07:12,039 –> 00:07:16,639
were interested in art and culture and

175
00:07:13,800 –> 00:07:19,240
how to bring more art um onto the

176
00:07:16,639 –> 00:07:21,479
internet how to bring better quality

177
00:07:19,240 –> 00:07:24,680
images of paintings and more immersive

178
00:07:21,479 –> 00:07:28,400
experiences in museums online and so I

179
00:07:24,680 –> 00:07:30,160
became part of that early group and um

180
00:07:28,400 –> 00:07:32,240
kind of followed that thread

181
00:07:30,160 –> 00:07:34,520
as a part-time project for the next five

182
00:07:32,240 –> 00:07:36,479
six years while I worked on other stuff

183
00:07:34,520 –> 00:07:39,520
which we can talk about mostly produ

184
00:07:36,479 –> 00:07:41,560
stuff um and also at the time I was

185
00:07:39,520 –> 00:07:44,319
writing I wrote a bit for architectural

186
00:07:41,560 –> 00:07:47,919
digest I wrote a bit for cond Nas

187
00:07:44,319 –> 00:07:49,879
traveler I wrote for Vogue in India I

188
00:07:47,919 –> 00:07:53,520
don’t know why I I think because you

189
00:07:49,879 –> 00:07:56,080
know you sort of um my life just became

190
00:07:53,520 –> 00:07:57,639
much more contained in a way and so I

191
00:07:56,080 –> 00:08:00,120
probably was like way more productive

192
00:07:57,639 –> 00:08:02,120
than I’ll ever be and also was the

193
00:08:00,120 –> 00:08:04,759
energy of being there in California at

194
00:08:02,120 –> 00:08:07,599
that time there was an intensity of it

195
00:08:04,759 –> 00:08:10,479
and there was a sense of trying out

196
00:08:07,599 –> 00:08:11,800
everything all at once because um you

197
00:08:10,479 –> 00:08:12,879
know that’s what Google was doing and

198
00:08:11,800 –> 00:08:15,639
that’s what all of these technology

199
00:08:12,879 –> 00:08:18,520
companies were looking to do right but

200
00:08:15,639 –> 00:08:21,520
did you feel um did you feel more drawn

201
00:08:18,520 –> 00:08:25,120
to the way of uh working and and

202
00:08:21,520 –> 00:08:27,680
creating in Silicon Valley than what you

203
00:08:25,120 –> 00:08:29,800
what New York was offering you because

204
00:08:27,680 –> 00:08:32,360
you know as like you said new New York

205
00:08:29,800 –> 00:08:35,240
offers lots of uh cultural events

206
00:08:32,360 –> 00:08:37,120
there’s lots of Galleries and it seems

207
00:08:35,240 –> 00:08:39,640
at least uh in

208
00:08:37,120 –> 00:08:42,159
this stere

209
00:08:39,640 –> 00:08:45,800
stereotypical uh way of thinking about

210
00:08:42,159 –> 00:08:50,279
New York it’s all the artists all the

211
00:08:45,800 –> 00:08:51,839
Bohemia um American Bohemia is there so

212
00:08:50,279 –> 00:08:53,200
at the time I lived in New York was when

213
00:08:51,839 –> 00:08:55,680
everybody was getting pushed out of

214
00:08:53,200 –> 00:08:58,120
Manhattan so i’ say leading all the way

215
00:08:55,680 –> 00:09:00,200
up to co New York became or Manhattan

216
00:08:58,120 –> 00:09:02,760
became more and more and more expensive

217
00:09:00,200 –> 00:09:04,640
uh for anybody creative to live and so

218
00:09:02,760 –> 00:09:06,959
it was a time of the beginning of this

219
00:09:04,640 –> 00:09:09,360
kind of you know consumption of uh

220
00:09:06,959 –> 00:09:11,079
Damian hear sharks by people like Steve

221
00:09:09,360 –> 00:09:13,240
Cohen and you know massive like

222
00:09:11,079 –> 00:09:15,519
Acquisitions of art that were very much

223
00:09:13,240 –> 00:09:18,560
about showing uh that wealth and art

224
00:09:15,519 –> 00:09:20,279
were intimately connected um moving to

225
00:09:18,560 –> 00:09:22,800
California I found very difficult I

226
00:09:20,279 –> 00:09:25,040
found it to be isolating um everybody

227
00:09:22,800 –> 00:09:27,800
that I knew pretty much worked in

228
00:09:25,040 –> 00:09:30,480
technology and I found that to be quite

229
00:09:27,800 –> 00:09:32,959
hard and quite narrow and also you know

230
00:09:30,480 –> 00:09:34,839
it was for me I’m not from America you

231
00:09:32,959 –> 00:09:36,920
know as such was pushing away from the

232
00:09:34,839 –> 00:09:39,399
rest of the world because you know it is

233
00:09:36,920 –> 00:09:42,800
very insular in a lot of ways

234
00:09:39,399 –> 00:09:44,600
individualistic right like exactly and

235
00:09:42,800 –> 00:09:46,240
you know at the end of the day if you

236
00:09:44,600 –> 00:09:47,640
think about that period of time it was a

237
00:09:46,240 –> 00:09:49,279
period of America kind of turning

238
00:09:47,640 –> 00:09:52,920
inwards looking away from the rest of

239
00:09:49,279 –> 00:09:55,680
the world feeling very um vulnerable

240
00:09:52,920 –> 00:09:58,240
after 911 and the sense of the outsider

241
00:09:55,680 –> 00:09:59,680
was very palpable and became more so in

242
00:09:58,240 –> 00:10:01,959
the time that I was there which of

243
00:09:59,680 –> 00:10:03,920
course I didn’t really understand but I

244
00:10:01,959 –> 00:10:06,279
do think that from a creative and arts

245
00:10:03,920 –> 00:10:08,160
perspective what we’ve seen now is a

246
00:10:06,279 –> 00:10:10,320
manifestation of what Silicon Valley

247
00:10:08,160 –> 00:10:12,440
creativity is South by Southwest is the

248
00:10:10,320 –> 00:10:15,040
best example you know Art Is Not Just

249
00:10:12,440 –> 00:10:17,000
Fine Art Museums and that’s a lot of the

250
00:10:15,040 –> 00:10:18,920
work that I’ve done so of course San

251
00:10:17,000 –> 00:10:21,240
Francisco has beautiful museums the SF

252
00:10:18,920 –> 00:10:23,360
mom and you know you could go on and on

253
00:10:21,240 –> 00:10:26,000
the the Young Museum of course they have

254
00:10:23,360 –> 00:10:28,200
the ballet but at the core of it that

255
00:10:26,000 –> 00:10:29,760
new generation of creativity that has

256
00:10:28,200 –> 00:10:32,079
come out of Technology and out of

257
00:10:29,760 –> 00:10:34,680
Silicon Valley is at that Confluence of

258
00:10:32,079 –> 00:10:36,160
film media music and it is I think

259
00:10:34,680 –> 00:10:38,639
closest to what you would probably see

260
00:10:36,160 –> 00:10:40,839
at South by Southwest yeah and I’m sure

261
00:10:38,639 –> 00:10:43,200
there’s lots of incredible now

262
00:10:40,839 –> 00:10:44,720
technology uh CEOs and people who’ve

263
00:10:43,200 –> 00:10:46,040
made a lot of money who are acquiring

264
00:10:44,720 –> 00:10:47,600
art in the way that they would in New

265
00:10:46,040 –> 00:10:49,079
York or London but I just think it’s a

266
00:10:47,600 –> 00:10:51,720
very different ethos as to what

267
00:10:49,079 –> 00:10:53,120
creativity is and also what culture is

268
00:10:51,720 –> 00:10:56,480
and that’s something that I at the time

269
00:10:53,120 –> 00:10:59,000
I didn’t really have the space to engage

270
00:10:56,480 –> 00:11:01,200
with that wasn’t really my world um and

271
00:10:59,000 –> 00:11:04,560
also so I think that that has evolved

272
00:11:01,200 –> 00:11:07,720
quite a lot um over the last 15 or 20

273
00:11:04,560 –> 00:11:10,120
years as a space but for me personally I

274
00:11:07,720 –> 00:11:11,800
think I was quite lonely in a way and

275
00:11:10,120 –> 00:11:13,440
one of the most interesting experiences

276
00:11:11,800 –> 00:11:16,519
I have I don’t think I’ve probably

277
00:11:13,440 –> 00:11:19,480
really talked about it but um was I was

278
00:11:16,519 –> 00:11:22,079
part of one of the very early um groups

279
00:11:19,480 –> 00:11:23,720
that were curated by Cheryl Sandberg

280
00:11:22,079 –> 00:11:26,360
when she had just joined Facebook she

281
00:11:23,720 –> 00:11:28,079
was writing her book lean in which I I

282
00:11:26,360 –> 00:11:29,680
mean I don’t I haven’t read it for many

283
00:11:28,079 –> 00:11:31,880
years I don’t know how it feels about it

284
00:11:29,680 –> 00:11:33,519
now but at the time you know it was um

285
00:11:31,880 –> 00:11:36,160
great to have a woman leader writing

286
00:11:33,519 –> 00:11:38,519
about her career she worked at the World

287
00:11:36,160 –> 00:11:40,120
Bank she was in you know at Harvard and

288
00:11:38,519 –> 00:11:41,040
she ended up in Tech so I found that

289
00:11:40,120 –> 00:11:43,959
very

290
00:11:41,040 –> 00:11:45,800
inspiring and she would bring together

291
00:11:43,959 –> 00:11:48,120
these groups of women or she started

292
00:11:45,800 –> 00:11:51,040
this you know kind of Journey of it

293
00:11:48,120 –> 00:11:53,279
where um we met a c every couple of

294
00:11:51,040 –> 00:11:55,480
weeks at somebody’s house and it was

295
00:11:53,279 –> 00:11:59,000
quite structured in terms of being uh

296
00:11:55,480 –> 00:12:00,399
led by a mentor and you know she had

297
00:11:59,000 –> 00:12:01,839
chose chosen the people to be there and

298
00:12:00,399 –> 00:12:04,560
I was really lucky a friend of mine

299
00:12:01,839 –> 00:12:07,560
invited me um who was working at

300
00:12:04,560 –> 00:12:10,079
Facebook at the time and there was uh

301
00:12:07,560 –> 00:12:12,480
human rights lawyers doctors child

302
00:12:10,079 –> 00:12:14,720
psychologists who worked in war zones uh

303
00:12:12,480 –> 00:12:16,120
Venture capitalists you know incredible

304
00:12:14,720 –> 00:12:17,720
women that I would never have had the

305
00:12:16,120 –> 00:12:19,760
chance to meet if it wasn’t for that

306
00:12:17,720 –> 00:12:21,800
because I was in my you know even though

307
00:12:19,760 –> 00:12:24,839
Google was huge it was still my work

308
00:12:21,800 –> 00:12:26,880
world and that for me really opened up

309
00:12:24,839 –> 00:12:30,480
California uh from a completely

310
00:12:26,880 –> 00:12:32,519
different angle and um allowed me to you

311
00:12:30,480 –> 00:12:35,720
know meet a a wider discourse of people

312
00:12:32,519 –> 00:12:38,120
but art culture and creativity I always

313
00:12:35,720 –> 00:12:40,760
felt like I was in my experience of it

314
00:12:38,120 –> 00:12:43,760
was less and I I did find it difficult

315
00:12:40,760 –> 00:12:45,519
and I think the bigger issue probably

316
00:12:43,760 –> 00:12:47,199
was like I was still getting to know

317
00:12:45,519 –> 00:12:49,519
myself you know I was in a different

318
00:12:47,199 –> 00:12:50,600
country in a different CL working in a

319
00:12:49,519 –> 00:12:55,360
different

320
00:12:50,600 –> 00:12:57,760
organization um so that was that but I

321
00:12:55,360 –> 00:13:00,079
think one of the things you mentioned

322
00:12:57,760 –> 00:13:03,480
was you know being like Wall Street

323
00:13:00,079 –> 00:13:06,199
versus California but I actually think

324
00:13:03,480 –> 00:13:08,199
that and this comes down to creativity

325
00:13:06,199 –> 00:13:10,320
critical thinking and how I think of my

326
00:13:08,199 –> 00:13:12,920
own work because I’ve been in I’ve

327
00:13:10,320 –> 00:13:15,800
worked in many countries I’ve worked in

328
00:13:12,920 –> 00:13:18,040
um you all over Asia uh I was based in

329
00:13:15,800 –> 00:13:21,320
Singapore I’ve worked in India I’ve done

330
00:13:18,040 –> 00:13:23,880
projects in Africa Latin America uh and

331
00:13:21,320 –> 00:13:26,600
of course in the UK and Europe but for

332
00:13:23,880 –> 00:13:28,600
me I feel that I I might be wrong but I

333
00:13:26,600 –> 00:13:30,399
had an American education and then I

334
00:13:28,600 –> 00:13:32,639
worked for American

335
00:13:30,399 –> 00:13:36,079
companies and I think that actually

336
00:13:32,639 –> 00:13:37,760
shaped my way of seeing the world uh and

337
00:13:36,079 –> 00:13:40,600
how I worked and how I learn to work I

338
00:13:37,760 –> 00:13:42,959
don’t know if it’s right or good but I

339
00:13:40,600 –> 00:13:45,880
part of that like there’s a restlessness

340
00:13:42,959 –> 00:13:48,680
that comes with there’s a kind of um

341
00:13:45,880 –> 00:13:50,399
probably over excitement about you know

342
00:13:48,680 –> 00:13:53,680
getting things done and confidence and

343
00:13:50,399 –> 00:13:55,920
gaining confidence a confidence um which

344
00:13:53,680 –> 00:13:59,720
definitely does you know of course helps

345
00:13:55,920 –> 00:14:01,920
especially in the art confence yes of us

346
00:13:59,720 –> 00:14:03,279
ever but um but I think there’s an

347
00:14:01,920 –> 00:14:05,759
American Energy and I think that’s a

348
00:14:03,279 –> 00:14:07,720
very creative energy as you know it’s

349
00:14:05,759 –> 00:14:09,399
much of the world that we live in today

350
00:14:07,720 –> 00:14:11,880
and that’s something that I don’t know

351
00:14:09,399 –> 00:14:13,959
how to make into a formula I and I don’t

352
00:14:11,880 –> 00:14:15,800
know if it’s the right thing like I said

353
00:14:13,959 –> 00:14:17,480
but I feel that’s more what shaped my

354
00:14:15,800 –> 00:14:21,600
career than anything

355
00:14:17,480 –> 00:14:24,360
else definitely yeah and and you you had

356
00:14:21,600 –> 00:14:28,480
your voice heard and you were able to to

357
00:14:24,360 –> 00:14:31,800
create this this uh say to infect your

358
00:14:28,480 –> 00:14:33,759
energy and an enthusiasm about uh art

359
00:14:31,800 –> 00:14:37,480
and and the need for for more

360
00:14:33,759 –> 00:14:40,440
conversation um around your community so

361
00:14:37,480 –> 00:14:42,440
hence open-ended right hence open-ended

362
00:14:40,440 –> 00:14:45,560
yeah so I started open-ended a couple of

363
00:14:42,440 –> 00:14:47,680
years ago and um that was actually

364
00:14:45,560 –> 00:14:50,279
during the pandemic and I started off as

365
00:14:47,680 –> 00:14:52,800
you were doing as a series of podcasts

366
00:14:50,279 –> 00:14:54,600
and I was really interested in nurturing

367
00:14:52,800 –> 00:14:56,160
this space where I was lucky to sit

368
00:14:54,600 –> 00:14:58,040
through my work at Google which was

369
00:14:56,160 –> 00:15:00,480
where art and culture and Technology

370
00:14:58,040 –> 00:15:03,000
come together and I got more and more

371
00:15:00,480 –> 00:15:04,920
interested in the space of design um

372
00:15:03,000 –> 00:15:06,680
during my time working in London on the

373
00:15:04,920 –> 00:15:08,519
Google arts and culture team because I

374
00:15:06,680 –> 00:15:11,279
felt this like very strong connection of

375
00:15:08,519 –> 00:15:13,000
design as a practice to technology

376
00:15:11,279 –> 00:15:14,839
because at the core of it you know we

377
00:15:13,000 –> 00:15:16,839
talk about AI now all the time what is

378
00:15:14,839 –> 00:15:18,880
it it’s algorithms that are stacked and

379
00:15:16,839 –> 00:15:21,560
organized and designed and shaped and

380
00:15:18,880 –> 00:15:24,480
implemented and it’s not just how you

381
00:15:21,560 –> 00:15:25,880
know your iPhone looks like or how you

382
00:15:24,480 –> 00:15:29,079
know the experience of holding your

383
00:15:25,880 –> 00:15:30,839
laptop is so for me design and techn

384
00:15:29,079 –> 00:15:32,399
ology really felt like the there was

385
00:15:30,839 –> 00:15:34,079
like a creative connection between the

386
00:15:32,399 –> 00:15:36,959
two fields and I was always very

387
00:15:34,079 –> 00:15:38,399
inspired by people um working in design

388
00:15:36,959 –> 00:15:40,079
because they could be computational

389
00:15:38,399 –> 00:15:41,920
designers they could be Architects they

390
00:15:40,079 –> 00:15:44,639
could be artists who are doing work with

391
00:15:41,920 –> 00:15:47,519
digital um and of course it could be

392
00:15:44,639 –> 00:15:49,399
people like myself and um that whole

393
00:15:47,519 –> 00:15:51,800
Work World of creative technologist for

394
00:15:49,399 –> 00:15:55,319
me always sat within you know where

395
00:15:51,800 –> 00:15:57,279
design and Technology met and so I was

396
00:15:55,319 –> 00:15:59,360
interested in hosting conversations in

397
00:15:57,279 –> 00:16:02,319
that space with people that I knew who

398
00:15:59,360 –> 00:16:04,040
are of my age group or you know kind of

399
00:16:02,319 –> 00:16:06,800
working in and around me who had always

400
00:16:04,040 –> 00:16:09,399
inspired me I’ve always had mentorship

401
00:16:06,800 –> 00:16:12,120
from people who are probably at my level

402
00:16:09,399 –> 00:16:15,120
of you know existence but come from

403
00:16:12,120 –> 00:16:18,600
outside of my own way of working and so

404
00:16:15,120 –> 00:16:20,560
that’s how I began the the podcast and

405
00:16:18,600 –> 00:16:22,399
it it shifted a little bit you know I

406
00:16:20,560 –> 00:16:25,160
had started off wanting it to be like

407
00:16:22,399 –> 00:16:26,800
more about heroes and design heroes in

408
00:16:25,160 –> 00:16:28,160
tech people whove changed the world

409
00:16:26,800 –> 00:16:31,160
similar to you know when I went to

410
00:16:28,160 –> 00:16:34,160
Harvard I wanted to work on like saving

411
00:16:31,160 –> 00:16:35,880
things but you know it was a journey

412
00:16:34,160 –> 00:16:38,360
that I went on to really try and figure

413
00:16:35,880 –> 00:16:40,040
out like who who is your hero and who do

414
00:16:38,360 –> 00:16:42,480
you want to celebrate and whom you can

415
00:16:40,040 –> 00:16:44,600
relate to yeah and who do you want other

416
00:16:42,480 –> 00:16:46,360
people to hear from you know and where

417
00:16:44,600 –> 00:16:47,800
would you might learn the most from you

418
00:16:46,360 –> 00:16:51,680
know you never learn the most from like

419
00:16:47,800 –> 00:16:53,360
a press conference by a CEO um or you

420
00:16:51,680 –> 00:16:55,399
know if you when I do panel

421
00:16:53,360 –> 00:16:58,079
conversations I never invite like the

422
00:16:55,399 –> 00:17:00,639
head of policy or the head of marketing

423
00:16:58,079 –> 00:17:02,000
who cares it’s not relevant what they

424
00:17:00,639 –> 00:17:03,639
think what’s relevant is the people who

425
00:17:02,000 –> 00:17:05,160
are building who are making who it’s

426
00:17:03,639 –> 00:17:07,400
their opinion it’s not about what they

427
00:17:05,160 –> 00:17:09,640
represent and for me that’s a really big

428
00:17:07,400 –> 00:17:11,079
distinction um that we don’t talk about

429
00:17:09,640 –> 00:17:15,319
enough and that’s really what I wanted

430
00:17:11,079 –> 00:17:17,839
to create space for and um so I have

431
00:17:15,319 –> 00:17:19,640
people who spoke pretty much all of them

432
00:17:17,839 –> 00:17:22,880
I had already known some I had known

433
00:17:19,640 –> 00:17:25,439
much less well before like versus after

434
00:17:22,880 –> 00:17:26,720
but people who worked in human rights

435
00:17:25,439 –> 00:17:30,240
design

436
00:17:26,720 –> 00:17:32,960
architecture um you know technologists

437
00:17:30,240 –> 00:17:35,280
Engineers scientists and that’s sort of

438
00:17:32,960 –> 00:17:37,799
the scope of the podcast as it began and

439
00:17:35,280 –> 00:17:40,320
then it evolved really as I left Google

440
00:17:37,799 –> 00:17:43,080
as I started to teach at Central St

441
00:17:40,320 –> 00:17:45,600
Martin’s on the architecture program

442
00:17:43,080 –> 00:17:48,120
into much more of a space of incubating

443
00:17:45,600 –> 00:17:50,080
ideas and conversations at that

444
00:17:48,120 –> 00:17:52,440
intersection of Technology creativity

445
00:17:50,080 –> 00:17:56,480
and design and now we’re actually

446
00:17:52,440 –> 00:17:58,480
working on raising capital for a fund as

447
00:17:56,480 –> 00:17:59,480
we just realized that it’s a growing

448
00:17:58,480 –> 00:18:02,320
space

449
00:17:59,480 –> 00:18:06,880
where people are looking at R&D in you

450
00:18:02,320 –> 00:18:09,280
know technology R&D in AI R&D in really

451
00:18:06,880 –> 00:18:10,880
building for the future from a creative

452
00:18:09,280 –> 00:18:13,360
perspective you know it’s creative

453
00:18:10,880 –> 00:18:16,000
technology and where does it go and it

454
00:18:13,360 –> 00:18:18,039
doesn’t necessarily fit within what

455
00:18:16,000 –> 00:18:20,400
Venture Capital firms are investing in

456
00:18:18,039 –> 00:18:22,600
right now it’s not really gaming it’s

457
00:18:20,400 –> 00:18:25,200
not really nfts or

458
00:18:22,600 –> 00:18:26,919
blockchain um and it’s not really going

459
00:18:25,200 –> 00:18:29,840
to necessarily make the same amount of

460
00:18:26,919 –> 00:18:34,960
money in a year’s time as um the next

461
00:18:29,840 –> 00:18:38,559
gen assistant um but these are you know

462
00:18:34,960 –> 00:18:41,200
a conversation yeah I think it’s needed

463
00:18:38,559 –> 00:18:42,080
yeah yeah how do you see how do you find

464
00:18:41,200 –> 00:18:44,840
a

465
00:18:42,080 –> 00:18:47,640
response well it’s so the response is

466
00:18:44,840 –> 00:18:50,400
been in education because I’ve been

467
00:18:47,640 –> 00:18:52,799
educating myself as to how does the The

468
00:18:50,400 –> 00:18:55,000
Venture Capital industry work and so I

469
00:18:52,799 –> 00:18:56,640
think I spent all of last year which you

470
00:18:55,000 –> 00:18:58,200
know some in some ways I’m like God I

471
00:18:56,640 –> 00:18:59,840
wasted the whole year trying to

472
00:18:58,200 –> 00:19:01,280
understand this of investing and where

473
00:18:59,840 –> 00:19:03,440
does it work and where would I find the

474
00:19:01,280 –> 00:19:05,159
space for this work but part of it is

475
00:19:03,440 –> 00:19:07,360
been understanding that there’s a gap

476
00:19:05,159 –> 00:19:09,919
right now um you have long-term

477
00:19:07,360 –> 00:19:12,559
investment in R&D in AI happening in

478
00:19:09,919 –> 00:19:14,840
biotech and in deep Tech um the more

479
00:19:12,559 –> 00:19:18,360
creative incubation of products happens

480
00:19:14,840 –> 00:19:20,039
at Google and Microsoft um it happens in

481
00:19:18,360 –> 00:19:22,679
large technology companies that have the

482
00:19:20,039 –> 00:19:25,360
capacity to create those spaces or of

483
00:19:22,679 –> 00:19:28,000
course places like MIT media lab and so

484
00:19:25,360 –> 00:19:30,799
on but there isn’t really a space that

485
00:19:28,000 –> 00:19:33,240
says how do we do this in a way that is

486
00:19:30,799 –> 00:19:35,000
honoring the creative Spirit of the work

487
00:19:33,240 –> 00:19:36,440
that’s happening considering a long-term

488
00:19:35,000 –> 00:19:39,200
perspective but providing the support

489
00:19:36,440 –> 00:19:41,000
needed to um give the Technical

490
00:19:39,200 –> 00:19:42,960
Solutions to provide access to

491
00:19:41,000 –> 00:19:46,880
computational resources engineering

492
00:19:42,960 –> 00:19:49,000
talent and to build businesses so you

493
00:19:46,880 –> 00:19:50,840
know let’s see it’s it’s an adventure

494
00:19:49,000 –> 00:19:52,679
it’s a journey but I do believe in it

495
00:19:50,840 –> 00:19:54,400
and I think that trying to do it in the

496
00:19:52,679 –> 00:19:56,840
right way has actually probably been

497
00:19:54,400 –> 00:19:58,360
better um because I think it allows us

498
00:19:56,840 –> 00:20:00,960
to honor the kind of work that’s coming

499
00:19:58,360 –> 00:20:04,640
to us us and maybe create a space that

500
00:20:00,960 –> 00:20:07,600
ends up being part public part private

501
00:20:04,640 –> 00:20:10,880
um you know maybe more about um Venture

502
00:20:07,600 –> 00:20:12,360
philanthropy then Venture capitalism but

503
00:20:10,880 –> 00:20:15,200
that’s something that hasn’t really been

504
00:20:12,360 –> 00:20:16,360
written before as a new language and you

505
00:20:15,200 –> 00:20:17,960
know those are the people you always

506
00:20:16,360 –> 00:20:21,159
want to be is a people who wrote a

507
00:20:17,960 –> 00:20:24,880
different language and maybe you know it

508
00:20:21,159 –> 00:20:27,799
and no one else does but as opposed to

509
00:20:24,880 –> 00:20:29,360
doing what’s already been done because

510
00:20:27,799 –> 00:20:30,720
sure you know the venture world has you

511
00:20:29,360 –> 00:20:33,200
know we have Google and Twitter and

512
00:20:30,720 –> 00:20:35,520
Facebook and whatever else um because of

513
00:20:33,200 –> 00:20:38,919
it um but there’s definitely other

514
00:20:35,520 –> 00:20:40,720
models you know the indexed prize in uh

515
00:20:38,919 –> 00:20:42,960
Denmark was a great example of that they

516
00:20:40,720 –> 00:20:44,520
now have a tiny fund but they got a

517
00:20:42,960 –> 00:20:47,880
prize from the government every year and

518
00:20:44,520 –> 00:20:50,159
they funded um they gave the prize to

519
00:20:47,880 –> 00:20:52,000
Tesla to do a lingo to the iPhone but

520
00:20:50,159 –> 00:20:53,919
also to incredible Innovations and

521
00:20:52,000 –> 00:20:56,559
design that nobody would ever have heard

522
00:20:53,919 –> 00:20:58,400
of if there wasn’t a prize and so we

523
00:20:56,559 –> 00:21:00,559
have to find that balance particularly

524
00:20:58,400 –> 00:21:02,919
with Tech and particularly with AI

525
00:21:00,559 –> 00:21:04,559
because it’s a very brief moment in time

526
00:21:02,919 –> 00:21:06,120
in which we’re given the opportunity to

527
00:21:04,559 –> 00:21:08,840
shape these Technologies and they will

528
00:21:06,120 –> 00:21:11,200
have long-term impact and um if we’re

529
00:21:08,840 –> 00:21:14,320
not bringing in other voices from

530
00:21:11,200 –> 00:21:16,640
Philosophy from art from design um you

531
00:21:14,320 –> 00:21:18,760
know from outside of just computer

532
00:21:16,640 –> 00:21:21,159
science and engineering we’re not giving

533
00:21:18,760 –> 00:21:24,240
really ourselves a chance of Designing

534
00:21:21,159 –> 00:21:25,360
with whether it’s empathy or Humanity or

535
00:21:24,240 –> 00:21:27,640
you know all of the issues that you’re

536
00:21:25,360 –> 00:21:29,240
interested in yeah yeah you want to be

537
00:21:27,640 –> 00:21:31,919
addressing the whole population right

538
00:21:29,240 –> 00:21:34,840
because it’s so much more diverse than

539
00:21:31,919 –> 00:21:37,520
the bubble we of the people who are

540
00:21:34,840 –> 00:21:42,200
designing at this at this moment as as

541
00:21:37,520 –> 00:21:46,039
you like you said also VCS tend to fund

542
00:21:42,200 –> 00:21:50,000
what they know or what they can

543
00:21:46,039 –> 00:21:53,400
quantify uh and art is so AFF firal it’s

544
00:21:50,000 –> 00:21:56,279
it’s just difficult to to calculate

545
00:21:53,400 –> 00:21:58,039
value but it’s definitely it’s needed

546
00:21:56,279 –> 00:21:59,080
because it’s created by People for

547
00:21:58,039 –> 00:22:02,000
People

548
00:21:59,080 –> 00:22:05,559
uh so there is a value in it it’s just

549
00:22:02,000 –> 00:22:09,039
how to t how to how to describe it how

550
00:22:05,559 –> 00:22:10,960
to sell it because you have to sell and

551
00:22:09,039 –> 00:22:14,760
you like you sell ev anything and

552
00:22:10,960 –> 00:22:17,440
everything um to the parties to who can

553
00:22:14,760 –> 00:22:21,640
support you either with funding or with

554
00:22:17,440 –> 00:22:23,880
reach and I’m thinking yeah are you

555
00:22:21,640 –> 00:22:26,240
having conversations with government

556
00:22:23,880 –> 00:22:28,840
like are there any public publicly

557
00:22:26,240 –> 00:22:31,760
funded grants or maybe there there is

558
00:22:28,840 –> 00:22:34,279
some um I don’t know some kind of

559
00:22:31,760 –> 00:22:36,880
collaboration between you could merge

560
00:22:34,279 –> 00:22:40,760
something between like commer commercial

561
00:22:36,880 –> 00:22:42,320
and nonprofit world yeah so if anyone is

562
00:22:40,760 –> 00:22:44,480
listening to this and wants to get

563
00:22:42,320 –> 00:22:46,000
involved I’d love to chat because I’m

564
00:22:44,480 –> 00:22:48,760
speaking to a couple of

565
00:22:46,000 –> 00:22:51,000
universities um and that’s where I see a

566
00:22:48,760 –> 00:22:53,600
space you know there is a space for R&D

567
00:22:51,000 –> 00:22:56,279
and Ai and it is an academic Endeavor

568
00:22:53,600 –> 00:22:58,080
but we have to we have to honor the IP

569
00:22:56,279 –> 00:23:00,240
and the work and the energy of people

570
00:22:58,080 –> 00:23:01,799
who are whether they’re creative or Tech

571
00:23:00,240 –> 00:23:03,400
I think technologists are incredibly

572
00:23:01,799 –> 00:23:05,320
creative Engineers are creative it’s not

573
00:23:03,400 –> 00:23:07,200
to say that they’re not but whether it’s

574
00:23:05,320 –> 00:23:09,679
people come from from the more creative

575
00:23:07,200 –> 00:23:11,400
Industries or the tech industry um they

576
00:23:09,679 –> 00:23:13,120
can’t just be doing nice things for

577
00:23:11,400 –> 00:23:14,960
other people’s service and so where is

578
00:23:13,120 –> 00:23:16,520
the economic imperative that comes back

579
00:23:14,960 –> 00:23:18,440
to them that gives them back ownership

580
00:23:16,520 –> 00:23:20,080
of their work that’s a model I’m

581
00:23:18,440 –> 00:23:21,679
interested in creating so if there’s

582
00:23:20,080 –> 00:23:23,799
anybody out there who’s interested in

583
00:23:21,679 –> 00:23:25,360
that kind of space I’d love to speak to

584
00:23:23,799 –> 00:23:27,039
them and you know to your earlier point

585
00:23:25,360 –> 00:23:29,320
of what’s not quantifiable there’s an

586
00:23:27,039 –> 00:23:31,640
amazing book published last year by Ivy

587
00:23:29,320 –> 00:23:33,559
Ross who’s a head of design at Google

588
00:23:31,640 –> 00:23:35,520
with Susan magman who’s at John’s

589
00:23:33,559 –> 00:23:37,200
Hopkins University she works in the

590
00:23:35,520 –> 00:23:39,080
Neuroscience department and they talk

591
00:23:37,200 –> 00:23:41,720
about what you know kind of that

592
00:23:39,080 –> 00:23:44,159
Confluence between art and feeling and

593
00:23:41,720 –> 00:23:45,840
how the brain is impacted by Art and one

594
00:23:44,159 –> 00:23:47,679
of the things that I took away from that

595
00:23:45,840 –> 00:23:48,919
that you know probably feels quite

596
00:23:47,679 –> 00:23:51,880
obvious but I hadn’t actually thought

597
00:23:48,919 –> 00:23:54,039
about before was how we measure feeling

598
00:23:51,880 –> 00:23:56,799
how do we measure pain you know if you

599
00:23:54,039 –> 00:23:58,159
or I were to uh get into a fight and

600
00:23:56,799 –> 00:24:00,240
like one of us scratch the other or

601
00:23:58,159 –> 00:24:04,279
whatever how would I feel pain versus

602
00:24:00,240 –> 00:24:06,320
you um it’s totally subjective and in

603
00:24:04,279 –> 00:24:09,320
this moment with technology we’re

604
00:24:06,320 –> 00:24:11,799
experiencing a very subjective world you

605
00:24:09,320 –> 00:24:14,400
know of course as you said these systems

606
00:24:11,799 –> 00:24:17,600
are designed by a very limited group of

607
00:24:14,400 –> 00:24:20,080
people um with you know very narrow data

608
00:24:17,600 –> 00:24:22,559
sets and outcomes uh and you know their

609
00:24:20,080 –> 00:24:24,279
own you know beliefs and value systems

610
00:24:22,559 –> 00:24:25,960
but even to say they I think does

611
00:24:24,279 –> 00:24:28,000
disservice to all the people who are

612
00:24:25,960 –> 00:24:30,080
sitting in different offices and have

613
00:24:28,000 –> 00:24:32,080
worked on you know building image net or

614
00:24:30,080 –> 00:24:34,440
on computer vision build a Transformer

615
00:24:32,080 –> 00:24:36,120
model it does them a disservice to

616
00:24:34,440 –> 00:24:37,640
remove the diversity of thinking and

617
00:24:36,120 –> 00:24:40,440
thought and experience that they bring

618
00:24:37,640 –> 00:24:43,399
to the table and so for me that kind of

619
00:24:40,440 –> 00:24:45,240
space of inclusivity and AI um there’s a

620
00:24:43,399 –> 00:24:46,760
lot of language around it as you know

621
00:24:45,240 –> 00:24:50,039
you know there’s governments talking

622
00:24:46,760 –> 00:24:53,200
about friendly AI responsible AI

623
00:24:50,039 –> 00:24:56,039
trustworthy Fair ethical we’re throwing

624
00:24:53,200 –> 00:24:58,559
around a lot of language um in mostly in

625
00:24:56,039 –> 00:25:01,080
English but like what does it mean to be

626
00:24:58,559 –> 00:25:03,720
inclusive and that’s a conversation that

627
00:25:01,080 –> 00:25:05,559
goes beyond what Chach PT should build

628
00:25:03,720 –> 00:25:08,360
or what Gemini should look like it’s

629
00:25:05,559 –> 00:25:10,159
much deeper and we should hopefully come

630
00:25:08,360 –> 00:25:11,840
to a place where we’re not just naming a

631
00:25:10,159 –> 00:25:14,360
few big billionaires or multi

632
00:25:11,840 –> 00:25:17,039
multi-billionaires and big companies but

633
00:25:14,360 –> 00:25:18,960
we’re talking about the ideas what ideas

634
00:25:17,039 –> 00:25:21,720
are we building for and for whom and how

635
00:25:18,960 –> 00:25:23,559
and why what is inclusive is it to

636
00:25:21,720 –> 00:25:26,399
consider the so-called next billion

637
00:25:23,559 –> 00:25:29,360
users of the internet everybody’s online

638
00:25:26,399 –> 00:25:31,279
now you know to think about what’s on

639
00:25:29,360 –> 00:25:33,039
the edges you know the edges are

640
00:25:31,279 –> 00:25:35,559
indigenous language and culture which is

641
00:25:33,039 –> 00:25:37,399
being lost the edges are people who are

642
00:25:35,559 –> 00:25:39,960
differently abled who have special needs

643
00:25:37,399 –> 00:25:41,720
the edges are people who are extremely

644
00:25:39,960 –> 00:25:43,880
poor and didn’t have a home to go to

645
00:25:41,720 –> 00:25:45,320
during covid are we designing for them

646
00:25:43,880 –> 00:25:48,840
are we designing for what Google would

647
00:25:45,320 –> 00:25:50,600
not design for is that our tag phrase or

648
00:25:48,840 –> 00:25:52,520
are we designing for what works for our

649
00:25:50,600 –> 00:25:54,159
communities are we designing for our

650
00:25:52,520 –> 00:25:55,679
planet you know we can’t just solve

651
00:25:54,159 –> 00:25:58,000
everything by talking about what is

652
00:25:55,679 –> 00:26:00,799
inclusive in an instant and we can’t

653
00:25:58,000 –> 00:26:03,120
just be expecting that to be resolved um

654
00:26:00,799 –> 00:26:05,240
by large organizations that have a very

655
00:26:03,120 –> 00:26:07,720
a clear responsibility to their

656
00:26:05,240 –> 00:26:09,679
shareholders which is uh to make money

657
00:26:07,720 –> 00:26:12,600
and to make it in a very very Rush

658
00:26:09,679 –> 00:26:14,000
fashion in very short periods of time

659
00:26:12,600 –> 00:26:16,440
with what seems on the outside to be

660
00:26:14,000 –> 00:26:18,200
unlimited resources but when you’re on

661
00:26:16,440 –> 00:26:20,760
the inside you’re just part of this

662
00:26:18,200 –> 00:26:23,600
constant Panic around budget money and

663
00:26:20,760 –> 00:26:25,039
share prices and so I do think we live

664
00:26:23,600 –> 00:26:27,840
in a world of these multi-layered

665
00:26:25,039 –> 00:26:30,159
realities it is completely subjective

666
00:26:27,840 –> 00:26:31,960
and we have obviously more poverty and

667
00:26:30,159 –> 00:26:34,279
inequality than we ever had but we also

668
00:26:31,960 –> 00:26:36,720
have these incredible um ways of

669
00:26:34,279 –> 00:26:38,120
connecting and communicating and I think

670
00:26:36,720 –> 00:26:41,919
mostly people feel like they’re going

671
00:26:38,120 –> 00:26:45,200
crazy because there’s such a pressure to

672
00:26:41,919 –> 00:26:48,080
figure out how to keep up with progress

673
00:26:45,200 –> 00:26:49,760
um and also you know you sit and you you

674
00:26:48,080 –> 00:26:52,640
look at how these Technologies end up

675
00:26:49,760 –> 00:26:54,200
being so racist so mistaken the language

676
00:26:52,640 –> 00:26:55,520
is you know hallucination of course

677
00:26:54,200 –> 00:26:58,520
they’re not hallucinating they’re just

678
00:26:55,520 –> 00:27:01,000
wrong it’s broken or it’s overcorrect

679
00:26:58,520 –> 00:27:03,279
itself because it’s so worried but is it

680
00:27:01,000 –> 00:27:05,399
worried it’s a computer you know do we

681
00:27:03,279 –> 00:27:09,000
blame it do we Sue it you know these are

682
00:27:05,399 –> 00:27:11,320
such FAL own responsibility right yeah

683
00:27:09,000 –> 00:27:12,880
and I just think we’re not you know no

684
00:27:11,320 –> 00:27:14,760
one person can solve it no one

685
00:27:12,880 –> 00:27:16,600
conversation can come to an outcome and

686
00:27:14,760 –> 00:27:18,559
that’s why to come back to this

687
00:27:16,600 –> 00:27:20,120
incubator I’m just really interested in

688
00:27:18,559 –> 00:27:21,480
what creative people have to say what

689
00:27:20,120 –> 00:27:24,440
they care about what they do and how

690
00:27:21,480 –> 00:27:27,279
they build and to facilitate that yeah

691
00:27:24,440 –> 00:27:28,880
yeah I’m also worried about like you

692
00:27:27,279 –> 00:27:31,840
said like

693
00:27:28,880 –> 00:27:34,240
people who are designing or building

694
00:27:31,840 –> 00:27:38,240
things for for the masses are usually

695
00:27:34,240 –> 00:27:41,039
the ones who have money are in power and

696
00:27:38,240 –> 00:27:45,559
those voices or those communities which

697
00:27:41,039 –> 00:27:48,320
are on on minorities or they they are

698
00:27:45,559 –> 00:27:50,919
the special cases right they they won’t

699
00:27:48,320 –> 00:27:53,760
be heard they won’t be designed

700
00:27:50,919 –> 00:27:56,080
because they don’t bring profit like

701
00:27:53,760 –> 00:28:00,360
they don’t count in know in a grand

702
00:27:56,080 –> 00:28:04,000
sense of commer like how would I say

703
00:28:00,360 –> 00:28:07,120
capitalism pure capitalism right so I

704
00:28:04,000 –> 00:28:11,640
wonder how can you solve it because

705
00:28:07,120 –> 00:28:13,760
companies don’t want to do nonprofit or

706
00:28:11,640 –> 00:28:15,960
Char they don’t they don’t want to work

707
00:28:13,760 –> 00:28:20,559
as Charity unless it’s a TA tax write

708
00:28:15,960 –> 00:28:23,200
off um so who who should fund it who I

709
00:28:20,559 –> 00:28:25,640
think it’s a great question I think

710
00:28:23,200 –> 00:28:27,880
there is we are in a crisis point right

711
00:28:25,640 –> 00:28:29,039
we talk about crisis and I hate it like

712
00:28:27,880 –> 00:28:30,799
I just you know of course you want to

713
00:28:29,039 –> 00:28:32,480
roll your eyes because everyone’s fine

714
00:28:30,799 –> 00:28:34,440
we we’re all quite fancy we live in

715
00:28:32,480 –> 00:28:36,720
fancy cities and countries and I go to

716
00:28:34,440 –> 00:28:38,240
aot Talks by like very smart people

717
00:28:36,720 –> 00:28:40,320
where they’re like I just want to tell

718
00:28:38,240 –> 00:28:42,039
you about the crisis that we live in

719
00:28:40,320 –> 00:28:44,000
like we’re not in a crisis people in

720
00:28:42,039 –> 00:28:45,760
Sudan are in a crisis right now people

721
00:28:44,000 –> 00:28:47,960
in Yemen are in a crisis right now

722
00:28:45,760 –> 00:28:50,440
people in Palestine are in a crisis

723
00:28:47,960 –> 00:28:52,320
right now we’re fine but there is a

724
00:28:50,440 –> 00:28:54,640
climate crisis that is happening that

725
00:28:52,320 –> 00:28:56,640
we’re part of that we are experiencing

726
00:28:54,640 –> 00:28:59,640
however we think about it and there is

727
00:28:56,640 –> 00:29:01,399
this kind of inequality shift which is

728
00:28:59,640 –> 00:29:03,679
probably a crisis you know although

729
00:29:01,399 –> 00:29:05,600
we’ve had inequality in the past and

730
00:29:03,679 –> 00:29:07,679
then at the core of it I think

731
00:29:05,600 –> 00:29:09,480
technology has led to its own crisis of

732
00:29:07,679 –> 00:29:10,840
isolation and loneliness it’s led to

733
00:29:09,480 –> 00:29:13,120
misinformation

734
00:29:10,840 –> 00:29:15,320
misunderstanding unkindness a lack of

735
00:29:13,120 –> 00:29:16,760
empathy I think if we added it up all

736
00:29:15,320 –> 00:29:18,320
over the world for all the billions of

737
00:29:16,760 –> 00:29:20,600
people who use the internet there’s

738
00:29:18,320 –> 00:29:22,640
probably a lot of hurt that has come

739
00:29:20,600 –> 00:29:25,600
with this experience of being connected

740
00:29:22,640 –> 00:29:27,519
ironically and so the reason I say all

741
00:29:25,600 –> 00:29:29,799
of that is I do think that technology

742
00:29:27,519 –> 00:29:31,279
the big Tech techology companies do feel

743
00:29:29,799 –> 00:29:32,919
in many ways that they’re losing a

744
00:29:31,279 –> 00:29:34,960
battle of thought leadership in the

745
00:29:32,919 –> 00:29:37,640
space of course they’re very focused on

746
00:29:34,960 –> 00:29:40,720
money and resources and as you say you

747
00:29:37,640 –> 00:29:43,000
know um you see them reorg their teams

748
00:29:40,720 –> 00:29:46,360
in order to make just better version of

749
00:29:43,000 –> 00:29:48,760
chat GPT and and from the outside losing

750
00:29:46,360 –> 00:29:50,440
track or losing perspective on the

751
00:29:48,760 –> 00:29:52,919
future of the world you know when I join

752
00:29:50,440 –> 00:29:54,880
Google that’s all at least that was my

753
00:29:52,919 –> 00:29:56,840
experience of it I just felt that all we

754
00:29:54,880 –> 00:29:59,519
did was think about amazing Futures I

755
00:29:56,840 –> 00:30:02,279
worked with people huh they

756
00:29:59,519 –> 00:30:05,120
Chang and also like where do we go next

757
00:30:02,279 –> 00:30:06,960
in hindsight I think about what were the

758
00:30:05,120 –> 00:30:08,720
questions we were asking ourselves I

759
00:30:06,960 –> 00:30:10,880
don’t remember asking myself any

760
00:30:08,720 –> 00:30:12,720
questions I learned to ask questions

761
00:30:10,880 –> 00:30:15,120
working as I was lucky to do with

762
00:30:12,720 –> 00:30:17,080
artists with museums with creatives uh

763
00:30:15,120 –> 00:30:19,640
with people in Emerging Markets with

764
00:30:17,080 –> 00:30:22,399
companies with Partners I was educated

765
00:30:19,640 –> 00:30:25,039
by Outsiders and then eventually as a

766
00:30:22,399 –> 00:30:28,039
teacher um Academia and critical

767
00:30:25,039 –> 00:30:30,279
thinking and Frameworks so I think the

768
00:30:28,039 –> 00:30:32,039
only solution is that you need long more

769
00:30:30,279 –> 00:30:33,440
time you need to pause you need to think

770
00:30:32,039 –> 00:30:35,600
you need to breathe you need to not

771
00:30:33,440 –> 00:30:37,919
panic because you can’t just solve

772
00:30:35,600 –> 00:30:41,159
ethics by having an Ethics Committee

773
00:30:37,919 –> 00:30:42,720
having said that I think big technology

774
00:30:41,159 –> 00:30:44,760
companies are coming to a point where

775
00:30:42,720 –> 00:30:46,360
they’re going to have to question what

776
00:30:44,760 –> 00:30:50,200
does it mean to have a space of thought

777
00:30:46,360 –> 00:30:52,240
leadership because they are set up with

778
00:30:50,200 –> 00:30:54,399
value systems whether it was don’t be

779
00:30:52,240 –> 00:30:56,960
evil whether it is about Innovation

780
00:30:54,399 –> 00:31:00,360
experience it’s not ever been just about

781
00:30:56,960 –> 00:31:02,120
money and that’s a fact and is that all

782
00:31:00,360 –> 00:31:03,919
that it comes down to and is that what

783
00:31:02,120 –> 00:31:06,600
makes your life worth living or your

784
00:31:03,919 –> 00:31:10,000
company worth existing in a few years

785
00:31:06,600 –> 00:31:12,000
probably not so I think that of course

786
00:31:10,000 –> 00:31:13,679
money will not necessarily come from the

787
00:31:12,000 –> 00:31:15,880
company’s profiting the most off of this

788
00:31:13,679 –> 00:31:17,240
moment or the individuals doing so but

789
00:31:15,880 –> 00:31:19,279
there is a sense of collective

790
00:31:17,240 –> 00:31:22,320
responsibility to avert the crisis that

791
00:31:19,279 –> 00:31:25,080
we face and for that I think you’ll see

792
00:31:22,320 –> 00:31:27,639
probably more thoughtful engagement with

793
00:31:25,080 –> 00:31:30,360
new companies that come out of the blue

794
00:31:27,639 –> 00:31:32,760
with new research um with government

795
00:31:30,360 –> 00:31:35,080
policy as it’s figured out along the way

796
00:31:32,760 –> 00:31:37,559
and really fostering more thoughtful

797
00:31:35,080 –> 00:31:40,480
Innovation and that probably needs

798
00:31:37,559 –> 00:31:42,960
slower Innovation more time more space

799
00:31:40,480 –> 00:31:45,679
more energy given to just letting people

800
00:31:42,960 –> 00:31:47,159
have ideas and not panicking probably

801
00:31:45,679 –> 00:31:48,480
like they do in biotech where I know

802
00:31:47,159 –> 00:31:51,039
they give themselves a lot more time

803
00:31:48,480 –> 00:31:53,360
than they do in you know I think it’s it

804
00:31:51,039 –> 00:31:56,760
gots like it’s sped

805
00:31:53,360 –> 00:31:58,200
up with all the yeah you would know but

806
00:31:56,760 –> 00:32:00,120
the other thing of course is

807
00:31:58,200 –> 00:32:01,840
reallocating resources who are you

808
00:32:00,120 –> 00:32:03,679
giving your resources to and who are you

809
00:32:01,840 –> 00:32:06,519
bringing to the table if you have an

810
00:32:03,679 –> 00:32:08,679
Ethics lab or an empathy lab who are you

811
00:32:06,519 –> 00:32:11,039
inviting and who’s deciding and really

812
00:32:08,679 –> 00:32:13,080
reconsidering in a moment of possible

813
00:32:11,039 –> 00:32:16,080
humility the fact that you haven’t got

814
00:32:13,080 –> 00:32:18,360
it right and um and that only comes when

815
00:32:16,080 –> 00:32:19,919
your products start failing when the

816
00:32:18,360 –> 00:32:21,360
thing you’re meant to do stops working

817
00:32:19,919 –> 00:32:22,880
you have to question yourself and if

818
00:32:21,360 –> 00:32:24,880
there can be a space for introspection

819
00:32:22,880 –> 00:32:27,559
there I think that’s very powerful the

820
00:32:24,880 –> 00:32:28,960
final layers are civil society and

821
00:32:27,559 –> 00:32:31,000
govern

822
00:32:28,960 –> 00:32:33,120
um I do think you know at the end of the

823
00:32:31,000 –> 00:32:36,799
day AI regulation is going to come too

824
00:32:33,120 –> 00:32:39,279
late for AI but as regulation it’s going

825
00:32:36,799 –> 00:32:42,679
to make us consider what are we building

826
00:32:39,279 –> 00:32:45,679
for if AI represents a fundamental shift

827
00:32:42,679 –> 00:32:47,880
a paradigm shift in you know how we

828
00:32:45,679 –> 00:32:50,039
connect what is human intelligence

829
00:32:47,880 –> 00:32:51,919
versus machine intelligence if we are

830
00:32:50,039 –> 00:32:54,080
spending the time right now to consider

831
00:32:51,919 –> 00:32:56,679
the Futures we want to have with

832
00:32:54,080 –> 00:32:59,840
machines with machines and humans

833
00:32:56,679 –> 00:32:59,840
collaborating with

834
00:33:58,080 –> 00:34:01,240
you know I think 40% of the world’s

835
00:33:59,600 –> 00:34:04,320
young population will be in subsaharan

836
00:34:01,240 –> 00:34:07,679
Africa in a few years to India to East

837
00:34:04,320 –> 00:34:09,720
Asia and with that shift that might not

838
00:34:07,679 –> 00:34:12,200
necessarily be a crisis but it is a

839
00:34:09,720 –> 00:34:14,599
crisis for the Western world as we’ve

840
00:34:12,200 –> 00:34:17,119
known it in many ways and with that I

841
00:34:14,599 –> 00:34:19,359
hope for technology companies comes a

842
00:34:17,119 –> 00:34:22,159
moment of Reckoning and thinking about

843
00:34:19,359 –> 00:34:23,839
who are you building for and why are you

844
00:34:22,159 –> 00:34:26,760
building the products that you build if

845
00:34:23,839 –> 00:34:28,800
it’s not just to get a better version of

846
00:34:26,760 –> 00:34:31,520
what you doing versus everyone else it

847
00:34:28,800 –> 00:34:33,839
just feels at this moment I would I

848
00:34:31,520 –> 00:34:37,000
would love to to experience what you are

849
00:34:33,839 –> 00:34:41,480
saying that you know um companies and

850
00:34:37,000 –> 00:34:46,480
government should make mental space for

851
00:34:41,480 –> 00:34:48,800
for people to think of like

852
00:34:46,480 –> 00:34:50,879
actually understanding what whom should

853
00:34:48,800 –> 00:34:54,639
we design for and how should we design

854
00:34:50,879 –> 00:34:59,200
it to to make it a longl lasting and and

855
00:34:54,639 –> 00:35:02,200
like a beneficial um uh component but it

856
00:34:59,200 –> 00:35:06,800
feels like we are in a constant Rush

857
00:35:02,200 –> 00:35:11,480
right now even exponential Rush between

858
00:35:06,800 –> 00:35:14,599
western or like us UK or Europe and and

859
00:35:11,480 –> 00:35:17,599
Asia mainly China uh in terms of the

860
00:35:14,599 –> 00:35:20,960
whole design and development around

861
00:35:17,599 –> 00:35:24,920
Ai and I as much as I would love to

862
00:35:20,960 –> 00:35:27,839
there is not so much um news there is

863
00:35:24,920 –> 00:35:30,720
not so much at least I don’t hear it

864
00:35:27,839 –> 00:35:34,079
about uh Development coming from Africa

865
00:35:30,720 –> 00:35:36,560
for example it feels like they are

866
00:35:34,079 –> 00:35:39,599
left to

867
00:35:36,560 –> 00:35:41,320
themselves I think that um I agree you

868
00:35:39,599 –> 00:35:43,240
know I think it’s I mean this is like

869
00:35:41,320 –> 00:35:44,920
now like very much part of this rhetoric

870
00:35:43,240 –> 00:35:46,960
of like where is China where didn’t we

871
00:35:44,920 –> 00:35:50,040
invite them to this conversation you

872
00:35:46,960 –> 00:35:52,359
know why are we constantly like ignoring

873
00:35:50,040 –> 00:35:54,400
um large com large comp countries that

874
00:35:52,359 –> 00:35:57,040
don’t suit us and I think the core of

875
00:35:54,400 –> 00:35:59,359
this becomes um it’s not just about tech

876
00:35:57,040 –> 00:36:01,720
right it’s about geopolitics um

877
00:35:59,359 –> 00:36:03,800
technology is built on physical land you

878
00:36:01,720 –> 00:36:06,960
have data centers you have fiber optic

879
00:36:03,800 –> 00:36:09,599
cables you have resource extraction um

880
00:36:06,960 –> 00:36:11,160
you know a silicon chip uh a GPU chip

881
00:36:09,599 –> 00:36:13,040
requires rare Earths and you know

882
00:36:11,160 –> 00:36:16,640
whatever else like and then of course

883
00:36:13,040 –> 00:36:19,160
you have um uh all of the politics of

884
00:36:16,640 –> 00:36:20,480
power which come with uh you know AI

885
00:36:19,160 –> 00:36:21,960
politics of power there’s a really good

886
00:36:20,480 –> 00:36:23,720
book on it I’ll remember the name of and

887
00:36:21,960 –> 00:36:27,920
mention it later that was I recently

888
00:36:23,720 –> 00:36:30,119
read um so it’s not just about how good

889
00:36:27,920 –> 00:36:32,839
is the Internet it’s really about who’s

890
00:36:30,119 –> 00:36:34,480
building and designing and for whom and

891
00:36:32,839 –> 00:36:35,760
um of course we have we can blame

892
00:36:34,480 –> 00:36:37,119
ourselves you know if you come from the

893
00:36:35,760 –> 00:36:38,480
global South you can say well we should

894
00:36:37,119 –> 00:36:39,880
have had our act together but on the

895
00:36:38,480 –> 00:36:42,760
other hand you can think about leaning

896
00:36:39,880 –> 00:36:44,880
in and I think that’s a really good

897
00:36:42,760 –> 00:36:47,440
moment of Reckoning for what comes next

898
00:36:44,880 –> 00:36:49,440
I met in um Davos this year the new

899
00:36:47,440 –> 00:36:51,359
minister of arts and culture from

900
00:36:49,440 –> 00:36:53,359
Nigeria and she’s an incredible very

901
00:36:51,359 –> 00:36:55,560
powerful woman and she’s working with

902
00:36:53,359 –> 00:36:57,920
the minister of Technology on really

903
00:36:55,560 –> 00:37:00,640
thinking about how best to merge merg

904
00:36:57,920 –> 00:37:04,079
the two industries and um you know there

905
00:37:00,640 –> 00:37:08,280
there was somebody um in this lunch

906
00:37:04,079 –> 00:37:09,760
audience telling her about how um which

907
00:37:08,280 –> 00:37:11,640
various tech companies could help her in

908
00:37:09,760 –> 00:37:13,079
her work and help her to digitize her

909
00:37:11,640 –> 00:37:16,319
collection and whatever and whatever and

910
00:37:13,079 –> 00:37:18,599
she was really polite and I just said

911
00:37:16,319 –> 00:37:20,440
she’s going to be helping them what do

912
00:37:18,599 –> 00:37:23,560
we need to how do we create more

913
00:37:20,440 –> 00:37:26,280
inclusive Tech how do we include other

914
00:37:23,560 –> 00:37:30,000
cultures in AI how do we build better

915
00:37:26,280 –> 00:37:32,000
data sets and um understand meaning in

916
00:37:30,000 –> 00:37:33,599
ways go that go beyond our narrow

917
00:37:32,000 –> 00:37:35,599
context is to work with people like

918
00:37:33,599 –> 00:37:37,720
herself who are looking after the

919
00:37:35,599 –> 00:37:40,839
culture of one of the world’s biggest

920
00:37:37,720 –> 00:37:42,400
countries and you know obviously those

921
00:37:40,839 –> 00:37:43,839
conversations you know the more you say

922
00:37:42,400 –> 00:37:46,920
that the more people get annoyed with

923
00:37:43,839 –> 00:37:49,839
you but it’s true we need her help and

924
00:37:46,920 –> 00:37:52,040
so those who we look to now to uh advise

925
00:37:49,839 –> 00:37:53,560
us to design our future will have to

926
00:37:52,040 –> 00:37:55,440
come from other places and they should

927
00:37:53,560 –> 00:37:57,240
be feeling the confidence to do that

928
00:37:55,440 –> 00:37:59,520
themselves of course she you know she’s

929
00:37:57,240 –> 00:38:01,160
educ ating herself she’s trying to

930
00:37:59,520 –> 00:38:02,839
understand the space and then another

931
00:38:01,160 –> 00:38:05,280
issue with AI is that people get so

932
00:38:02,839 –> 00:38:06,640
intimidated they’re so nervous you know

933
00:38:05,280 –> 00:38:08,920
I met an investor the other day he’s

934
00:38:06,640 –> 00:38:10,880
invested in two big AI funds one’s a

935
00:38:08,920 –> 00:38:12,599
fund and one’s a start amazing neuros

936
00:38:10,880 –> 00:38:13,760
science startup and he didn’t understand

937
00:38:12,599 –> 00:38:15,119
the technology and how it was being

938
00:38:13,760 –> 00:38:17,200
built and what was happening and I said

939
00:38:15,119 –> 00:38:18,520
well ask him to share the research with

940
00:38:17,200 –> 00:38:20,400
you and walk you through it and he said

941
00:38:18,520 –> 00:38:22,160
I don’t know if I’ll understand it they

942
00:38:20,400 –> 00:38:24,319
said the generative model works and it’s

943
00:38:22,160 –> 00:38:26,440
better than the other one you know that

944
00:38:24,319 –> 00:38:28,520
wouldn’t happen in any other industry

945
00:38:26,440 –> 00:38:29,839
you would want to understand what’s

946
00:38:28,520 –> 00:38:31,880
happening and of course we don’t know

947
00:38:29,839 –> 00:38:33,280
what happens uh you know once it goes

948
00:38:31,880 –> 00:38:35,640
into the black box of the computer

949
00:38:33,280 –> 00:38:37,440
speaking to itself but if the person

950
00:38:35,640 –> 00:38:40,200
who’s built it can’t admit that and say

951
00:38:37,440 –> 00:38:41,520
that then we have a is true so to come

952
00:38:40,200 –> 00:38:43,640
back to the demographics and the

953
00:38:41,520 –> 00:38:45,400
geopolitics I think it’s really terrible

954
00:38:43,640 –> 00:38:47,440
that we don’t know what’s happening in

955
00:38:45,400 –> 00:38:50,160
China I think it’s you know we should be

956
00:38:47,440 –> 00:38:51,680
having a lot more of a conversation on

957
00:38:50,160 –> 00:38:54,160
um you know what does it mean that

958
00:38:51,680 –> 00:38:56,720
America has blocked out uh you know if

959
00:38:54,160 –> 00:38:58,520
this is the case um any kind of supply

960
00:38:56,720 –> 00:39:01,760
of microchips to China so they can’t

961
00:38:58,520 –> 00:39:04,280
develop um you know gen technology they

962
00:39:01,760 –> 00:39:06,359
don’t have access to Taiwanese chips um

963
00:39:04,280 –> 00:39:09,119
other countries are developing it you

964
00:39:06,359 –> 00:39:10,640
know it’s just so loaded right now I

965
00:39:09,119 –> 00:39:12,359
mean Taiwan is pretty much the only

966
00:39:10,640 –> 00:39:14,480
country in the world that’s developing

967
00:39:12,359 –> 00:39:15,960
these microchips and other companies and

968
00:39:14,480 –> 00:39:18,440
companies are working on it but if it’s

969
00:39:15,960 –> 00:39:22,160
based off a finite resources we should

970
00:39:18,440 –> 00:39:24,240
understand um what the alternatives are

971
00:39:22,160 –> 00:39:26,839
and how this might play into you know

972
00:39:24,240 –> 00:39:28,640
global politics in the future and also

973
00:39:26,839 –> 00:39:30,920
how this impct owers or disempowers

974
00:39:28,640 –> 00:39:33,040
populations of people who should be

975
00:39:30,920 –> 00:39:35,160
benefiting from technology and who

976
00:39:33,040 –> 00:39:37,319
should be gaining from The Amazing

977
00:39:35,160 –> 00:39:41,040
opportunities with you know in this case

978
00:39:37,319 –> 00:39:43,880
AI specifically on Health on science on

979
00:39:41,040 –> 00:39:46,520
um we were talking about quantum physics

980
00:39:43,880 –> 00:39:48,640
um on of course creativity on human

981
00:39:46,520 –> 00:39:50,839
connection that’s where we have a a

982
00:39:48,640 –> 00:39:54,560
wonderful we’ve seen already from alpha

983
00:39:50,839 –> 00:39:57,400
F to Ala go these very you know

984
00:39:54,560 –> 00:39:59,680
breakthrough moments of human in

985
00:39:57,400 –> 00:40:01,800
inovation that have happened as a result

986
00:39:59,680 –> 00:40:03,200
of the support of these Technologies and

987
00:40:01,800 –> 00:40:06,119
that’s where we should be channeling

988
00:40:03,200 –> 00:40:08,400
them and for me there’s such a gap in

989
00:40:06,119 –> 00:40:10,240
maybe what the media presents but also

990
00:40:08,400 –> 00:40:12,280
us really saying if we empowered every

991
00:40:10,240 –> 00:40:15,160
culture if we thought about Confucian

992
00:40:12,280 –> 00:40:17,040
values I don’t know or you know the fact

993
00:40:15,160 –> 00:40:18,880
that there’s two billion Muslims in the

994
00:40:17,040 –> 00:40:21,160
world and what are their values how do

995
00:40:18,880 –> 00:40:23,319
we believe in you know what they care

996
00:40:21,160 –> 00:40:24,880
about maybe that doesn’t always align

997
00:40:23,319 –> 00:40:26,359
with how an engineer thinks about an

998
00:40:24,880 –> 00:40:28,440
algorithm but that’s where I think

999
00:40:26,359 –> 00:40:30,680
governments should be playing a role of

1000
00:40:28,440 –> 00:40:32,760
thought leadership and about like bridge

1001
00:40:30,680 –> 00:40:35,359
building rather than like yes exactly

1002
00:40:32,760 –> 00:40:36,720
finding the common denominator yeah

1003
00:40:35,359 –> 00:40:39,440
being like my AI will be better than

1004
00:40:36,720 –> 00:40:41,119
yours like actually it won’t because it

1005
00:40:39,440 –> 00:40:43,880
making more of the same right now is

1006
00:40:41,119 –> 00:40:45,880
destructive it will be evil um it will

1007
00:40:43,880 –> 00:40:47,880
pull out the worst of bias the worst of

1008
00:40:45,880 –> 00:40:51,119
racism there’s great books by Kate

1009
00:40:47,880 –> 00:40:53,720
Crawford and merid brussard on this um

1010
00:40:51,119 –> 00:40:56,240
and you know how do we invest in um you

1011
00:40:53,720 –> 00:40:59,680
know opening ourselves up and every day

1012
00:40:56,240 –> 00:41:02,560
somebody sends me an amazing indigenous

1013
00:40:59,680 –> 00:41:04,760
uh language app or a new fund that’s

1014
00:41:02,560 –> 00:41:08,000
looking at thinking about how AI can be

1015
00:41:04,760 –> 00:41:10,319
applied to protecting indigenous culture

1016
00:41:08,000 –> 00:41:12,280
or to channeling ideas from indigenous

1017
00:41:10,319 –> 00:41:14,839
communities Rafi anodal the artist is

1018
00:41:12,280 –> 00:41:17,160
doing a project in the Amazon that’s so

1019
00:41:14,839 –> 00:41:19,000
amazing and meaningful but like it has

1020
00:41:17,160 –> 00:41:22,079
to come from the ground up it should be

1021
00:41:19,000 –> 00:41:24,760
coming out of Africa out of the Amazon

1022
00:41:22,079 –> 00:41:27,119
out of India out of tribes in

1023
00:41:24,760 –> 00:41:28,880
nagaland without polluting their way of

1024
00:41:27,119 –> 00:41:30,400
life and it doesn’t mean I have an

1025
00:41:28,880 –> 00:41:31,560
answer for it but I think the way we

1026
00:41:30,400 –> 00:41:33,319
flip

1027
00:41:31,560 –> 00:41:35,200
perspective is something I don’t know

1028
00:41:33,319 –> 00:41:36,720
how to do I sit here giving you my

1029
00:41:35,200 –> 00:41:38,280
opinion as somebody who’s I told you

1030
00:41:36,720 –> 00:41:40,920
I’ve worked in America I’ve studied

1031
00:41:38,280 –> 00:41:43,480
there I have a very particular opinion

1032
00:41:40,920 –> 00:41:44,839
and if it hadn’t been for Co and getting

1033
00:41:43,480 –> 00:41:46,800
kind of lost in the mountains of

1034
00:41:44,839 –> 00:41:49,280
Pakistan and you know working with my

1035
00:41:46,800 –> 00:41:51,400
students who are so traumatized after

1036
00:41:49,280 –> 00:41:53,560
two years of being locked up I don’t

1037
00:41:51,400 –> 00:41:55,800
think it would have actually triggered

1038
00:41:53,560 –> 00:41:58,359
in me the same kind of feeling of it’s

1039
00:41:55,800 –> 00:42:00,240
not about being mad at you know whoever

1040
00:41:58,359 –> 00:42:02,880
built where we’re at now because it’s we

1041
00:42:00,240 –> 00:42:04,720
live in an amazing world but it’s about

1042
00:42:02,880 –> 00:42:08,160
recognizing the dangers of the moment in

1043
00:42:04,720 –> 00:42:09,839
which we find ourselves recognizing the

1044
00:42:08,160 –> 00:42:12,480
uncertainty which we have for the future

1045
00:42:09,839 –> 00:42:14,800
of our climate and also how we have

1046
00:42:12,480 –> 00:42:17,000
excluded most of the world’s population

1047
00:42:14,800 –> 00:42:18,599
from progress and how these Technologies

1048
00:42:17,000 –> 00:42:20,559
are actually only going to accelerate

1049
00:42:18,599 –> 00:42:22,800
that exclusion and what are we even

1050
00:42:20,559 –> 00:42:25,480
doing about it I have no I I don’t see

1051
00:42:22,800 –> 00:42:27,920
that future um right now and I don’t see

1052
00:42:25,480 –> 00:42:30,760
what governments are doing about it

1053
00:42:27,920 –> 00:42:33,640
yeah yeah it’s it’s a it’s a highlight

1054
00:42:30,760 –> 00:42:37,200
of our conversation I think it’s yeah

1055
00:42:33,640 –> 00:42:39,599
like I I really don’t it just feels like

1056
00:42:37,200 –> 00:42:45,599
voices like yours are

1057
00:42:39,599 –> 00:42:49,000
so this so distributed but so like

1058
00:42:45,599 –> 00:42:51,440
together they would be much stronger

1059
00:42:49,000 –> 00:42:54,240
they they could be heard right now you

1060
00:42:51,440 –> 00:42:57,119
can hear those voices every now and then

1061
00:42:54,240 –> 00:43:01,599
and in the end like they are so how to

1062
00:42:57,119 –> 00:43:03,240
say say like diminishing with time and

1063
00:43:01,599 –> 00:43:07,760
and with so

1064
00:43:03,240 –> 00:43:10,839
much like you said so much an anxiety

1065
00:43:07,760 –> 00:43:15,240
about what media is presenting what what

1066
00:43:10,839 –> 00:43:19,200
technology is um providing and people

1067
00:43:15,240 –> 00:43:21,680
feel trapped people feel lost in in the

1068
00:43:19,200 –> 00:43:24,119
progress in a way right or it seems very

1069
00:43:21,680 –> 00:43:26,160
esoteric you know cuz I do a bit of

1070
00:43:24,119 –> 00:43:27,880
advisory work now and you know people

1071
00:43:26,160 –> 00:43:30,040
want to know how to get from to be like

1072
00:43:27,880 –> 00:43:33,319
what tools should I use to you know how

1073
00:43:30,040 –> 00:43:35,680
do I solve it and I find that you know

1074
00:43:33,319 –> 00:43:39,359
can there is a moment where we have to

1075
00:43:35,680 –> 00:43:41,440
be brave and like and and tell truth to

1076
00:43:39,359 –> 00:43:42,640
ourselves and that’s you know so if

1077
00:43:41,440 –> 00:43:44,280
there’s people out there who want to

1078
00:43:42,640 –> 00:43:46,599
come and be in a voice I’d love to hear

1079
00:43:44,280 –> 00:43:49,599
from them but also I’ve started to do as

1080
00:43:46,599 –> 00:43:53,559
of last month um as part of our work

1081
00:43:49,599 –> 00:43:55,640
with open-ended um Gathering women in AI

1082
00:43:53,559 –> 00:43:58,240
um you know baced in London to come in

1083
00:43:55,640 –> 00:44:00,559
to meet and of course happy to find them

1084
00:43:58,240 –> 00:44:03,520
online and it ended up being women

1085
00:44:00,559 –> 00:44:04,920
because we ended up having women but

1086
00:44:03,520 –> 00:44:06,559
it’s not you know it’s creative

1087
00:44:04,920 –> 00:44:08,839
technologists it’s natural language

1088
00:44:06,559 –> 00:44:11,160
processing researchers it’s professors

1089
00:44:08,839 –> 00:44:14,319
it’s Engineers it’s Executives from

1090
00:44:11,160 –> 00:44:16,280
technology companies who share an

1091
00:44:14,319 –> 00:44:19,640
interest but also a knowledge of what’s

1092
00:44:16,280 –> 00:44:23,440
happening in common and um are mostly

1093
00:44:19,640 –> 00:44:25,760
working on incredible um actual work and

1094
00:44:23,440 –> 00:44:29,119
millions of side projects and so that’s

1095
00:44:25,760 –> 00:44:29,119
something that I would welcome

1096
00:45:27,440 –> 00:45:32,200
um a very I’m doing a bunch of work with

1097
00:45:30,359 –> 00:45:34,640
a A friend of mine who’s a philosopher

1098
00:45:32,200 –> 00:45:36,480
based in California his name is Tobias

1099
00:45:34,640 –> 00:45:40,359
ree should look him up he’s worked a lot

1100
00:45:36,480 –> 00:45:42,839
with artists and um he is a very

1101
00:45:40,359 –> 00:45:44,480
important philosopher but he comes from

1102
00:45:42,839 –> 00:45:47,640
the Western tradition of what is

1103
00:45:44,480 –> 00:45:50,559
philosophy and I’m not a philosopher but

1104
00:45:47,640 –> 00:45:53,960
we have so much of this kind of uh

1105
00:45:50,559 –> 00:45:56,480
friction around what is philosophy who

1106
00:45:53,960 –> 00:45:59,240
decides you know the idea of the human

1107
00:45:56,480 –> 00:46:01,079
actually or like Consciousness per se

1108
00:45:59,240 –> 00:46:02,800
was actually I think it’s it’s decart

1109
00:46:01,079 –> 00:46:04,880
it’s not that old in like Western

1110
00:46:02,800 –> 00:46:07,079
philosophy but of course in eastern

1111
00:46:04,880 –> 00:46:10,119
philosophy the self the heart the whole

1112
00:46:07,079 –> 00:46:12,119
you know Consciousness there you know uh

1113
00:46:10,119 –> 00:46:14,920
we kind of passed through so many layers

1114
00:46:12,119 –> 00:46:17,359
of uh centuries and and thousands of

1115
00:46:14,920 –> 00:46:18,760
years to to come to a point of defining

1116
00:46:17,359 –> 00:46:21,680
even what Consciousness could be in the

1117
00:46:18,760 –> 00:46:23,960
vadas for example um and then you have

1118
00:46:21,680 –> 00:46:26,440
science you know quantum entanglement

1119
00:46:23,960 –> 00:46:28,680
Quantum Computing which comes next that

1120
00:46:26,440 –> 00:46:30,559
will bring us to new ideas of what the

1121
00:46:28,680 –> 00:46:33,240
self really means and they hate the word

1122
00:46:30,559 –> 00:46:35,960
Consciousness they talk about reality

1123
00:46:33,240 –> 00:46:37,800
and so I’m not a philosopher at all but

1124
00:46:35,960 –> 00:46:40,960
I do know that this is a moment where we

1125
00:46:37,800 –> 00:46:44,000
have tools to allow us to consider where

1126
00:46:40,960 –> 00:46:47,359
we stand and you know we have let’s say

1127
00:46:44,000 –> 00:46:49,160
one life how we leverage all of these

1128
00:46:47,359 –> 00:46:51,559
resources to create a better one for

1129
00:46:49,160 –> 00:46:53,000
ourselves or for others and so if

1130
00:46:51,559 –> 00:46:55,440
anything in beginning that questioning

1131
00:46:53,000 –> 00:46:58,040
of ourselves and stepping outside of

1132
00:46:55,440 –> 00:47:00,319
worries about efficiency or automation

1133
00:46:58,040 –> 00:47:02,440
or uh progress as defined in the

1134
00:47:00,319 –> 00:47:04,480
industrial revolution and just maybe

1135
00:47:02,440 –> 00:47:07,440
being a bit comfortable with

1136
00:47:04,480 –> 00:47:09,440
considering who am I it’s not very

1137
00:47:07,440 –> 00:47:12,280
esoteric you know everybody in Tech from

1138
00:47:09,440 –> 00:47:14,240
Steve Jobs to um you know basically

1139
00:47:12,280 –> 00:47:18,160
everybody has had some kind of spiritual

1140
00:47:14,240 –> 00:47:21,119
Guru uh ramas was um the inspiration for

1141
00:47:18,160 –> 00:47:23,440
Google’s empathy lab uh and so on and so

1142
00:47:21,119 –> 00:47:27,200
forth you know California is the home of

1143
00:47:23,440 –> 00:47:29,400
big sir and uh eln so these are not like

1144
00:47:27,200 –> 00:47:31,200
conversations necessarily to have but

1145
00:47:29,400 –> 00:47:33,960
they’re not efficient conversations to

1146
00:47:31,200 –> 00:47:36,160
have because they waste time and right

1147
00:47:33,960 –> 00:47:37,720
now actually the more time we think

1148
00:47:36,160 –> 00:47:39,640
we’re wasting the less time we probably

1149
00:47:37,720 –> 00:47:42,599
have to come up with Solutions because

1150
00:47:39,640 –> 00:47:45,319
if we are to understand the science of

1151
00:47:42,599 –> 00:47:47,559
climate change which again I’m not an

1152
00:47:45,319 –> 00:47:50,000
expert we have little time left to

1153
00:47:47,559 –> 00:47:51,440
resolve where we sit right now and I’m

1154
00:47:50,000 –> 00:47:53,400
really interested in the concept of

1155
00:47:51,440 –> 00:47:55,880
Cathedral thinking which is really like

1156
00:47:53,400 –> 00:47:57,559
pushing a thousand years into the future

1157
00:47:55,880 –> 00:48:00,119
where could we be

1158
00:47:57,559 –> 00:48:02,440
M and a really what how do you define

1159
00:48:00,119 –> 00:48:05,400
Cathedral thinking so Cathedral thinking

1160
00:48:02,440 –> 00:48:07,359
is a is actually an architectural term

1161
00:48:05,400 –> 00:48:08,680
based around Cathedrals like actual

1162
00:48:07,359 –> 00:48:12,359
Cathedrals which were built the last

1163
00:48:08,680 –> 00:48:14,040
4,000 years in whever medieval Europe um

1164
00:48:12,359 –> 00:48:15,920
but a lot of designers and Architects

1165
00:48:14,040 –> 00:48:19,880
have considered it there’s a library in

1166
00:48:15,920 –> 00:48:22,920
Norway um which has got um a lot of

1167
00:48:19,880 –> 00:48:24,079
famous authors to write books um that

1168
00:48:22,920 –> 00:48:27,079
will only

1169
00:48:24,079 –> 00:48:28,800
be I guess unraveled in 100 years I I

1170
00:48:27,079 –> 00:48:30,400
think Margaret Atwood is one of them

1171
00:48:28,800 –> 00:48:33,000
alif shafak and there’s something about

1172
00:48:30,400 –> 00:48:37,480
the paper like the tree also for the

1173
00:48:33,000 –> 00:48:39,000
paper is going to show up 100 years um

1174
00:48:37,480 –> 00:48:41,920
it’s kind of an Indulgence that you can

1175
00:48:39,000 –> 00:48:43,000
probably do in Scandinavia and you don’t

1176
00:48:41,920 –> 00:48:46,400
have time to do that if you’re

1177
00:48:43,000 –> 00:48:48,640
deforesting mountains in uh India or

1178
00:48:46,400 –> 00:48:51,200
whatever else but the idea of like being

1179
00:48:48,640 –> 00:48:53,520
able to take the luxury of extending

1180
00:48:51,200 –> 00:48:55,839
time forward and to look at what happens

1181
00:48:53,520 –> 00:48:58,839
with glaciers as they explode in the

1182
00:48:55,839 –> 00:48:58,839
Himalayas

1183
00:49:00,400 –> 00:49:05,440
this is where your airports are usually

1184
00:49:02,200 –> 00:49:07,040
better um all right but yeah glacial

1185
00:49:05,440 –> 00:49:09,000
melts you know what happens with the

1186
00:49:07,040 –> 00:49:11,319
future of mountains how do we consider

1187
00:49:09,000 –> 00:49:14,960
we don’t have a thousand years for that

1188
00:49:11,319 –> 00:49:17,359
you know the ecosystem around um the

1189
00:49:14,960 –> 00:49:19,960
northern Airs of Pakistan and India and

1190
00:49:17,359 –> 00:49:21,839
Tibet is Shifting extremely rapidly uh

1191
00:49:19,960 –> 00:49:23,319
these are scientific facts it is because

1192
00:49:21,839 –> 00:49:25,480
of climate change it’s because of

1193
00:49:23,319 –> 00:49:27,920
pollution it’s because of a lot of uh

1194
00:49:25,480 –> 00:49:30,599
human designed intervention in our

1195
00:49:27,920 –> 00:49:33,359
environment and um we don’t have a

1196
00:49:30,599 –> 00:49:35,200
thousand years to solve for that at all

1197
00:49:33,359 –> 00:49:36,920
you know it’s more how we deal with the

1198
00:49:35,200 –> 00:49:38,520
implications of that the impact of it

1199
00:49:36,920 –> 00:49:40,119
what happens in the next major flood

1200
00:49:38,520 –> 00:49:42,000
flooding event in Pakistan what does

1201
00:49:40,119 –> 00:49:44,000
that mean for displaced populations and

1202
00:49:42,000 –> 00:49:45,799
people everything is not just going to

1203
00:49:44,000 –> 00:49:47,839
be about tech it’s going to be about

1204
00:49:45,799 –> 00:49:49,839
empathy it’s going to be about

1205
00:49:47,839 –> 00:49:51,680
conversation it’s going to be about the

1206
00:49:49,839 –> 00:49:54,440
application or allocation of resources

1207
00:49:51,680 –> 00:49:56,520
in an efficient way and a consideration

1208
00:49:54,440 –> 00:49:58,599
by the wealthy world or the global North

1209
00:49:56,520 –> 00:50:01,920
that We’re All In It Together For Better

1210
00:49:58,599 –> 00:50:03,400
or For For Worse um you know the the

1211
00:50:01,920 –> 00:50:04,799
minister of climate of Pakistan had

1212
00:50:03,400 –> 00:50:06,599
actually said this in an interview a few

1213
00:50:04,799 –> 00:50:09,079
years ago she said what stays in what

1214
00:50:06,599 –> 00:50:12,400
happens in Pakistan doesn’t stay in

1215
00:50:09,079 –> 00:50:15,079
Pakistan what happens flood somewhere up

1216
00:50:12,400 –> 00:50:16,839
in the uh Cara Korum Valley where I I

1217
00:50:15,079 –> 00:50:20,520
work on a bunch of conservation projects

1218
00:50:16,839 –> 00:50:22,240
there has an immediate physical um and

1219
00:50:20,520 –> 00:50:25,559
energetic impact on the rest of the

1220
00:50:22,240 –> 00:50:26,960
country which you know extends to the

1221
00:50:25,559 –> 00:50:28,920
rest of the world in so many different

1222
00:50:26,960 –> 00:50:31,480
layered ways and of course that’s going

1223
00:50:28,920 –> 00:50:35,160
to be the case all over with Cyclones

1224
00:50:31,480 –> 00:50:38,040
with um weather patterns with migration

1225
00:50:35,160 –> 00:50:39,520
patterns um and that’s what you know

1226
00:50:38,040 –> 00:50:41,000
scientists are working on tons of people

1227
00:50:39,520 –> 00:50:42,520
are investing a lot of money in this

1228
00:50:41,000 –> 00:50:45,240
space it’s not that things are not being

1229
00:50:42,520 –> 00:50:47,040
done but I think what the internet or

1230
00:50:45,240 –> 00:50:49,240
technology or even this moment of AI

1231
00:50:47,040 –> 00:50:51,319
should remind us is that we’re not just

1232
00:50:49,240 –> 00:50:54,200
connected in these very ephemeral Ways

1233
00:50:51,319 –> 00:50:56,079
by like you know bits and bites we’re

1234
00:50:54,200 –> 00:50:57,280
connected by the physical world which we

1235
00:50:56,079 –> 00:50:58,760
inhabit

1236
00:50:57,280 –> 00:51:01,400
and if you can channel that as a

1237
00:50:58,760 –> 00:51:04,319
positive connection of Love or empathy

1238
00:51:01,400 –> 00:51:06,119
or even desperation to survive is you

1239
00:51:04,319 –> 00:51:08,599
know in this moment we should be

1240
00:51:06,119 –> 00:51:10,559
thinking of it in a very positive sense

1241
00:51:08,599 –> 00:51:12,799
and we should be using technology for

1242
00:51:10,559 –> 00:51:15,880
all of the efficiencies you know from

1243
00:51:12,799 –> 00:51:19,040
robotics to self-driving cars to travel

1244
00:51:15,880 –> 00:51:22,559
to space to science to health but at the

1245
00:51:19,040 –> 00:51:24,640
core of it why we’re building I think is

1246
00:51:22,559 –> 00:51:27,280
something we should be

1247
00:51:24,640 –> 00:51:31,200
reconsidering definitely love and and

1248
00:51:27,280 –> 00:51:34,000
positive emotions are moving people to

1249
00:51:31,200 –> 00:51:36,480
like you strive either you what’s the

1250
00:51:34,000 –> 00:51:39,079
what’s the theory right like you um

1251
00:51:36,480 –> 00:51:41,559
either try to avoid pain or seek

1252
00:51:39,079 –> 00:51:45,440
pleasure love and all the positive

1253
00:51:41,559 –> 00:51:47,839
emotions but then there is also this I I

1254
00:51:45,440 –> 00:51:51,400
and I’m just trying to understand what’s

1255
00:51:47,839 –> 00:51:55,079
more effective for people to take action

1256
00:51:51,400 –> 00:51:57,799
there is this um this thinking that you

1257
00:51:55,079 –> 00:52:00,920
should stage a crisis even exaggerate

1258
00:51:57,799 –> 00:52:05,599
something to uh for for people to take

1259
00:52:00,920 –> 00:52:08,240
action even from what um all the you

1260
00:52:05,599 –> 00:52:10,079
know this clock which says we are three

1261
00:52:08,240 –> 00:52:13,359
minutes to midnight or like two minutes

1262
00:52:10,079 –> 00:52:16,040
to midnight running out of time in terms

1263
00:52:13,359 –> 00:52:19,280
of um climate

1264
00:52:16,040 –> 00:52:20,040
change how you what do you think it’s is

1265
00:52:19,280 –> 00:52:22,640
the

1266
00:52:20,040 –> 00:52:26,160
best um way

1267
00:52:22,640 –> 00:52:28,160
to make people relate that this is you

1268
00:52:26,160 –> 00:52:30,520
know maybe right now they are sitting in

1269
00:52:28,160 –> 00:52:33,319
a comfortable chair in at their home

1270
00:52:30,520 –> 00:52:36,079
it’s you know they have access to to

1271
00:52:33,319 –> 00:52:38,760
water to clean water to energy but

1272
00:52:36,079 –> 00:52:42,119
actually if they don’t themselves don’t

1273
00:52:38,760 –> 00:52:43,680
do something about uh something positive

1274
00:52:42,119 –> 00:52:47,000
about

1275
00:52:43,680 –> 00:52:51,440
um preserving nature and

1276
00:52:47,000 –> 00:52:52,559
environment we are all to blame right

1277
00:52:51,440 –> 00:52:54,680
well I mean I don’t want to totally

1278
00:52:52,559 –> 00:52:56,839
contradict myself but like I think that

1279
00:52:54,680 –> 00:52:59,160
I there it’s very hard to know because

1280
00:52:56,839 –> 00:53:00,760
there’s such a there’s so many of us and

1281
00:52:59,160 –> 00:53:03,040
we are although connected we’re also

1282
00:53:00,760 –> 00:53:05,920
culturally quite separated is where you

1283
00:53:03,040 –> 00:53:08,119
start I don’t think Panic is an answer

1284
00:53:05,920 –> 00:53:10,040
obviously not I think about this from a

1285
00:53:08,119 –> 00:53:11,960
technology first perspective so the

1286
00:53:10,040 –> 00:53:15,079
thing that I’m interested in is like a

1287
00:53:11,960 –> 00:53:17,119
lot more of a technology focused lens is

1288
00:53:15,079 –> 00:53:19,480
really how we Empower people who work

1289
00:53:17,119 –> 00:53:21,400
within technology companies those who no

1290
00:53:19,480 –> 00:53:23,960
longer need to work for just one

1291
00:53:21,400 –> 00:53:25,799
organization um creative technologist

1292
00:53:23,960 –> 00:53:27,839
and also policy makers involved with

1293
00:53:25,799 –> 00:53:29,640
tech to realize that there is a great

1294
00:53:27,839 –> 00:53:33,280
opportunity to come together to create

1295
00:53:29,640 –> 00:53:35,480
spaces of connection for coders

1296
00:53:33,280 –> 00:53:37,319
Engineers computer scientists

1297
00:53:35,480 –> 00:53:39,160
researchers the space I’m really

1298
00:53:37,319 –> 00:53:40,960
interested in is where we can really

1299
00:53:39,160 –> 00:53:42,640
Empower technologist people who are

1300
00:53:40,960 –> 00:53:43,599
coming technology first whether it’s

1301
00:53:42,640 –> 00:53:46,280
people working within large

1302
00:53:43,599 –> 00:53:48,799
organizations in tech people who are

1303
00:53:46,280 –> 00:53:51,559
Engineers computer scientists

1304
00:53:48,799 –> 00:53:54,040
researchers um in the space of computer

1305
00:53:51,559 –> 00:53:57,119
science um creative

1306
00:53:54,040 –> 00:53:59,400
technologists uh computational designer

1307
00:53:57,119 –> 00:54:01,000
really anyone working at that very

1308
00:53:59,400 –> 00:54:03,079
strong intersection of technology and

1309
00:54:01,000 –> 00:54:04,839
any other practice is who I want to be

1310
00:54:03,079 –> 00:54:06,720
speaking to and of course policy makers

1311
00:54:04,839 –> 00:54:08,400
who are engaging in this space at the

1312
00:54:06,720 –> 00:54:09,880
end of it as we’ve discussed everything

1313
00:54:08,400 –> 00:54:12,079
is very holistic everything is

1314
00:54:09,880 –> 00:54:13,960
interconnected you can’t really separate

1315
00:54:12,079 –> 00:54:16,240
now what happens with AI or you know

1316
00:54:13,960 –> 00:54:18,520
Tech in general from Human progress or

1317
00:54:16,240 –> 00:54:20,119
disaster but at the core of it I think

1318
00:54:18,520 –> 00:54:22,480
there’s an opportunity for us to find

1319
00:54:20,119 –> 00:54:25,720
each other to find common voices to

1320
00:54:22,480 –> 00:54:27,960
invest um in work and ideas and projects

1321
00:54:25,720 –> 00:54:31,319
that feel meaning ful and also to think

1322
00:54:27,960 –> 00:54:33,440
about how we can be reconsidering not

1323
00:54:31,319 –> 00:54:35,280
just our value systems because as I you

1324
00:54:33,440 –> 00:54:37,280
know I don’t think anybody really that

1325
00:54:35,280 –> 00:54:39,799
I’ve ever worked with has expressed

1326
00:54:37,280 –> 00:54:42,480
extreme racism and therefore manifested

1327
00:54:39,799 –> 00:54:44,280
uh a chat GPT version of it but to have

1328
00:54:42,480 –> 00:54:46,559
thoughtful spaces of collaboration where

1329
00:54:44,280 –> 00:54:50,160
we can afford the luxury of that slowing

1330
00:54:46,559 –> 00:54:52,280
down of creating incubators for ideas uh

1331
00:54:50,160 –> 00:54:53,720
for questioning and manifesting out of

1332
00:54:52,280 –> 00:54:56,040
it we can’t create that for the whole

1333
00:54:53,720 –> 00:54:58,520
world we can’t uh voice that upon

1334
00:54:56,040 –> 00:55:00,440
government everywhere but I would urge

1335
00:54:58,520 –> 00:55:02,319
policy makers in the space Maybe to

1336
00:55:00,440 –> 00:55:05,079
create those spaces as you mentioned

1337
00:55:02,319 –> 00:55:06,920
government how can governments you know

1338
00:55:05,079 –> 00:55:09,559
outside of just the Western context but

1339
00:55:06,920 –> 00:55:12,720
even you know in this country uh create

1340
00:55:09,559 –> 00:55:14,720
more opportunities for investment in you

1341
00:55:12,720 –> 00:55:16,920
know you know we say slow fashion what

1342
00:55:14,720 –> 00:55:20,280
about slow AI you know what about slow

1343
00:55:16,920 –> 00:55:23,160
ideas in AI an R&D that is about cross

1344
00:55:20,280 –> 00:55:25,000
disciplinary insight and Innovation and

1345
00:55:23,160 –> 00:55:27,200
not just about a rush to beating the

1346
00:55:25,000 –> 00:55:28,520
next guy who made you know the same

1347
00:55:27,200 –> 00:55:30,599
thing in a different way and I think

1348
00:55:28,520 –> 00:55:32,240
that is very possible it’s not

1349
00:55:30,599 –> 00:55:33,839
unachievable and it isn’t as

1350
00:55:32,240 –> 00:55:36,960
overwhelming as thinking about the whole

1351
00:55:33,839 –> 00:55:38,559
world um not being solvable because we

1352
00:55:36,960 –> 00:55:40,400
you know we can’t afford to do that and

1353
00:55:38,559 –> 00:55:46,400
we don’t have time for that and what do

1354
00:55:40,400 –> 00:55:51,640
you think about the impact of AI in the

1355
00:55:46,400 –> 00:55:56,559
Arts music film industry I so the other

1356
00:55:51,640 –> 00:55:59,599
day um this guy I have this yes uh Tyler

1357
00:55:56,559 –> 00:56:02,559
Perry who is an American actor and a

1358
00:55:59,599 –> 00:56:06,400
filmmaker and he was creating this 800

1359
00:56:02,559 –> 00:56:09,200
million uh studio and he put it on hold

1360
00:56:06,400 –> 00:56:13,200
when he when they released Sora you know

1361
00:56:09,200 –> 00:56:15,960
they open AI video like text to to video

1362
00:56:13,200 –> 00:56:19,559
um uh platform

1363
00:56:15,960 –> 00:56:22,160
algorithms and he this this guy Perry

1364
00:56:19,559 –> 00:56:25,000
said I know longer would have to travel

1365
00:56:22,160 –> 00:56:28,079
to locations if I wanted to be in a in

1366
00:56:25,000 –> 00:56:30,440
the snow of in Colorado it’s text if I

1367
00:56:28,079 –> 00:56:34,760
wanted to write a scene on the moon it’s

1368
00:56:30,440 –> 00:56:40,640
text and this AI can uh generate it like

1369
00:56:34,760 –> 00:56:42,160
nothing you know I guess so many so many

1370
00:56:40,640 –> 00:56:44,839
and what what’s happening what was

1371
00:56:42,160 –> 00:56:46,960
happening with the uh with the protest

1372
00:56:44,839 –> 00:56:49,520
with Hollywood writer strike and actor

1373
00:56:46,960 –> 00:56:50,880
strike so I’ll tell you briefly on this

1374
00:56:49,520 –> 00:56:52,920
so I think there’s like a huge issue

1375
00:56:50,880 –> 00:56:56,400
with law this is going to be about IP

1376
00:56:52,920 –> 00:56:58,039
ownership trademarks law and really um

1377
00:56:56,400 –> 00:57:01,240
you know it’ll be the same as if you

1378
00:56:58,039 –> 00:57:03,920
know a self-driving car hits somebody is

1379
00:57:01,240 –> 00:57:04,960
you know who owns what who’s in charge

1380
00:57:03,920 –> 00:57:07,680
and I think this is going to be a

1381
00:57:04,960 –> 00:57:09,200
seminal shift for employment uh of

1382
00:57:07,680 –> 00:57:12,440
course for the creative Industries in

1383
00:57:09,200 –> 00:57:15,000
terms of what is new what is um accepted

1384
00:57:12,440 –> 00:57:16,760
as new and who owns not just the

1385
00:57:15,000 –> 00:57:18,440
economic rewards of doing something

1386
00:57:16,760 –> 00:57:20,400
amazing but obviously the life rewards

1387
00:57:18,440 –> 00:57:22,440
of being an important actor or writer or

1388
00:57:20,400 –> 00:57:24,839
producer and getting the benefit of

1389
00:57:22,440 –> 00:57:26,319
living a life where uh you experience

1390
00:57:24,839 –> 00:57:28,079
every high with every film you’ve

1391
00:57:26,319 –> 00:57:29,520
produced or been part of so I think

1392
00:57:28,079 –> 00:57:31,240
that’s a whole other podcast which I

1393
00:57:29,520 –> 00:57:33,039
have thoughts on in terms of what

1394
00:57:31,240 –> 00:57:35,119
happens with you know the future of film

1395
00:57:33,039 –> 00:57:36,640
media and content I think the idea of

1396
00:57:35,119 –> 00:57:40,520
text that you touched upon is actually

1397
00:57:36,640 –> 00:57:42,640
very Central Language you know how do we

1398
00:57:40,520 –> 00:57:44,480
decide that language that is where bias

1399
00:57:42,640 –> 00:57:46,520
comes in that’s way more interesting to

1400
00:57:44,480 –> 00:57:49,760
me than the conversation on racism

1401
00:57:46,520 –> 00:57:51,960
because it is so subtle it’s so nuanced

1402
00:57:49,760 –> 00:57:54,160
and it’s framing the visuals of how we

1403
00:57:51,960 –> 00:57:55,799
will experience the world why should

1404
00:57:54,160 –> 00:57:57,799
language decide why should only people

1405
00:57:55,799 –> 00:57:59,760
who are or who can speak or who can

1406
00:57:57,799 –> 00:58:01,960
communicate why should it only be in

1407
00:57:59,760 –> 00:58:04,000
English that it all ends up manifesting

1408
00:58:01,960 –> 00:58:05,880
in different forms of media and that’s

1409
00:58:04,000 –> 00:58:08,880
where I think we have a much bigger

1410
00:58:05,880 –> 00:58:10,240
conversation and I don’t know how to get

1411
00:58:08,880 –> 00:58:11,640
in touch with Tyler Perry I don’t know

1412
00:58:10,240 –> 00:58:13,480
much about his work but I would be

1413
00:58:11,640 –> 00:58:16,559
interested in that it’s like why should

1414
00:58:13,480 –> 00:58:18,319
it be words um from a particular

1415
00:58:16,559 –> 00:58:20,680
language that are

1416
00:58:18,319 –> 00:58:23,119
designing the world that we experience

1417
00:58:20,680 –> 00:58:25,000
and then this question becomes around

1418
00:58:23,119 –> 00:58:27,440
the images where they’re sourced what

1419
00:58:25,000 –> 00:58:32,559
does it mean to describe relaxing or

1420
00:58:27,440 –> 00:58:34,920
soft or intense uh what does that mean

1421
00:58:32,559 –> 00:58:36,680
for Sora versus how it that mean in a

1422
00:58:34,920 –> 00:58:38,960
totally different context and so I think

1423
00:58:36,680 –> 00:58:40,640
that’s really where the future of how or

1424
00:58:38,960 –> 00:58:44,319
the present I guess of how we unpack

1425
00:58:40,640 –> 00:58:47,319
bias racism perspective in AI is really

1426
00:58:44,319 –> 00:58:50,720
going to manifest and you know Sora is

1427
00:58:47,319 –> 00:58:53,680
interesting but it’s not that surprising

1428
00:58:50,720 –> 00:58:56,559
I’m not that shock that it exists it’s

1429
00:58:53,680 –> 00:58:58,240
beautiful um but you know is it okay for

1430
00:58:56,559 –> 00:59:00,400
me to call it beautiful because you know

1431
00:58:58,240 –> 00:59:02,240
it’s going to destroy in many ways a lot

1432
00:59:00,400 –> 00:59:04,559
of creative Endeavor that could have

1433
00:59:02,240 –> 00:59:06,240
manifested in that way but is that mean

1434
00:59:04,559 –> 00:59:10,359
for us at the end of it those are

1435
00:59:06,240 –> 00:59:14,520
answers that um are not very clear no of

1436
00:59:10,359 –> 00:59:18,599
course and in the way Sarai is

1437
00:59:14,520 –> 00:59:21,920
generating the outputs are very much

1438
00:59:18,599 –> 00:59:24,960
Western and there are so many different

1439
00:59:21,920 –> 00:59:27,039
ways each each person can show through

1440
00:59:24,960 –> 00:59:30,440
through film through

1441
00:59:27,039 –> 00:59:32,079
music the same emotion in completely

1442
00:59:30,440 –> 00:59:35,359
different context in completely

1443
00:59:32,079 –> 00:59:39,559
different way and I’m not so sure AI

1444
00:59:35,359 –> 00:59:42,160
will ever be able to reflect to what you

1445
00:59:39,559 –> 00:59:45,720
want wanted to say what you wanted to

1446
00:59:42,160 –> 00:59:47,640
show to to to present like the way you

1447
00:59:45,720 –> 00:59:50,319
feel as a

1448
00:59:47,640 –> 00:59:52,520
producer I’m

1449
00:59:50,319 –> 00:59:55,880
wondering will it

1450
00:59:52,520 –> 00:59:56,720
be um will it be support like will

1451
00:59:55,880 –> 00:59:59,599
people

1452
00:59:56,720 –> 01:00:02,520
truly creative people will will they be

1453
00:59:59,599 –> 01:00:05,160
supportive of it and will they find

1454
01:00:02,520 –> 01:00:08,640
benefit in it or they will still try to

1455
01:00:05,160 –> 01:00:11,480
create those in the uh

1456
01:00:08,640 –> 01:00:13,760
alternative so the issue we have then is

1457
01:00:11,480 –> 01:00:16,000
like us knowing what we want and US

1458
01:00:13,760 –> 01:00:18,160
knowing that we can seek and have ideas

1459
01:00:16,000 –> 01:00:20,880
and have a spontaneous experience that

1460
01:00:18,160 –> 01:00:23,200
leads to the new chapter of a book or uh

1461
01:00:20,880 –> 01:00:25,599
a new image or artwork the spontaneity

1462
01:00:23,200 –> 01:00:26,880
of your own feelings and perspective and

1463
01:00:25,599 –> 01:00:29,000
time and the the weather and the

1464
01:00:26,880 –> 01:00:31,000
temperature and the smells around you

1465
01:00:29,000 –> 01:00:32,559
you know that is like all fact the

1466
01:00:31,000 –> 01:00:34,799
question is about you know the

1467
01:00:32,559 –> 01:00:36,799
confidence we have to continue to

1468
01:00:34,799 –> 01:00:41,000
believe in ourselves how do we nurture

1469
01:00:36,799 –> 01:00:43,240
culture creativity Artistry um these

1470
01:00:41,000 –> 01:00:45,359
that is something that we really stand

1471
01:00:43,240 –> 01:00:46,960
at risk to lose and stand at risk to

1472
01:00:45,359 –> 01:00:50,680
lose very quickly if we allow these

1473
01:00:46,960 –> 01:00:53,880
tools to take over um of course it could

1474
01:00:50,680 –> 01:00:56,920
happen but it also depends on how we um

1475
01:00:53,880 –> 01:01:00,960
what we value in society value people

1476
01:00:56,920 –> 01:01:01,920
who can spend their time uh considering

1477
01:01:00,960 –> 01:01:06,079
you

1478
01:01:01,920 –> 01:01:10,280
know the context of a particular artwork

1479
01:01:06,079 –> 01:01:13,559
uh The Melody of a particular um song a

1480
01:01:10,280 –> 01:01:15,480
beat you know what is the live sum of

1481
01:01:13,559 –> 01:01:18,640
all of my experiences of music and how

1482
01:01:15,480 –> 01:01:20,520
that manifests versus um nii where I put

1483
01:01:18,640 –> 01:01:22,079
in a few key wordss it would sound

1484
01:01:20,520 –> 01:01:24,160
similar probably to what I would want it

1485
01:01:22,079 –> 01:01:25,880
to sound like but like you said if I

1486
01:01:24,160 –> 01:01:28,000
knew what I wanted it wouldn’t be good

1487
01:01:25,880 –> 01:01:30,160
enough enough and the issue is US losing

1488
01:01:28,000 –> 01:01:32,599
the ability to know what we want and

1489
01:01:30,160 –> 01:01:34,480
that’s a core I think where we where

1490
01:01:32,599 –> 01:01:37,599
we’ll have to

1491
01:01:34,480 –> 01:01:41,039
goating this I think it’s a long really

1492
01:01:37,599 –> 01:01:45,319
long conversation though yes and I think

1493
01:01:41,039 –> 01:01:48,760
also there is a meaning in finding what

1494
01:01:45,319 –> 01:01:51,920
you want through almost like pain try

1495
01:01:48,760 –> 01:01:55,559
trying and failing and retrying and that

1496
01:01:51,920 –> 01:01:57,799
takes time and with AI you can pretty

1497
01:01:55,559 –> 01:02:00,920
much generate lots of things in an

1498
01:01:57,799 –> 01:02:03,559
instance maybe that that doesn’t bring

1499
01:02:00,920 –> 01:02:06,079
so much satisfaction well I mean I I’m

1500
01:02:03,559 –> 01:02:08,839
involved with a lot of Arts institutions

1501
01:02:06,079 –> 01:02:11,680
um two big dance institutions which are

1502
01:02:08,839 –> 01:02:14,920
all about the body showing up being

1503
01:02:11,680 –> 01:02:16,960
present connecting feeling remembering

1504
01:02:14,920 –> 01:02:19,520
thinking about intelligence outside of

1505
01:02:16,960 –> 01:02:23,640
just the cognitive I’m involved in

1506
01:02:19,520 –> 01:02:26,640
museums the Arts um I know writers I’m

1507
01:02:23,640 –> 01:02:28,960
writing a book right now and

1508
01:02:26,640 –> 01:02:33,039
I don’t believe that you’re going to be

1509
01:02:28,960 –> 01:02:36,799
able to Source or sort that kind of

1510
01:02:33,039 –> 01:02:39,000
creativity uh without the experience of

1511
01:02:36,799 –> 01:02:40,520
it without the years of feeling and

1512
01:02:39,000 –> 01:02:42,480
being and the stories you bring to the

1513
01:02:40,520 –> 01:02:44,720
table for so many reasons both

1514
01:02:42,480 –> 01:02:48,160
biological and scientific as well as a

1515
01:02:44,720 –> 01:02:51,160
lot more my own perspective so I don’t

1516
01:02:48,160 –> 01:02:53,480
think that will be where we go next but

1517
01:02:51,160 –> 01:02:56,680
I do think we’re at risk of losing all

1518
01:02:53,480 –> 01:02:58,839
of that because of the

1519
01:02:56,680 –> 01:02:59,960
expediency of technology and I think

1520
01:02:58,839 –> 01:03:01,240
with the cultural and the creative

1521
01:02:59,960 –> 01:03:02,640
sector we’re going to see a seminal

1522
01:03:01,240 –> 01:03:05,880
shift we’ll see a shift with film we’ll

1523
01:03:02,640 –> 01:03:08,200
see a shift with music um we can’t deny

1524
01:03:05,880 –> 01:03:09,920
the fact that music is being generated

1525
01:03:08,200 –> 01:03:12,400
by AI that a lot of it will be very

1526
01:03:09,920 –> 01:03:14,680
popular the same with art um we have

1527
01:03:12,400 –> 01:03:17,520
less probably to say about that with

1528
01:03:14,680 –> 01:03:20,119
performance art um with events with

1529
01:03:17,520 –> 01:03:21,720
venues with concerts um so I think

1530
01:03:20,119 –> 01:03:23,359
there’s going to be a recalibration in

1531
01:03:21,720 –> 01:03:25,279
the creative Industries in the

1532
01:03:23,359 –> 01:03:27,480
entertainment space and in the art space

1533
01:03:25,279 –> 01:03:29,200
and that is very very scary um it’s

1534
01:03:27,480 –> 01:03:31,599
going to impact creativity it’s going to

1535
01:03:29,200 –> 01:03:34,160
impact human creativity it will make us

1536
01:03:31,599 –> 01:03:38,559
consider what we value in terms of great

1537
01:03:34,160 –> 01:03:41,119
writing and poetry and music um but you

1538
01:03:38,559 –> 01:03:43,200
know the truth of it is that the

1539
01:03:41,119 –> 01:03:45,720
publishing industry has struggled a lot

1540
01:03:43,200 –> 01:03:48,880
in the last several years um it’s harder

1541
01:03:45,720 –> 01:03:51,160
to get your book published if um you’re

1542
01:03:48,880 –> 01:03:54,400
not a well-known um

1543
01:03:51,160 –> 01:03:55,920
individual so the shift is happening

1544
01:03:54,400 –> 01:03:57,480
it’s more the extent of it is difficult

1545
01:03:55,920 –> 01:04:00,279
to measure and map but the thing I agree

1546
01:03:57,480 –> 01:04:04,200
on is that you can’t of course go to the

1547
01:04:00,279 –> 01:04:05,920
concert last night I went to um see the

1548
01:04:04,200 –> 01:04:09,520
London philarmonic at the South Bank

1549
01:04:05,920 –> 01:04:13,200
Center in London Wayne McGregor um had

1550
01:04:09,520 –> 01:04:16,279
done a beautiful choreography using um

1551
01:04:13,200 –> 01:04:18,279
his own dancers from his studio uh to a

1552
01:04:16,279 –> 01:04:20,760
piece of Polish orchestral music

1553
01:04:18,279 –> 01:04:22,119
actually and an artist Ben cullin

1554
01:04:20,760 –> 01:04:24,079
Williams who’s a very good friend of

1555
01:04:22,119 –> 01:04:25,599
mine had used machine learning to create

1556
01:04:24,079 –> 01:04:28,240
an artwork that was engaging with the

1557
01:04:25,599 –> 01:04:30,799
body of the dancers and the sound

1558
01:04:28,240 –> 01:04:32,720
beautiful and fluid and intense he used

1559
01:04:30,799 –> 01:04:34,559
technology but you have to be there to

1560
01:04:32,720 –> 01:04:36,079
see it and to experience it you wouldn’t

1561
01:04:34,559 –> 01:04:38,279
have known the experience of it in any

1562
01:04:36,079 –> 01:04:40,319
other way and that’s a privilege that I

1563
01:04:38,279 –> 01:04:43,079
had to be there and that’s where my

1564
01:04:40,319 –> 01:04:44,880
concern is like what becomes a privilege

1565
01:04:43,079 –> 01:04:47,799
um to create and to build and to

1566
01:04:44,880 –> 01:04:52,200
experience ART versus um what is

1567
01:04:47,799 –> 01:04:55,960
entertainment and last question um so

1568
01:04:52,200 –> 01:04:59,520
what would you say for us to how should

1569
01:04:55,960 –> 01:04:59,520
we cultivate your our

1570
01:04:59,559 –> 01:05:04,319
creativity I mean I’m I’m maybe not the

1571
01:05:02,359 –> 01:05:07,720
person to ask that question but I would

1572
01:05:04,319 –> 01:05:10,839
say the things that we should

1573
01:05:07,720 –> 01:05:13,760
cultivate anyway self aware

1574
01:05:10,839 –> 01:05:17,480
self-awareness meditation

1575
01:05:13,760 –> 01:05:20,880
mindfulness um you know time in nature

1576
01:05:17,480 –> 01:05:22,559
time with art music family friends

1577
01:05:20,880 –> 01:05:24,640
things that give you joy and happiness

1578
01:05:22,559 –> 01:05:28,119
and make you feel whole things that give

1579
01:05:24,640 –> 01:05:32,160
you space to consider s all of that you

1580
01:05:28,119 –> 01:05:35,799
know for me um we uh in the same way

1581
01:05:32,160 –> 01:05:38,720
that we become sort of roboticized with

1582
01:05:35,799 –> 01:05:39,920
efficiencies uh we do in our daily lives

1583
01:05:38,720 –> 01:05:42,079
you know where do you find your own

1584
01:05:39,920 –> 01:05:45,119
voice what do you believe in keep a

1585
01:05:42,079 –> 01:05:46,520
diary keep a journal all of that I think

1586
01:05:45,119 –> 01:05:48,000
and then most importantly for me is

1587
01:05:46,520 –> 01:05:49,400
surround yourself with people who

1588
01:05:48,000 –> 01:05:50,880
inspire you and figure out ways of

1589
01:05:49,400 –> 01:05:53,760
making that part of your

1590
01:05:50,880 –> 01:05:55,880
work whether you’re paying for it or not

1591
01:05:53,760 –> 01:05:59,680
exactly and and create right and and

1592
01:05:55,880 –> 01:06:01,760
show it um uh share it with with wider

1593
01:05:59,680 –> 01:06:04,520
public and I think this is this is great

1594
01:06:01,760 –> 01:06:08,200
because people can um connect through

1595
01:06:04,520 –> 01:06:10,880
through work through ART through

1596
01:06:08,200 –> 01:06:13,400
creativity it’s it’s been my pleasure

1597
01:06:10,880 –> 01:06:17,000
and I I hope you’ll have amazing uh

1598
01:06:13,400 –> 01:06:20,720
weekend and all the the best and lots of

1599
01:06:17,000 –> 01:06:24,720
um good good stuff to your parents um

1600
01:06:20,720 –> 01:06:27,119
and I hope I hope we will meet um in

1601
01:06:24,720 –> 01:06:28,920
person I’ll invite you for my next

1602
01:06:27,119 –> 01:06:31,480
dinner also definitely definitely I

1603
01:06:28,920 –> 01:06:31,480
would love to

Shares
Write Comment
Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous Post

Reed Albergotti: Sports Scandals, Fraud, AI, Future Of Journalism, Freedom Of Speech

Next Post

Sol Rashidi: Build Relationship With The C-Suite And Ask Nothing In Return

Tech, business and everything In between
Tech, business and everything In between
  • My why
  • Are You Human
  • Understanding AI
  • Entrepreneurship Handbook
  • Skill up
  • Inspiration

Kamila Hankiewicz

Entrepreneur / Host

Creativity is born in chaos. No matter if it's software, podcast or a kitchen. I share what I learn while building untrite.com, oishya.com, and hosting brilliant people on my podcast Are You Human.