
In this episode #16 of Are You Human, I have a pleasure to speak with Joanna Bryson, the Professor of Ethics and Technology at the Hertie School. Her research focuses on the impact of technology on human cooperation, and AI/ICT governance. We discuss a broad range of topics, exploring her expertise in ethics and technology; her childhood interests in animal behaviour and how it influenced her path into AI and behavioural sciences. Joanna shares the risks of anthropomorphising AI and the potential dangers of giving AI citizen rights, which could lead to corruption in our justice system. She stresses the importance of ensuring that the benefits of AI are widely distributed, not just concentrated among a powerful few, and highlighted the need for holding stakeholders accountable for AI’s actions and consequences. We also touch on the existential risks associated with AI, such as the concentration of power, lack of responsibility, and the prevalence of surveillance. Joanna shares her views on the current state and future of AGI, emphasising the combinatorial power of human-AI collaboration. I hope you’ll enjoy this episode as much as I did.
Transcript
1
00:00:00,319 –> 00:00:05,160
AGI it depends what you define how you
2
00:00:02,760 –> 00:00:06,759
mean it um when I first heard the term
3
00:00:05,160 –> 00:00:09,320
it was people that were saying oh you
4
00:00:06,759 –> 00:00:11,080
know nobody working in AI is actually
5
00:00:09,320 –> 00:00:12,240
trying to solve all the problems they’re
6
00:00:11,080 –> 00:00:13,759
you know they’re all just trying to do
7
00:00:12,240 –> 00:00:15,360
Machine Vision or something and that’s
8
00:00:13,759 –> 00:00:16,960
boring and we want to get back to more
9
00:00:15,360 –> 00:00:18,880
human you know really understand human
10
00:00:16,960 –> 00:00:21,279
intelligence and yeah know that was
11
00:00:18,880 –> 00:00:21,279
completely
12
00:00:26,039 –> 00:00:30,039
false there were very good reasons a lot
13
00:00:28,439 –> 00:00:32,279
of people were trying to solve the whole
14
00:00:30,039 –> 00:00:35,480
problem of intelligence and and and it
15
00:00:32,279 –> 00:00:37,040
was it was just this false narrative um
16
00:00:35,480 –> 00:00:40,039
and then people started talking about oh
17
00:00:37,040 –> 00:00:41,840
AGI there’s like this um there’s going
18
00:00:40,039 –> 00:00:43,600
to be this intelligence explosion when
19
00:00:41,840 –> 00:00:45,440
you have systems that can learn how to
20
00:00:43,600 –> 00:00:47,559
learn and then there’s going to be this
21
00:00:45,440 –> 00:00:50,079
exponential and exponential growth and
22
00:00:47,559 –> 00:00:54,359
we will be mere cockroaches after that
23
00:00:50,079 –> 00:00:56,160
well I think that is at least logical um
24
00:00:54,359 –> 00:00:58,320
but again it doesn’t make sense to talk
25
00:00:56,160 –> 00:01:01,160
about the system itself and if you
26
00:00:58,320 –> 00:01:02,879
actually look at the number of people on
27
00:01:01,160 –> 00:01:05,040
the planet that’s what’s happened since
28
00:01:02,879 –> 00:01:06,680
we’ve had writing so since we’ve been
29
00:01:05,040 –> 00:01:09,920
able to improve our own intelligence
30
00:01:06,680 –> 00:01:11,560
through our technology um we have been
31
00:01:09,920 –> 00:01:14,479
that explosion and that’s why we have a
32
00:01:11,560 –> 00:01:14,479
climate crisis right
33
00:01:15,320 –> 00:01:18,720
now are you
34
00:01:27,720 –> 00:01:32,560
human welcome to are you here human
35
00:01:30,280 –> 00:01:35,200
podcast on this episode you’ll hear a
36
00:01:32,560 –> 00:01:37,720
conversation I have with Joanna Bryson
37
00:01:35,200 –> 00:01:40,479
Joanna is a professor at the hery school
38
00:01:37,720 –> 00:01:43,040
in Berlin where she works on artificial
39
00:01:40,479 –> 00:01:46,920
intelligence ethics and collaborative
40
00:01:43,040 –> 00:01:50,240
cognition Joanna earned her PhD from MIT
41
00:01:46,920 –> 00:01:54,000
in 2001 and has since held positions in
42
00:01:50,240 –> 00:01:56,799
University of bath nothingham and Oxford
43
00:01:54,000 –> 00:02:00,320
University uh a thought leader in AI
44
00:01:56,799 –> 00:02:02,759
ethics of human AI relationships J
45
00:02:00,320 –> 00:02:06,920
published the influential influential
46
00:02:02,759 –> 00:02:10,679
essay robots should be slaved slaves and
47
00:02:06,920 –> 00:02:12,640
helped Define principles of Robotics her
48
00:02:10,679 –> 00:02:15,120
research has appeared in Science
49
00:02:12,640 –> 00:02:18,680
magazine and she consulted organizations
50
00:02:15,120 –> 00:02:21,480
like redcross uh on autonomous
51
00:02:18,680 –> 00:02:23,400
autonomous weapons Joanna’s
52
00:02:21,480 –> 00:02:25,640
contributions to the field have been
53
00:02:23,400 –> 00:02:28,640
recognized with an outstanding
54
00:02:25,640 –> 00:02:30,879
Achievement Award from cognition X in
55
00:02:28,640 –> 00:02:34,480
2017
56
00:02:30,879 –> 00:02:36,879
I’m excited to have her joining us today
57
00:02:34,480 –> 00:02:39,519
uh to share her expertise on this very
58
00:02:36,879 –> 00:02:43,800
crucial topic
59
00:02:39,519 –> 00:02:45,560
enjoy Joanna so nice to finally meet you
60
00:02:43,800 –> 00:02:47,720
and thank you thank you so much for
61
00:02:45,560 –> 00:02:49,080
finding time to do this nice to meet you
62
00:02:47,720 –> 00:02:52,959
too
63
00:02:49,080 –> 00:02:55,280
camela where do I even begin there is so
64
00:02:52,959 –> 00:02:57,840
much so much things you you’ve written
65
00:02:55,280 –> 00:03:01,319
about so many speeches I had to go
66
00:02:57,840 –> 00:03:04,000
through I wanted to um uh so definitely
67
00:03:01,319 –> 00:03:06,680
we will just cover a little bit of of
68
00:03:04,000 –> 00:03:09,560
what you’ve been working on and and
69
00:03:06,680 –> 00:03:13,040
doing but I guess the easiest would be
70
00:03:09,560 –> 00:03:15,480
to start from the beginning so your
71
00:03:13,040 –> 00:03:18,920
childhood uh on one of the
72
00:03:15,480 –> 00:03:24,280
interviews on one of the interviews you
73
00:03:18,920 –> 00:03:27,280
said that you became interested um in
74
00:03:24,280 –> 00:03:30,760
animal intelligence that’s what led you
75
00:03:27,280 –> 00:03:33,760
later to to studying behavioral um
76
00:03:30,760 –> 00:03:38,080
like Sciences U psychology and and
77
00:03:33,760 –> 00:03:41,000
eventually uh working on AI but what was
78
00:03:38,080 –> 00:03:42,439
the the spark what was the uh the
79
00:03:41,000 –> 00:03:45,040
trigger
80
00:03:42,439 –> 00:03:48,760
to like what what what made you
81
00:03:45,040 –> 00:03:51,439
interested in this you know I I it’s
82
00:03:48,760 –> 00:03:53,920
hard to say that I mean what I just find
83
00:03:51,439 –> 00:03:55,840
uh well life interesting but Behavior
84
00:03:53,920 –> 00:03:58,599
interesting I think differen is
85
00:03:55,840 –> 00:04:01,760
interesting and we I was fortunate to
86
00:03:58,599 –> 00:04:03,360
live my P my family backed up uh moved
87
00:04:01,760 –> 00:04:05,239
into a place that backed up onto a
88
00:04:03,360 –> 00:04:08,040
little Pond I mean it was that we called
89
00:04:05,239 –> 00:04:10,319
it the Lagoon but it was really it it
90
00:04:08,040 –> 00:04:12,079
was storm it was uh water from the
91
00:04:10,319 –> 00:04:13,959
storms from the street you know that
92
00:04:12,079 –> 00:04:16,079
went into this little but there was
93
00:04:13,959 –> 00:04:19,239
croads and you know we had snakes and
94
00:04:16,079 –> 00:04:20,600
frogs and whatever and um yeah I was
95
00:04:19,239 –> 00:04:21,919
just fascinated by all these different
96
00:04:20,600 –> 00:04:23,520
things and I actually think it’s a real
97
00:04:21,919 –> 00:04:25,520
shame now that people don’t have the
98
00:04:23,520 –> 00:04:26,720
experience of other living intelligent
99
00:04:25,520 –> 00:04:28,120
beings you know it would have been
100
00:04:26,720 –> 00:04:29,720
totally different to grow up in a time
101
00:04:28,120 –> 00:04:31,960
with horses and dog I mean we do have do
102
00:04:29,720 –> 00:04:35,000
dogs kind of but they’re really obedient
103
00:04:31,960 –> 00:04:37,440
so I think I think we have
104
00:04:35,000 –> 00:04:40,440
uh uh we don’t see that many animals in
105
00:04:37,440 –> 00:04:42,440
their natural uh ecosystem anymore but I
106
00:04:40,440 –> 00:04:44,560
was also interested in people but I
107
00:04:42,440 –> 00:04:45,800
figured out that when I told my parents
108
00:04:44,560 –> 00:04:47,080
you know things that I thought about
109
00:04:45,800 –> 00:04:48,479
animals that they thought that was
110
00:04:47,080 –> 00:04:49,800
interesting and if I told them things I
111
00:04:48,479 –> 00:04:53,120
thought about people they said I was
112
00:04:49,800 –> 00:04:55,479
wrong and so I figured out people were
113
00:04:53,120 –> 00:04:57,840
defensive one thing that’s funny and
114
00:04:55,479 –> 00:04:59,400
relevant to AI that as L were talking
115
00:04:57,840 –> 00:05:01,560
about my childhood which most people
116
00:04:59,400 –> 00:05:04,120
don’t ask about I I was just thinking
117
00:05:01,560 –> 00:05:07,080
the other day and I don’t know why about
118
00:05:04,120 –> 00:05:08,800
um about how I used to worry about
119
00:05:07,080 –> 00:05:10,880
whether if my stuffed animals were alive
120
00:05:08,800 –> 00:05:13,280
they would like me or not because people
121
00:05:10,880 –> 00:05:15,080
real living people did not like me and
122
00:05:13,280 –> 00:05:17,400
and I just assumed all my animals would
123
00:05:15,080 –> 00:05:19,680
like me but I didn’t know if it was
124
00:05:17,400 –> 00:05:22,960
true yeah actually I I played very
125
00:05:19,680 –> 00:05:25,360
similar game where I was um letting my
126
00:05:22,960 –> 00:05:27,680
cousin who was more more or less my age
127
00:05:25,360 –> 00:05:30,080
uh believe that when she closes the her
128
00:05:27,680 –> 00:05:33,240
eyes uh all the stuffed animals actually
129
00:05:30,080 –> 00:05:36,680
alive and that she has to behave uh for
130
00:05:33,240 –> 00:05:38,280
them to to to like her yes I don’t know
131
00:05:36,680 –> 00:05:39,880
if she still believes
132
00:05:38,280 –> 00:05:43,600
it
133
00:05:39,880 –> 00:05:47,000
maybe um okay and so so then what led
134
00:05:43,600 –> 00:05:48,479
you to you know to study this um
135
00:05:47,000 –> 00:05:50,199
professionally I mean the first time I
136
00:05:48,479 –> 00:05:52,160
really got to do it I I would love to
137
00:05:50,199 –> 00:05:53,440
have done it at school but most you know
138
00:05:52,160 –> 00:05:56,240
High School grade school that you don’t
139
00:05:53,440 –> 00:05:59,639
get to do that except at science fair so
140
00:05:56,240 –> 00:06:02,560
I did study uh uh dog communication at
141
00:05:59,639 –> 00:06:04,960
for sign and I got to uh I got a I got a
142
00:06:02,560 –> 00:06:08,199
first at the Illinois state level but
143
00:06:04,960 –> 00:06:11,599
didn’t get to go to Nationals um and uh
144
00:06:08,199 –> 00:06:13,680
I also studied um snail intelligence oh
145
00:06:11,599 –> 00:06:16,199
wow what’s what’s what’s interesting
146
00:06:13,680 –> 00:06:17,720
about that well I they just I read
147
00:06:16,199 –> 00:06:19,759
somewhere that worms could solve a te-
148
00:06:17,720 –> 00:06:21,400
maze and I wondered if snails could so
149
00:06:19,759 –> 00:06:23,199
we we found out that they tended to
150
00:06:21,400 –> 00:06:26,880
crawl out of the maze it wasn’t very
151
00:06:23,199 –> 00:06:29,960
exciting anyway okay so then I went to a
152
00:06:26,880 –> 00:06:31,599
University of Chicago and uh and I
153
00:06:29,960 –> 00:06:33,240
actually my best teacher at high school
154
00:06:31,599 –> 00:06:35,919
had been physics so I actually started
155
00:06:33,240 –> 00:06:38,039
out in physics but I wasn’t as thrilled
156
00:06:35,919 –> 00:06:39,560
to do because I managed to have three
157
00:06:38,039 –> 00:06:41,960
years of physics at high school two
158
00:06:39,560 –> 00:06:44,039
years of kempis and one year physics um
159
00:06:41,960 –> 00:06:45,800
so I wasn’t that thrilled to do sort of
160
00:06:44,039 –> 00:06:48,479
all the same physics over again only
161
00:06:45,800 –> 00:06:49,919
with Calculus I found that uh tedious
162
00:06:48,479 –> 00:06:52,360
and then I remembered that I’d always
163
00:06:49,919 –> 00:06:54,000
wanted to do um behavior and and there
164
00:06:52,360 –> 00:06:56,800
was something called Behavioral Science
165
00:06:54,000 –> 00:06:58,599
there um which at that time it’s really
166
00:06:56,800 –> 00:07:00,759
annoying they’ve changed what it means
167
00:06:58,599 –> 00:07:02,599
but at that time it the non-clinical
168
00:07:00,759 –> 00:07:05,479
psychology and and of all its
169
00:07:02,599 –> 00:07:07,680
disciplines so I I particularly liked um
170
00:07:05,479 –> 00:07:09,720
you know behavioral ecology um I don’t
171
00:07:07,680 –> 00:07:10,840
think it was called that I anyway there
172
00:07:09,720 –> 00:07:13,759
was something about the evolution of
173
00:07:10,840 –> 00:07:15,160
behavior and there was also um a course
174
00:07:13,759 –> 00:07:18,039
uh called developmental neuros
175
00:07:15,160 –> 00:07:20,120
psychology so really how how children
176
00:07:18,039 –> 00:07:22,840
start how how it works how children
177
00:07:20,120 –> 00:07:25,919
start working and you know various other
178
00:07:22,840 –> 00:07:28,160
uh neuroscience and and um ecological
179
00:07:25,919 –> 00:07:29,800
perspectives as well as you know one
180
00:07:28,160 –> 00:07:33,680
some social psychology some Clin iCal
181
00:07:29,800 –> 00:07:35,680
psychology MH and you because I remember
182
00:07:33,680 –> 00:07:38,319
you mentioned in some of the interviews
183
00:07:35,680 –> 00:07:42,080
I saw um that you were also considering
184
00:07:38,319 –> 00:07:44,120
neuro uh science right um I never
185
00:07:42,080 –> 00:07:45,840
seriously took consider Neuroscience
186
00:07:44,120 –> 00:07:47,800
because it’s unfortunate but in the in
187
00:07:45,840 –> 00:07:51,520
the United States if you have anything
188
00:07:47,800 –> 00:07:55,840
like that um you wind up fighting with
189
00:07:51,520 –> 00:07:57,879
uh with preds and for grades and and it
190
00:07:55,840 –> 00:08:00,199
just so it just puts people off from
191
00:07:57,879 –> 00:08:02,080
studying even botney you you just can’t
192
00:08:00,199 –> 00:08:04,240
get near the preds without having a
193
00:08:02,080 –> 00:08:08,240
miserable life that they put on
194
00:08:04,240 –> 00:08:10,960
themselves um so so I don’t that I I did
195
00:08:08,240 –> 00:08:12,800
think about it when I was at MIT uh
196
00:08:10,960 –> 00:08:15,680
because I I attended all the
197
00:08:12,800 –> 00:08:17,800
Neuroscience talks um and I thought
198
00:08:15,680 –> 00:08:19,759
about trying to switch over to that but
199
00:08:17,800 –> 00:08:21,120
in the end of the day I I was barely
200
00:08:19,759 –> 00:08:22,520
making it in computer science which I
201
00:08:21,120 –> 00:08:25,080
knew something about so I thought I
202
00:08:22,520 –> 00:08:28,319
should better stay there um but I did I
203
00:08:25,080 –> 00:08:30,800
I have been in um uh technically I’ve
204
00:08:28,319 –> 00:08:33,320
been in supposedly beh Neuroscience Labs
205
00:08:30,800 –> 00:08:35,399
twice both both with non-human primates
206
00:08:33,320 –> 00:08:36,680
and both with nothing um invasive
207
00:08:35,399 –> 00:08:39,080
happening so I don’t know why they were
208
00:08:36,680 –> 00:08:42,519
called that um but both at Harvard and
209
00:08:39,080 –> 00:08:44,399
and uh sh and at MIT I went to a lot of
210
00:08:42,519 –> 00:08:47,720
Neuroscience talks if I could Chicago I
211
00:08:44,399 –> 00:08:49,640
took the classes I could in it um and uh
212
00:08:47,720 –> 00:08:52,480
yeah they didn’t have anything like that
213
00:08:49,640 –> 00:08:54,480
at at uh bath at all and of course har
214
00:08:52,480 –> 00:08:57,000
school doesn’t have any neural science
215
00:08:54,480 –> 00:08:58,839
um but um but you know whenever I I get
216
00:08:57,000 –> 00:09:00,440
an opportunity I I love going to that
217
00:08:58,839 –> 00:09:04,000
it’s the same with like microbiology
218
00:09:00,440 –> 00:09:06,519
that it’s the evolution of behavior at
219
00:09:04,000 –> 00:09:07,880
any level isconnected yeah if you’re
220
00:09:06,519 –> 00:09:09,360
trying to understand machine learning
221
00:09:07,880 –> 00:09:11,560
and you don’t think it’s relevant that
222
00:09:09,360 –> 00:09:13,760
bacteria will sacrifice themselves for
223
00:09:11,560 –> 00:09:17,160
each other then you’re missing something
224
00:09:13,760 –> 00:09:18,920
you know uh you know so I I really loved
225
00:09:17,160 –> 00:09:21,240
uh all the opportunities I had that was
226
00:09:18,920 –> 00:09:23,240
something be was it worked well because
227
00:09:21,240 –> 00:09:25,760
it’s relevant to antibiotic resistance
228
00:09:23,240 –> 00:09:27,480
and things so a lot of good work on
229
00:09:25,760 –> 00:09:30,399
altruism and Cooperative behavior and
230
00:09:27,480 –> 00:09:31,200
things uh not only in uh but also in
231
00:09:30,399 –> 00:09:34,760
fruit
232
00:09:31,200 –> 00:09:36,279
flies but at bath but yeah people in in
233
00:09:34,760 –> 00:09:38,560
the UK at that time were it was very
234
00:09:36,279 –> 00:09:40,920
hard for them to study anything much
235
00:09:38,560 –> 00:09:44,959
bigger than fruit fly and and why did
236
00:09:40,920 –> 00:09:47,760
you um choose the academic uh Academia
237
00:09:44,959 –> 00:09:49,480
path rather than you know I always
238
00:09:47,760 –> 00:09:51,519
wanted that I mean I originally wanted
239
00:09:49,480 –> 00:09:54,560
to be a paleontologist but always always
240
00:09:51,519 –> 00:09:56,000
wanted to be a scientist um yeah so
241
00:09:54,560 –> 00:09:57,720
actually the the more interesting thing
242
00:09:56,000 –> 00:09:58,959
was that how I wound up in industry for
243
00:09:57,720 –> 00:10:00,640
a few years and that was because I
244
00:09:58,959 –> 00:10:02,920
couldn’t choose which discipline to go
245
00:10:00,640 –> 00:10:05,000
into so I thought I should pay off my
246
00:10:02,920 –> 00:10:07,360
debts and and see if I could make it in
247
00:10:05,000 –> 00:10:08,920
rock and roll and figure out which
248
00:10:07,360 –> 00:10:10,760
science I missed the most and then I
249
00:10:08,920 –> 00:10:12,760
still couldn’t figure out so I took AI
250
00:10:10,760 –> 00:10:14,079
is sort of halfway in between what I
251
00:10:12,760 –> 00:10:16,399
really cared about I still cared about
252
00:10:14,079 –> 00:10:17,600
in Behavior but what I was good at I
253
00:10:16,399 –> 00:10:19,160
knew I was an exceptionally good
254
00:10:17,600 –> 00:10:20,880
programmer and and I figured it would
255
00:10:19,160 –> 00:10:22,160
help me get into a good degree if I did
256
00:10:20,880 –> 00:10:23,839
what I was exceptionally good at and
257
00:10:22,160 –> 00:10:25,560
indeed I got into MIT so I couldn’t have
258
00:10:23,839 –> 00:10:29,079
been happier about
259
00:10:25,560 –> 00:10:31,640
that but you yeah but you end but you
260
00:10:29,079 –> 00:10:35,120
end ended up in in the right uh the
261
00:10:31,640 –> 00:10:37,920
prime time right now uh for AI so I’ve
262
00:10:35,120 –> 00:10:41,079
been in AI since uh well arguably since
263
00:10:37,920 –> 00:10:43,639
1986 I I tutored the first class was
264
00:10:41,079 –> 00:10:46,160
taught by uh what was his name Chris hen
265
00:10:43,639 –> 00:10:48,600
or something like that that they that
266
00:10:46,160 –> 00:10:50,160
the University of Chicago offered an AI
267
00:10:48,600 –> 00:10:51,519
um so I’ve been in it for a very long
268
00:10:50,160 –> 00:10:52,959
time and people used to tell me I should
269
00:10:51,519 –> 00:10:55,920
take it off my CV because it made me
270
00:10:52,959 –> 00:10:57,760
look dumb you know so now now the time
271
00:10:55,920 –> 00:10:59,800
has come to me but it’s not like now you
272
00:10:57,760 –> 00:11:01,959
made it involved
273
00:10:59,800 –> 00:11:03,760
I’m like unlike some people who who are
274
00:11:01,959 –> 00:11:06,519
are suddenly AI
275
00:11:03,760 –> 00:11:09,600
experts yeah it happens with all the you
276
00:11:06,519 –> 00:11:11,120
know big um topics right like we had
277
00:11:09,600 –> 00:11:12,600
interesting you know I was the kind like
278
00:11:11,120 –> 00:11:14,680
when I played soccer I was the kind of
279
00:11:12,600 –> 00:11:16,600
kid that that tried to go find a big
280
00:11:14,680 –> 00:11:18,240
open space in the field so that if the
281
00:11:16,600 –> 00:11:20,480
ball went anywhere I might get to do it
282
00:11:18,240 –> 00:11:22,560
I didn’t fight other kids for the ball
283
00:11:20,480 –> 00:11:26,200
and I was like that in in Academia for a
284
00:11:22,560 –> 00:11:28,600
long time too and then um s suddenly
285
00:11:26,200 –> 00:11:30,160
other stuff started coming at me and and
286
00:11:28,600 –> 00:11:31,880
towards me and it’s interesting because
287
00:11:30,160 –> 00:11:34,079
it’s not just the other academics it’s
288
00:11:31,880 –> 00:11:36,440
also like sudden there’s like almost
289
00:11:34,079 –> 00:11:38,959
like flies swarming around you the uh
290
00:11:36,440 –> 00:11:40,720
the conses and things so all these start
291
00:11:38,959 –> 00:11:42,079
these weird things start happening and
292
00:11:40,720 –> 00:11:43,760
trying to figure out what matters and
293
00:11:42,079 –> 00:11:45,760
what doesn’t but at some point I just
294
00:11:43,760 –> 00:11:47,880
realized it was it was time to start you
295
00:11:45,760 –> 00:11:50,040
know stop avoiding it that I I I had
296
00:11:47,880 –> 00:11:52,760
enough uh confidence and seniority that
297
00:11:50,040 –> 00:11:54,720
I could stick in something that was
298
00:11:52,760 –> 00:11:56,519
popular sometimes feel like maybe I’m
299
00:11:54,720 –> 00:11:58,040
not putting enough effort into you know
300
00:11:56,519 –> 00:12:00,560
like I see people self-promoting right
301
00:11:58,040 –> 00:12:02,560
over the top of my work and I think you
302
00:12:00,560 –> 00:12:03,959
know what should I be doing that but I’m
303
00:12:02,560 –> 00:12:06,200
just too interested I’ve always been
304
00:12:03,959 –> 00:12:09,399
interested and so I i’ I chased the
305
00:12:06,200 –> 00:12:12,839
science more no no it’s amazing because
306
00:12:09,399 –> 00:12:15,199
you right most of the scientists
307
00:12:12,839 –> 00:12:17,600
unfortunately you know turn to the other
308
00:12:15,199 –> 00:12:20,639
side so they they become advisors or
309
00:12:17,600 –> 00:12:22,440
they they just you know go on AAL that’s
310
00:12:20,639 –> 00:12:26,839
well maybe I mean the thing is there’s
311
00:12:22,440 –> 00:12:28,920
such a um that who you hear from is
312
00:12:26,839 –> 00:12:30,880
different it is certainly true that I do
313
00:12:28,920 –> 00:12:32,800
not um bring in funding the way an
314
00:12:30,880 –> 00:12:35,199
academic is supposed to and I was
315
00:12:32,800 –> 00:12:37,079
fortunate to find a school that didn’t
316
00:12:35,199 –> 00:12:38,519
care actually bath had sort of decided
317
00:12:37,079 –> 00:12:41,160
they didn’t care anymore either it
318
00:12:38,519 –> 00:12:43,079
became evident that I was you know again
319
00:12:41,160 –> 00:12:47,000
I was like the only AI person there for
320
00:12:43,079 –> 00:12:48,800
for like I don’t know 15 years and um
321
00:12:47,000 –> 00:12:50,480
and when they started to realize how
322
00:12:48,800 –> 00:12:52,760
important I was because other people
323
00:12:50,480 –> 00:12:54,360
started making my field important then I
324
00:12:52,760 –> 00:12:57,399
mean I’d always been well connected in
325
00:12:54,360 –> 00:12:59,399
the field um I got you know but but the
326
00:12:57,399 –> 00:13:03,000
um sorry I digressed too much might want
327
00:12:59,399 –> 00:13:04,959
to cut no no it’s fine it’s anyway so so
328
00:13:03,000 –> 00:13:06,480
yeah I’d I’d always been reasonably big
329
00:13:04,959 –> 00:13:08,279
in the field but bath didn’t have anyone
330
00:13:06,480 –> 00:13:10,000
there to recognized that and then when
331
00:13:08,279 –> 00:13:12,240
they started oh when I was on news night
332
00:13:10,000 –> 00:13:13,920
that was the main thing so anyway they
333
00:13:12,240 –> 00:13:15,399
they once they started realizing that
334
00:13:13,920 –> 00:13:17,240
the field mattered and stuff then they
335
00:13:15,399 –> 00:13:19,399
were pretty tolerant of the fact that I
336
00:13:17,240 –> 00:13:21,160
wasn’t bringing in the money myself and
337
00:13:19,399 –> 00:13:22,839
both bath and heris sko have managed to
338
00:13:21,160 –> 00:13:24,399
bring in a lot of money doing exactly
339
00:13:22,839 –> 00:13:26,720
what I do but it wasn’t me that brought
340
00:13:24,399 –> 00:13:28,279
it in and I keep trying to write grants
341
00:13:26,720 –> 00:13:29,920
actually more about the Behavioral
342
00:13:28,279 –> 00:13:31,920
Science and less about about the AI and
343
00:13:29,920 –> 00:13:34,000
I’m not as successful but but I’m happy
344
00:13:31,920 –> 00:13:36,680
to help my institutions right now since
345
00:13:34,000 –> 00:13:39,920
you are based in in Berlin do you feel
346
00:13:36,680 –> 00:13:43,199
there is more um attention and more
347
00:13:39,920 –> 00:13:45,440
resources put into um learning uh
348
00:13:43,199 –> 00:13:50,800
academic learning uh around
349
00:13:45,440 –> 00:13:52,360
AI um well not not uh originally no the
350
00:13:50,800 –> 00:13:55,240
the thing part of the reason I’m here is
351
00:13:52,360 –> 00:13:56,959
because of brexit because uh you know I
352
00:13:55,240 –> 00:13:58,759
I had not originally thought about you
353
00:13:56,959 –> 00:14:01,040
know governing AI it was not one of my
354
00:13:58,759 –> 00:14:04,320
conc concerns but I got sucked into it
355
00:14:01,040 –> 00:14:05,880
by people who who uh did notice and
356
00:14:04,320 –> 00:14:07,480
worry about those things and I’m glad
357
00:14:05,880 –> 00:14:10,440
they did you know people who were policy
358
00:14:07,480 –> 00:14:12,320
aware and so the Britain was a fantastic
359
00:14:10,440 –> 00:14:15,600
place to be involved in this I was very
360
00:14:12,320 –> 00:14:18,639
much uh mentored uh you know and and and
361
00:14:15,600 –> 00:14:20,839
and included and that was great um but
362
00:14:18,639 –> 00:14:22,880
uh the whole brexit situation is just
363
00:14:20,839 –> 00:14:24,600
insane and not just brexit itself but
364
00:14:22,880 –> 00:14:27,920
also what the conservative government is
365
00:14:24,600 –> 00:14:30,399
doing to Academia um and and and indeed
366
00:14:27,920 –> 00:14:32,600
to to digital Reg digital governance
367
00:14:30,399 –> 00:14:34,680
there was immense competence in the UK
368
00:14:32,600 –> 00:14:36,120
I’m just having some people I’m debating
369
00:14:34,680 –> 00:14:37,440
with some people in a think tank right
370
00:14:36,120 –> 00:14:39,480
now because they’re saying oh nobody in
371
00:14:37,440 –> 00:14:41,480
AI knows anything and you know there’s
372
00:14:39,480 –> 00:14:43,360
no connection and like no what are you
373
00:14:41,480 –> 00:14:44,720
talking about that you know the Royal
374
00:14:43,360 –> 00:14:46,360
Society has done a very good job of
375
00:14:44,720 –> 00:14:48,560
appointing really well-connected people
376
00:14:46,360 –> 00:14:51,639
in there’s been people doing policy for
377
00:14:48,560 –> 00:14:54,360
decades you know there was you know
378
00:14:51,639 –> 00:14:56,320
there was the the lead of the of the the
379
00:14:54,360 –> 00:14:58,519
appropriate Ministry uh I think it was
380
00:14:56,320 –> 00:15:01,000
called dcms at the time you know Tom
381
00:14:58,519 –> 00:15:04,480
Robin very very you know everybody knew
382
00:15:01,000 –> 00:15:06,519
what was going on and yet um but it just
383
00:15:04,480 –> 00:15:08,399
it’s active destruction if they’re if
384
00:15:06,519 –> 00:15:11,440
they’re taking if you know sunak is
385
00:15:08,399 –> 00:15:13,199
taking advice from Elon Musk instead of
386
00:15:11,440 –> 00:15:15,000
you know these leading academics that’s
387
00:15:13,199 –> 00:15:16,920
his choice and if he’s disrupting there
388
00:15:15,000 –> 00:15:19,079
were like in fact there was maybe too
389
00:15:16,920 –> 00:15:21,040
many there was like four or five uh
390
00:15:19,079 –> 00:15:23,399
centers of excellence and that were
391
00:15:21,040 –> 00:15:25,920
working on the on the AI and digital
392
00:15:23,399 –> 00:15:27,680
governance within the UK government and
393
00:15:25,920 –> 00:15:29,040
and if they don’t have competent if they
394
00:15:27,680 –> 00:15:31,279
don’t have links to people who know
395
00:15:29,040 –> 00:15:32,959
what’s going on there then it’s because
396
00:15:31,279 –> 00:15:35,519
well honestly I think it’s corruption
397
00:15:32,959 –> 00:15:38,000
because because the the big Tech really
398
00:15:35,519 –> 00:15:40,279
does not want to be regulated even
399
00:15:38,000 –> 00:15:41,680
though the gdpr massively improved their
400
00:15:40,279 –> 00:15:43,079
business case I mean what are
401
00:15:41,680 –> 00:15:45,560
governments doing when they regulate
402
00:15:43,079 –> 00:15:49,040
they’re trying to strengthen economies
403
00:15:45,560 –> 00:15:51,600
right and the and the US uh West Coast
404
00:15:49,040 –> 00:15:53,959
uh tech companies are just assuming that
405
00:15:51,600 –> 00:15:56,399
that this is adversarial that that it’s
406
00:15:53,959 –> 00:15:59,519
that it’s uh zero sum for reasons that
407
00:15:56,399 –> 00:16:01,519
are not at all clear um but I think that
408
00:15:59,519 –> 00:16:04,399
actually the digital regulation that
409
00:16:01,519 –> 00:16:07,440
they are now subjecting themselves to in
410
00:16:04,399 –> 00:16:10,399
the EU will actually help them running
411
00:16:07,440 –> 00:16:12,720
their own businesses and and uh well
412
00:16:10,399 –> 00:16:15,160
besides expanding their access to the EU
413
00:16:12,720 –> 00:16:18,880
and growing our digital economy as did
414
00:16:15,160 –> 00:16:20,680
the G so I think um I think it’s insane
415
00:16:18,880 –> 00:16:24,800
that they fought so hard against us and
416
00:16:20,680 –> 00:16:27,759
I think that if the UK is uh backing the
417
00:16:24,800 –> 00:16:29,560
wrong horses on this it it’s not just
418
00:16:27,759 –> 00:16:31,720
you know like a tragedy from the British
419
00:16:29,560 –> 00:16:33,920
perspective it’s also a huge security
420
00:16:31,720 –> 00:16:37,160
Hazard I mean it’s that you know they’ve
421
00:16:33,920 –> 00:16:38,920
taken uh half the H literally like half
422
00:16:37,160 –> 00:16:41,399
the military out of the EU right it’s
423
00:16:38,920 –> 00:16:43,319
basically France and ger and and Britain
424
00:16:41,399 –> 00:16:44,880
were the only ones that really had the
425
00:16:43,319 –> 00:16:47,079
active you know militaries I mean
426
00:16:44,880 –> 00:16:48,720
Germany has some capacities but yeah and
427
00:16:47,079 –> 00:16:50,639
obviously Finland and every there’s a
428
00:16:48,720 –> 00:16:53,720
bunch of like you know things but the
429
00:16:50,639 –> 00:16:56,639
biggest things were Britain and uh
430
00:16:53,720 –> 00:16:58,440
France and and now and also Britain was
431
00:16:56,639 –> 00:17:00,360
really leading in a lot of areas of
432
00:16:58,440 –> 00:17:04,480
governance as well and the including the
433
00:17:00,360 –> 00:17:05,839
digital sphere so it’s it’s um yeah it’s
434
00:17:04,480 –> 00:17:08,679
just insane if people are saying they
435
00:17:05,839 –> 00:17:10,919
can’t find confidence in you came anyway
436
00:17:08,679 –> 00:17:14,000
no no no you’re completely right even
437
00:17:10,919 –> 00:17:16,480
even affects earlier stage meaning like
438
00:17:14,000 –> 00:17:20,160
erasmos for example right it was so easy
439
00:17:16,480 –> 00:17:23,799
to to to trans transfer knowledge and
440
00:17:20,160 –> 00:17:27,480
and exchange and exchange talent and uh
441
00:17:23,799 –> 00:17:29,840
right now I’m I’m with you on this and
442
00:17:27,480 –> 00:17:33,480
and even really the I don’t mean just
443
00:17:29,840 –> 00:17:35,880
that Berlin is not a s it is also it is
444
00:17:33,480 –> 00:17:37,280
also actively really interesting so I
445
00:17:35,880 –> 00:17:40,320
think it’s really interesting why they
446
00:17:37,280 –> 00:17:44,120
had so little digital competence before
447
00:17:40,320 –> 00:17:46,679
um and I I I actually have a suspicion
448
00:17:44,120 –> 00:17:49,840
this is the conjecture let’s say that
449
00:17:46,679 –> 00:17:51,559
both Russia and Germany um they felt
450
00:17:49,840 –> 00:17:53,600
more secure about having an economy
451
00:17:51,559 –> 00:17:56,000
that’s grounded in the land so that
452
00:17:53,600 –> 00:17:57,440
people just can’t leave right so Putin
453
00:17:56,000 –> 00:17:59,159
you know if anything seems to be happy
454
00:17:57,440 –> 00:18:01,840
to have all the digital people chased
455
00:17:59,159 –> 00:18:04,240
out of out of uh Russia because they
456
00:18:01,840 –> 00:18:07,039
tend to be liberal and and I think
457
00:18:04,240 –> 00:18:09,280
Germany was either you know suspicious
458
00:18:07,039 –> 00:18:11,280
of or the government had a lot of you
459
00:18:09,280 –> 00:18:12,600
know hearing of you know the digital
460
00:18:11,280 –> 00:18:14,799
because again it’s very highly portable
461
00:18:12,600 –> 00:18:16,520
it’s new and they they had such a strong
462
00:18:14,799 –> 00:18:19,360
you know car and that kind of
463
00:18:16,520 –> 00:18:23,320
manufacturing petrochemical Industries
464
00:18:19,360 –> 00:18:27,080
so I think that um but but I think
465
00:18:23,320 –> 00:18:29,280
Germany realizes the value of of the
466
00:18:27,080 –> 00:18:32,000
digital economy but more than that
467
00:18:29,280 –> 00:18:34,120
sees the incredible importance of having
468
00:18:32,000 –> 00:18:35,360
a stable Europe and to have a stable
469
00:18:34,120 –> 00:18:38,400
Europe you have to have a well-governed
470
00:18:35,360 –> 00:18:40,960
digital sphere um and so they get that
471
00:18:38,400 –> 00:18:43,440
and so um so it’s been really just
472
00:18:40,960 –> 00:18:44,960
amazing and to come here obviously it’s
473
00:18:43,440 –> 00:18:47,440
a bigger city than bath you know I have
474
00:18:44,960 –> 00:18:49,400
a little more to do uh but but but
475
00:18:47,440 –> 00:18:53,600
mostly it’s just incredible to be
476
00:18:49,400 –> 00:18:56,360
talking to um so many Ministries and you
477
00:18:53,600 –> 00:18:58,280
know so many people so many embassies
478
00:18:56,360 –> 00:19:00,240
including the US Embassy I’m originally
479
00:18:58,280 –> 00:19:03,440
American I still hold both British and
480
00:19:00,240 –> 00:19:05,600
American passports and uh but I’ve never
481
00:19:03,440 –> 00:19:07,360
been had America ask me so many
482
00:19:05,600 –> 00:19:10,039
questions and you know bring me into so
483
00:19:07,360 –> 00:19:11,880
many events so they they also are paying
484
00:19:10,039 –> 00:19:13,320
more attention because I’m in Berlin I
485
00:19:11,880 –> 00:19:14,559
don’t know if I’d been in London if I
486
00:19:13,320 –> 00:19:15,919
would have been invited to more stuff
487
00:19:14,559 –> 00:19:17,760
you know because it’s just but you know
488
00:19:15,919 –> 00:19:20,280
baath is only an hour and a half out of
489
00:19:17,760 –> 00:19:22,080
London so I do think it’s more you know
490
00:19:20,280 –> 00:19:23,679
partly AI becoming a bigger deal but
491
00:19:22,080 –> 00:19:25,400
more that like oh you know if you’re in
492
00:19:23,679 –> 00:19:27,440
the UK you’re not doing too much damage
493
00:19:25,400 –> 00:19:30,320
wait why are you in
494
00:19:27,440 –> 00:19:31,919
Germany yes and then you know France is
495
00:19:30,320 –> 00:19:33,840
trying to lead as well they’re they’re
496
00:19:31,919 –> 00:19:36,799
are quite having quite a big success
497
00:19:33,840 –> 00:19:38,240
successes I I actually really quite like
498
00:19:36,799 –> 00:19:40,000
oh it’s so interesting actually I really
499
00:19:38,240 –> 00:19:41,679
like the French policy although it’s
500
00:19:40,000 –> 00:19:43,280
quite Divergent and they’ve made some
501
00:19:41,679 –> 00:19:46,080
big mistakes too I mean everybody makes
502
00:19:43,280 –> 00:19:47,320
some mistakes um but in general it’s so
503
00:19:46,080 –> 00:19:48,520
interesting to get a better
504
00:19:47,320 –> 00:19:50,400
understanding of Germany and who it’s
505
00:19:48,520 –> 00:19:52,400
been in historically I mean we all know
506
00:19:50,400 –> 00:19:54,280
like two or three stories of the history
507
00:19:52,400 –> 00:19:55,600
but a lot of people forget that like
508
00:19:54,280 –> 00:19:58,120
actually the Germans invented the
509
00:19:55,600 –> 00:20:01,320
welfare state you know and and the fact
510
00:19:58,120 –> 00:20:03,679
that they have the um their labor unions
511
00:20:01,320 –> 00:20:05,720
at the sea Suite is a consequence of
512
00:20:03,679 –> 00:20:07,880
1918 that they were worried about
513
00:20:05,720 –> 00:20:10,480
Revolution here and apparently the
514
00:20:07,880 –> 00:20:12,080
Marshall fund is partly because both
515
00:20:10,480 –> 00:20:14,080
Russia and America were afraid of
516
00:20:12,080 –> 00:20:15,720
another Revolution here and they really
517
00:20:14,080 –> 00:20:18,480
really so they just threw money at
518
00:20:15,720 –> 00:20:20,400
trying to there were a lot of people
519
00:20:18,480 –> 00:20:23,120
that were more democratically oriented
520
00:20:20,400 –> 00:20:25,280
than than the than the fascists um and
521
00:20:23,120 –> 00:20:28,919
so it wasn’t hard to find people to to
522
00:20:25,280 –> 00:20:29,919
back up you know um but anyway no it’s
523
00:20:28,919 –> 00:20:31,520
been really interesting because the
524
00:20:29,919 –> 00:20:33,520
French really do have a different
525
00:20:31,520 –> 00:20:35,400
perspective and I I do think it’s really
526
00:20:33,520 –> 00:20:36,559
important to have that diversity and so
527
00:20:35,400 –> 00:20:38,559
this is one of the things I’ve been
528
00:20:36,559 –> 00:20:40,600
telling America um when when they talk
529
00:20:38,559 –> 00:20:42,520
to me because they they they really want
530
00:20:40,600 –> 00:20:44,440
to see that they say hey look we are the
531
00:20:42,520 –> 00:20:46,240
most AI in the world and it’s true you
532
00:20:44,440 –> 00:20:50,120
know at the end of World War II the
533
00:20:46,240 –> 00:20:52,280
world uh the the US had um more than
534
00:20:50,120 –> 00:20:53,799
half the world’s GDP and that’s no
535
00:20:52,280 –> 00:20:59,240
longer true it’s more like
536
00:20:53,799 –> 00:21:01,640
24% but if you look at um Payton’s uh
537
00:20:59,240 –> 00:21:04,760
uh and and also the market cap of the
538
00:21:01,640 –> 00:21:08,000
companies are holding those patents um
539
00:21:04,760 –> 00:21:09,480
they’re they’re at about 50% now um of
540
00:21:08,000 –> 00:21:10,880
the of the world so there’s they’re
541
00:21:09,480 –> 00:21:12,120
dominating like they used to dominate
542
00:21:10,880 –> 00:21:14,159
the whole world that’s how they dominate
543
00:21:12,120 –> 00:21:16,080
AI so I can see why they’re saying oh
544
00:21:14,159 –> 00:21:17,440
look it’s working out well for us but at
545
00:21:16,080 –> 00:21:19,559
the same time those two measures are
546
00:21:17,440 –> 00:21:21,039
also kind of biased by the fact that
547
00:21:19,559 –> 00:21:23,720
that you know there’s like a there’s a
548
00:21:21,039 –> 00:21:25,120
securist thing going there um and it’s
549
00:21:23,720 –> 00:21:26,480
not entirely working out you know look
550
00:21:25,120 –> 00:21:29,320
at what’s happening with the democracy
551
00:21:26,480 –> 00:21:31,960
and everything so so there are
552
00:21:29,320 –> 00:21:33,760
but it’s a global Trend in a way right
553
00:21:31,960 –> 00:21:35,039
you can see that and I don’t think it’d
554
00:21:33,760 –> 00:21:38,400
be a good idea to have the whole world
555
00:21:35,039 –> 00:21:42,400
just have one side of laws no of course
556
00:21:38,400 –> 00:21:45,600
extreme extremes never good idea um okay
557
00:21:42,400 –> 00:21:50,240
but let’s go back to to the subject um
558
00:21:45,600 –> 00:21:54,559
of of AI per se so um if you can yes and
559
00:21:50,240 –> 00:21:58,000
uh so I in all of your talks you are
560
00:21:54,559 –> 00:21:59,120
very vocal about the risks of um it’s
561
00:21:58,000 –> 00:22:01,480
it’s a difficult
562
00:21:59,120 –> 00:22:04,919
pronoun to pronounce from me said word
563
00:22:01,480 –> 00:22:08,080
antho anthropomorphism exactly that one
564
00:22:04,919 –> 00:22:11,360
making making AI feel or look like human
565
00:22:08,080 –> 00:22:15,760
um and so so yes attributing them
566
00:22:11,360 –> 00:22:19,320
humanlike um traits uh why do you think
567
00:22:15,760 –> 00:22:21,720
human are prone to doing that oh well
568
00:22:19,320 –> 00:22:24,000
it’s not it’s not it’s not just we’re
569
00:22:21,720 –> 00:22:25,360
prone to doing that um so I can talk
570
00:22:24,000 –> 00:22:26,919
about this but first of all I have to
571
00:22:25,360 –> 00:22:28,120
say that not all of my talks are about
572
00:22:26,919 –> 00:22:29,799
that a lot of my talks are actually
573
00:22:28,120 –> 00:22:32,919
about things like economics and public
574
00:22:29,799 –> 00:22:34,679
goods investment and cooperation and and
575
00:22:32,919 –> 00:22:36,840
Rec recently a lot about the market
576
00:22:34,679 –> 00:22:39,240
concentration and geopolitics and power
577
00:22:36,840 –> 00:22:42,320
also related to AI
578
00:22:39,240 –> 00:22:43,679
yes why so again this this matters with
579
00:22:42,320 –> 00:22:45,000
respect to what I was just talking about
580
00:22:43,679 –> 00:22:47,080
one of the big things that we worry
581
00:22:45,000 –> 00:22:48,360
about we’re a highly social species and
582
00:22:47,080 –> 00:22:49,840
so we really want to know who our
583
00:22:48,360 –> 00:22:53,039
friends are you know we want to know
584
00:22:49,840 –> 00:22:54,600
who’s in group who’s outgroup and um so
585
00:22:53,039 –> 00:22:56,279
artificial intelligence you know it’s it
586
00:22:54,600 –> 00:22:59,240
doesn’t just happen to be there it’s an
587
00:22:56,279 –> 00:23:01,159
artifact so so companies design it in
588
00:22:59,240 –> 00:23:04,039
order to make it feel like you really
589
00:23:01,159 –> 00:23:06,200
are important you know because so that
590
00:23:04,039 –> 00:23:08,760
you will feel good about Bing and and
591
00:23:06,200 –> 00:23:10,960
using their their their products right
592
00:23:08,760 –> 00:23:12,720
so it’s not like a coincidence I I
593
00:23:10,960 –> 00:23:14,640
wanted man is super smart I mean I
594
00:23:12,720 –> 00:23:16,279
already knew him I I was at a conference
595
00:23:14,640 –> 00:23:18,240
some super super smart developmental
596
00:23:16,279 –> 00:23:22,080
psychologist he’s like you know I want
597
00:23:18,240 –> 00:23:23,840
to do a study why why uh why we love AI
598
00:23:22,080 –> 00:23:25,559
because you know you you you know in
599
00:23:23,840 –> 00:23:28,559
every movie you always wind up loving
600
00:23:25,559 –> 00:23:30,440
the robot and I’m just like that you
601
00:23:28,559 –> 00:23:32,200
people spend tens of Millions on writing
602
00:23:30,440 –> 00:23:34,200
the narratives for you to they choose
603
00:23:32,200 –> 00:23:36,400
who you’re going to
604
00:23:34,200 –> 00:23:40,880
love the same with social media right
605
00:23:36,400 –> 00:23:43,240
like and and um designing the systems um
606
00:23:40,880 –> 00:23:45,799
around how psychology works and H how
607
00:23:43,240 –> 00:23:47,520
addictions work yeah I don’t think the
608
00:23:45,799 –> 00:23:50,200
the systems were deliberately designed
609
00:23:47,520 –> 00:23:51,120
around the addiction um and and
610
00:23:50,200 –> 00:23:53,279
sometimes I think some of that’s
611
00:23:51,120 –> 00:23:54,840
exaggerated too but it’s certainly true
612
00:23:53,279 –> 00:23:57,159
that the successful ones were the ones
613
00:23:54,840 –> 00:23:58,320
that you know that that to some extent
614
00:23:57,159 –> 00:24:00,400
that that they were feeding into that a
615
00:23:58,320 –> 00:24:01,679
bit so it’s there’s there’s a certain
616
00:24:00,400 –> 00:24:03,240
level where it’s designed it’s a certain
617
00:24:01,679 –> 00:24:07,600
level where there’s selection into what
618
00:24:03,240 –> 00:24:09,720
works you know and um but anyway yeah so
619
00:24:07,600 –> 00:24:11,000
I have the feeling that we we we’ve gone
620
00:24:09,720 –> 00:24:13,919
slightly off the piss we were going to
621
00:24:11,000 –> 00:24:13,919
talk about
622
00:24:14,039 –> 00:24:18,919
um um about making oh about
623
00:24:17,240 –> 00:24:21,520
anthropomorphism yeah so the so the
624
00:24:18,919 –> 00:24:23,799
point is the part of it is that that the
625
00:24:21,520 –> 00:24:26,720
companies do what what keeps us
626
00:24:23,799 –> 00:24:28,520
believing you know engaged and a lot of
627
00:24:26,720 –> 00:24:31,559
people are engaged by people not every
628
00:24:28,520 –> 00:24:34,080
one and and uh so that’s that’s part of
629
00:24:31,559 –> 00:24:35,679
it um but I think it is also there’s a
630
00:24:34,080 –> 00:24:37,440
lot of projection people don’t want to
631
00:24:35,679 –> 00:24:39,159
die and they really want to believe that
632
00:24:37,440 –> 00:24:41,279
if they upload themselves in AI that
633
00:24:39,159 –> 00:24:43,440
they’ll live forever or other people
634
00:24:41,279 –> 00:24:45,840
feel more um maternal or fraternal they
635
00:24:43,440 –> 00:24:47,880
feel like they are going to die but they
636
00:24:45,840 –> 00:24:50,240
want to have eternal powerful you know
637
00:24:47,880 –> 00:24:52,480
friends or or children or something or
638
00:24:50,240 –> 00:24:54,080
lovers the number of people that that
639
00:24:52,480 –> 00:24:56,679
you know seriously think that they’ve
640
00:24:54,080 –> 00:24:58,640
married an avatar or whatever it you
641
00:24:56,679 –> 00:25:00,600
know it’s just a little weird and and
642
00:24:58,640 –> 00:25:02,120
you know again there’s a feminist
643
00:25:00,600 –> 00:25:03,440
perspective of this that you could think
644
00:25:02,120 –> 00:25:05,520
that something you could own that you
645
00:25:03,440 –> 00:25:07,559
could turn off and on um that you could
646
00:25:05,520 –> 00:25:09,919
clone that you could alter that that’s
647
00:25:07,559 –> 00:25:11,799
your best friend you know that that’s
648
00:25:09,919 –> 00:25:13,039
that’s you know well in a way it’s it’s
649
00:25:11,799 –> 00:25:17,320
better than having a woman that you’re
650
00:25:13,039 –> 00:25:21,480
trying to do that to you know but but uh
651
00:25:17,320 –> 00:25:23,559
yes yes yeah like um the the systems
652
00:25:21,480 –> 00:25:26,919
such as you I guess you know the replica
653
00:25:23,559 –> 00:25:28,960
and what you’ve mentioned um around
654
00:25:26,919 –> 00:25:31,799
talking or
655
00:25:28,960 –> 00:25:34,600
like uploading and feeding the the the
656
00:25:31,799 –> 00:25:38,640
algorithms the your the the Persona of
657
00:25:34,600 –> 00:25:43,039
your diseased or past uh people from
658
00:25:38,640 –> 00:25:47,279
from your past it’s it I find it a bit
659
00:25:43,039 –> 00:25:50,000
scary and worrying that people prefer to
660
00:25:47,279 –> 00:25:53,159
many people prefer to get stuck in a
661
00:25:50,000 –> 00:25:55,919
loop of of their past instead of um you
662
00:25:53,159 –> 00:25:57,960
know creating new memories and I don’t
663
00:25:55,919 –> 00:26:02,200
know if it’s I don’t know if you see it
664
00:25:57,960 –> 00:26:05,520
in in a similar way but the it was one
665
00:26:02,200 –> 00:26:10,440
in one of the um Black Mirror episode
666
00:26:05,520 –> 00:26:13,520
and the whole recording of of your
667
00:26:10,440 –> 00:26:17,240
surroundings with you know rayan like uh
668
00:26:13,520 –> 00:26:20,640
Google’s where you store like you create
669
00:26:17,240 –> 00:26:23,760
those um recordings you create data of
670
00:26:20,640 –> 00:26:27,480
your um presentence and then just you
671
00:26:23,760 –> 00:26:29,880
keep relieving it like and like I said
672
00:26:27,480 –> 00:26:32,440
without uh instead of you know just
673
00:26:29,880 –> 00:26:35,679
moving on and I I don’t know I just find
674
00:26:32,440 –> 00:26:37,440
it people have always asked about uh who
675
00:26:35,679 –> 00:26:39,159
who’s caught up in the past you know who
676
00:26:37,440 –> 00:26:41,480
tries to just keep things all the same
677
00:26:39,159 –> 00:26:44,039
who who uh reads their same books over
678
00:26:41,480 –> 00:26:46,720
and over and over um you know so I don’t
679
00:26:44,039 –> 00:26:48,159
I don’t think that’s changed that much
680
00:26:46,720 –> 00:26:50,399
um I do think when you’re going through
681
00:26:48,159 –> 00:26:51,960
a grieving process there may be a role
682
00:26:50,399 –> 00:26:54,240
as long as you have a good understanding
683
00:26:51,960 –> 00:26:56,000
that the AI is actually just like it’s
684
00:26:54,240 –> 00:26:57,600
another kind of recording you know
685
00:26:56,000 –> 00:26:58,840
people don’t have a great understanding
686
00:26:57,600 –> 00:27:00,440
that of right now you know they think
687
00:26:58,840 –> 00:27:02,320
that chat GPT is like some kind of
688
00:27:00,440 –> 00:27:04,039
entity rather than just like a weird
689
00:27:02,320 –> 00:27:06,080
Library interface which it is it’s a
690
00:27:04,039 –> 00:27:08,399
weird Library interface it’s it’s sort
691
00:27:06,080 –> 00:27:11,200
of predicts uh it predicts what you
692
00:27:08,399 –> 00:27:14,399
might say next based on some some
693
00:27:11,200 –> 00:27:16,880
culture has been trained on right so the
694
00:27:14,399 –> 00:27:18,559
uh so you can do that with you know with
695
00:27:16,880 –> 00:27:20,799
with people you miss and you can maybe
696
00:27:18,559 –> 00:27:22,760
even learn stuff I mean I was part of
697
00:27:20,799 –> 00:27:25,880
one of these experiments that Anna ster
698
00:27:22,760 –> 00:27:28,720
here in Berlin also actually she ran a
699
00:27:25,880 –> 00:27:30,919
um experiment where um they took a whole
700
00:27:28,720 –> 00:27:33,279
lot of uh Dan dennett’s writings and
701
00:27:30,919 –> 00:27:35,320
they trained up a version of dennet and
702
00:27:33,279 –> 00:27:37,480
then they asked then they asked uh
703
00:27:35,320 –> 00:27:39,840
questions and they got like four answers
704
00:27:37,480 –> 00:27:41,919
from from this you know Chat dennet
705
00:27:39,840 –> 00:27:46,399
thing and then they asked dennet himself
706
00:27:41,919 –> 00:27:48,960
he was still alive and uh and and people
707
00:27:46,399 –> 00:27:51,600
were absolutely at chance they they
708
00:27:48,960 –> 00:27:54,159
couldn’t um tell which one and I think
709
00:27:51,600 –> 00:27:55,480
that’s partly because he actually I I
710
00:27:54,159 –> 00:27:57,640
thought I was in an interview with him
711
00:27:55,480 –> 00:28:00,039
with a journalist about this and it was
712
00:27:57,640 –> 00:28:01,919
really interesting he he basically uh
713
00:28:00,039 –> 00:28:04,360
was accommodating to old age and he said
714
00:28:01,919 –> 00:28:06,600
look you you become famous for the ideas
715
00:28:04,360 –> 00:28:09,039
you had as a young man and not all your
716
00:28:06,600 –> 00:28:10,760
ideas just your best ideas and so then
717
00:28:09,039 –> 00:28:13,760
everyone who’s like read you and even
718
00:28:10,760 –> 00:28:15,720
your friends who are generous remember
719
00:28:13,760 –> 00:28:17,320
the best ideas you had and if they talk
720
00:28:15,720 –> 00:28:20,120
to the real you they’re gonna hear the
721
00:28:17,320 –> 00:28:21,720
random selection of bad ideas you have
722
00:28:20,120 –> 00:28:23,279
and that’s that was basically that he
723
00:28:21,720 –> 00:28:25,519
was flattered that people thought he was
724
00:28:23,279 –> 00:28:27,440
smarter than he really was yeah you know
725
00:28:25,519 –> 00:28:29,240
so he was as smart as his but Greatest
726
00:28:27,440 –> 00:28:32,600
Hits rather than as far as his normal
727
00:28:29,240 –> 00:28:37,960
self exactly yeah but it’s I guess it’s
728
00:28:32,600 –> 00:28:40,679
it’s with any any sort of living or um
729
00:28:37,960 –> 00:28:42,519
past uh artists right like you they have
730
00:28:40,679 –> 00:28:45,279
to create so much and only the
731
00:28:42,519 –> 00:28:47,720
percentage of their work is is it
732
00:28:45,279 –> 00:28:49,159
becomes famous and well I I was trying
733
00:28:47,720 –> 00:28:51,279
to
734
00:28:49,159 –> 00:28:53,799
um I would I would want you know I was
735
00:28:51,279 –> 00:28:55,559
one of the experimental subjects and and
736
00:28:53,799 –> 00:28:57,360
the main thing I was looking at was I’m
737
00:28:55,559 –> 00:28:58,600
going I don’t think he still believes
738
00:28:57,360 –> 00:28:59,840
that that must be something on his
739
00:28:58,600 –> 00:29:02,399
little text you know there’ be things
740
00:28:59,840 –> 00:29:03,679
where just the time didn’t make sense
741
00:29:02,399 –> 00:29:04,919
because I because i’ had been involved
742
00:29:03,679 –> 00:29:06,919
with some of the projects and I knew
743
00:29:04,919 –> 00:29:09,640
that he was you know but I still I was
744
00:29:06,919 –> 00:29:09,640
still at chance
745
00:29:10,279 –> 00:29:20,200
so but about that and few years ago you
746
00:29:15,559 –> 00:29:22,559
um wrote this uh article on for wir one
747
00:29:20,200 –> 00:29:27,320
day AI will seem as human as everyone
748
00:29:22,559 –> 00:29:30,039
anyone H was then it was 2020 I think um
749
00:29:27,320 –> 00:29:31,600
no no it was uh I think it was 22
750
00:29:30,039 –> 00:29:33,360
because it was the year it was right
751
00:29:31,600 –> 00:29:35,919
after the lamba thing happened so it
752
00:29:33,360 –> 00:29:38,519
time goes fast but it’s only two years
753
00:29:35,919 –> 00:29:41,799
it’s a it’s a feels like a long a long
754
00:29:38,519 –> 00:29:44,240
time to you know judging on what how how
755
00:29:41,799 –> 00:29:47,480
much of a progress uh we can see at at
756
00:29:44,240 –> 00:29:50,440
least in a commercial uh space in AI do
757
00:29:47,480 –> 00:29:53,760
you feel or like have you Revisited any
758
00:29:50,440 –> 00:29:56,640
of the beliefs you included at that time
759
00:29:53,760 –> 00:29:59,679
um around about you know the progress um
760
00:29:56,640 –> 00:30:02,120
or the way AI is is going to be
761
00:29:59,679 –> 00:30:05,919
constructed and used and you know where
762
00:30:02,120 –> 00:30:07,679
is it going or I I honestly haven’t R it
763
00:30:05,919 –> 00:30:09,360
uh in the last couple years so I can’t
764
00:30:07,679 –> 00:30:10,799
but I can’t think of anything that was
765
00:30:09,360 –> 00:30:12,480
wrong I haven’t seen something thought
766
00:30:10,799 –> 00:30:14,080
out that I was wrong in fact more often
767
00:30:12,480 –> 00:30:16,760
I think this is exactly what I said was
768
00:30:14,080 –> 00:30:19,240
going to happen and you know what what
769
00:30:16,760 –> 00:30:22,159
drives I mean I particularly was uh
770
00:30:19,240 –> 00:30:24,880
worried about the fact that people
771
00:30:22,159 –> 00:30:27,000
thought that um oh yeah I may have a
772
00:30:24,880 –> 00:30:28,159
regret I’ll come back to that okay so I
773
00:30:27,000 –> 00:30:29,320
was particularly worried at the time
774
00:30:28,159 –> 00:30:31,120
that people are saying oh don’t worry
775
00:30:29,320 –> 00:30:34,720
because humans will be more creative and
776
00:30:31,120 –> 00:30:36,279
whatever um and you know I and I think
777
00:30:34,720 –> 00:30:37,840
they were really undermining themselves
778
00:30:36,279 –> 00:30:39,480
they had kind of a religious faith that
779
00:30:37,840 –> 00:30:41,320
humans were more creative and yet any
780
00:30:39,480 –> 00:30:42,960
[ __ ] could look at the output of this
781
00:30:41,320 –> 00:30:44,919
stuff and see that was kind of creative
782
00:30:42,960 –> 00:30:46,480
right and maybe it’s not your favorite
783
00:30:44,919 –> 00:30:47,720
stuff for some people it is and you
784
00:30:46,480 –> 00:30:49,240
don’t want to undermine them and say oh
785
00:30:47,720 –> 00:30:50,919
you have no taste you think that the B
786
00:30:49,240 –> 00:30:55,519
is better right you know I just told you
787
00:30:50,919 –> 00:30:57,639
that story about D it so so uh so I you
788
00:30:55,519 –> 00:30:59,120
know so I wanted to say look there’s
789
00:30:57,639 –> 00:31:00,240
something more fundamental and this is
790
00:30:59,120 –> 00:31:02,000
something I’ve been working on for a
791
00:31:00,240 –> 00:31:04,240
long time so I kind of had something to
792
00:31:02,000 –> 00:31:06,000
write about it but I just did not like
793
00:31:04,240 –> 00:31:08,159
the way that the convers the public
794
00:31:06,000 –> 00:31:11,159
conversation was going so I I released
795
00:31:08,159 –> 00:31:12,919
basically some of my book um to into
796
00:31:11,159 –> 00:31:14,960
wired and I say released I send it to
797
00:31:12,919 –> 00:31:17,720
them they’re like nah I was like what no
798
00:31:14,960 –> 00:31:19,440
this is important you know and so there
799
00:31:17,720 –> 00:31:20,600
it got heavily edited and I think it’s
800
00:31:19,440 –> 00:31:22,279
pretty much okay there’s like one
801
00:31:20,600 –> 00:31:24,480
section I think isn’t coherent oh yeah
802
00:31:22,279 –> 00:31:28,519
the one thing that I I thought may have
803
00:31:24,480 –> 00:31:30,639
been wrong is that the I was taking the
804
00:31:28,519 –> 00:31:32,880
limit case that even in the case where
805
00:31:30,639 –> 00:31:35,679
you cannot tell it apart at all in any
806
00:31:32,880 –> 00:31:37,080
context you know touring touring test is
807
00:31:35,679 –> 00:31:39,760
completely
808
00:31:37,080 –> 00:31:42,039
past what does that mean to humans right
809
00:31:39,760 –> 00:31:43,559
what and how is our how and basically I
810
00:31:42,039 –> 00:31:44,919
predicted that that our life most
811
00:31:43,559 –> 00:31:48,519
people’s lives are not going to change
812
00:31:44,919 –> 00:31:50,919
that much um and because everything that
813
00:31:48,519 –> 00:31:53,720
we value is based on the basically the
814
00:31:50,919 –> 00:31:56,600
social problems of Apes right and and
815
00:31:53,720 –> 00:31:58,360
even like if you look the at the graphs
816
00:31:56,600 –> 00:31:59,880
that come out like they people say oh
817
00:31:58,360 –> 00:32:01,080
look at how much faster machine learning
818
00:31:59,880 –> 00:32:03,279
is going it’s like looks like it’s
819
00:32:01,080 –> 00:32:05,120
really zooming up but the point is that
820
00:32:03,279 –> 00:32:08,039
once it gets a little bit above human
821
00:32:05,120 –> 00:32:09,760
level it just plateaus why because any
822
00:32:08,039 –> 00:32:13,399
task we care about is something that we
823
00:32:09,760 –> 00:32:15,240
do right we it’s like you know nobody
824
00:32:13,399 –> 00:32:17,240
cares about the tasks that at which we
825
00:32:15,240 –> 00:32:18,799
were already superhuman like you know
826
00:32:17,240 –> 00:32:20,559
like this building can contain more
827
00:32:18,799 –> 00:32:22,320
people than any person can contain right
828
00:32:20,559 –> 00:32:23,880
or or the you know calculator can do
829
00:32:22,320 –> 00:32:26,200
arithmetic fast than any person could do
830
00:32:23,880 –> 00:32:28,039
it and everybody stopped caring so all
831
00:32:26,200 –> 00:32:29,880
the stuff we think is intelligence
832
00:32:28,039 –> 00:32:31,279
actually sort of humanlike and then
833
00:32:29,880 –> 00:32:34,039
there’s no way to be more human-like
834
00:32:31,279 –> 00:32:36,000
than a human so you you basically run
835
00:32:34,039 –> 00:32:40,279
out of you you run out of stuff you can
836
00:32:36,000 –> 00:32:42,720
do you know we have uh you know all all
837
00:32:40,279 –> 00:32:47,440
all all chess programs can play better
838
00:32:42,720 –> 00:32:49,200
than 9 99 99 you know n NS percent of
839
00:32:47,440 –> 00:32:52,919
people but people still play chess with
840
00:32:49,200 –> 00:32:55,240
each other and you know and everybody um
841
00:32:52,919 –> 00:32:56,240
you know people still have pup quizzes
842
00:32:55,240 –> 00:32:57,840
they don’t they don’t sit there and
843
00:32:56,240 –> 00:32:59,440
Google onto the table like why would
844
00:32:57,840 –> 00:33:00,799
would they do that you know and you know
845
00:32:59,440 –> 00:33:02,639
maybe you know some teenagers did it
846
00:33:00,799 –> 00:33:05,000
once in a while or something but at the
847
00:33:02,639 –> 00:33:06,799
end of the day all these things are
848
00:33:05,000 –> 00:33:09,279
human relationships and jobs are human
849
00:33:06,799 –> 00:33:11,320
relationships and you know the idea that
850
00:33:09,279 –> 00:33:13,960
you could just replace everyone I mean
851
00:33:11,320 –> 00:33:15,799
there you can replace too many people in
852
00:33:13,960 –> 00:33:17,880
you know so if you aren’t thinking
853
00:33:15,799 –> 00:33:19,880
correctly about this then you can make
854
00:33:17,880 –> 00:33:21,880
big mistakes and and this is I think the
855
00:33:19,880 –> 00:33:23,240
biggest issue in AI ethics right now I
856
00:33:21,880 –> 00:33:24,519
mean a lot of people are disagreeing
857
00:33:23,240 –> 00:33:27,960
about this but I just said something
858
00:33:24,519 –> 00:33:30,120
else um on on social media but what of
859
00:33:27,960 –> 00:33:32,279
the biggest things that happened was uh
860
00:33:30,120 –> 00:33:35,960
the in October 7th of
861
00:33:32,279 –> 00:33:39,039
2023 that there there was um there
862
00:33:35,960 –> 00:33:42,600
weren’t enough humans watching the
863
00:33:39,039 –> 00:33:44,279
perimeter of the Gaza Israel divide and
864
00:33:42,600 –> 00:33:47,960
because they thought that having a few
865
00:33:44,279 –> 00:33:50,159
cameras was enough and I’m not sure who
866
00:33:47,960 –> 00:33:52,799
thought they having a few cameras was
867
00:33:50,159 –> 00:33:54,360
enough and I don’t even know you know
868
00:33:52,799 –> 00:33:56,320
what what might have happened around
869
00:33:54,360 –> 00:33:58,480
those cameras but I had identified and I
870
00:33:56,320 –> 00:34:01,159
managed to find the the video
871
00:33:58,480 –> 00:34:03,480
of me saying in 2018 to Eric Schmidt
872
00:34:01,159 –> 00:34:06,039
this is a bad idea you do not want to
873
00:34:03,480 –> 00:34:08,040
have no people and a few cameras because
874
00:34:06,039 –> 00:34:09,839
first of all you know that’s like one
875
00:34:08,040 –> 00:34:11,639
point of failure one algorithm that you
876
00:34:09,839 –> 00:34:13,839
could work around and and you can break
877
00:34:11,639 –> 00:34:16,359
security and so don’t you know don’t
878
00:34:13,839 –> 00:34:19,240
don’t give people such a simple Target
879
00:34:16,359 –> 00:34:21,879
but secondly um the corruption potential
880
00:34:19,240 –> 00:34:24,320
for corruption once you do that then if
881
00:34:21,879 –> 00:34:26,000
somebody does want something to happen
882
00:34:24,320 –> 00:34:28,320
then it’s relatively easy for them to
883
00:34:26,000 –> 00:34:30,599
Mastermind that you want blowers you
884
00:34:28,320 –> 00:34:32,200
want soldiers you you you know it’s not
885
00:34:30,599 –> 00:34:33,520
that we’re going to eliminate my lies
886
00:34:32,200 –> 00:34:36,359
it’s that we are going to eliminate
887
00:34:33,520 –> 00:34:37,560
people stopping and Reporting my lies
888
00:34:36,359 –> 00:34:39,520
and that was the other thing I said
889
00:34:37,560 –> 00:34:43,280
earlier was that what we’re doing to
890
00:34:39,520 –> 00:34:45,800
journalism um uh by by just taking all
891
00:34:43,280 –> 00:34:47,079
the money away from newspapers right I
892
00:34:45,800 –> 00:34:49,720
don’t know again this is like I don’t
893
00:34:47,079 –> 00:34:50,839
think that was the original plan like to
894
00:34:49,720 –> 00:34:52,480
eliminate journalism I think the
895
00:34:50,839 –> 00:34:55,320
original plan was like gez how do we pay
896
00:34:52,480 –> 00:34:56,639
for our magic search machine right but
897
00:34:55,320 –> 00:34:58,160
but basically all that money now is
898
00:34:56,639 –> 00:35:00,000
going to Google and Facebook
899
00:34:58,160 –> 00:35:03,000
and and it’s very very hard to keep the
900
00:35:00,000 –> 00:35:05,800
journalist employed and um and now
901
00:35:03,000 –> 00:35:07,359
people are stealing their their text and
902
00:35:05,800 –> 00:35:09,440
and making fake people and pretending
903
00:35:07,359 –> 00:35:11,280
that they wrote the text and this is the
904
00:35:09,440 –> 00:35:12,560
kind of problems I think it’s solvable I
905
00:35:11,280 –> 00:35:14,839
don’t think it’s like oh my God it’s the
906
00:35:12,560 –> 00:35:17,040
ends of the world but we have to realize
907
00:35:14,839 –> 00:35:18,400
it’s important and choose to solve it
908
00:35:17,040 –> 00:35:21,079
right you know just like you know
909
00:35:18,400 –> 00:35:23,839
smoking causing cancer and and nuclear
910
00:35:21,079 –> 00:35:26,000
bi biological weapons you know we this
911
00:35:23,839 –> 00:35:28,079
is something that requires investment
912
00:35:26,000 –> 00:35:29,760
from the government um and from
913
00:35:28,079 –> 00:35:31,520
governments which means investment from
914
00:35:29,760 –> 00:35:34,560
taxpayers we all have to say this is
915
00:35:31,520 –> 00:35:39,280
something that matters like pay the
916
00:35:34,560 –> 00:35:41,640
those um payment um pay walls solve this
917
00:35:39,280 –> 00:35:44,000
issue only very in a very very small
918
00:35:41,640 –> 00:35:48,520
percentage usually people they got used
919
00:35:44,000 –> 00:35:51,240
to not paying uh and and being okay with
920
00:35:48,520 –> 00:35:54,240
being advertised um even though most of
921
00:35:51,240 –> 00:35:58,760
the things were you know
922
00:35:54,240 –> 00:36:01,200
um misleading to to to put it lightly um
923
00:35:58,760 –> 00:36:05,040
so what do you see as a potential
924
00:36:01,200 –> 00:36:08,640
solution you know for for journalism to
925
00:36:05,040 –> 00:36:10,280
survive right now I well either I I
926
00:36:08,640 –> 00:36:13,400
actually think journalism is a lot like
927
00:36:10,280 –> 00:36:15,560
higher education um they’re both and you
928
00:36:13,400 –> 00:36:16,920
know uh autocratic governments find them
929
00:36:15,560 –> 00:36:19,440
threatening because they’re basically
930
00:36:16,920 –> 00:36:21,200
information seeking and disseminating um
931
00:36:19,440 –> 00:36:23,119
and they’re they’re absolutely essential
932
00:36:21,200 –> 00:36:26,640
to a democracy in fact not just
933
00:36:23,119 –> 00:36:28,319
democracy if the the cultural revolution
934
00:36:26,640 –> 00:36:30,280
was stopped by China because they
935
00:36:28,319 –> 00:36:32,640
realize they’re falling behind Russia
936
00:36:30,280 –> 00:36:34,839
technologically you know so it it isn’t
937
00:36:32,640 –> 00:36:36,920
only you know so that you can vote it’s
938
00:36:34,839 –> 00:36:38,839
also so you can like you know have a
939
00:36:36,920 –> 00:36:43,119
working uh economy and all kinds of
940
00:36:38,839 –> 00:36:46,640
things you know but um so I think that
941
00:36:43,119 –> 00:36:48,400
uh one solution would be to um think a
942
00:36:46,640 –> 00:36:50,720
little harder about how to fund both and
943
00:36:48,400 –> 00:36:53,040
to bring them together um and maybe
944
00:36:50,720 –> 00:36:55,560
house them together um but there I do
945
00:36:53,040 –> 00:36:56,880
think that there’s going it doesn’t make
946
00:36:55,560 –> 00:36:58,319
sense so if you think the British went
947
00:36:56,880 –> 00:37:00,240
through this situation while I was there
948
00:36:58,319 –> 00:37:01,520
where they they sort of decided that
949
00:37:00,240 –> 00:37:02,760
maybe half of people should go to
950
00:37:01,520 –> 00:37:04,480
undergraduate now I’m not sure that
951
00:37:02,760 –> 00:37:05,839
number is true some people said that you
952
00:37:04,480 –> 00:37:07,480
know it got it was through a
953
00:37:05,839 –> 00:37:09,319
miscommunication with Tony Blair or
954
00:37:07,480 –> 00:37:10,599
something but once the government had
955
00:37:09,319 –> 00:37:13,760
announced the target they couldn’t go
956
00:37:10,599 –> 00:37:15,359
back so they they uh but anyway so so
957
00:37:13,760 –> 00:37:17,640
let’s just take the simple case half of
958
00:37:15,359 –> 00:37:20,119
people are getting higher education who
959
00:37:17,640 –> 00:37:22,319
pays for it so historically in Britain
960
00:37:20,119 –> 00:37:25,400
the taxpayers have paid for 5% of people
961
00:37:22,319 –> 00:37:28,599
to go to university um it doesn’t really
962
00:37:25,400 –> 00:37:31,640
make sense for everyone
963
00:37:28,599 –> 00:37:33,680
to uh pay the same amount and half the
964
00:37:31,640 –> 00:37:34,880
people to get more benefit out of it now
965
00:37:33,680 –> 00:37:37,160
if you can say that doing an
966
00:37:34,880 –> 00:37:39,040
undergraduate degree is work and it is
967
00:37:37,160 –> 00:37:41,480
you know you’re taking uh three or four
968
00:37:39,040 –> 00:37:44,160
years out of out of when you could have
969
00:37:41,480 –> 00:37:47,400
been getting paid you know it’s it’s a
970
00:37:44,160 –> 00:37:49,359
bunch of effort um that you know that
971
00:37:47,400 –> 00:37:51,920
then then you could say well you know
972
00:37:49,359 –> 00:37:54,839
maybe it equals equalizes out but in
973
00:37:51,920 –> 00:37:56,440
general um the reason people do it all
974
00:37:54,839 –> 00:37:57,839
that effort is because they expect over
975
00:37:56,440 –> 00:38:00,319
the period of their lifetime to get a
976
00:37:57,839 –> 00:38:01,960
higher return and so I think you have to
977
00:38:00,319 –> 00:38:03,880
do some kind of balance between like
978
00:38:01,960 –> 00:38:06,400
this is a public good it does help us
979
00:38:03,880 –> 00:38:07,960
all to have these educated people um
980
00:38:06,400 –> 00:38:09,599
again to some level and and we should
981
00:38:07,960 –> 00:38:12,680
come up with a better metric of exactly
982
00:38:09,599 –> 00:38:16,800
how many people need what education
983
00:38:12,680 –> 00:38:18,400
um but then you also need um then you
984
00:38:16,800 –> 00:38:21,240
also need to figure out how much do they
985
00:38:18,400 –> 00:38:23,000
contribute so I think actually the the
986
00:38:21,240 –> 00:38:24,560
solution that liberal Democrats came up
987
00:38:23,000 –> 00:38:26,920
with when they were sort of semi in
988
00:38:24,560 –> 00:38:29,680
power what was sensible that they said
989
00:38:26,920 –> 00:38:32,280
oh well you you you pay some proportion
990
00:38:29,680 –> 00:38:34,160
of your salary over some threshold for
991
00:38:32,280 –> 00:38:36,400
some number of years like I think it was
992
00:38:34,160 –> 00:38:38,119
30 or something and then it’s all so if
993
00:38:36,400 –> 00:38:40,480
somebody does decide to go into you know
994
00:38:38,119 –> 00:38:42,200
the Arts or or volunteerism or something
995
00:38:40,480 –> 00:38:44,640
then they never pay back and it’s not so
996
00:38:42,200 –> 00:38:46,920
it’s not like a regular debt um so you
997
00:38:44,640 –> 00:38:50,280
only get the extra benefit that that you
998
00:38:46,920 –> 00:38:51,920
got uh from if you basically only you
999
00:38:50,280 –> 00:38:53,960
only you have to make more you had to
1000
00:38:51,920 –> 00:38:56,319
make more than average to have your
1001
00:38:53,960 –> 00:38:57,599
overall tuition costs be higher than
1002
00:38:56,319 –> 00:38:59,720
they would have been before the this
1003
00:38:57,599 –> 00:39:01,079
whole thing started and and if you’re
1004
00:38:59,720 –> 00:39:03,440
making more than average well then
1005
00:39:01,079 –> 00:39:04,920
presumably your education’s helping you
1006
00:39:03,440 –> 00:39:07,000
um I mean which is of course a big
1007
00:39:04,920 –> 00:39:08,400
presumption but it’s some kind of it’s
1008
00:39:07,000 –> 00:39:09,720
none of this stuff is ever going to be
1009
00:39:08,400 –> 00:39:11,839
perfect it’s all about choosing
1010
00:39:09,720 –> 00:39:14,359
something that makes incentives right so
1011
00:39:11,839 –> 00:39:16,520
Society can improve itself uh as much as
1012
00:39:14,359 –> 00:39:17,960
possible that’s that’s the point so
1013
00:39:16,520 –> 00:39:20,119
anyway I thought that was pretty good
1014
00:39:17,960 –> 00:39:21,760
but it was pretty complicated even the
1015
00:39:20,119 –> 00:39:24,160
liberal Democrats didn’t try to explain
1016
00:39:21,760 –> 00:39:27,880
at the next election of course they got
1017
00:39:24,160 –> 00:39:30,520
slau I remember when I so I studied I
1018
00:39:27,880 –> 00:39:32,720
started my like you know as I’m in I’m
1019
00:39:30,520 –> 00:39:35,800
polish I I started University in Poland
1020
00:39:32,720 –> 00:39:38,280
and I moved to to UK to London to study
1021
00:39:35,800 –> 00:39:42,880
to continue studies and I remember you
1022
00:39:38,280 –> 00:39:47,319
know Poland is um has a lower income per
1023
00:39:42,880 –> 00:39:51,359
capita um as than UK and I remember it
1024
00:39:47,319 –> 00:39:53,400
was so much easier for for us to to get
1025
00:39:51,359 –> 00:39:57,760
uh I think it was grants but also some
1026
00:39:53,400 –> 00:40:00,200
some kind of credits uh to study
1027
00:39:57,760 –> 00:40:03,880
and since brexit I think that that was
1028
00:40:00,200 –> 00:40:05,720
taken away as well wow yeah no I think
1029
00:40:03,880 –> 00:40:07,839
uh yeah no I do think the British did a
1030
00:40:05,720 –> 00:40:08,880
lot of things to try to make it work um
1031
00:40:07,839 –> 00:40:10,160
they but they were just trying to figure
1032
00:40:08,880 –> 00:40:12,200
out how to fund and they had a pretty
1033
00:40:10,160 –> 00:40:14,720
good University system and I think they
1034
00:40:12,200 –> 00:40:16,760
still have one um by Legacy but but they
1035
00:40:14,720 –> 00:40:18,800
are losing you they’re showing a lot of
1036
00:40:16,760 –> 00:40:20,560
departments and things it’s it’s it’s
1037
00:40:18,800 –> 00:40:23,800
very
1038
00:40:20,560 –> 00:40:27,480
sad yes I I I agree with that you
1039
00:40:23,800 –> 00:40:30,760
mentioned the transparency um issue and
1040
00:40:27,480 –> 00:40:35,880
the accountability uh of those of
1041
00:40:30,760 –> 00:40:40,800
Designing uh systems which can bend
1042
00:40:35,880 –> 00:40:42,839
um the public uh opinion and and
1043
00:40:40,800 –> 00:40:46,400
eventually how they vote and how how
1044
00:40:42,839 –> 00:40:49,160
they um make choices important choices
1045
00:40:46,400 –> 00:40:52,480
so how do you think or who do you think
1046
00:40:49,160 –> 00:40:54,960
should be respon held responsible for
1047
00:40:52,480 –> 00:40:58,440
for Designing those systems for misuse
1048
00:40:54,960 –> 00:41:03,000
and um
1049
00:40:58,440 –> 00:41:05,440
like in creating those those models um
1050
00:41:03,000 –> 00:41:08,280
that they are not so biased as they are
1051
00:41:05,440 –> 00:41:10,800
now okay all right let’s leave bias out
1052
00:41:08,280 –> 00:41:12,599
of this for a second and go just back to
1053
00:41:10,800 –> 00:41:16,400
the accountability question so the so
1054
00:41:12,599 –> 00:41:18,400
the the basic point is that AI is not no
1055
00:41:16,400 –> 00:41:19,880
different from any other Advanced
1056
00:41:18,400 –> 00:41:23,520
complicated technology you have to deal
1057
00:41:19,880 –> 00:41:26,040
with so just like a car or or you know
1058
00:41:23,520 –> 00:41:27,800
just like a car that um if there’s
1059
00:41:26,040 –> 00:41:29,839
something that the design
1060
00:41:27,800 –> 00:41:31,240
or developer has done wrong so the
1061
00:41:29,839 –> 00:41:32,560
brakes are failing or something then
1062
00:41:31,240 –> 00:41:34,599
they have then they’re liable and they
1063
00:41:32,560 –> 00:41:37,040
have to do a recall if it’s the person
1064
00:41:34,599 –> 00:41:38,920
driving it that got drunk or whatever or
1065
00:41:37,040 –> 00:41:40,200
or just was not bothering to pay
1066
00:41:38,920 –> 00:41:42,599
attention or something playing with
1067
00:41:40,200 –> 00:41:44,280
their phone then it’s their fault and so
1068
00:41:42,599 –> 00:41:46,720
those are the only two kinds of people
1069
00:41:44,280 –> 00:41:48,119
that can be at fault basically the other
1070
00:41:46,720 –> 00:41:49,680
thing with any digital system whether
1071
00:41:48,119 –> 00:41:52,800
it’s AI or not of course is that you
1072
00:41:49,680 –> 00:41:55,359
could also have hackers um but then
1073
00:41:52,800 –> 00:41:57,480
again if you have hackers then that’s a
1074
00:41:55,359 –> 00:41:59,440
problem either of the Developers who
1075
00:41:57,480 –> 00:42:01,040
didn’t provide you enough cyber security
1076
00:41:59,440 –> 00:42:05,359
or the users that didn’t keep their
1077
00:42:01,040 –> 00:42:07,400
cyber security upgraded updated so um so
1078
00:42:05,359 –> 00:42:10,079
either way basically there’s only the
1079
00:42:07,400 –> 00:42:12,400
the and the the owners operators and the
1080
00:42:10,079 –> 00:42:16,000
people that that sold them the system
1081
00:42:12,400 –> 00:42:18,359
now because AI has very you so funny AI
1082
00:42:16,000 –> 00:42:20,359
people are like oh but you know but we
1083
00:42:18,359 –> 00:42:22,119
have all these software libraries and
1084
00:42:20,359 –> 00:42:23,880
you know data and everything it’s so
1085
00:42:22,119 –> 00:42:25,559
complicated you know it’s exactly the
1086
00:42:23,880 –> 00:42:26,880
same for building a fighter jet or a
1087
00:42:25,559 –> 00:42:29,079
hospital or something you know there’s
1088
00:42:26,880 –> 00:42:30,839
you have to Source all the products you
1089
00:42:29,079 –> 00:42:32,720
have to worry about your your supply
1090
00:42:30,839 –> 00:42:34,960
chain you have to you have to vet all
1091
00:42:32,720 –> 00:42:37,319
those people it’s normal it’s just a
1092
00:42:34,960 –> 00:42:41,040
software is becoming normal only well I
1093
00:42:37,319 –> 00:42:43,640
mean it’s normal for for um for
1094
00:42:41,040 –> 00:42:45,319
dangerous stuff you know like building a
1095
00:42:43,640 –> 00:42:47,319
big structure that can fall on people
1096
00:42:45,319 –> 00:42:49,720
then you have to get licensed you know
1097
00:42:47,319 –> 00:42:52,240
it it used to be that you you that you
1098
00:42:49,720 –> 00:42:53,800
had to get licenses to be an architect
1099
00:42:52,240 –> 00:42:56,200
or to build your own house or whatever
1100
00:42:53,800 –> 00:42:57,880
but now you have to all the Architects
1101
00:42:56,200 –> 00:42:59,480
get licens at school
1102
00:42:57,880 –> 00:43:00,960
and then um the building has to get
1103
00:42:59,480 –> 00:43:03,200
planning permission and the building has
1104
00:43:00,960 –> 00:43:04,880
to get inspected you know you don’t just
1105
00:43:03,200 –> 00:43:07,280
build it used to be any rich person
1106
00:43:04,880 –> 00:43:09,440
could build things anywhere but then as
1107
00:43:07,280 –> 00:43:12,200
ordinary people’s lives literally became
1108
00:43:09,440 –> 00:43:14,359
more valuable they were able to sue if
1109
00:43:12,200 –> 00:43:16,400
if a bunch of them got crushed right and
1110
00:43:14,359 –> 00:43:17,920
you see still to this day around
1111
00:43:16,400 –> 00:43:19,280
different parts of the world that
1112
00:43:17,920 –> 00:43:21,680
different people worry about this to
1113
00:43:19,280 –> 00:43:23,640
different extents you know it was really
1114
00:43:21,680 –> 00:43:25,839
tragic that in Turkey you know that you
1115
00:43:23,640 –> 00:43:28,200
saw that only The Institute of
1116
00:43:25,839 –> 00:43:30,920
engineering had followed had complied
1117
00:43:28,200 –> 00:43:32,400
with the with the laws the regulations
1118
00:43:30,920 –> 00:43:33,960
about how to build and their building
1119
00:43:32,400 –> 00:43:36,040
was perfectly still standing when
1120
00:43:33,960 –> 00:43:39,720
everything around it was collapsed you
1121
00:43:36,040 –> 00:43:41,280
know it was awful um but but it showed
1122
00:43:39,720 –> 00:43:43,200
that that it was corruption and not
1123
00:43:41,280 –> 00:43:45,280
following the law and not enforcing the
1124
00:43:43,200 –> 00:43:48,559
law that had led to all those hundreds
1125
00:43:45,280 –> 00:43:50,280
of thousands of deaths right so uh I
1126
00:43:48,559 –> 00:43:51,640
it’s the same thing and and we have to
1127
00:43:50,280 –> 00:43:53,319
deal with it the fact that we make
1128
00:43:51,640 –> 00:43:54,839
systems that they’re so essential to
1129
00:43:53,319 –> 00:43:56,720
people’s lives now we’re not just
1130
00:43:54,839 –> 00:43:58,280
building toys anymore we’re not even
1131
00:43:56,720 –> 00:43:59,760
just building spreadsheets and stuff
1132
00:43:58,280 –> 00:44:02,359
that helps with an individual business
1133
00:43:59,760 –> 00:44:04,880
now we build things the entire complexes
1134
00:44:02,359 –> 00:44:07,680
depend on and so we have to expect that
1135
00:44:04,880 –> 00:44:11,079
we have to do a better job so basically
1136
00:44:07,680 –> 00:44:12,720
um you you are responsible if you bought
1137
00:44:11,079 –> 00:44:15,720
it and you’ve used it wrong you’re
1138
00:44:12,720 –> 00:44:17,559
responsible um and you know even that if
1139
00:44:15,720 –> 00:44:20,000
if if it was too easy to use wrong you
1140
00:44:17,559 –> 00:44:21,359
might consider the company responsible
1141
00:44:20,000 –> 00:44:23,160
if the company finds out that it was
1142
00:44:21,359 –> 00:44:25,119
somewhere in the further back like so
1143
00:44:23,160 –> 00:44:27,319
there was some company that you know
1144
00:44:25,119 –> 00:44:29,079
sourced something from somewhere else
1145
00:44:27,319 –> 00:44:30,800
that’s their problem and so they they
1146
00:44:29,079 –> 00:44:32,160
can worry that about that with respect
1147
00:44:30,800 –> 00:44:34,160
to the contracts that they have with
1148
00:44:32,160 –> 00:44:35,960
their suppliers and vendors but you as
1149
00:44:34,160 –> 00:44:37,359
the user don’t need to worry about that
1150
00:44:35,960 –> 00:44:38,839
just like if you went into a bank you
1151
00:44:37,359 –> 00:44:40,319
don’t care which employee was the one
1152
00:44:38,839 –> 00:44:41,880
who stole your money you just need your
1153
00:44:40,319 –> 00:44:43,240
money back and and that’s what the
1154
00:44:41,880 –> 00:44:45,000
government is going to force the bank to
1155
00:44:43,240 –> 00:44:48,880
do for you hopefully you’ve got that
1156
00:44:45,000 –> 00:44:50,400
right lawyers and things um so uh so
1157
00:44:48,880 –> 00:44:53,240
yeah that’s the situation that we are
1158
00:44:50,400 –> 00:44:55,319
now in that’s what the AI act says and
1159
00:44:53,240 –> 00:44:57,119
it was just such a big deal I mean I
1160
00:44:55,319 –> 00:44:59,800
just can’t believe how much trouble
1161
00:44:57,119 –> 00:45:02,800
there was around such a simple idea you
1162
00:44:59,800 –> 00:45:04,400
know that basically um software is is
1163
00:45:02,800 –> 00:45:06,280
actually a product it’s not not a
1164
00:45:04,400 –> 00:45:08,680
service I mean it’s sort of it can
1165
00:45:06,280 –> 00:45:10,720
provide a service but the Ser the
1166
00:45:08,680 –> 00:45:12,720
software itself is a product and you you
1167
00:45:10,720 –> 00:45:15,160
have all the obligations including being
1168
00:45:12,720 –> 00:45:17,920
able to prove that it was your user who
1169
00:45:15,160 –> 00:45:20,559
misused it so that is on the developer
1170
00:45:17,920 –> 00:45:22,119
if they can’t show that if if they can’t
1171
00:45:20,559 –> 00:45:24,480
show that the user is the one who messed
1172
00:45:22,119 –> 00:45:26,160
up then it’s on them they’re the ones
1173
00:45:24,480 –> 00:45:30,240
reliable to society if there’s damage
1174
00:45:26,160 –> 00:45:32,720
cost mhm is there any aspect you don’t
1175
00:45:30,240 –> 00:45:36,599
agree or you think it’s missing from the
1176
00:45:32,720 –> 00:45:38,400
ACT um I think to be honest I read it a
1177
00:45:36,599 –> 00:45:39,800
bunch of times before the final one and
1178
00:45:38,400 –> 00:45:42,720
now that it’s over I kind of haven’t
1179
00:45:39,800 –> 00:45:44,520
read the final version um but I
1180
00:45:42,720 –> 00:45:46,440
understand that there’s that looked like
1181
00:45:44,520 –> 00:45:48,800
a lot of stuff got shoehorned into it
1182
00:45:46,440 –> 00:45:52,160
towards the end and that there’s two
1183
00:45:48,800 –> 00:45:54,760
wide of holes open to interpretation and
1184
00:45:52,160 –> 00:45:56,359
interpretation means massive lobbying
1185
00:45:54,760 –> 00:45:58,720
from people with more money than they
1186
00:45:56,359 –> 00:46:00,880
should be allowed have you know the that
1187
00:45:58,720 –> 00:46:02,680
it’s literally that that the US has
1188
00:46:00,880 –> 00:46:04,599
stopped enforcing antitrust is a big
1189
00:46:02,680 –> 00:46:06,920
part of the problem inequality is
1190
00:46:04,599 –> 00:46:10,359
spiraling and then you have you have
1191
00:46:06,920 –> 00:46:12,839
these just insane processes so but in by
1192
00:46:10,359 –> 00:46:16,599
by and large no I think it’s great um I
1193
00:46:12,839 –> 00:46:19,119
guess the entire thing about um about uh
1194
00:46:16,599 –> 00:46:20,720
generative AI that or Foundation models
1195
00:46:19,119 –> 00:46:22,880
however you want to call it I don’t
1196
00:46:20,720 –> 00:46:25,240
think we needed it at all I think that
1197
00:46:22,880 –> 00:46:27,280
the framework we had was adequately
1198
00:46:25,240 –> 00:46:28,880
totally adequate but unfortunately Ely a
1199
00:46:27,280 –> 00:46:30,800
whole lot of legislators were were
1200
00:46:28,880 –> 00:46:32,960
fooled into believing that oh this is
1201
00:46:30,800 –> 00:46:34,319
beyond what we thought of and and the
1202
00:46:32,960 –> 00:46:35,880
only good thing that came out of that
1203
00:46:34,319 –> 00:46:38,359
was that it helped them think more about
1204
00:46:35,880 –> 00:46:41,240
IP and the IP behind the data so I think
1205
00:46:38,359 –> 00:46:43,520
there was a little bit of of uh stuff
1206
00:46:41,240 –> 00:46:46,480
there but anyway as far as I understand
1207
00:46:43,520 –> 00:46:49,280
the final law version of the law um what
1208
00:46:46,480 –> 00:46:51,559
was going on was that uh that a couple
1209
00:46:49,280 –> 00:46:53,839
of big tech companies had put you know a
1210
00:46:51,559 –> 00:46:56,960
couple like four or five have put
1211
00:46:53,839 –> 00:46:58,200
billions I mean billions and billions I
1212
00:46:56,960 –> 00:47:00,960
car San thing but this year we’re
1213
00:46:58,200 –> 00:47:04,160
talking about dollars not Stars sadly um
1214
00:47:00,960 –> 00:47:05,839
so billions and billions ofll same into
1215
00:47:04,160 –> 00:47:08,000
these products that it’s not clear to
1216
00:47:05,839 –> 00:47:09,839
have any real use I mean they they do
1217
00:47:08,000 –> 00:47:12,480
make some people more productive in some
1218
00:47:09,839 –> 00:47:15,000
contexts but it’s not clear whether they
1219
00:47:12,480 –> 00:47:17,400
make them enough more productive to pay
1220
00:47:15,000 –> 00:47:21,040
for the costs the overhead the Energy
1221
00:47:17,400 –> 00:47:22,280
overhead um and so while those people
1222
00:47:21,040 –> 00:47:23,319
are trying to sort that out the same
1223
00:47:22,280 –> 00:47:24,839
thing was true with Google at the
1224
00:47:23,319 –> 00:47:26,040
beginning incidentally this is going
1225
00:47:24,839 –> 00:47:28,359
back to when they finally figured out
1226
00:47:26,040 –> 00:47:29,800
the advertising model right but so at
1227
00:47:28,359 –> 00:47:31,160
the beginning Google just said forget
1228
00:47:29,800 –> 00:47:32,559
that we’re just we’re just creating a
1229
00:47:31,160 –> 00:47:35,720
great product and we’ll worry about you
1230
00:47:32,559 –> 00:47:37,440
know how to pay for it later um but what
1231
00:47:35,720 –> 00:47:39,079
they’re worried about is that in fact
1232
00:47:37,440 –> 00:47:40,720
you don’t need that much data this is
1233
00:47:39,079 –> 00:47:42,520
like when I was talking about Chess
1234
00:47:40,720 –> 00:47:44,359
right every chess program even once
1235
00:47:42,520 –> 00:47:46,160
people like hack together in a garage
1236
00:47:44,359 –> 00:47:48,400
and sell to kids on like little plastic
1237
00:47:46,160 –> 00:47:50,520
toys it’s playing better than humans
1238
00:47:48,400 –> 00:47:52,640
right you don’t need to be you know deep
1239
00:47:50,520 –> 00:47:55,119
blue anymore to you know you probably
1240
00:47:52,640 –> 00:47:57,160
still need to be that to be CASRO but
1241
00:47:55,119 –> 00:48:01,280
you don’t you don’t need that much power
1242
00:47:57,160 –> 00:48:03,720
to to beat 99% of people and so so that
1243
00:48:01,280 –> 00:48:05,319
was what uh there there was right before
1244
00:48:03,720 –> 00:48:07,440
all this stuff happened there was this
1245
00:48:05,319 –> 00:48:09,119
release a leak from Google saying we
1246
00:48:07,440 –> 00:48:11,520
don’t have a moat and neither does open
1247
00:48:09,119 –> 00:48:13,359
AI we’re putting on this investment in
1248
00:48:11,520 –> 00:48:15,599
and and you can find these cheap
1249
00:48:13,359 –> 00:48:18,000
products that um you don’t have to pay
1250
00:48:15,599 –> 00:48:20,079
much for that that do basically the same
1251
00:48:18,000 –> 00:48:21,559
thing and why you know why is anyone
1252
00:48:20,079 –> 00:48:23,599
going to pay for the amount the amount
1253
00:48:21,559 –> 00:48:25,319
of money that we’re we’re spending and
1254
00:48:23,599 –> 00:48:27,680
so basically what they really really
1255
00:48:25,319 –> 00:48:29,400
wanted was that you know ban anybody
1256
00:48:27,680 –> 00:48:31,960
else working on that but them make it
1257
00:48:29,400 –> 00:48:34,400
just prohibitively expensive and instead
1258
00:48:31,960 –> 00:48:36,160
what the AI act actually does it says
1259
00:48:34,400 –> 00:48:37,839
this only regulates people over a
1260
00:48:36,160 –> 00:48:39,440
certain size which as far as I
1261
00:48:37,839 –> 00:48:41,720
understand there’s only two models big
1262
00:48:39,440 –> 00:48:43,680
enough to be so the only people that
1263
00:48:41,720 –> 00:48:45,200
have extra stuff in the AI act are
1264
00:48:43,680 –> 00:48:48,000
exactly the people that caused all this
1265
00:48:45,200 –> 00:48:50,319
trouble and cost the EU so much effort
1266
00:48:48,000 –> 00:48:52,440
um and time and and and loss of sleep
1267
00:48:50,319 –> 00:48:54,079
and everything else so I think that’s
1268
00:48:52,440 –> 00:48:58,040
kind if it’s true that’s kind of wicked
1269
00:48:54,079 –> 00:49:01,680
I love it it would other it would
1270
00:48:58,040 –> 00:49:03,160
be like catastrophic to to small
1271
00:49:01,680 –> 00:49:06,240
startups and and companies who are
1272
00:49:03,160 –> 00:49:08,799
trying to you know um
1273
00:49:06,240 –> 00:49:10,760
challenge a big word but but yeah know
1274
00:49:08,799 –> 00:49:12,359
that the the but I mean that was the
1275
00:49:10,760 –> 00:49:14,160
thing Sam Altman was going out to the
1276
00:49:12,359 –> 00:49:16,160
Congress saying oh please please I’m a
1277
00:49:14,160 –> 00:49:17,480
good guy you know bind my hands and
1278
00:49:16,160 –> 00:49:19,920
anyone else that looks like me because
1279
00:49:17,480 –> 00:49:21,559
hey I’m leading and then but don’t but
1280
00:49:19,920 –> 00:49:23,640
don’t do any other kind of AI because
1281
00:49:21,559 –> 00:49:26,319
that would hurt the and it’s like Oh you
1282
00:49:23,640 –> 00:49:28,960
mean the SNES like say Microsoft who is
1283
00:49:26,319 –> 00:49:32,480
funding actually paying your billions
1284
00:49:28,960 –> 00:49:34,480
your 14 billion electricity bills and
1285
00:49:32,480 –> 00:49:36,400
and uh you know basically all the stuff
1286
00:49:34,480 –> 00:49:39,280
he said that other kind of AI is like
1287
00:49:36,400 –> 00:49:43,119
you know search is AI right that like
1288
00:49:39,280 –> 00:49:44,599
that was like that’s like AI 101 in 1993
1289
00:49:43,119 –> 00:49:46,960
that you know AI is basically just
1290
00:49:44,599 –> 00:49:48,559
search right search is ai ai is search
1291
00:49:46,960 –> 00:49:51,440
you know so you’re always trying to find
1292
00:49:48,559 –> 00:49:53,559
the next thing to do right so it’s it’s
1293
00:49:51,440 –> 00:49:54,839
integral like all the all these software
1294
00:49:53,559 –> 00:49:56,359
products that these companies are
1295
00:49:54,839 –> 00:49:59,200
producing they don’t want those to be
1296
00:49:56,359 –> 00:50:00,839
right ated which is nuts you know it
1297
00:49:59,200 –> 00:50:03,200
it’s just it’s just nuts because like I
1298
00:50:00,839 –> 00:50:07,040
said it’s not that honorous um they just
1299
00:50:03,200 –> 00:50:09,240
need to you know to grow up and and deal
1300
00:50:07,040 –> 00:50:11,760
with you know work together as a sector
1301
00:50:09,240 –> 00:50:13,280
to protect the people on the planet so
1302
00:50:11,760 –> 00:50:15,240
that they can still have a company in
1303
00:50:13,280 –> 00:50:16,640
another decade right it doesn’t make
1304
00:50:15,240 –> 00:50:20,160
sense that they’re that they’re that
1305
00:50:16,640 –> 00:50:22,160
they’re intervening so much um but I do
1306
00:50:20,160 –> 00:50:24,520
think I do think some of them sincerely
1307
00:50:22,160 –> 00:50:25,680
believe that I mean but but what they’re
1308
00:50:24,520 –> 00:50:27,720
sincerely believing is just
1309
00:50:25,680 –> 00:50:30,240
undereducated under informed and they
1310
00:50:27,720 –> 00:50:31,960
they really ought to get you know bring
1311
00:50:30,240 –> 00:50:33,640
in people that actually understand you
1312
00:50:31,960 –> 00:50:35,720
know political science that actually
1313
00:50:33,640 –> 00:50:37,839
understand uh you know various other
1314
00:50:35,720 –> 00:50:41,839
things you know that product product
1315
00:50:37,839 –> 00:50:43,839
liability and uh you yeah they do but
1316
00:50:41,839 –> 00:50:47,520
you know judging from what happened
1317
00:50:43,839 –> 00:50:51,520
recently with with the safety board
1318
00:50:47,520 –> 00:50:53,960
safety um Board of uh open AI you know
1319
00:50:51,520 –> 00:50:56,079
removing them and creating new
1320
00:50:53,960 –> 00:50:57,319
one that there so there’s so many
1321
00:50:56,079 –> 00:51:00,359
problems like a said this is there’s
1322
00:50:57,319 –> 00:51:01,640
just so much money around and um there’s
1323
00:51:00,359 –> 00:51:04,200
some real questions about what’s going
1324
00:51:01,640 –> 00:51:07,680
on with open AI I mean on the one hand
1325
00:51:04,200 –> 00:51:11,040
you know safety that is like long
1326
00:51:07,680 –> 00:51:12,640
termism is not really safety um and so
1327
00:51:11,040 –> 00:51:14,240
but on the other hand yeah there was
1328
00:51:12,640 –> 00:51:16,040
this whole hope that they actually did
1329
00:51:14,240 –> 00:51:18,839
have some kind of board independency you
1330
00:51:16,040 –> 00:51:20,319
want that and so again I it’s it’s just
1331
00:51:18,839 –> 00:51:23,400
like a big soap oper and I don’t have
1332
00:51:20,319 –> 00:51:24,960
time for so poers you know you know if
1333
00:51:23,400 –> 00:51:26,640
somebody wants to pay me I’ll go study
1334
00:51:24,960 –> 00:51:28,000
it but right now I’m more interested in
1335
00:51:26,640 –> 00:51:29,359
some other problems I’m I’m mostly
1336
00:51:28,000 –> 00:51:31,319
working on political polarization
1337
00:51:29,359 –> 00:51:35,760
actually right yeah yeah yeah yeah I can
1338
00:51:31,319 –> 00:51:39,799
see okay and the bias aspect which you
1339
00:51:35,760 –> 00:51:42,680
oh right okay so yeah so so too much
1340
00:51:39,799 –> 00:51:44,240
bias so this is the whole thing it’s not
1341
00:51:42,680 –> 00:51:46,079
only the AI act there’s also something
1342
00:51:44,240 –> 00:51:47,480
called the Digital Services Act and the
1343
00:51:46,079 –> 00:51:49,799
digital markets act but we won’t talk
1344
00:51:47,480 –> 00:51:51,240
about that right now but these are three
1345
00:51:49,799 –> 00:51:52,960
really important pieces of of
1346
00:51:51,240 –> 00:51:54,040
legislation and the general data
1347
00:51:52,960 –> 00:51:55,880
protection regulation which is
1348
00:51:54,040 –> 00:51:57,880
underlying at all that’s where the human
1349
00:51:55,880 –> 00:51:59,799
rights comes in incidentally all the
1350
00:51:57,880 –> 00:52:02,240
human rights is like fundamental to EU
1351
00:51:59,799 –> 00:52:05,119
law and gdpr is basically the one that
1352
00:52:02,240 –> 00:52:06,640
protects our human rights right okay but
1353
00:52:05,119 –> 00:52:09,119
then one up from that the Digital
1354
00:52:06,640 –> 00:52:11,520
Services Act and the AI act both say we
1355
00:52:09,119 –> 00:52:13,440
need to know whether you built your
1356
00:52:11,520 –> 00:52:16,079
software well right what are you doing
1357
00:52:13,440 –> 00:52:19,559
with it we need enough transparency so
1358
00:52:16,079 –> 00:52:21,040
the thing is about bias is that you know
1359
00:52:19,559 –> 00:52:23,520
we used to have to teach people again
1360
00:52:21,040 –> 00:52:26,280
back to AI 101 or machine learning 101
1361
00:52:23,520 –> 00:52:28,480
in this case bias is just regularities
1362
00:52:26,280 –> 00:52:29,960
right you expect a model to find out
1363
00:52:28,480 –> 00:52:32,559
that there’s certain things that happen
1364
00:52:29,960 –> 00:52:35,440
all the time so if you read the paper I
1365
00:52:32,559 –> 00:52:37,520
wrote on on on bias and in 2017 that
1366
00:52:35,440 –> 00:52:40,079
showed you know it was the big paper
1367
00:52:37,520 –> 00:52:42,200
that showed that um that the implicit
1368
00:52:40,079 –> 00:52:43,720
biases that people have are also
1369
00:52:42,200 –> 00:52:47,240
reflected in the software that’s trained
1370
00:52:43,720 –> 00:52:49,119
on the outs of those people right but it
1371
00:52:47,240 –> 00:52:51,240
also at the same time showed that those
1372
00:52:49,119 –> 00:52:53,559
implicit biases were reflecting lived
1373
00:52:51,240 –> 00:52:55,559
reality so like if you know you have
1374
00:52:53,559 –> 00:52:58,359
this sexist belief that you know nurses
1375
00:52:55,559 –> 00:53:00,119
are female and and programmers are male
1376
00:52:58,359 –> 00:53:01,799
but then if you looked at the US La
1377
00:53:00,119 –> 00:53:03,760
Labor Statistics that’s what was
1378
00:53:01,799 –> 00:53:05,359
happening and so there and it wasn’t
1379
00:53:03,760 –> 00:53:08,319
just like you know in general it was
1380
00:53:05,359 –> 00:53:11,559
like it was actually a pretty uh a very
1381
00:53:08,319 –> 00:53:13,400
very strong correlation um so in a way
1382
00:53:11,559 –> 00:53:15,240
our implicit biases and we could we
1383
00:53:13,400 –> 00:53:16,720
didn’t know this before like you know in
1384
00:53:15,240 –> 00:53:19,799
fact there was big arguments about
1385
00:53:16,720 –> 00:53:21,480
whether um that you know that these that
1386
00:53:19,799 –> 00:53:23,160
that you know the the conservatives were
1387
00:53:21,480 –> 00:53:25,920
saying you guys are trying to change
1388
00:53:23,160 –> 00:53:28,440
reality and the and the um and the
1389
00:53:25,920 –> 00:53:30,280
Liberals were saying oh no you know like
1390
00:53:28,440 –> 00:53:32,280
the the world is actually Fair it was
1391
00:53:30,280 –> 00:53:34,160
created to be fair and and you guys are
1392
00:53:32,280 –> 00:53:35,520
the ones who are just messing it up well
1393
00:53:34,160 –> 00:53:38,079
it turns out the conservatives were
1394
00:53:35,520 –> 00:53:40,520
right you know the world isn’t fair and
1395
00:53:38,079 –> 00:53:43,160
but I I still think that’s okay that
1396
00:53:40,520 –> 00:53:45,640
that the that that we choose to make the
1397
00:53:43,160 –> 00:53:48,160
world more fair than it was before right
1398
00:53:45,640 –> 00:53:50,079
I’m I’m I’m totally behind that but but
1399
00:53:48,160 –> 00:53:53,400
the but the path is different between
1400
00:53:50,079 –> 00:53:56,480
those two so anyway my point is
1401
00:53:53,400 –> 00:53:57,760
that we can’t use uh no amount of
1402
00:53:56,480 –> 00:54:00,000
transparency is going to make the world
1403
00:53:57,760 –> 00:54:02,559
more fair than it is right that that
1404
00:54:00,000 –> 00:54:05,599
takes policy decisions and real effort
1405
00:54:02,559 –> 00:54:07,240
you know and and and some things there
1406
00:54:05,599 –> 00:54:09,960
there is no Chang in the fact that you
1407
00:54:07,240 –> 00:54:11,960
know the women when they’re pregnant
1408
00:54:09,960 –> 00:54:14,079
that there’s nine months you know like
1409
00:54:11,960 –> 00:54:16,040
that’s women and it’s not meant right
1410
00:54:14,079 –> 00:54:18,160
that maybe some biologist will change
1411
00:54:16,040 –> 00:54:20,079
that but basically that’s that’s that’s
1412
00:54:18,160 –> 00:54:22,240
the way so some things are fundamentally
1413
00:54:20,079 –> 00:54:24,480
unfair and it’s not that’s not the only
1414
00:54:22,240 –> 00:54:27,319
thing there’s also um like there’s more
1415
00:54:24,480 –> 00:54:29,280
genetic risks taken with you know like
1416
00:54:27,319 –> 00:54:30,680
like this is not this is not a human
1417
00:54:29,280 –> 00:54:32,240
decision this is true like in you know
1418
00:54:30,680 –> 00:54:34,000
fruit flies and things too right you
1419
00:54:32,240 –> 00:54:36,200
know it’s just that the mixing the
1420
00:54:34,000 –> 00:54:37,760
mixing of the genomes level you know so
1421
00:54:36,200 –> 00:54:39,760
so there’s certain things that are just
1422
00:54:37,760 –> 00:54:41,400
not the same at least I don’t know how
1423
00:54:39,760 –> 00:54:43,920
you want to Define but they’re certainly
1424
00:54:41,400 –> 00:54:45,440
not the same but we have chosen what
1425
00:54:43,920 –> 00:54:47,720
what we try to do with our system of
1426
00:54:45,440 –> 00:54:49,200
justice is say that shouldn’t when
1427
00:54:47,720 –> 00:54:51,480
you’re born that every child should have
1428
00:54:49,200 –> 00:54:54,040
at least a certain amount of of of
1429
00:54:51,480 –> 00:54:55,559
chance of doing well no matter who their
1430
00:54:54,040 –> 00:54:57,359
parents were no matter what genes they
1431
00:54:55,559 –> 00:54:59,040
happen to have and again there’s some
1432
00:54:57,359 –> 00:55:00,839
things we can’t overcome you know like
1433
00:54:59,040 –> 00:55:02,160
if you’re if you’re born with no brain
1434
00:55:00,839 –> 00:55:04,400
that happens right and there’s nothing
1435
00:55:02,160 –> 00:55:06,880
we can do about that the best of our
1436
00:55:04,400 –> 00:55:08,839
abilities we try to help everyone uh be
1437
00:55:06,880 –> 00:55:11,680
a productive member of society and have
1438
00:55:08,839 –> 00:55:13,520
POS possibilities and beliefs right so
1439
00:55:11,680 –> 00:55:16,640
um that’s not about you know that’s not
1440
00:55:13,520 –> 00:55:19,119
something you can fix by by better data
1441
00:55:16,640 –> 00:55:21,480
right but you but but nevertheless
1442
00:55:19,119 –> 00:55:24,319
having said all that absolutely it is
1443
00:55:21,480 –> 00:55:27,160
possible to build an unfair AI system
1444
00:55:24,319 –> 00:55:28,799
you can deliberately set it up to hurt
1445
00:55:27,160 –> 00:55:30,039
people that are of A different race or a
1446
00:55:28,799 –> 00:55:32,359
different ethnicity or a different
1447
00:55:30,039 –> 00:55:35,000
religion or whatever or gender or
1448
00:55:32,359 –> 00:55:37,319
whatever you know you you can easily go
1449
00:55:35,000 –> 00:55:38,880
into the system and say it’s you’re
1450
00:55:37,319 –> 00:55:40,720
going to be less likely to get into
1451
00:55:38,880 –> 00:55:43,839
college in this context or right
1452
00:55:40,720 –> 00:55:45,400
whatever you’re not g to get Alone um
1453
00:55:43,839 –> 00:55:48,039
you know in China right now you’re not
1454
00:55:45,400 –> 00:55:49,960
going to be able to fly if you if you
1455
00:55:48,039 –> 00:55:52,280
used to study wegers even if you let
1456
00:55:49,960 –> 00:55:53,440
alone if you are Wier right you know and
1457
00:55:52,280 –> 00:55:55,400
they’re going to say it’s a social
1458
00:55:53,440 –> 00:55:56,920
credit system but it’s actually you know
1459
00:55:55,400 –> 00:55:59,039
it’s not about like bad things you’ve
1460
00:55:56,920 –> 00:56:01,319
done in terms of like smoking on the
1461
00:55:59,039 –> 00:56:03,640
train it’s bad things that you’ve done
1462
00:56:01,319 –> 00:56:06,079
as in pursuing former policy which has
1463
00:56:03,640 –> 00:56:08,440
now been changed and updated right and
1464
00:56:06,079 –> 00:56:10,559
so that is why we need transparency we
1465
00:56:08,440 –> 00:56:13,359
need to be able to see whether there’s
1466
00:56:10,559 –> 00:56:15,280
been extra bias put in and the other
1467
00:56:13,359 –> 00:56:17,039
thing is that when people do figure out
1468
00:56:15,280 –> 00:56:18,559
ways to make the system more fair and
1469
00:56:17,039 –> 00:56:21,200
you’ll never make it perfectly fair but
1470
00:56:18,559 –> 00:56:23,480
we have found ways to make it more fair
1471
00:56:21,200 –> 00:56:24,960
then that becomes best practice and
1472
00:56:23,480 –> 00:56:26,720
particularly large companies have an
1473
00:56:24,960 –> 00:56:28,039
obligation to follow the best practice
1474
00:56:26,720 –> 00:56:30,119
they have to show they’ve done due
1475
00:56:28,039 –> 00:56:32,200
diligence and so that’s what we try to
1476
00:56:30,119 –> 00:56:33,799
do and and so that’s why that’s sitting
1477
00:56:32,200 –> 00:56:35,920
on top of what I was talking about
1478
00:56:33,799 –> 00:56:38,119
before bias is just one thing though
1479
00:56:35,920 –> 00:56:40,319
there’s so many ways you can hurt people
1480
00:56:38,119 –> 00:56:42,039
and you can and if you want to you can
1481
00:56:40,319 –> 00:56:43,839
pretend that it was you know the AI
1482
00:56:42,039 –> 00:56:46,839
signed off on it or something and it’s
1483
00:56:43,839 –> 00:56:49,000
just rubbish you know and and it’s a way
1484
00:56:46,839 –> 00:56:50,799
to get out of trying to get out of
1485
00:56:49,000 –> 00:56:53,880
responsibility but we’re we’re hopefully
1486
00:56:50,799 –> 00:56:55,960
educating enough judges that that that
1487
00:56:53,880 –> 00:56:57,079
um decision doesn’t wash that excuse
1488
00:56:55,960 –> 00:57:01,160
does not not
1489
00:56:57,079 –> 00:57:04,960
was how can we incentivize and support
1490
00:57:01,160 –> 00:57:06,240
the AI for good um efforts well I mean
1491
00:57:04,960 –> 00:57:08,400
there’s people that put tons of money
1492
00:57:06,240 –> 00:57:09,559
into AI for good and again some of those
1493
00:57:08,400 –> 00:57:11,280
people are the people who don’t want to
1494
00:57:09,559 –> 00:57:13,799
be regulated so I’m a little worried
1495
00:57:11,280 –> 00:57:15,960
about such things I I think that what we
1496
00:57:13,799 –> 00:57:19,680
have done with the European regulation
1497
00:57:15,960 –> 00:57:22,440
is fantastic and it’s just making m to
1498
00:57:19,680 –> 00:57:24,960
the best to the best of our abilities
1499
00:57:22,440 –> 00:57:27,520
making it a fair market where where
1500
00:57:24,960 –> 00:57:28,960
people can pursue uh complaints you know
1501
00:57:27,520 –> 00:57:30,960
where we can check and see whether or
1502
00:57:28,960 –> 00:57:32,240
not people were doing the right thing
1503
00:57:30,960 –> 00:57:34,839
that’s going to solve a lot of the
1504
00:57:32,240 –> 00:57:36,280
problems there are other things that are
1505
00:57:34,839 –> 00:57:37,839
that that didn’t I said it wasn’t going
1506
00:57:36,280 –> 00:57:40,440
to mention digital Market side but there
1507
00:57:37,839 –> 00:57:42,480
are other problems which are that some
1508
00:57:40,440 –> 00:57:44,520
things are just by Nature very large and
1509
00:57:42,480 –> 00:57:45,960
then the market can’t help take care of
1510
00:57:44,520 –> 00:57:48,319
it and so that means even more
1511
00:57:45,960 –> 00:57:50,160
governance but again what does that mean
1512
00:57:48,319 –> 00:57:51,640
it’s like so like electricity bills and
1513
00:57:50,160 –> 00:57:53,720
things like that you get citizens to
1514
00:57:51,640 –> 00:57:55,640
work together and they decide what would
1515
00:57:53,720 –> 00:57:56,920
be fair or what you know is practicable
1516
00:57:55,640 –> 00:57:58,400
and things and then they work with the
1517
00:57:56,920 –> 00:57:59,920
government and they enforce that on the
1518
00:57:58,400 –> 00:58:02,680
people who provide electricity or
1519
00:57:59,920 –> 00:58:07,280
provide trains or whatever provide post
1520
00:58:02,680 –> 00:58:10,640
offices um so so I think there will be
1521
00:58:07,280 –> 00:58:13,400
some it’s not AI itself but you know AI
1522
00:58:10,640 –> 00:58:14,960
is just a suite of software uh uh
1523
00:58:13,400 –> 00:58:16,520
techniques which can be applied to all
1524
00:58:14,960 –> 00:58:18,480
kinds of things but stuff like search
1525
00:58:16,520 –> 00:58:21,000
stuff like Social Media stuff like
1526
00:58:18,480 –> 00:58:23,200
generative AI that might very well turn
1527
00:58:21,000 –> 00:58:25,680
out to be more like electricity and
1528
00:58:23,200 –> 00:58:27,160
plumbing and less like you know I don’t
1529
00:58:25,680 –> 00:58:32,720
know car
1530
00:58:27,160 –> 00:58:34,920
so so in that case then um then then we
1531
00:58:32,720 –> 00:58:37,520
will need to have even more regulation
1532
00:58:34,920 –> 00:58:39,200
but still the basic principles of real
1533
00:58:37,520 –> 00:58:40,839
transparency to How It’s Made and making
1534
00:58:39,200 –> 00:58:42,200
sure people are following good practice
1535
00:58:40,839 –> 00:58:45,400
and they’re doing things for the right
1536
00:58:42,200 –> 00:58:47,640
reasons uh that that’s the key how do
1537
00:58:45,400 –> 00:58:51,319
you see this like where do you see it
1538
00:58:47,640 –> 00:58:54,359
going in terms of human and AI um
1539
00:58:51,319 –> 00:58:56,520
collaboration in terms of you know maybe
1540
00:58:54,359 –> 00:58:59,000
arriving to
1541
00:58:56,520 –> 00:59:00,720
AGI so first of all my most recent
1542
00:58:59,000 –> 00:59:03,119
philosophy paper in fact the only one
1543
00:59:00,720 –> 00:59:05,000
I’ve written with actual philosophers is
1544
00:59:03,119 –> 00:59:07,079
called do we collaborate with what we
1545
00:59:05,000 –> 00:59:09,280
design and the answer is no that’s the
1546
00:59:07,079 –> 00:59:10,559
wrong metaphor the only way that we can
1547
00:59:09,280 –> 00:59:12,680
do the kinds of improvement we’re
1548
00:59:10,559 –> 00:59:14,880
talking about is to hold the people
1549
00:59:12,680 –> 00:59:17,240
responsible for the systems they develop
1550
00:59:14,880 –> 00:59:18,799
and they own and that they operate they
1551
00:59:17,240 –> 00:59:21,119
are the ones we have to hold to account
1552
00:59:18,799 –> 00:59:23,400
there is no way to hold a design system
1553
00:59:21,119 –> 00:59:26,480
to account it doesn’t make sense so it
1554
00:59:23,400 –> 00:59:29,000
is not helpful to think about um about
1555
00:59:26,480 –> 00:59:30,119
AI itself as Pier or whatever it’s it’s
1556
00:59:29,000 –> 00:59:32,799
it’s like what we were talking about
1557
00:59:30,119 –> 00:59:35,319
before with replica like you’re just you
1558
00:59:32,799 –> 00:59:36,960
you you’re pretending that something is
1559
00:59:35,319 –> 00:59:39,440
held to account but actually you’re just
1560
00:59:36,960 –> 00:59:42,200
letting the people who misused some
1561
00:59:39,440 –> 00:59:44,920
technology off the hook all right so so
1562
00:59:42,200 –> 00:59:47,359
collaborating out the window now okay
1563
00:59:44,920 –> 00:59:49,760
AGI it depends what you define how you
1564
00:59:47,359 –> 00:59:51,359
mean it um when I first heard the term
1565
00:59:49,760 –> 00:59:53,920
it was people that were saying oh you
1566
00:59:51,359 –> 00:59:55,640
know nobody working in AI is actually
1567
00:59:53,920 –> 00:59:56,839
trying to solve all the problems they
1568
00:59:55,640 –> 00:59:58,359
you know they’re all just trying to do
1569
00:59:56,839 –> 01:00:00,000
Machine Vision or something and that’s
1570
00:59:58,359 –> 01:00:01,559
boring and we want to get back to more
1571
01:00:00,000 –> 01:00:03,440
human you know really understand human
1572
01:00:01,559 –> 01:00:05,240
intelligence and you know that was
1573
01:00:03,440 –> 01:00:06,880
completely false there were very good
1574
01:00:05,240 –> 01:00:08,799
reasons a lot of people were trying to
1575
01:00:06,880 –> 01:00:10,720
solve the whole problem of intelligence
1576
01:00:08,799 –> 01:00:13,640
and and and it was it was just this
1577
01:00:10,720 –> 01:00:15,799
false narrative um and then people
1578
01:00:13,640 –> 01:00:18,119
started talking about oh AGI there’s
1579
01:00:15,799 –> 01:00:19,799
like this um there’s going to be this
1580
01:00:18,119 –> 01:00:22,119
intelligence explosion when you have
1581
01:00:19,799 –> 01:00:23,039
systems that can learn how to learn and
1582
01:00:22,119 –> 01:00:25,119
then there’s going to be this
1583
01:00:23,039 –> 01:00:29,480
exponential and exponential growth and
1584
01:00:25,119 –> 01:00:32,799
we will be m is after that well I think
1585
01:00:29,480 –> 01:00:34,000
that is at least logical um but again it
1586
01:00:32,799 –> 01:00:36,559
doesn’t make sense to talk about the
1587
01:00:34,000 –> 01:00:39,480
system itself and if you actually look
1588
01:00:36,559 –> 01:00:40,760
at the number of people on the planet
1589
01:00:39,480 –> 01:00:42,960
that’s what’s happened since we’ve had
1590
01:00:40,760 –> 01:00:45,079
writing so since we’ve been able to
1591
01:00:42,960 –> 01:00:47,680
improve our own intelligence through our
1592
01:00:45,079 –> 01:00:49,079
technology um we have been that
1593
01:00:47,680 –> 01:00:52,280
explosion and that’s why we have a
1594
01:00:49,079 –> 01:00:54,240
climate crisis right now right so so if
1595
01:00:52,280 –> 01:00:56,520
that’s what you mean by AGI then you’re
1596
01:00:54,240 –> 01:00:58,280
you know it it happened 10,000 years ago
1597
01:00:56,520 –> 01:01:00,720
and we need to start dealing with it a
1598
01:00:58,280 –> 01:01:03,119
little better than we have been um and
1599
01:01:00,720 –> 01:01:05,480
then finally uh there’s like this thing
1600
01:01:03,119 –> 01:01:06,720
about well you know AI is going to be
1601
01:01:05,480 –> 01:01:09,039
controlling everything and we’re just
1602
01:01:06,720 –> 01:01:11,640
gting these magic magic machines that
1603
01:01:09,039 –> 01:01:13,280
that take over the world and um then
1604
01:01:11,640 –> 01:01:15,200
what we do to you know ensure that
1605
01:01:13,280 –> 01:01:17,559
there’s employment or whatever and these
1606
01:01:15,200 –> 01:01:19,440
guys basically wind up you know they
1607
01:01:17,559 –> 01:01:20,799
they’ve stuck in this extra piece which
1608
01:01:19,440 –> 01:01:22,760
as I said doesn’t make sense it’s just
1609
01:01:20,799 –> 01:01:25,359
an extension an extension of human
1610
01:01:22,760 –> 01:01:26,880
agency but they’ve hypothesized this
1611
01:01:25,359 –> 01:01:28,799
extra piece and then they start trying
1612
01:01:26,880 –> 01:01:30,920
to solve as I said the basic problems of
1613
01:01:28,799 –> 01:01:32,920
political science you know and economics
1614
01:01:30,920 –> 01:01:34,520
and whatever and and they just they just
1615
01:01:32,920 –> 01:01:36,640
come back to exactly where we already
1616
01:01:34,520 –> 01:01:38,920
are and the best thing those guys can do
1617
01:01:36,640 –> 01:01:40,799
is take their enormous brains and and
1618
01:01:38,920 –> 01:01:42,319
and get another degree and help us work
1619
01:01:40,799 –> 01:01:44,640
on those problems because there are
1620
01:01:42,319 –> 01:01:46,880
serious problems governance is hard the
1621
01:01:44,640 –> 01:01:49,280
economy has to change security has to
1622
01:01:46,880 –> 01:01:51,079
change because we’re at a situation
1623
01:01:49,280 –> 01:01:53,480
where we can no longer have exponential
1624
01:01:51,079 –> 01:01:55,319
un exponential growth of of the number
1625
01:01:53,480 –> 01:01:58,200
of people so we need to think of a
1626
01:01:55,319 –> 01:02:00,359
better way to stabilize our our our
1627
01:01:58,200 –> 01:02:03,039
biosphere um and we need to do it in a
1628
01:02:00,359 –> 01:02:05,000
way that that presumably that causes as
1629
01:02:03,039 –> 01:02:08,400
little suffering as possible as well as
1630
01:02:05,000 –> 01:02:12,359
as as little further few uh further
1631
01:02:08,400 –> 01:02:13,480
extinctions so um yeah that that’s why I
1632
01:02:12,359 –> 01:02:18,319
think about
1633
01:02:13,480 –> 01:02:22,039
AI okay just to close it how so do you
1634
01:02:18,319 –> 01:02:26,279
see um more benefits or
1635
01:02:22,039 –> 01:02:31,200
more like positives uh helping us to
1636
01:02:26,279 –> 01:02:34,160
combat such um existential like crisises
1637
01:02:31,200 –> 01:02:36,559
oh yeah absolutely I don’t I mean like
1638
01:02:34,160 –> 01:02:38,319
it’s almost it’s Inseparable from who it
1639
01:02:36,559 –> 01:02:41,160
is and what we are to be humans that we
1640
01:02:38,319 –> 01:02:43,119
have we are these these Apes that have
1641
01:02:41,160 –> 01:02:46,319
wielded technology to be in this
1642
01:02:43,119 –> 01:02:48,039
situation but what I see now although um
1643
01:02:46,319 –> 01:02:49,720
we’re using a lot of this capacity to
1644
01:02:48,039 –> 01:02:52,640
kind of slow down change especially the
1645
01:02:49,720 –> 01:02:55,440
powerful when we really have to change
1646
01:02:52,640 –> 01:02:57,160
um we can do it very quickly and so we
1647
01:02:55,440 –> 01:02:59,520
have seen some really you know
1648
01:02:57,160 –> 01:03:01,920
incredible you know if you look at
1649
01:02:59,520 –> 01:03:04,400
compare Co to Spanish Flu compare you
1650
01:03:01,920 –> 01:03:08,319
know 2008 to to
1651
01:03:04,400 –> 01:03:11,400
1929 um you know if you look at Germany
1652
01:03:08,319 –> 01:03:14,400
that we we changed our entire source of
1653
01:03:11,400 –> 01:03:17,160
fuel from basically Russian oil and gas
1654
01:03:14,400 –> 01:03:19,520
to liquid natural gas in six months six
1655
01:03:17,160 –> 01:03:21,880
months you know every business then I
1656
01:03:19,520 –> 01:03:25,359
told you this like most of the industry
1657
01:03:21,880 –> 01:03:27,240
um we reduced uh Demand by 20% and
1658
01:03:25,359 –> 01:03:28,960
that’s after Decades of being total you
1659
01:03:27,240 –> 01:03:31,760
know save the save the polar bears and
1660
01:03:28,960 –> 01:03:34,760
everything you know so so we C we are
1661
01:03:31,760 –> 01:03:36,520
capable of great change we are already
1662
01:03:34,760 –> 01:03:37,960
too slow at it there are already people
1663
01:03:36,520 –> 01:03:40,559
that are drowning and dying of heat
1664
01:03:37,960 –> 01:03:41,920
strokes and whatever um but but we
1665
01:03:40,559 –> 01:03:43,559
aren’t going to be so slow that there’s
1666
01:03:41,920 –> 01:03:46,559
really going to be the end of humanity I
1667
01:03:43,559 –> 01:03:48,119
don’t think um it it seems very unlikely
1668
01:03:46,559 –> 01:03:49,680
scientists always say that say oh she
1669
01:03:48,119 –> 01:03:51,520
wasn’t sure no no no scientists always
1670
01:03:49,680 –> 01:03:55,559
say that so very very very unlikely that
1671
01:03:51,520 –> 01:03:56,960
we would do that it you know the the um
1672
01:03:55,559 –> 01:03:58,720
but we could kill a lot of people if we
1673
01:03:56,960 –> 01:04:01,480
have a nuclear war a lot of people would
1674
01:03:58,720 –> 01:04:03,279
die um and and and if we don’t figure
1675
01:04:01,480 –> 01:04:05,079
out how to help the people in India get
1676
01:04:03,279 –> 01:04:07,160
to parts of the world that AR a little
1677
01:04:05,079 –> 01:04:08,640
less hot than where they are right now
1678
01:04:07,160 –> 01:04:10,279
they’re going to die right so there’s
1679
01:04:08,640 –> 01:04:12,799
like a lot of there’s a lot of
1680
01:04:10,279 –> 01:04:15,640
situations I’m sorry Pakistan
1681
01:04:12,799 –> 01:04:17,400
too so uh you know there’s a lot of
1682
01:04:15,640 –> 01:04:19,000
situations right now we have to figure
1683
01:04:17,400 –> 01:04:20,880
out how to stop uh people from
1684
01:04:19,000 –> 01:04:24,039
flattening cities but it’s not clear to
1685
01:04:20,880 –> 01:04:26,240
how to do that um either you know it’s
1686
01:04:24,039 –> 01:04:27,839
not it’s not easy but there are there’s
1687
01:04:26,240 –> 01:04:29,119
only four or five countries that would
1688
01:04:27,839 –> 01:04:31,839
that are doing that right now but we
1689
01:04:29,119 –> 01:04:33,359
need to stop those guys too so I think
1690
01:04:31,839 –> 01:04:35,400
that we have a lot of real immediate
1691
01:04:33,359 –> 01:04:38,000
problems but we also are coming up with
1692
01:04:35,400 –> 01:04:39,760
Solutions at an incredible rate and I
1693
01:04:38,000 –> 01:04:42,520
think this is
1694
01:04:39,760 –> 01:04:44,760
life it’s constant Innovation and and
1695
01:04:42,520 –> 01:04:46,920
constant attempt to basically keep
1696
01:04:44,760 –> 01:04:49,160
projecting yourself into the future
1697
01:04:46,920 –> 01:04:51,599
Joanna it’s it’s been pleasure thank you
1698
01:04:49,160 –> 01:04:54,079
thank you so much for for sharing all
1699
01:04:51,599 –> 01:04:56,559
this uh I will have to rewatch it
1700
01:04:54,079 –> 01:04:59,240
because so many aspects you’ve um
1701
01:04:56,559 –> 01:05:02,359
mentioned I haven’t uh you know heard or
1702
01:04:59,240 –> 01:05:04,680
read about um but they are very very
1703
01:05:02,359 –> 01:05:07,279
crucial to understand uh to
1704
01:05:04,680 –> 01:05:10,599
understanding the the threats and and
1705
01:05:07,279 –> 01:05:12,160
opportunities of of what we are creating
1706
01:05:10,599 –> 01:05:14,799
well you’re very welcome it was nice to
1707
01:05:12,160 –> 01:05:17,559
meet you nice to meet you J thank you
1708
01:05:14,799 –> 01:05:17,559
bye bye