“Everyone loves change, if it doesn’t affect them.”
– this insightful quote from Vikram Somaya, former Chief Data & Analytics Officer at PepsiC, sets the stage for our fascinating conversation hitting the Are You Human podcast now.
AI is transforming our world at an incredible pace, reshaping everything from consumer behaviour to business operations. That’s why I believe, Vikram’s hard-won wisdom on driving transformation through data analytics has never been more relevant.
Having led data strategies across media giants like ESPN and Nielsen, as well as an iconic CPG brand like PepsiCo, Vikram offers a rare cross-industry perspective, that I’m so grateful he shares with my dear audience – you.
More importantly, Vikram doesn’t hold back on the existential challenges AI poses for businesses and society. We candidly discuss the risks of bias in automated decision-making and the ethical guardrails companies must implement.
Perhaps most eye-opening are his views on AI’s potential to disrupt consumer preferences, brand loyalty. Don’t miss out.
Transcript:
1
00:00:00,240 –> 00:00:03,879
how do we take advantage of the amazing
2
00:00:02,040 –> 00:00:06,560
things that are happening in in
3
00:00:03,879 –> 00:00:07,759
Enterprise data technology in all the
4
00:00:06,560 –> 00:00:10,639
amazing things that we’re now hearing
5
00:00:07,759 –> 00:00:14,840
about with AI to actually make business
6
00:00:10,639 –> 00:00:14,840
better to make people have better
7
00:00:16,640 –> 00:00:20,439
experiences and sometimes that thread
8
00:00:18,760 –> 00:00:22,199
gets lost a little bit we get so caught
9
00:00:20,439 –> 00:00:24,519
up in the technology we forget why are
10
00:00:22,199 –> 00:00:26,119
we actually doing this but sometimes
11
00:00:24,519 –> 00:00:29,599
they’re they’re afraid to say I don’t
12
00:00:26,119 –> 00:00:31,080
know right and and and some some cases I
13
00:00:29,599 –> 00:00:33,280
don’t blame them it’s hard right you
14
00:00:31,080 –> 00:00:34,719
have to get up and say look there’s this
15
00:00:33,280 –> 00:00:36,920
completely new way of doing something
16
00:00:34,719 –> 00:00:39,719
that I’ve been doing for 10 20 in some
17
00:00:36,920 –> 00:00:41,879
cases at pepsic Disney 40 years right
18
00:00:39,719 –> 00:00:43,920
and it’s different and everything I
19
00:00:41,879 –> 00:00:45,440
thought I knew about how to do my job
20
00:00:43,920 –> 00:00:48,000
and how many people it would require and
21
00:00:45,440 –> 00:00:51,120
how much investment I have to put in is
22
00:00:48,000 –> 00:00:53,879
changing and everyone loves change if it
23
00:00:51,120 –> 00:00:55,440
doesn’t affect them right right I mean
24
00:00:53,879 –> 00:00:58,160
and that’s that that’s the nature of
25
00:00:55,440 –> 00:01:02,239
that process so it’s it can be difficult
26
00:00:58,160 –> 00:01:04,600
and it can be hard and it can be um uh
27
00:01:02,239 –> 00:01:06,439
it can be humbling right and I think if
28
00:01:04,600 –> 00:01:08,240
you go in there with the attitude that
29
00:01:06,439 –> 00:01:10,560
this is about learning this is about
30
00:01:08,240 –> 00:01:12,560
experience and we have to be part of how
31
00:01:10,560 –> 00:01:13,880
these things change and that means your
32
00:01:12,560 –> 00:01:15,320
path might eventually lead somewhere
33
00:01:13,880 –> 00:01:18,200
else and that’s that’s what happened
34
00:01:15,320 –> 00:01:20,240
with me and others um we are part of a
35
00:01:18,200 –> 00:01:21,920
story we are part of a chapter in a book
36
00:01:20,240 –> 00:01:23,520
right and that book you have one that is
37
00:01:21,920 –> 00:01:25,280
your own life your own book Your Own
38
00:01:23,520 –> 00:01:27,040
Story your own experience and the other
39
00:01:25,280 –> 00:01:28,200
is the life of these companies and
40
00:01:27,040 –> 00:01:30,079
sometimes you’ll be part of them
41
00:01:28,200 –> 00:01:31,880
sometimes you won’t but I think if you
42
00:01:30,079 –> 00:01:34,840
can find again going back to the things
43
00:01:31,880 –> 00:01:37,759
that matter find culture find Innovation
44
00:01:34,840 –> 00:01:41,079
find curiosity and use those as your not
45
00:01:37,759 –> 00:01:43,079
stars for change um and and be confident
46
00:01:41,079 –> 00:01:44,719
in the fact that learning is the most
47
00:01:43,079 –> 00:01:46,640
powerful tool you can possess I think
48
00:01:44,719 –> 00:01:48,360
those are all powerful qualities and and
49
00:01:46,640 –> 00:01:49,920
at well-run companies like the ones I
50
00:01:48,360 –> 00:01:52,399
worked for I think you know we’ve seen
51
00:01:49,920 –> 00:01:52,399
Success With
52
00:01:52,880 –> 00:01:56,159
It are you
53
00:01:54,820 –> 00:02:01,600
[Music]
54
00:01:56,159 –> 00:02:05,159
human hi VI CR and our friend s rashidi
55
00:02:01,600 –> 00:02:08,399
told me so so many great things about
56
00:02:05,159 –> 00:02:10,840
you I’m really really curious to hear
57
00:02:08,399 –> 00:02:14,800
all the things you’ve been um doing
58
00:02:10,840 –> 00:02:16,760
since you left epito right yeah thanks
59
00:02:14,800 –> 00:02:18,480
cam I first of all thank you for having
60
00:02:16,760 –> 00:02:20,440
me it’s always it’s always nice to talk
61
00:02:18,480 –> 00:02:23,440
to people who have something interesting
62
00:02:20,440 –> 00:02:26,840
to say and interesting questions to ask
63
00:02:23,440 –> 00:02:29,319
yeah I um uh pepsic IOD it wears in late
64
00:02:26,840 –> 00:02:30,720
January and since then you know there
65
00:02:29,319 –> 00:02:32,400
are very few times in life when you
66
00:02:30,720 –> 00:02:34,120
really get an opportunity to explore
67
00:02:32,400 –> 00:02:36,239
what’s going on outside the world that
68
00:02:34,120 –> 00:02:38,319
you’re very used to um and you know
69
00:02:36,239 –> 00:02:39,599
large corporate companies you know they
70
00:02:38,319 –> 00:02:41,040
that becomes your Universe to some
71
00:02:39,599 –> 00:02:43,280
degree especially when there is loud
72
00:02:41,040 –> 00:02:45,560
weico you know 185 markets we had a lot
73
00:02:43,280 –> 00:02:48,000
of work to do I was running a team of
74
00:02:45,560 –> 00:02:50,680
about 1100 people all in it’s a that’s a
75
00:02:48,000 –> 00:02:52,280
full-time very consuming job um and so
76
00:02:50,680 –> 00:02:54,400
I’ve had the opportunity now to spend a
77
00:02:52,280 –> 00:02:57,000
lot of time with everyone from Young
78
00:02:54,400 –> 00:02:58,599
startups we’re working in all forms of
79
00:02:57,000 –> 00:03:01,120
AI and what’s happening in analytics in
80
00:02:58,599 –> 00:03:03,000
the world today and and then also medium
81
00:03:01,120 –> 00:03:04,280
to large corporations so companies that
82
00:03:03,000 –> 00:03:07,200
are really thinking through some of the
83
00:03:04,280 –> 00:03:09,360
same problems that PepsiCo was or ESPN
84
00:03:07,200 –> 00:03:11,680
was or or neelsen was some of the
85
00:03:09,360 –> 00:03:13,840
companies I’ve worked at in the past um
86
00:03:11,680 –> 00:03:15,560
and that freedom has been very enjoyable
87
00:03:13,840 –> 00:03:18,760
I’ve uh I’ve really really had a good
88
00:03:15,560 –> 00:03:22,040
time was it deliberate that you chose
89
00:03:18,760 –> 00:03:24,760
those um Industries media and then uh
90
00:03:22,040 –> 00:03:26,799
retail or you just yeah it’s a good
91
00:03:24,760 –> 00:03:29,159
question I I don’t think anything in my
92
00:03:26,799 –> 00:03:31,640
life has been widely deliberate i u you
93
00:03:29,159 –> 00:03:33,280
know I’ve had my whole life um it is
94
00:03:31,640 –> 00:03:35,400
part of how I function and you know as
95
00:03:33,280 –> 00:03:38,239
with all things neuro there are pros and
96
00:03:35,400 –> 00:03:40,560
cons to how my brain works I like a lot
97
00:03:38,239 –> 00:03:43,280
of stimulus I really enjoy people I need
98
00:03:40,560 –> 00:03:45,680
a lot of data to process things um but
99
00:03:43,280 –> 00:03:47,519
then when I’m ready to I move on and
100
00:03:45,680 –> 00:03:49,560
that has informed my career in a lot of
101
00:03:47,519 –> 00:03:51,640
ways when I came out of college and I
102
00:03:49,560 –> 00:03:53,920
you know I originally went to to Yale
103
00:03:51,640 –> 00:03:55,280
for molecular biology and biochemistry
104
00:03:53,920 –> 00:03:57,599
and graduated with a degree in the
105
00:03:55,280 –> 00:03:59,159
history of architecture that’s not so
106
00:03:57,599 –> 00:04:00,519
wildly different because my mother and
107
00:03:59,159 –> 00:04:02,439
my sister are both Architects and it’s
108
00:04:00,519 –> 00:04:04,480
something I grew up with but when I
109
00:04:02,439 –> 00:04:06,360
graduated I went to work initially for a
110
00:04:04,480 –> 00:04:08,319
creative advertising agency called Cliff
111
00:04:06,360 –> 00:04:10,680
Freeman and partners and you know the
112
00:04:08,319 –> 00:04:13,680
notion of what Cliff did because he
113
00:04:10,680 –> 00:04:16,239
focused on using people to create
114
00:04:13,680 –> 00:04:17,840
emotional messaging and that story then
115
00:04:16,239 –> 00:04:19,759
sted with me in a variety of different
116
00:04:17,840 –> 00:04:22,440
ways over time because the next phase I
117
00:04:19,759 –> 00:04:24,880
went to was into the startup world and I
118
00:04:22,440 –> 00:04:26,600
spent uh you know things 12 years at
119
00:04:24,880 –> 00:04:28,520
four different startups all in the early
120
00:04:26,600 –> 00:04:30,199
consumer data space you know the early
121
00:04:28,520 –> 00:04:32,080
days of web Analytics companies like
122
00:04:30,199 –> 00:04:34,919
omniture that originated really some of
123
00:04:32,080 –> 00:04:36,680
how analytics was used uh using sort of
124
00:04:34,919 –> 00:04:38,320
internet technology and then companies
125
00:04:36,680 –> 00:04:40,720
like blue Kai which became the heart of
126
00:04:38,320 –> 00:04:42,880
the Oracle data cloud and was one of the
127
00:04:40,720 –> 00:04:44,919
first creators of the DMP the data
128
00:04:42,880 –> 00:04:46,199
management platforms used by large
129
00:04:44,919 –> 00:04:48,520
corporations that have now sort of
130
00:04:46,199 –> 00:04:51,120
morphed into consumer data platforms and
131
00:04:48,520 –> 00:04:52,759
various other ways to understand people
132
00:04:51,120 –> 00:04:54,639
but I think the thread that linked all
133
00:04:52,759 –> 00:04:56,960
those things was I was interested in how
134
00:04:54,639 –> 00:04:59,320
people behaved and I was interested in
135
00:04:56,960 –> 00:05:00,919
people period and that then translated
136
00:04:59,320 –> 00:05:04,039
into the next half of my career which
137
00:05:00,919 –> 00:05:05,600
became media so I spent time at Reuters
138
00:05:04,039 –> 00:05:07,199
I spent time at The Weather Channel
139
00:05:05,600 –> 00:05:10,120
which is going through an amazing
140
00:05:07,199 –> 00:05:11,600
transformation um and then finally uh uh
141
00:05:10,120 –> 00:05:15,160
originated the role of global data
142
00:05:11,600 –> 00:05:17,120
officer at ESPN and Disney media and
143
00:05:15,160 –> 00:05:19,240
then uh finally spent some time at
144
00:05:17,120 –> 00:05:22,000
neelsen before I spent the last just
145
00:05:19,240 –> 00:05:23,720
over four years at PepsiCo as its first
146
00:05:22,000 –> 00:05:27,680
Chief data analytics
147
00:05:23,720 –> 00:05:29,880
officer yeah because I I can I I I
148
00:05:27,680 –> 00:05:32,039
understand that this type of role and
149
00:05:29,880 –> 00:05:35,560
together with Chief digital officers
150
00:05:32,039 –> 00:05:36,600
it’s quite new it is y was it was it
151
00:05:35,560 –> 00:05:40,240
when you
152
00:05:36,600 –> 00:05:42,479
joined was it part of your job to Define
153
00:05:40,240 –> 00:05:44,560
what you are going to do
154
00:05:42,479 –> 00:05:46,840
absolutely um and I you know I know you
155
00:05:44,560 –> 00:05:48,319
know so rash introduced me and she and I
156
00:05:46,840 –> 00:05:50,639
share a lot of common thoughts around
157
00:05:48,319 –> 00:05:52,919
sort of the changing nature of this job
158
00:05:50,639 –> 00:05:54,960
and I don’t think it’s just the cdao job
159
00:05:52,919 –> 00:05:58,400
that’s changing I think it’s all the
160
00:05:54,960 –> 00:06:00,039
jobs around technology and consumer um
161
00:05:58,400 –> 00:06:02,479
that are sort of some of them have been
162
00:06:00,039 –> 00:06:05,319
Legacy jobs for a long time the CTO the
163
00:06:02,479 –> 00:06:07,400
CIO the CMO and some of them are much
164
00:06:05,319 –> 00:06:09,240
newer the chief analytics officer the
165
00:06:07,400 –> 00:06:10,960
chief data officer in my case Chief data
166
00:06:09,240 –> 00:06:12,960
analytics officer you know the
167
00:06:10,960 –> 00:06:15,199
origination now of what they’re calling
168
00:06:12,960 –> 00:06:16,720
Chief AI officer which is is nonsensical
169
00:06:15,199 –> 00:06:18,759
because AI has been around for a long
170
00:06:16,720 –> 00:06:20,400
time and has been very much part and the
171
00:06:18,759 –> 00:06:22,080
Heart In some cases of what analytics
172
00:06:20,400 –> 00:06:23,960
has been for a lot of these companies
173
00:06:22,080 –> 00:06:25,960
those jobs are all beginning to merge
174
00:06:23,960 –> 00:06:29,080
right and depending on which company you
175
00:06:25,960 –> 00:06:31,000
look at what stage of their data and
176
00:06:29,080 –> 00:06:33,240
Analytics Journey they’re on what size
177
00:06:31,000 –> 00:06:35,400
they are what companies they operate in
178
00:06:33,240 –> 00:06:37,240
those roles can be widely different
179
00:06:35,400 –> 00:06:39,720
right depending on how they have chosen
180
00:06:37,240 –> 00:06:42,639
to operate and who they have chosen in
181
00:06:39,720 –> 00:06:44,280
many cases to lead those functions so if
182
00:06:42,639 –> 00:06:46,120
we go back to sort of the heart of the
183
00:06:44,280 –> 00:06:48,160
chief data analytics officer role it
184
00:06:46,120 –> 00:06:50,280
really has been how do we use data and
185
00:06:48,160 –> 00:06:52,120
information better at the company how do
186
00:06:50,280 –> 00:06:54,479
we take advantage of the amazing things
187
00:06:52,120 –> 00:06:56,879
that are happening in in Enterprise data
188
00:06:54,479 –> 00:06:58,960
technology in all the amazing things
189
00:06:56,879 –> 00:07:01,800
that we’re now hearing about with AI to
190
00:06:58,960 –> 00:07:04,120
actually make business better to make
191
00:07:01,800 –> 00:07:05,720
people have better experiences and
192
00:07:04,120 –> 00:07:06,919
sometimes that thread gets lost a little
193
00:07:05,720 –> 00:07:09,000
bit we get so caught up in the
194
00:07:06,919 –> 00:07:11,160
technology we forget why we actually
195
00:07:09,000 –> 00:07:12,639
doing this and I think the human
196
00:07:11,160 –> 00:07:14,960
connection is what links those two
197
00:07:12,639 –> 00:07:16,759
things together both the employees the
198
00:07:14,960 –> 00:07:18,160
consumers and guests and associates they
199
00:07:16,759 –> 00:07:19,520
serve I love all the words that people
200
00:07:18,160 –> 00:07:21,800
use to describe the people that they
201
00:07:19,520 –> 00:07:23,800
Service uh and then finally obviously
202
00:07:21,800 –> 00:07:25,919
what we as Leaders find interesting and
203
00:07:23,800 –> 00:07:29,960
useful and how we create culture around
204
00:07:25,919 –> 00:07:33,720
it h and you’ve been going through
205
00:07:29,960 –> 00:07:37,240
through like you you were in it when
206
00:07:33,720 –> 00:07:40,639
pandemic happened and the whole consumer
207
00:07:37,240 –> 00:07:43,520
Behavior shifted right like I guess now
208
00:07:40,639 –> 00:07:45,720
it’s in some aspects it’s coming back to
209
00:07:43,520 –> 00:07:49,120
what it was and in some aspects it
210
00:07:45,720 –> 00:07:50,879
stayed but how how did you see it from
211
00:07:49,120 –> 00:07:52,919
from the inside you know I had a
212
00:07:50,879 –> 00:07:56,520
fascinating experience so I joined
213
00:07:52,919 –> 00:07:59,680
PepsiCo in I think it was fall of
214
00:07:56,520 –> 00:08:01,479
2019 um and I had a wonderful boss uh
215
00:07:59,680 –> 00:08:03,000
Hugh Johnston who was the CFO of the
216
00:08:01,479 –> 00:08:05,240
company at the time he’s then
217
00:08:03,000 –> 00:08:08,400
subsequently moved on to Disney and
218
00:08:05,240 –> 00:08:10,199
oddly and and what Hugh gave me time to
219
00:08:08,400 –> 00:08:12,520
do was to go and listen to the
220
00:08:10,199 –> 00:08:14,440
company um and there were obviously some
221
00:08:12,520 –> 00:08:16,120
specific needs that were required but it
222
00:08:14,440 –> 00:08:18,599
was really about understanding what the
223
00:08:16,120 –> 00:08:20,680
company required and so I had the time I
224
00:08:18,599 –> 00:08:23,599
spent time between then and oddly March
225
00:08:20,680 –> 00:08:24,759
of 2020 um to really go out and visit
226
00:08:23,599 –> 00:08:26,840
the company and again this is a
227
00:08:24,759 –> 00:08:29,680
far-flung company many countries many
228
00:08:26,840 –> 00:08:31,440
continents many offices many people um
229
00:08:29,680 –> 00:08:33,159
and it was an amazing experience because
230
00:08:31,440 –> 00:08:35,240
I presented the first time to the
231
00:08:33,159 –> 00:08:37,760
executive committee in March of 2020 the
232
00:08:35,240 –> 00:08:39,399
first week one week after that the
233
00:08:37,760 –> 00:08:42,159
company shut down as did the rest of the
234
00:08:39,399 –> 00:08:44,519
world and I had then that the
235
00:08:42,159 –> 00:08:47,360
opportunity to build a team but within
236
00:08:44,519 –> 00:08:49,200
the confines of Co um and it was an
237
00:08:47,360 –> 00:08:52,279
astonishing experience when I got there
238
00:08:49,200 –> 00:08:56,240
it was a team of one me um and when I
239
00:08:52,279 –> 00:08:59,120
left in January of 2024 this January we
240
00:08:56,240 –> 00:09:00,920
were at 1100 people um and one of the
241
00:08:59,120 –> 00:09:02,680
things that I focused on was I
242
00:09:00,920 –> 00:09:04,680
recognized that there was a lot of
243
00:09:02,680 –> 00:09:06,760
things that needed to be done but in
244
00:09:04,680 –> 00:09:08,880
order to preserve the culture and Legacy
245
00:09:06,760 –> 00:09:11,440
of what the company had built and also
246
00:09:08,880 –> 00:09:14,040
to bring in new blood we ensured that
247
00:09:11,440 –> 00:09:16,600
our teams were always 50/50 right my
248
00:09:14,040 –> 00:09:19,560
team had to be 50% New Blood new
249
00:09:16,600 –> 00:09:20,600
technologists data scientists AI experts
250
00:09:19,560 –> 00:09:22,480
folks who were thinking about the
251
00:09:20,600 –> 00:09:24,880
ethical nature of of what we were doing
252
00:09:22,480 –> 00:09:26,600
with analytics and also people who truly
253
00:09:24,880 –> 00:09:28,560
understood how these large Enterprises
254
00:09:26,600 –> 00:09:30,000
worked right and PepsiCo has no shortage
255
00:09:28,560 –> 00:09:31,440
of amazing talent
256
00:09:30,000 –> 00:09:33,440
there were amazing people to go out and
257
00:09:31,440 –> 00:09:35,360
pick from and some jobs required that
258
00:09:33,440 –> 00:09:37,200
I’ll give you an example data governance
259
00:09:35,360 –> 00:09:38,640
which is this you know band lead about
260
00:09:37,200 –> 00:09:40,320
word it’s a hard word right because it
261
00:09:38,640 –> 00:09:42,560
sounds very bureaucratic it sounds like
262
00:09:40,320 –> 00:09:44,040
you’re slowing down Enterprise progress
263
00:09:42,560 –> 00:09:46,360
but it’s really understanding how to
264
00:09:44,040 –> 00:09:47,760
organize the information you have and
265
00:09:46,360 –> 00:09:49,480
those folks we had to find from within
266
00:09:47,760 –> 00:09:52,240
the company you know and that was that
267
00:09:49,480 –> 00:09:54,480
was during covid that was difficult
268
00:09:52,240 –> 00:09:56,600
right because how how these companies
269
00:09:54,480 –> 00:09:59,120
are built on relationships and when you
270
00:09:56,600 –> 00:10:02,160
can’t go out and meet them it was uh it
271
00:09:59,120 –> 00:10:04,440
was it was tough so what did you offer
272
00:10:02,160 –> 00:10:06,279
them how did you make it how did you
273
00:10:04,440 –> 00:10:07,760
make it attractive yeah I think this is
274
00:10:06,279 –> 00:10:11,600
where culture plays really important
275
00:10:07,760 –> 00:10:14,760
role um I we wanted to build a team that
276
00:10:11,600 –> 00:10:17,200
felt like they had a purpose and to be
277
00:10:14,760 –> 00:10:19,640
to be clear we also had an incredible
278
00:10:17,200 –> 00:10:21,320
amount of support one of the advantages
279
00:10:19,640 –> 00:10:23,200
one of the very few advantages that Co
280
00:10:21,320 –> 00:10:26,360
gave us was some of our Executives had
281
00:10:23,200 –> 00:10:29,079
more time in fact our CEO Ramon laguarta
282
00:10:26,360 –> 00:10:30,320
who is a wonderful CEO had more time and
283
00:10:29,079 –> 00:10:32,560
so I was able to spend some time
284
00:10:30,320 –> 00:10:35,160
directly with him uh on at one point a
285
00:10:32,560 –> 00:10:37,000
weekly basis to try and understand what
286
00:10:35,160 –> 00:10:39,560
the ideas he had for the future of the
287
00:10:37,000 –> 00:10:41,920
company and it is astonishing when you
288
00:10:39,560 –> 00:10:44,240
have a CEO who is willing to learn
289
00:10:41,920 –> 00:10:46,200
willing to listen and then makes the
290
00:10:44,240 –> 00:10:47,399
decisions that he or she needs to make
291
00:10:46,200 –> 00:10:49,320
to make sure that the company is
292
00:10:47,399 –> 00:10:52,360
thinking about the future and Ramon is
293
00:10:49,320 –> 00:10:54,200
very much one of those CEOs um and and
294
00:10:52,360 –> 00:10:56,440
he gave us Direction and purpose and
295
00:10:54,200 –> 00:10:59,600
guidance and really allowed us to build
296
00:10:56,440 –> 00:11:01,959
a team that would use the best what was
297
00:10:59,600 –> 00:11:04,440
in the company but allow us to bring in
298
00:11:01,959 –> 00:11:06,360
what we needed to bring in to transform
299
00:11:04,440 –> 00:11:08,480
not just all the shiny metal objects of
300
00:11:06,360 –> 00:11:10,760
AI and analytics and all the cool
301
00:11:08,480 –> 00:11:12,360
applications we could build but actually
302
00:11:10,760 –> 00:11:14,040
fixing the infrastructure right
303
00:11:12,360 –> 00:11:16,120
fundamentally understanding that any
304
00:11:14,040 –> 00:11:18,279
large Enterprise organization today has
305
00:11:16,120 –> 00:11:20,399
too much data right they have more data
306
00:11:18,279 –> 00:11:21,959
than they need typically and yeah the
307
00:11:20,399 –> 00:11:23,880
problem is how to get that organized
308
00:11:21,959 –> 00:11:25,720
especially when you’ve been used to
309
00:11:23,880 –> 00:11:27,639
operating in a in a variety of different
310
00:11:25,720 –> 00:11:30,360
ways and that problem by the way I saw
311
00:11:27,639 –> 00:11:33,000
it ESPN we saw it
312
00:11:30,360 –> 00:11:35,320
and in any you see today everybody is on
313
00:11:33,000 –> 00:11:37,880
a journey right and what makes it tricky
314
00:11:35,320 –> 00:11:39,600
is technology is Shifting so quickly now
315
00:11:37,880 –> 00:11:41,399
the question is when you’re making these
316
00:11:39,600 –> 00:11:43,720
big Investments and you know tens of
317
00:11:41,399 –> 00:11:46,560
millions they already
318
00:11:43,720 –> 00:11:49,440
outdated it get outdated yeah I think
319
00:11:46,560 –> 00:11:51,040
that’s that’s that’s part of the magic
320
00:11:49,440 –> 00:11:53,839
is trying to understand where are we
321
00:11:51,040 –> 00:11:55,800
today where do we need to go and then
322
00:11:53,839 –> 00:11:57,360
it’s not a straight line it’s kind of a
323
00:11:55,800 –> 00:12:00,320
you know it’s like navigating by boat
324
00:11:57,360 –> 00:12:02,079
you have to pick a path going to go here
325
00:12:00,320 –> 00:12:03,519
and maybe the Seas will be rough today
326
00:12:02,079 –> 00:12:05,240
and maybe the winds will take us that
327
00:12:03,519 –> 00:12:06,200
way but you kind of you finally make it
328
00:12:05,240 –> 00:12:07,920
there and then you make it to the next
329
00:12:06,200 –> 00:12:09,800
one and you make it to the next one but
330
00:12:07,920 –> 00:12:13,000
culture I think is what keeps the boat
331
00:12:09,800 –> 00:12:15,199
sort of afloat right and so I focused
332
00:12:13,000 –> 00:12:17,360
very hard on creating a sense of
333
00:12:15,199 –> 00:12:19,440
belonging for this company and for these
334
00:12:17,360 –> 00:12:21,040
people because they needed to believe
335
00:12:19,440 –> 00:12:23,600
that they could actually impact
336
00:12:21,040 –> 00:12:25,560
transformation at a company like PepsiCo
337
00:12:23,600 –> 00:12:28,079
or a company like neelson or a company
338
00:12:25,560 –> 00:12:30,160
like ESPN and without that sort of
339
00:12:28,079 –> 00:12:31,480
guiding nordstar I think it’s very
340
00:12:30,160 –> 00:12:32,839
difficult to succeed and we had a lot of
341
00:12:31,480 –> 00:12:36,480
support in doing
342
00:12:32,839 –> 00:12:38,959
that yeah so this is I guess the most
343
00:12:36,480 –> 00:12:41,760
important um Factor right like having
344
00:12:38,959 –> 00:12:43,560
people um supporting each other
345
00:12:41,760 –> 00:12:46,839
communicating and
346
00:12:43,560 –> 00:12:49,519
brainstorming um because what other
347
00:12:46,839 –> 00:12:51,639
factors do you think are setting apart
348
00:12:49,519 –> 00:12:54,079
those companies which are winning the
349
00:12:51,639 –> 00:12:56,160
race and and and being competitive and
350
00:12:54,079 –> 00:12:58,800
being I think the human differentiator
351
00:12:56,160 –> 00:12:59,839
is the greatest right and after you had
352
00:12:58,800 –> 00:13:02,399
a
353
00:12:59,839 –> 00:13:04,560
a manager named Athena canura and Athena
354
00:13:02,399 –> 00:13:07,440
is a Dr atina canura she’s a remarkable
355
00:13:04,560 –> 00:13:09,480
woman she actually ran a huge portion of
356
00:13:07,440 –> 00:13:11,399
the accenta analytics world and when she
357
00:13:09,480 –> 00:13:13,040
came in she had a PhD in sort of the
358
00:13:11,399 –> 00:13:15,480
early days of statistics and data
359
00:13:13,040 –> 00:13:17,079
science so uh she ran all of strategy
360
00:13:15,480 –> 00:13:19,120
and transformation at the time that I
361
00:13:17,079 –> 00:13:20,760
was there and to have somebody who
362
00:13:19,120 –> 00:13:23,440
fundamentally understands the language
363
00:13:20,760 –> 00:13:25,079
in which you speak has strong ideas um
364
00:13:23,440 –> 00:13:26,959
and and can really speak to everyone on
365
00:13:25,079 –> 00:13:29,519
your team from a data scientist up to a
366
00:13:26,959 –> 00:13:31,560
chief strategist it makes it much easier
367
00:13:29,519 –> 00:13:34,320
right and and and you have then the
368
00:13:31,560 –> 00:13:37,079
joint creating joint goals for the
369
00:13:34,320 –> 00:13:38,440
leadership team of the company becomes
370
00:13:37,079 –> 00:13:40,320
easier because you have somebody who
371
00:13:38,440 –> 00:13:41,880
supports you who is thinking about what
372
00:13:40,320 –> 00:13:43,240
you’re thinking about obviously has a
373
00:13:41,880 –> 00:13:44,839
broader remit and other things that she
374
00:13:43,240 –> 00:13:46,600
was thinking about for the broader
375
00:13:44,839 –> 00:13:48,360
digital and and corporate transformation
376
00:13:46,600 –> 00:13:49,959
of the company but you don’t have to
377
00:13:48,360 –> 00:13:52,639
explain everything to her because she
378
00:13:49,959 –> 00:13:55,360
knows right and so to have that support
379
00:13:52,639 –> 00:13:58,120
to have remone to have H when in my the
380
00:13:55,360 –> 00:14:01,120
initial phase of my job um that was an
381
00:13:58,120 –> 00:14:02,680
amazing thing right and and then as we
382
00:14:01,120 –> 00:14:05,079
built out the company as we built out
383
00:14:02,680 –> 00:14:07,199
the the function I’m afraid uh I’m sorry
384
00:14:05,079 –> 00:14:08,800
um we found people that really had the
385
00:14:07,199 –> 00:14:10,959
same passion and energy for what it is
386
00:14:08,800 –> 00:14:12,880
we did and the cool thing was there were
387
00:14:10,959 –> 00:14:14,600
plenty of them to be found and during
388
00:14:12,880 –> 00:14:17,560
covid you know it was an interesting
389
00:14:14,600 –> 00:14:19,160
reaction that everybody had right we we
390
00:14:17,560 –> 00:14:21,720
there was some guilt about not being in
391
00:14:19,160 –> 00:14:24,279
the office and so we worked extremely
392
00:14:21,720 –> 00:14:27,160
hard I mean I remember long phases I
393
00:14:24,279 –> 00:14:29,720
mean quarters of you know 15 18 hour
394
00:14:27,160 –> 00:14:31,560
days where and you remember everybody
395
00:14:29,720 –> 00:14:34,279
remembers right we were on Zoom meeting
396
00:14:31,560 –> 00:14:38,160
after Zoom meeting or teams or yeah that
397
00:14:34,279 –> 00:14:40,040
wasn’t necessarily healthy oh no it was
398
00:14:38,160 –> 00:14:42,920
as you mentioned that pendulum has swung
399
00:14:40,040 –> 00:14:44,320
a little bit right so we have we had a
400
00:14:42,920 –> 00:14:47,120
lot of people who got used to working
401
00:14:44,320 –> 00:14:48,600
from home or moved to other places my
402
00:14:47,120 –> 00:14:50,440
best friend got up from New York City
403
00:14:48,600 –> 00:14:52,079
and moved to the mountains in Montana
404
00:14:50,440 –> 00:14:54,199
and you know loves it there and has
405
00:14:52,079 –> 00:14:55,519
never come back and I think as you
406
00:14:54,199 –> 00:14:57,320
pointed out some of that pendulum has
407
00:14:55,519 –> 00:14:58,600
swung back you know companies are
408
00:14:57,320 –> 00:14:59,800
requiring people to be back in the
409
00:14:58,600 –> 00:15:01,839
office
410
00:14:59,800 –> 00:15:03,399
before I left Epico you know i’ said we
411
00:15:01,839 –> 00:15:04,880
need you to be in the office for 3 days
412
00:15:03,399 –> 00:15:06,839
but there was some flexibility around
413
00:15:04,880 –> 00:15:08,320
how you do that I spoke to a lot of my
414
00:15:06,839 –> 00:15:09,720
peers around what they were doing and
415
00:15:08,320 –> 00:15:11,519
similar things you know two three days
416
00:15:09,720 –> 00:15:13,600
in the office and some flexibility
417
00:15:11,519 –> 00:15:15,160
around that for large corporations other
418
00:15:13,600 –> 00:15:16,360
companies have gone completely remote
419
00:15:15,160 –> 00:15:18,399
you know especially a lot of smaller
420
00:15:16,360 –> 00:15:20,759
companies and I now have a business
421
00:15:18,399 –> 00:15:23,240
where I advise young startups in the AI
422
00:15:20,759 –> 00:15:25,360
space um and a lot of them went remote
423
00:15:23,240 –> 00:15:27,199
forever right because they realized that
424
00:15:25,360 –> 00:15:29,199
if they wanted to as I talked about
425
00:15:27,199 –> 00:15:31,639
earlier the the if the defining Factor
426
00:15:29,199 –> 00:15:33,639
is your human talent to allow technology
427
00:15:31,639 –> 00:15:36,199
to really Thrive you have to go where
428
00:15:33,639 –> 00:15:38,040
the fish are and that means as the world
429
00:15:36,199 –> 00:15:40,000
has changed you find people where you
430
00:15:38,040 –> 00:15:41,600
can right and I think with a lot of them
431
00:15:40,000 –> 00:15:43,079
they had a tremendous amount of success
432
00:15:41,600 –> 00:15:45,880
I do think it can be harder with large
433
00:15:43,079 –> 00:15:47,519
corporations because in order to create
434
00:15:45,880 –> 00:15:49,759
a sense of solidarity you do need to
435
00:15:47,519 –> 00:15:51,680
spend time with the business and one of
436
00:15:49,759 –> 00:15:53,720
the things that became very clear to us
437
00:15:51,680 –> 00:15:55,920
was in order for our analytics products
438
00:15:53,720 –> 00:15:58,000
to succeed we needed the business to
439
00:15:55,920 –> 00:16:00,160
fundamentally understand what we did and
440
00:15:58,000 –> 00:16:02,199
so we spent a lot of time on education
441
00:16:00,160 –> 00:16:04,720
and a lot of time on visiting right as
442
00:16:02,199 –> 00:16:06,319
soon as it became possible to do that we
443
00:16:04,720 –> 00:16:08,240
we you know we we did a huge program
444
00:16:06,319 –> 00:16:09,600
which we’ve talked about uh before this
445
00:16:08,240 –> 00:16:11,639
I can I can I can spend some time
446
00:16:09,600 –> 00:16:13,240
talking about it where um we had the
447
00:16:11,639 –> 00:16:16,240
first education program was for the top
448
00:16:13,240 –> 00:16:17,639
250 people at the company right because
449
00:16:16,240 –> 00:16:19,160
in a conversation with Ramon and me he
450
00:16:17,639 –> 00:16:20,800
said listen we’ve got to educate the
451
00:16:19,160 –> 00:16:22,079
people making the investment decisions
452
00:16:20,800 –> 00:16:24,040
not just the folks who are using the
453
00:16:22,079 –> 00:16:26,680
tools and again you know that kind of
454
00:16:24,040 –> 00:16:29,000
foresight from a CEO is is so helpful
455
00:16:26,680 –> 00:16:30,680
right because it allows you to then
456
00:16:29,000 –> 00:16:32,079
Focus your activities on the things that
457
00:16:30,680 –> 00:16:34,240
matter and I will tell you that
458
00:16:32,079 –> 00:16:35,199
education program served us for the next
459
00:16:34,240 –> 00:16:38,319
four
460
00:16:35,199 –> 00:16:40,759
years how did it work so what what what
461
00:16:38,319 –> 00:16:45,399
was the main yeah what was the main uh
462
00:16:40,759 –> 00:16:49,079
objective and how like what struggle and
463
00:16:45,399 –> 00:16:51,279
and what was the um stimula what was the
464
00:16:49,079 –> 00:16:54,680
reason why you decided
465
00:16:51,279 –> 00:16:56,920
to I mean I think in order to at a
466
00:16:54,680 –> 00:17:00,519
company like PepsiCo or a company like
467
00:16:56,920 –> 00:17:02,519
Disney in order to actually take out
468
00:17:00,519 –> 00:17:05,000
powerful new technology products that
469
00:17:02,519 –> 00:17:07,079
will change jobs change organizational
470
00:17:05,000 –> 00:17:08,919
structures change the way people are
471
00:17:07,079 –> 00:17:10,640
used to operating sometimes very
472
00:17:08,919 –> 00:17:12,120
successfully right these are these are
473
00:17:10,640 –> 00:17:13,319
not companies that were not having a
474
00:17:12,120 –> 00:17:15,559
problem these were companies that have
475
00:17:13,319 –> 00:17:17,000
done very well and sometimes it’s harder
476
00:17:15,559 –> 00:17:18,319
to change a company that’s done very
477
00:17:17,000 –> 00:17:20,720
well than to change a company that
478
00:17:18,319 –> 00:17:22,520
hasn’t because the impetus for change as
479
00:17:20,720 –> 00:17:24,319
you’ve pointed out might be less that’s
480
00:17:22,520 –> 00:17:26,480
where you need leadership right where
481
00:17:24,319 –> 00:17:30,240
you need people like a bob Iger at
482
00:17:26,480 –> 00:17:32,720
Disney or Ramon at at at at PepsiCo um
483
00:17:30,240 –> 00:17:35,919
to say we are setting a nor star because
484
00:17:32,720 –> 00:17:37,240
we are anticipating change down the road
485
00:17:35,919 –> 00:17:39,280
and we want to make sure no matter
486
00:17:37,240 –> 00:17:41,559
what’s going on now we want to create an
487
00:17:39,280 –> 00:17:43,400
environment where we are encouraging
488
00:17:41,559 –> 00:17:45,120
change not just for the sake of change
489
00:17:43,400 –> 00:17:47,120
but because we know the company will
490
00:17:45,120 –> 00:17:49,280
require it right and that’s why
491
00:17:47,120 –> 00:17:51,919
education is not just the first step
492
00:17:49,280 –> 00:17:54,080
education is an evergreen process and
493
00:17:51,919 –> 00:17:56,480
you know I do wish every company would
494
00:17:54,080 –> 00:17:58,200
do education more often because we get
495
00:17:56,480 –> 00:18:00,919
very used to the idea that we have these
496
00:17:58,200 –> 00:18:02,640
very smart capable people and you know
497
00:18:00,919 –> 00:18:04,080
they we often we brought them in or we
498
00:18:02,640 –> 00:18:06,240
we’ve educated them based on what we
499
00:18:04,080 –> 00:18:09,039
knew they knew but sometimes they’re
500
00:18:06,240 –> 00:18:11,200
they’re afraid to say I don’t know right
501
00:18:09,039 –> 00:18:12,720
and and and I some some cases I don’t
502
00:18:11,200 –> 00:18:14,640
blame them it’s hard right you have to
503
00:18:12,720 –> 00:18:16,080
get up and say look there’s this
504
00:18:14,640 –> 00:18:18,320
completely new way of doing something
505
00:18:16,080 –> 00:18:21,120
that I’ve been doing for 10 20 in some
506
00:18:18,320 –> 00:18:23,280
cases at PepsiCo Disney 40 years right
507
00:18:21,120 –> 00:18:25,320
and it’s different and everything I
508
00:18:23,280 –> 00:18:26,840
thought I knew about how to do my job
509
00:18:25,320 –> 00:18:28,760
and how many people it would require and
510
00:18:26,840 –> 00:18:32,360
how much investment I have to put in is
511
00:18:28,760 –> 00:18:35,120
is changing and everyone loves change if
512
00:18:32,360 –> 00:18:37,039
it doesn’t affect them right right I
513
00:18:35,120 –> 00:18:39,640
mean that’s that’s the nature of that
514
00:18:37,039 –> 00:18:43,840
process so it’s it can be difficult and
515
00:18:39,640 –> 00:18:46,159
it can be hard and it can be um uh it
516
00:18:43,840 –> 00:18:47,919
can be humbling right and I think if you
517
00:18:46,159 –> 00:18:49,600
go in there with the attitude that this
518
00:18:47,919 –> 00:18:51,919
is about learning this is about
519
00:18:49,600 –> 00:18:53,919
experience and we have to be part of how
520
00:18:51,919 –> 00:18:55,280
these things change and that means your
521
00:18:53,919 –> 00:18:56,720
path might eventually lead somewhere
522
00:18:55,280 –> 00:18:59,559
else and that’s that’s what happened
523
00:18:56,720 –> 00:19:01,600
with me and others um we are part of a
524
00:18:59,559 –> 00:19:03,280
story we are part of a chapter in a book
525
00:19:01,600 –> 00:19:04,919
right and that book you have one that is
526
00:19:03,280 –> 00:19:06,600
your own life your own book Your Own
527
00:19:04,919 –> 00:19:08,400
Story your own experience and the other
528
00:19:06,600 –> 00:19:09,559
that is the life of these companies and
529
00:19:08,400 –> 00:19:11,400
sometimes you’ll be part of them
530
00:19:09,559 –> 00:19:13,240
sometimes you won’t but I think if you
531
00:19:11,400 –> 00:19:16,200
can find again going back to the things
532
00:19:13,240 –> 00:19:19,159
that matter find culture find Innovation
533
00:19:16,200 –> 00:19:22,400
find curiosity and use those as your not
534
00:19:19,159 –> 00:19:24,480
stars for change um and and be confident
535
00:19:22,400 –> 00:19:26,120
in the fact that learning is the most
536
00:19:24,480 –> 00:19:28,000
powerful tool you can possess I think
537
00:19:26,120 –> 00:19:29,720
those are all powerful qualities and and
538
00:19:28,000 –> 00:19:31,280
at well-run companies like the ones I
539
00:19:29,720 –> 00:19:34,400
worked for I think you know we’ve seen
540
00:19:31,280 –> 00:19:37,799
success with it this is what will save
541
00:19:34,400 –> 00:19:41,000
people in in this whole um
542
00:19:37,799 –> 00:19:46,240
transformational exponential exponential
543
00:19:41,000 –> 00:19:50,520
uh time of where you know using tools uh
544
00:19:46,240 –> 00:19:53,000
embedded with like with AI um it’s it’s
545
00:19:50,520 –> 00:19:55,280
becoming like the normal type of work
546
00:19:53,000 –> 00:19:58,919
you are doing it’s a commodity so you
547
00:19:55,280 –> 00:20:01,679
have to put focus into
548
00:19:58,919 –> 00:20:04,400
like self-learning and and being more
549
00:20:01,679 –> 00:20:06,960
creative than what another person can
550
00:20:04,400 –> 00:20:09,480
can do with no look it’s absolutely true
551
00:20:06,960 –> 00:20:12,000
you know was interesting I was at a
552
00:20:09,480 –> 00:20:14,640
um when I think about that notion of how
553
00:20:12,000 –> 00:20:16,120
you res skill yourself and and read on
554
00:20:14,640 –> 00:20:17,280
people always ask me how do I get you
555
00:20:16,120 –> 00:20:18,679
know especially Business Leaders are
556
00:20:17,280 –> 00:20:20,880
like how do I get you know how do I get
557
00:20:18,679 –> 00:20:22,799
smarter about what’s possible yeah what
558
00:20:20,880 –> 00:20:25,440
I always tell them is with AI especially
559
00:20:22,799 –> 00:20:28,120
now technology is changing so fast if
560
00:20:25,440 –> 00:20:31,120
you spend just the extra time necessary
561
00:20:28,120 –> 00:20:33,400
to understand for your particular space
562
00:20:31,120 –> 00:20:35,440
whether it’s vertical or horizontal what
563
00:20:33,400 –> 00:20:37,039
is most interesting in AI there are so
564
00:20:35,440 –> 00:20:38,799
many opportunities to do it there’s
565
00:20:37,039 –> 00:20:40,760
stuff online there’s people there’s
566
00:20:38,799 –> 00:20:42,240
social media with people who genuinely
567
00:20:40,760 –> 00:20:44,760
know what they’re talking about there’s
568
00:20:42,240 –> 00:20:46,400
educational stuff on YouTube there are
569
00:20:44,760 –> 00:20:48,360
professional programs there are graduate
570
00:20:46,400 –> 00:20:50,360
degrees there’s a million ways you can
571
00:20:48,360 –> 00:20:51,440
become an expert and you can be the
572
00:20:50,360 –> 00:20:53,080
greatest expert in the world if you
573
00:20:51,440 –> 00:20:54,400
focus on it for the next six months
574
00:20:53,080 –> 00:20:56,640
right because there’s just so much
575
00:20:54,400 –> 00:20:59,760
specialization happening that nobody can
576
00:20:56,640 –> 00:21:02,000
be everything right um but it takes
577
00:20:59,760 –> 00:21:03,880
effort and it takes leaning and it takes
578
00:21:02,000 –> 00:21:06,240
again that humility of saying I don’t
579
00:21:03,880 –> 00:21:08,880
know what the future holds for what it
580
00:21:06,240 –> 00:21:11,279
is that I do I can learn about it I can
581
00:21:08,880 –> 00:21:13,240
become an expert on it I can I have I
582
00:21:11,279 –> 00:21:15,200
have the aptitude and the skill but if I
583
00:21:13,240 –> 00:21:18,240
sit back and just hope that the storm
584
00:21:15,200 –> 00:21:20,240
blows me by then I think that you are
585
00:21:18,240 –> 00:21:22,039
putting a lot of things at risk right
586
00:21:20,240 –> 00:21:24,520
and especially if you are a manager if
587
00:21:22,039 –> 00:21:25,400
you’re a you know a significant manager
588
00:21:24,520 –> 00:21:27,159
and some of the you know some of these
589
00:21:25,400 –> 00:21:29,520
people these last corporations run
590
00:21:27,159 –> 00:21:31,000
thousands 10 thousands of people people
591
00:21:29,520 –> 00:21:32,720
um you’re doing a disservice to all of
592
00:21:31,000 –> 00:21:34,960
the people who work for you as well
593
00:21:32,720 –> 00:21:36,640
because change will impact them right
594
00:21:34,960 –> 00:21:38,679
whether they like it or not and it might
595
00:21:36,640 –> 00:21:40,120
come slowly it might come in a measured
596
00:21:38,679 –> 00:21:42,240
way where you are actually sort of
597
00:21:40,120 –> 00:21:43,720
controlling that change or it will come
598
00:21:42,240 –> 00:21:46,919
with a way that you don’t like and you
599
00:21:43,720 –> 00:21:48,720
don’t expect right and um so pick pick
600
00:21:46,919 –> 00:21:51,120
pick a poison a little bit right take
601
00:21:48,720 –> 00:21:53,799
the extra time make the effort lean in
602
00:21:51,120 –> 00:21:55,320
educate upskill talk to people Network
603
00:21:53,799 –> 00:21:56,640
connect with some of the thinking around
604
00:21:55,320 –> 00:21:57,760
there I mean there are so many great
605
00:21:56,640 –> 00:22:00,480
opportunities like listening to
606
00:21:57,760 –> 00:22:01,600
something like this going to conferences
607
00:22:00,480 –> 00:22:03,320
uh I’m part of a couple of different
608
00:22:01,600 –> 00:22:04,760
organizations like World 50 where you
609
00:22:03,320 –> 00:22:06,679
can connect with some of your peers
610
00:22:04,760 –> 00:22:08,679
which for me has been incredibly
611
00:22:06,679 –> 00:22:10,320
educational and I owe a lot to some of
612
00:22:08,679 –> 00:22:12,799
the conversations I’ve had with with
613
00:22:10,320 –> 00:22:14,559
with my peers who are you know again
614
00:22:12,799 –> 00:22:16,400
Chief data analytics officers are kind
615
00:22:14,559 –> 00:22:18,320
of a a right brain left brain
616
00:22:16,400 –> 00:22:19,799
combination of yeah smart interesting
617
00:22:18,320 –> 00:22:21,880
people who understand the business who
618
00:22:19,799 –> 00:22:24,200
also understand technology uh and
619
00:22:21,880 –> 00:22:26,039
they’re usually very generous right um
620
00:22:24,200 –> 00:22:27,240
and I’ve always said to young people
621
00:22:26,039 –> 00:22:28,799
reach out to the people in your
622
00:22:27,240 –> 00:22:31,080
organization no matter how senior they
623
00:22:28,799 –> 00:22:33,919
are and ask for time the worst they can
624
00:22:31,080 –> 00:22:36,000
say is no right and at that point even
625
00:22:33,919 –> 00:22:38,840
the acknowledgement that you try is
626
00:22:36,000 –> 00:22:40,960
something um and I think that that that
627
00:22:38,840 –> 00:22:43,480
Spirit of curiosity and exploration and
628
00:22:40,960 –> 00:22:44,880
Innovation will now become a keystone to
629
00:22:43,480 –> 00:22:46,679
how we operate in the world today
630
00:22:44,880 –> 00:22:48,440
because it won’t just be about what you
631
00:22:46,679 –> 00:22:50,679
know or what knowledge is resid in your
632
00:22:48,440 –> 00:22:52,679
head all knowledge is now available the
633
00:22:50,679 –> 00:22:55,679
question is how do you utilize that how
634
00:22:52,679 –> 00:22:58,679
do you make it real um and for that you
635
00:22:55,679 –> 00:23:01,640
have to go out into the world yeah which
636
00:22:58,679 –> 00:23:04,400
brings me to another um aspect of you
637
00:23:01,640 –> 00:23:07,200
have a lot of knowledge but some of it
638
00:23:04,400 –> 00:23:09,760
is not real some of it is fake some some
639
00:23:07,200 –> 00:23:14,400
of the of the things you you’re reading
640
00:23:09,760 –> 00:23:19,880
you are like people just um share spread
641
00:23:14,400 –> 00:23:22,000
um are not good for you um so how in
642
00:23:19,880 –> 00:23:25,679
your opinion how do
643
00:23:22,000 –> 00:23:27,679
you how can you verify what’s what’s the
644
00:23:25,679 –> 00:23:29,640
right how you build your mental filters
645
00:23:27,679 –> 00:23:31,200
on this stuff is everything right as as
646
00:23:29,640 –> 00:23:33,480
you pointed out there is infinite
647
00:23:31,200 –> 00:23:35,400
knowledge available now we have more
648
00:23:33,480 –> 00:23:39,000
power in our phones than they did going
649
00:23:35,400 –> 00:23:41,760
to the Moon 50 years ago 60 years ago um
650
00:23:39,000 –> 00:23:43,480
got more than that now um and so
651
00:23:41,760 –> 00:23:45,440
beginning to understand how you
652
00:23:43,480 –> 00:23:48,440
differentiate between what is real and
653
00:23:45,440 –> 00:23:50,360
what is not will be a key skill right
654
00:23:48,440 –> 00:23:51,919
and I think the best way to do that is
655
00:23:50,360 –> 00:23:54,480
through the people filter not through
656
00:23:51,919 –> 00:23:56,320
the technology filter right very often
657
00:23:54,480 –> 00:23:58,120
you have to pick your sources of
658
00:23:56,320 –> 00:23:59,679
information and I’m going to stay away
659
00:23:58,120 –> 00:24:01,039
from any anything political but you know
660
00:23:59,679 –> 00:24:05,360
we live in a world today that’s highly
661
00:24:01,039 –> 00:24:07,120
polarized right and media content
662
00:24:05,360 –> 00:24:08,960
messaging even what my face looks like
663
00:24:07,120 –> 00:24:11,840
can be replicated can be changed can be
664
00:24:08,960 –> 00:24:14,520
altered can be transformed um can be
665
00:24:11,840 –> 00:24:18,200
used for purposes that are nefarious or
666
00:24:14,520 –> 00:24:22,520
or even um sort of malicious right and
667
00:24:18,200 –> 00:24:24,240
so figuring out how you trust and I I I
668
00:24:22,520 –> 00:24:27,840
I use that word casually but that word
669
00:24:24,240 –> 00:24:29,600
trust is unbelievably important right I
670
00:24:27,840 –> 00:24:30,960
think about it a lot and one of the
671
00:24:29,600 –> 00:24:32,960
things I talked to my teams about at
672
00:24:30,960 –> 00:24:35,120
PepsiCo specifically was trust is our
673
00:24:32,960 –> 00:24:37,480
most important word the fact that the
674
00:24:35,120 –> 00:24:39,320
company needs to trust us that we are
675
00:24:37,480 –> 00:24:40,960
coming in with the right intent and the
676
00:24:39,320 –> 00:24:43,120
right tools and the right technology and
677
00:24:40,960 –> 00:24:45,399
the right opportunities the fact that
678
00:24:43,120 –> 00:24:47,880
they have trust in the data itself that
679
00:24:45,399 –> 00:24:50,399
the data is correct that it is you know
680
00:24:47,880 –> 00:24:52,159
data can be very dirty in all its forms
681
00:24:50,399 –> 00:24:54,120
it depends who put it in depends how was
682
00:24:52,159 –> 00:24:56,240
created depends how it was structured or
683
00:24:54,120 –> 00:24:59,159
unstructured how it was analyzed what
684
00:24:56,240 –> 00:25:00,760
models were used to decrypt it secrets
685
00:24:59,159 –> 00:25:02,279
and then finally what were the outputs
686
00:25:00,760 –> 00:25:04,240
of these products and applications that
687
00:25:02,279 –> 00:25:05,679
we created right are they better than
688
00:25:04,240 –> 00:25:08,080
what they’ve been using before are they
689
00:25:05,679 –> 00:25:09,240
not and so both that personal trust in
690
00:25:08,080 –> 00:25:11,000
the fact that we are doing the right
691
00:25:09,240 –> 00:25:12,360
thing and that professional trust in the
692
00:25:11,000 –> 00:25:14,120
fact that the data and applications are
693
00:25:12,360 –> 00:25:15,559
right is very important and the last
694
00:25:14,120 –> 00:25:18,159
piece is what you just described which
695
00:25:15,559 –> 00:25:20,520
is we also operate in a world where we
696
00:25:18,159 –> 00:25:23,159
need external data signals right
697
00:25:20,520 –> 00:25:26,600
whatever they might be and some of those
698
00:25:23,159 –> 00:25:27,840
now are beginning to be less trusted
699
00:25:26,600 –> 00:25:29,640
than they were back when there were very
700
00:25:27,840 –> 00:25:32,200
few
701
00:25:29,640 –> 00:25:34,559
regulated or managed you know I mean now
702
00:25:32,200 –> 00:25:36,840
the world is global we are we are linked
703
00:25:34,559 –> 00:25:39,279
to everyone in in you know in one way or
704
00:25:36,840 –> 00:25:43,000
the other there are very many ways for
705
00:25:39,279 –> 00:25:44,919
people to sort of not do the right thing
706
00:25:43,000 –> 00:25:46,399
and again that’s why we need governance
707
00:25:44,919 –> 00:25:48,200
we need the world to think about this
708
00:25:46,399 –> 00:25:50,320
stuff um I know we were talking earlier
709
00:25:48,200 –> 00:25:51,600
about the fact that I’ve spent time with
710
00:25:50,320 –> 00:25:53,159
sort of the world economic forum and
711
00:25:51,600 –> 00:25:56,000
some of the work they’ve been doing with
712
00:25:53,159 –> 00:25:58,360
a lot of my peers with regulatory bodies
713
00:25:56,000 –> 00:26:01,080
with governmental and legislative bodies
714
00:25:58,360 –> 00:26:03,320
and the amount of information is widely
715
00:26:01,080 –> 00:26:05,240
variant in terms of what people know
716
00:26:03,320 –> 00:26:06,679
right I mean we have legislation
717
00:26:05,240 –> 00:26:08,399
sometimes being created by very smart
718
00:26:06,679 –> 00:26:10,200
young AIDS who are you know spending
719
00:26:08,399 –> 00:26:11,720
three six months even a year trying to
720
00:26:10,200 –> 00:26:13,760
understand all of AI it’s a difficult
721
00:26:11,720 –> 00:26:15,799
job right and and and then you’re trying
722
00:26:13,760 –> 00:26:17,120
to create legislation for the future
723
00:26:15,799 –> 00:26:18,880
when the technology is changing every
724
00:26:17,120 –> 00:26:22,159
six months every three months every
725
00:26:18,880 –> 00:26:23,440
month yeah um I was at a as I mentioned
726
00:26:22,159 –> 00:26:25,399
I was at a world economic Forum
727
00:26:23,440 –> 00:26:27,679
conference late last year where it was
728
00:26:25,399 –> 00:26:29,120
their first Forum on data governance and
729
00:26:27,679 –> 00:26:30,200
really thinking about sort of you know
730
00:26:29,120 –> 00:26:32,760
what’s going to change and they had
731
00:26:30,200 –> 00:26:35,960
legislators and technologists and and ex
732
00:26:32,760 –> 00:26:38,399
and operational leaders like myself and
733
00:26:35,960 –> 00:26:41,679
sort of one of the things that came up a
734
00:26:38,399 –> 00:26:44,080
ton was the speed at which this
735
00:26:41,679 –> 00:26:45,200
particular Revolution is taking place
736
00:26:44,080 –> 00:26:46,799
right when you think about even the
737
00:26:45,200 –> 00:26:49,080
internet Revolution or the mobile
738
00:26:46,799 –> 00:26:50,480
Revolution um the technology Revolution
739
00:26:49,080 –> 00:26:53,360
even the Industrial Revolution they you
740
00:26:50,480 –> 00:26:56,360
had some time to figure some stuff out
741
00:26:53,360 –> 00:26:58,240
with AI what we do today in six months
742
00:26:56,360 –> 00:26:59,320
could be completely redundant and I know
743
00:26:58,240 –> 00:27:00,640
don’t mean a little bit redundant I
744
00:26:59,320 –> 00:27:02,320
don’t mean you can keep using that old
745
00:27:00,640 –> 00:27:04,919
one for three years and it’ll kind of be
746
00:27:02,320 –> 00:27:08,559
all right it’s not going to work right
747
00:27:04,919 –> 00:27:12,880
not the same way so I think that that
748
00:27:08,559 –> 00:27:15,760
pace of change is new um and the amount
749
00:27:12,880 –> 00:27:17,640
of impact it’s having mainly because now
750
00:27:15,760 –> 00:27:19,919
consumers have become very aware of it
751
00:27:17,640 –> 00:27:22,320
since the end of 2022 when Chad GPT
752
00:27:19,919 –> 00:27:23,919
became real um and suddenly they saw in
753
00:27:22,320 –> 00:27:25,159
all of the consumer applications that
754
00:27:23,919 –> 00:27:27,360
they were used to dealing with that
755
00:27:25,159 –> 00:27:29,480
there was this revolutionary new way to
756
00:27:27,360 –> 00:27:32,039
think about how content is generated and
757
00:27:29,480 –> 00:27:33,679
how of you can speak to a computer if
758
00:27:32,039 –> 00:27:35,159
you will and how it can create things
759
00:27:33,679 –> 00:27:37,360
for you that were just not possible
760
00:27:35,159 –> 00:27:39,840
before this suddenly AI which had been
761
00:27:37,360 –> 00:27:42,279
around for decades became all in the
762
00:27:39,840 –> 00:27:43,760
Forefront and we see again the same hype
763
00:27:42,279 –> 00:27:45,480
cycle you know some of the stuff that
764
00:27:43,760 –> 00:27:49,399
people were expecting to work is not
765
00:27:45,480 –> 00:27:52,200
working um they I also I always laugh
766
00:27:49,399 –> 00:27:54,760
about this when I presented I used the
767
00:27:52,200 –> 00:27:56,600
analogy of an iceberg right it actually
768
00:27:54,760 –> 00:27:58,679
became the logo of our team subsequent
769
00:27:56,600 –> 00:28:00,120
to that because the top of the iceberg
770
00:27:58,679 –> 00:28:01,600
is you know the 10% that you see and
771
00:28:00,120 –> 00:28:03,679
it’s the AI and the analytics and the
772
00:28:01,600 –> 00:28:07,120
cool stuff but the bottom of the iceberg
773
00:28:03,679 –> 00:28:09,000
is the data right and and data goes from
774
00:28:07,120 –> 00:28:10,720
sexy to non-sexy to sexy to non-sexy and
775
00:28:09,000 –> 00:28:12,200
it’s kind of in between right now but
776
00:28:10,720 –> 00:28:13,960
and it’s painful right nobody wants to
777
00:28:12,200 –> 00:28:16,080
hear about how complicated it is to get
778
00:28:13,960 –> 00:28:18,840
your data organized but I think what’s
779
00:28:16,080 –> 00:28:21,760
changing is historically when analytics
780
00:28:18,840 –> 00:28:23,799
were presented primarily a report not us
781
00:28:21,760 –> 00:28:25,320
be numbers and often there was a fair
782
00:28:23,799 –> 00:28:27,080
amount of hey skepticism you know these
783
00:28:25,320 –> 00:28:29,279
are not my reports these are not my
784
00:28:27,080 –> 00:28:30,279
numbers where do you get them from yeah
785
00:28:29,279 –> 00:28:32,360
with some of the stuff that’s being
786
00:28:30,279 –> 00:28:34,360
generated now it could read like a
787
00:28:32,360 –> 00:28:36,960
report written by human analyst who took
788
00:28:34,360 –> 00:28:40,480
a week to report it except it took one
789
00:28:36,960 –> 00:28:42,120
second to generate by an AI and you are
790
00:28:40,480 –> 00:28:43,960
more likely to believe something when
791
00:28:42,120 –> 00:28:45,919
you believe that it came from what felt
792
00:28:43,960 –> 00:28:47,640
like a human right and that could be a
793
00:28:45,919 –> 00:28:49,559
written report and there there’s lots of
794
00:28:47,640 –> 00:28:51,880
talk now about a lot of startups now are
795
00:28:49,559 –> 00:28:54,440
working with the creation of human face
796
00:28:51,880 –> 00:28:56,080
and voice and emotion um you know there
797
00:28:54,440 –> 00:28:57,760
are owning SS happening with some of
798
00:28:56,080 –> 00:28:59,760
these things happening on them now not
799
00:28:57,760 –> 00:29:03,279
just analysis of those from real people
800
00:28:59,760 –> 00:29:04,960
but the creation of you know fake people
801
00:29:03,279 –> 00:29:06,480
essentially talking about things in an
802
00:29:04,960 –> 00:29:08,000
expressive way that you and I would
803
00:29:06,480 –> 00:29:09,840
consider the domain of a human before
804
00:29:08,000 –> 00:29:11,159
that you’re more likely to believe that
805
00:29:09,840 –> 00:29:13,600
that’s just how we’re biologically
806
00:29:11,159 –> 00:29:17,240
hardwired do you because there is this
807
00:29:13,600 –> 00:29:20,679
concept of um uncanny valley right the
808
00:29:17,240 –> 00:29:23,600
there we humans can feel that something
809
00:29:20,679 –> 00:29:26,279
is off and it’s makes us uncomfortable
810
00:29:23,600 –> 00:29:28,320
yes look with Graphics we we we we
811
00:29:26,279 –> 00:29:29,840
crossed that barrier a long time ago
812
00:29:28,320 –> 00:29:32,440
right I mean if you remember the early
813
00:29:29,840 –> 00:29:33,840
days of of 3D effects in movies you
814
00:29:32,440 –> 00:29:35,840
would have you know that sort of
815
00:29:33,840 –> 00:29:37,960
weirdness and as you mentioned that
816
00:29:35,840 –> 00:29:40,480
uncanny valley notion that you can tell
817
00:29:37,960 –> 00:29:42,960
that they’re not quite there deep fakes
818
00:29:40,480 –> 00:29:44,640
now are unbelievably good right I mean
819
00:29:42,960 –> 00:29:46,399
quite frankly in most cases you will not
820
00:29:44,640 –> 00:29:48,360
be able to tell the difference often
821
00:29:46,399 –> 00:29:50,600
some of these AIS are beating cheering
822
00:29:48,360 –> 00:29:52,559
tests which is the proof that you are in
823
00:29:50,600 –> 00:29:54,840
fact human right maybe these tests need
824
00:29:52,559 –> 00:29:56,640
to be changed and re reevaluated and I
825
00:29:54,840 –> 00:29:58,679
think the line between what is created
826
00:29:56,640 –> 00:30:01,440
and what is real as you pointed out out
827
00:29:58,679 –> 00:30:03,360
is a lot more gray and and quite frankly
828
00:30:01,440 –> 00:30:05,360
most of us are not equipped anymore to
829
00:30:03,360 –> 00:30:07,720
tell the difference of truly advanced
830
00:30:05,360 –> 00:30:08,640
technology um and was it Arthur C Clark
831
00:30:07,720 –> 00:30:11,039
who said you know the difference between
832
00:30:08,640 –> 00:30:13,159
Magic and sort of magic is just science
833
00:30:11,039 –> 00:30:14,720
that we haven’t understood yet yeah
834
00:30:13,159 –> 00:30:16,600
right and so a lot of the things that
835
00:30:14,720 –> 00:30:18,240
we’re seeing now feel magical because
836
00:30:16,600 –> 00:30:19,840
for a lot of us we’re not fully
837
00:30:18,240 –> 00:30:21,559
conversent with how and why they were
838
00:30:19,840 –> 00:30:23,240
created and you think about you know
839
00:30:21,559 –> 00:30:26,279
some of the darker side of this stuff
840
00:30:23,240 –> 00:30:28,440
fraud and uh you know earlier it used to
841
00:30:26,279 –> 00:30:29,559
be fishing emails or a text and thing
842
00:30:28,440 –> 00:30:32,320
from somebody you thought now it could
843
00:30:29,559 –> 00:30:34,080
be a voicemail from your video voicemail
844
00:30:32,320 –> 00:30:36,279
from your mother saying I’m in trouble
845
00:30:34,080 –> 00:30:38,519
and I need money and so and so and you
846
00:30:36,279 –> 00:30:41,279
know lots of people have fallen prey to
847
00:30:38,519 –> 00:30:43,000
scamming already lots of companies have
848
00:30:41,279 –> 00:30:44,600
been hacked ransomware all sorts of
849
00:30:43,000 –> 00:30:46,159
things are happening in cyber and
850
00:30:44,600 –> 00:30:47,720
there’s no shortage of scary things that
851
00:30:46,159 –> 00:30:50,120
can be done with these Technologies but
852
00:30:47,720 –> 00:30:52,960
again you know when when jny I first
853
00:30:50,120 –> 00:30:54,480
came out every board every CEO every
854
00:30:52,960 –> 00:30:56,840
every senior leader wanted to know how
855
00:30:54,480 –> 00:30:58,200
they could be more relevant in this and
856
00:30:56,840 –> 00:31:01,240
one of the first things we did was to
857
00:30:58,200 –> 00:31:02,960
say hold on right large language models
858
00:31:01,240 –> 00:31:04,360
are processing data for in some cases
859
00:31:02,960 –> 00:31:07,000
from as much of the internet as they can
860
00:31:04,360 –> 00:31:09,399
see some of this data is not copyrighted
861
00:31:07,000 –> 00:31:11,880
some of this data we cannot use if you
862
00:31:09,399 –> 00:31:14,360
actually ask a question of a large
863
00:31:11,880 –> 00:31:15,880
language model you are educating it are
864
00:31:14,360 –> 00:31:17,480
you educating them on things that you
865
00:31:15,880 –> 00:31:18,919
should not be educating them on and
866
00:31:17,480 –> 00:31:20,440
obviously we worked with our large
867
00:31:18,919 –> 00:31:22,399
technology Partners to figure out how to
868
00:31:20,440 –> 00:31:23,559
privatize those Notions and you know
869
00:31:22,399 –> 00:31:26,200
every cdao will tell you they’re
870
00:31:23,559 –> 00:31:28,039
figuring out how to essentially create a
871
00:31:26,200 –> 00:31:29,639
a shared but private space right with
872
00:31:28,039 –> 00:31:30,639
they can converse with the AI without
873
00:31:29,639 –> 00:31:32,720
losing all of their corporate
874
00:31:30,639 –> 00:31:35,279
information but that warning was
875
00:31:32,720 –> 00:31:37,799
important you know we’re at the stage
876
00:31:35,279 –> 00:31:39,720
with AI a little like Prometheus with
877
00:31:37,799 –> 00:31:41,240
fire right we know that it’s this
878
00:31:39,720 –> 00:31:42,880
unbelievably power in fact I used a
879
00:31:41,240 –> 00:31:44,320
stronger analogy when I was talking to
880
00:31:42,880 –> 00:31:47,279
some of my leaders of PepsiCo I said
881
00:31:44,320 –> 00:31:48,960
this is like nuclear technology right it
882
00:31:47,279 –> 00:31:51,600
can be used for good it can be used for
883
00:31:48,960 –> 00:31:53,279
evil the technology itself doesn’t care
884
00:31:51,600 –> 00:31:55,919
it’s our application of them as human
885
00:31:53,279 –> 00:31:58,080
beings that is going to matter and so we
886
00:31:55,919 –> 00:31:59,440
have to be cautious as we think through
887
00:31:58,080 –> 00:32:01,919
this because again like I said yeah is
888
00:31:59,440 –> 00:32:05,000
not new but some of these things can
889
00:32:01,919 –> 00:32:06,600
change how people live and sort of work
890
00:32:05,000 –> 00:32:08,480
in the world today and we have to be
891
00:32:06,600 –> 00:32:10,639
cautious about that especially for a
892
00:32:08,480 –> 00:32:13,039
large corporation there there’s in
893
00:32:10,639 –> 00:32:16,600
efficient effort to think about
894
00:32:13,039 –> 00:32:21,799
potential risks such as you know uh the
895
00:32:16,600 –> 00:32:25,559
AGI concept and and how it can
896
00:32:21,799 –> 00:32:29,399
transform like the way we collaborate
897
00:32:25,559 –> 00:32:32,880
the way we live and whom we fall in love
898
00:32:29,399 –> 00:32:35,720
into like love who whom we decid to love
899
00:32:32,880 –> 00:32:39,639
like that kind of strange Concepts which
900
00:32:35,720 –> 00:32:40,799
we previously seen in um Sci-Fi movies
901
00:32:39,639 –> 00:32:44,559
yeah and look look I’ve been a huge
902
00:32:40,799 –> 00:32:46,600
sci-fi buff since I was a kid um um um I
903
00:32:44,559 –> 00:32:48,080
I firmly believe that most of our ideas
904
00:32:46,600 –> 00:32:50,120
are not new they’ve been written about
905
00:32:48,080 –> 00:32:52,120
by somebody right it’s a question of
906
00:32:50,120 –> 00:32:55,519
again how we manifest the magic into
907
00:32:52,120 –> 00:32:56,720
reality through science um and as you’ve
908
00:32:55,519 –> 00:32:58,360
pointed out the technology is
909
00:32:56,720 –> 00:32:59,960
progressing so far
910
00:32:58,360 –> 00:33:02,360
that sometimes the mechanisms that
911
00:32:59,960 –> 00:33:03,840
manage them are struggling to keep up
912
00:33:02,360 –> 00:33:06,159
right whether that’s legislation or
913
00:33:03,840 –> 00:33:08,679
regulation or even industry
914
00:33:06,159 –> 00:33:09,960
conversation um there are fundamentally
915
00:33:08,679 –> 00:33:11,279
different ways that governments are
916
00:33:09,960 –> 00:33:12,919
approaching it you know when you think
917
00:33:11,279 –> 00:33:15,080
about how Singapore or China is
918
00:33:12,919 –> 00:33:16,919
approaching it versus the us there are
919
00:33:15,080 –> 00:33:18,320
fundamental differences with sort of how
920
00:33:16,919 –> 00:33:21,320
that’s being done how that’s being
921
00:33:18,320 –> 00:33:22,760
regulated um and I don’t you know we
922
00:33:21,320 –> 00:33:24,320
don’t know what right now which one is
923
00:33:22,760 –> 00:33:25,159
going to win or which one is the best we
924
00:33:24,320 –> 00:33:27,279
just know that there are different
925
00:33:25,159 –> 00:33:29,679
approaches to this thing and because we
926
00:33:27,279 –> 00:33:31,840
live in a Village it means that
927
00:33:29,679 –> 00:33:33,240
companies like the ones I’ve worked at
928
00:33:31,840 –> 00:33:35,360
and that a lot of your viewers are
929
00:33:33,240 –> 00:33:36,880
working at we are operating in a variety
930
00:33:35,360 –> 00:33:39,399
of different spaces right whether that’s
931
00:33:36,880 –> 00:33:42,039
geographical spaces or techn technology
932
00:33:39,399 –> 00:33:43,320
spaces or even how much technical debt
933
00:33:42,039 –> 00:33:45,559
they have so can they go and quickly
934
00:33:43,320 –> 00:33:47,480
build something fast right now without
935
00:33:45,559 –> 00:33:49,480
having to worry about fixing all their
936
00:33:47,480 –> 00:33:51,799
you know some cases billions of dollars
937
00:33:49,480 –> 00:33:53,000
systems that they’ve already built or do
938
00:33:51,799 –> 00:33:56,360
they have to really think about what
939
00:33:53,000 –> 00:33:58,240
that technical debt is and so each um
940
00:33:56,360 –> 00:34:00,480
engagement as I SP more time in
941
00:33:58,240 –> 00:34:02,760
Consulting is particularly
942
00:34:00,480 –> 00:34:05,360
different right there is no Silver
943
00:34:02,760 –> 00:34:07,240
Bullet for everyone there is no common
944
00:34:05,360 –> 00:34:09,440
answer for have we have we sorted out
945
00:34:07,240 –> 00:34:11,679
bias have we thought about a regulatory
946
00:34:09,440 –> 00:34:13,520
framework have we thought about an
947
00:34:11,679 –> 00:34:15,760
ethical framework for how we think about
948
00:34:13,520 –> 00:34:17,240
the utility of this analytics and simple
949
00:34:15,760 –> 00:34:19,440
examples like when we think about
950
00:34:17,240 –> 00:34:21,240
marketing if the the AI can go off and
951
00:34:19,440 –> 00:34:24,960
learn in a way that is fundamentally
952
00:34:21,240 –> 00:34:27,040
unhuman because it’s not human right and
953
00:34:24,960 –> 00:34:28,280
even the word ethics is so interesting
954
00:34:27,040 –> 00:34:32,320
right who ethics
955
00:34:28,280 –> 00:34:35,599
mine yours are in this company in this
956
00:34:32,320 –> 00:34:37,520
country in this part of the world ethics
957
00:34:35,599 –> 00:34:39,879
are a fungible thing and we are quick to
958
00:34:37,520 –> 00:34:42,440
forgive humans for having diff different
959
00:34:39,879 –> 00:34:44,240
ethical values we will not forgive
960
00:34:42,440 –> 00:34:47,159
technology for having different ethical
961
00:34:44,240 –> 00:34:49,639
values and so that is that’s a
962
00:34:47,159 –> 00:34:51,159
philosophical question right we’re now
963
00:34:49,639 –> 00:34:52,520
approach because we’re we’re getting so
964
00:34:51,159 –> 00:34:55,000
close to the domain of what was
965
00:34:52,520 –> 00:34:57,160
previously only human business to
966
00:34:55,000 –> 00:34:59,400
determine we are now beginning to give
967
00:34:57,160 –> 00:35:01,000
some of those things to the technology
968
00:34:59,400 –> 00:35:02,680
right whether that’s predicting
969
00:35:01,000 –> 00:35:05,960
financial analysis all the way to
970
00:35:02,680 –> 00:35:07,960
deciding who do we Market to right this
971
00:35:05,960 –> 00:35:09,599
notion of human in the loop is not going
972
00:35:07,960 –> 00:35:13,160
to become less important it’s going to
973
00:35:09,599 –> 00:35:16,560
become more important it definitely will
974
00:35:13,160 –> 00:35:20,160
because maybe it’s it does not have
975
00:35:16,560 –> 00:35:22,440
ethics but the the repercussions the the
976
00:35:20,160 –> 00:35:25,520
consequences of of
977
00:35:22,440 –> 00:35:29,280
the algorithm doing I don’t know giving
978
00:35:25,520 –> 00:35:32,599
that kind of score um instead of other
979
00:35:29,280 –> 00:35:35,200
may have fundamental um you know
980
00:35:32,599 –> 00:35:36,800
consequences for for a person of course
981
00:35:35,200 –> 00:35:39,040
and you know I mean like you know the
982
00:35:36,800 –> 00:35:40,640
the the standard use case is you’re
983
00:35:39,040 –> 00:35:43,800
you’re teaching an autonomous car what
984
00:35:40,640 –> 00:35:45,640
to do in in every scenario and it’s
985
00:35:43,800 –> 00:35:48,400
about to have an accident and it’s about
986
00:35:45,640 –> 00:35:49,720
to there’s one group of of of of of
987
00:35:48,400 –> 00:35:52,760
people and there’s a young child and
988
00:35:49,720 –> 00:35:54,640
there’s one group people older and how
989
00:35:52,760 –> 00:35:57,040
does it pick right and and this goes to
990
00:35:54,640 –> 00:35:58,680
the heart of what is an ethical a human
991
00:35:57,040 –> 00:36:01,880
determination
992
00:35:58,680 –> 00:36:04,599
is at some point you have to the systems
993
00:36:01,880 –> 00:36:08,079
and the models to do something that only
994
00:36:04,599 –> 00:36:09,640
a human historically has decided um and
995
00:36:08,079 –> 00:36:11,839
I think that will be very hard for us
996
00:36:09,640 –> 00:36:12,800
all to deal with right especially when
997
00:36:11,839 –> 00:36:14,599
when you think about the history of
998
00:36:12,800 –> 00:36:16,160
analytics it was primarily sort of
999
00:36:14,599 –> 00:36:17,359
descriptive and diagnostic right
1000
00:36:16,160 –> 00:36:19,920
essentially was looking in a rearview
1001
00:36:17,359 –> 00:36:21,800
mirror showing us what has happened so
1002
00:36:19,920 –> 00:36:23,760
we as humans can make a determination as
1003
00:36:21,800 –> 00:36:25,240
to what to do in the future now it’s
1004
00:36:23,760 –> 00:36:26,560
becoming predictive right so it’s
1005
00:36:25,240 –> 00:36:28,160
beginning to tell us hey here’s what I
1006
00:36:26,560 –> 00:36:30,319
think will happen in the future
1007
00:36:28,160 –> 00:36:33,119
or it’s even prescriptive which is here
1008
00:36:30,319 –> 00:36:35,240
is what you should do I as a Systems
1009
00:36:33,119 –> 00:36:38,119
Technology I’m going to tell you as a
1010
00:36:35,240 –> 00:36:40,640
human being what to do and that is a
1011
00:36:38,119 –> 00:36:43,359
very very different relationship right
1012
00:36:40,640 –> 00:36:44,760
um so this notion of what is technology
1013
00:36:43,359 –> 00:36:45,839
and what is human and what is human
1014
00:36:44,760 –> 00:36:48,359
determinant what is technology
1015
00:36:45,839 –> 00:36:50,280
determinant is becoming you know all
1016
00:36:48,359 –> 00:36:51,560
interesed and I think it’s going to make
1017
00:36:50,280 –> 00:36:53,640
a lot of people it has made a lot of
1018
00:36:51,560 –> 00:36:55,599
people very uncomfortable right and you
1019
00:36:53,640 –> 00:36:58,440
know their questions ranging from oh
1020
00:36:55,599 –> 00:37:01,800
will my job be gone will my industry be
1021
00:36:58,440 –> 00:37:04,200
gone you know am I am I teaching my
1022
00:37:01,800 –> 00:37:05,440
children what the next what this what
1023
00:37:04,200 –> 00:37:06,839
their world is going to be like because
1024
00:37:05,440 –> 00:37:09,079
it’s going to be so fundamentally
1025
00:37:06,839 –> 00:37:11,839
different from ours in terms of how they
1026
00:37:09,079 –> 00:37:13,760
learn what they learn why they learn you
1027
00:37:11,839 –> 00:37:16,359
how you will you are teaching them or
1028
00:37:13,760 –> 00:37:18,720
they are teaching you and ideally it
1029
00:37:16,359 –> 00:37:20,760
should be both right because I I have a
1030
00:37:18,720 –> 00:37:22,560
12-y old and a 15y old and I’m learning
1031
00:37:20,760 –> 00:37:24,880
from them constantly right because they
1032
00:37:22,560 –> 00:37:26,560
fundamentally operate differently um
1033
00:37:24,880 –> 00:37:27,720
because they are so used to having this
1034
00:37:26,560 –> 00:37:29,119
infinite amount of knowledge remember
1035
00:37:27,720 –> 00:37:31,040
when I was growing up maybe this dates
1036
00:37:29,119 –> 00:37:33,240
me but we had to go to an encyclopedia
1037
00:37:31,040 –> 00:37:35,000
or a library to gain knowledge right it
1038
00:37:33,240 –> 00:37:36,240
was the early days of the internet and
1039
00:37:35,000 –> 00:37:38,440
it was so exciting when you could
1040
00:37:36,240 –> 00:37:41,440
actually access something um through the
1041
00:37:38,440 –> 00:37:43,040
web our this next Generation even the
1042
00:37:41,440 –> 00:37:45,000
one that’s sort of you know in the
1043
00:37:43,040 –> 00:37:48,520
youngest AG of their career they are
1044
00:37:45,000 –> 00:37:51,760
used to having Limitless access right
1045
00:37:48,520 –> 00:37:54,160
and so they’re instant instant instant
1046
00:37:51,760 –> 00:37:55,560
instant Limitless so it’s the analytics
1047
00:37:54,160 –> 00:37:57,640
that is becoming everything right
1048
00:37:55,560 –> 00:38:00,000
because Gathering the data itself when
1049
00:37:57,640 –> 00:38:01,119
you think about jobs from 25 years ago
1050
00:38:00,000 –> 00:38:03,760
and you were coming out of college you
1051
00:38:01,119 –> 00:38:06,160
would do analyst jobs right you know at
1052
00:38:03,760 –> 00:38:07,560
especially if you you know a smart white
1053
00:38:06,160 –> 00:38:10,440
color worker was interested in doing
1054
00:38:07,560 –> 00:38:12,200
things in technology or Finance or cpg
1055
00:38:10,440 –> 00:38:13,480
you would go and be an analyst and the
1056
00:38:12,200 –> 00:38:14,720
cool thing about being an analyst was
1057
00:38:13,480 –> 00:38:15,560
not only did you learn how to put
1058
00:38:14,720 –> 00:38:16,720
together this information but you
1059
00:38:15,560 –> 00:38:19,079
learned about the companies and the
1060
00:38:16,720 –> 00:38:21,040
industries while you were doing it now
1061
00:38:19,079 –> 00:38:23,520
we don’t need people necessarily to do a
1062
00:38:21,040 –> 00:38:25,560
lot of that right I mean that data can
1063
00:38:23,520 –> 00:38:27,920
be gathered and analyzed by a well
1064
00:38:25,560 –> 00:38:29,280
constructed system but how it is
1065
00:38:27,920 –> 00:38:30,800
utilized at a company how those
1066
00:38:29,280 –> 00:38:33,000
decisions are made have to be human
1067
00:38:30,800 –> 00:38:35,800
determinant and again if a human is not
1068
00:38:33,000 –> 00:38:38,119
in the loop during the creation of that
1069
00:38:35,800 –> 00:38:40,160
data during the management of that data
1070
00:38:38,119 –> 00:38:42,040
during the creation of the analytics and
1071
00:38:40,160 –> 00:38:44,440
applications and tools that are deciding
1072
00:38:42,040 –> 00:38:46,079
then the outputs something very strange
1073
00:38:44,440 –> 00:38:47,359
is going to start to happen right which
1074
00:38:46,079 –> 00:38:50,000
the system is going to make determinants
1075
00:38:47,359 –> 00:38:53,040
of its own and we don’t want
1076
00:38:50,000 –> 00:38:56,640
that yeah yet so many people start
1077
00:38:53,040 –> 00:38:59,520
relying on those systems uh for advice
1078
00:38:56,640 –> 00:39:02,160
sometimes um you know quite uh which can
1079
00:38:59,520 –> 00:39:03,960
have dear consequences yeah no I mean
1080
00:39:02,160 –> 00:39:06,040
look that that’s not new right I mean
1081
00:39:03,960 –> 00:39:07,680
when you’re in a plane today and a pilot
1082
00:39:06,040 –> 00:39:10,160
is relying on systems that are highly
1083
00:39:07,680 –> 00:39:11,720
electronic to keep you alive and safe
1084
00:39:10,160 –> 00:39:15,280
and on your way to you know what would
1085
00:39:11,720 –> 00:39:16,520
be miraculous 50 years ago we we have
1086
00:39:15,280 –> 00:39:19,640
gotten pretty used to doing that right
1087
00:39:16,520 –> 00:39:21,400
we do that in every day of how often do
1088
00:39:19,640 –> 00:39:24,079
we touch our phones right every few
1089
00:39:21,400 –> 00:39:26,440
seconds um we’re constantly relying on
1090
00:39:24,079 –> 00:39:27,960
technology derived information to
1091
00:39:26,440 –> 00:39:29,800
determine where we’re going how we’re
1092
00:39:27,960 –> 00:39:31,599
going what to tell the car what music to
1093
00:39:29,800 –> 00:39:33,040
play what to tell our children how to
1094
00:39:31,599 –> 00:39:34,720
order food like we’re doing it
1095
00:39:33,040 –> 00:39:37,119
constantly and it’s only going to get
1096
00:39:34,720 –> 00:39:38,720
more and more intense right and granted
1097
00:39:37,119 –> 00:39:40,440
that’s not for everyone I grew up in
1098
00:39:38,720 –> 00:39:41,880
India and you know we still have a lot
1099
00:39:40,440 –> 00:39:44,440
of people sitting under the poverty line
1100
00:39:41,880 –> 00:39:46,480
but most of them have a cell phone
1101
00:39:44,440 –> 00:39:48,920
because that is access to knowledge
1102
00:39:46,480 –> 00:39:51,240
right fundamental access to knowledge so
1103
00:39:48,920 –> 00:39:53,000
it it spans the gamut of of human
1104
00:39:51,240 –> 00:39:55,160
population at this point in terms of how
1105
00:39:53,000 –> 00:39:56,599
they access information is beginning to
1106
00:39:55,160 –> 00:39:58,800
become uniform in the sense that they
1107
00:39:56,599 –> 00:40:00,480
all it’s not limited to the rich or the
1108
00:39:58,800 –> 00:40:02,839
powerful or people in a particular
1109
00:40:00,480 –> 00:40:04,240
country a lot of people have access to a
1110
00:40:02,839 –> 00:40:06,359
lot of information as you pointed out
1111
00:40:04,240 –> 00:40:08,280
nearly Limitless right the question now
1112
00:40:06,359 –> 00:40:10,560
becomes is how do we still give them
1113
00:40:08,280 –> 00:40:13,359
things to do that matter right and what
1114
00:40:10,560 –> 00:40:16,359
are those things that are truly human
1115
00:40:13,359 –> 00:40:17,960
only versus what are the things and even
1116
00:40:16,359 –> 00:40:20,280
when I say human only it’s always
1117
00:40:17,960 –> 00:40:22,040
enabled by technology right but what are
1118
00:40:20,280 –> 00:40:24,680
the decisions that specifically should
1119
00:40:22,040 –> 00:40:26,280
not be made by machines and that again
1120
00:40:24,680 –> 00:40:28,520
begins to become there are some that are
1121
00:40:26,280 –> 00:40:29,680
very clear very black white very easy to
1122
00:40:28,520 –> 00:40:31,640
understand some that are much more
1123
00:40:29,680 –> 00:40:33,480
amorphous much more gray much more
1124
00:40:31,640 –> 00:40:35,720
philosophical and that’s where as you
1125
00:40:33,480 –> 00:40:38,359
pointed out governments are talking you
1126
00:40:35,720 –> 00:40:39,599
know NOS are talking uh private
1127
00:40:38,359 –> 00:40:42,680
corporations are talking public
1128
00:40:39,599 –> 00:40:45,200
corporations are talking um um Watchdog
1129
00:40:42,680 –> 00:40:46,880
organizations are keeping an eye out to
1130
00:40:45,200 –> 00:40:48,480
make sure they’re thinking about how
1131
00:40:46,880 –> 00:40:51,400
legislation is created how we’re
1132
00:40:48,480 –> 00:40:54,720
enforcing it but it it’s a whole new
1133
00:40:51,400 –> 00:40:57,079
world of of uh of governance and again I
1134
00:40:54,720 –> 00:40:59,319
think we have to be very careful to use
1135
00:40:57,079 –> 00:41:02,240
that word in a positive sense right
1136
00:40:59,319 –> 00:41:04,720
because it requires good people to work
1137
00:41:02,240 –> 00:41:06,319
hard on things that matter uh for us to
1138
00:41:04,720 –> 00:41:08,160
be a better society and but I
1139
00:41:06,319 –> 00:41:10,200
fundamentally believe technology can do
1140
00:41:08,160 –> 00:41:13,040
that if we all sort of do the right
1141
00:41:10,200 –> 00:41:16,359
thing yeah it’s it’s just the tool we’ve
1142
00:41:13,040 –> 00:41:18,800
created and supposed to serve us but I
1143
00:41:16,359 –> 00:41:22,839
what I’m what worries me is
1144
00:41:18,800 –> 00:41:26,400
that in a way those technologists those
1145
00:41:22,839 –> 00:41:29,920
creators um are trying to Design Systems
1146
00:41:26,400 –> 00:41:34,640
which will do the thinking for us to to
1147
00:41:29,920 –> 00:41:37,640
to like to two wide extent meaning we
1148
00:41:34,640 –> 00:41:40,839
will become lazy we will become we can
1149
00:41:37,640 –> 00:41:43,280
become less creative because yeah all
1150
00:41:40,839 –> 00:41:46,960
the choice will be limited and just
1151
00:41:43,280 –> 00:41:49,040
giving giving to us uh to confume
1152
00:41:46,960 –> 00:41:50,520
um you know there are these two great
1153
00:41:49,040 –> 00:41:52,359
movies that illustrate this point there
1154
00:41:50,520 –> 00:41:53,880
was one made this 15 years ago now
1155
00:41:52,359 –> 00:41:55,240
called idiocracy where someone comes
1156
00:41:53,880 –> 00:41:56,680
from the past into the future and
1157
00:41:55,240 –> 00:41:59,560
everyone’s kind of idiot because they’re
1158
00:41:56,680 –> 00:42:01,040
now relying on the world and Technology
1159
00:41:59,560 –> 00:42:02,520
giving them what they need the other
1160
00:42:01,040 –> 00:42:05,359
movie which I think is fascinating for
1161
00:42:02,520 –> 00:42:06,880
this is w-e the story about the robot
1162
00:42:05,359 –> 00:42:09,359
all the humans are sitting on like
1163
00:42:06,880 –> 00:42:11,200
floating couches and just watching sort
1164
00:42:09,359 –> 00:42:13,480
of media and and and have no real
1165
00:42:11,200 –> 00:42:15,240
curiosity anymore and look that’s a very
1166
00:42:13,480 –> 00:42:17,440
there’s lots of dystopian Futures
1167
00:42:15,240 –> 00:42:18,720
possible Right Where technology takes
1168
00:42:17,440 –> 00:42:21,079
over and does all sorts of strange
1169
00:42:18,720 –> 00:42:22,440
things I mean look in the 50s and 60s we
1170
00:42:21,079 –> 00:42:25,240
were wored that nuclear technology would
1171
00:42:22,440 –> 00:42:27,240
destroy the world right um I so I think
1172
00:42:25,240 –> 00:42:28,480
all revolutions are difficult right we
1173
00:42:27,240 –> 00:42:29,920
have to go through a significant period
1174
00:42:28,480 –> 00:42:31,960
of change and understand what this
1175
00:42:29,920 –> 00:42:33,720
change means and that’s why I don’t
1176
00:42:31,960 –> 00:42:35,640
think it’s enough for us to sit back and
1177
00:42:33,720 –> 00:42:38,000
say oh it’ll figure itself out right
1178
00:42:35,640 –> 00:42:39,760
this is where people um who have
1179
00:42:38,000 –> 00:42:41,720
curiosity and Innovation and care about
1180
00:42:39,760 –> 00:42:43,280
the world have to be sort of aggressive
1181
00:42:41,720 –> 00:42:45,040
about saying we are interested in
1182
00:42:43,280 –> 00:42:47,319
determining what our future will look
1183
00:42:45,040 –> 00:42:48,880
like and I will tell you again you know
1184
00:42:47,319 –> 00:42:50,480
my parents generation Our Generation
1185
00:42:48,880 –> 00:42:53,359
we’ve had it pretty good right since the
1186
00:42:50,480 –> 00:42:55,240
second world war through till now while
1187
00:42:53,359 –> 00:42:57,480
the world has gone through a lot of pain
1188
00:42:55,240 –> 00:43:00,319
less pain less less Global pain the
1189
00:42:57,480 –> 00:43:01,960
generations that preceded it and now we
1190
00:43:00,319 –> 00:43:04,160
have scenarios where you know climate’s
1191
00:43:01,960 –> 00:43:05,839
a very real concern and should be a
1192
00:43:04,160 –> 00:43:07,440
highly real concern for us I do some
1193
00:43:05,839 –> 00:43:10,440
work in that space too which I think is
1194
00:43:07,440 –> 00:43:12,400
very very important um and and and then
1195
00:43:10,440 –> 00:43:14,559
there’s this technology Revolution that
1196
00:43:12,400 –> 00:43:17,160
is changing everything about how the
1197
00:43:14,559 –> 00:43:18,760
Haves and the Have Nots live right and
1198
00:43:17,160 –> 00:43:19,880
again there’s lots of social you could
1199
00:43:18,760 –> 00:43:21,280
you know you could spend an hour just
1200
00:43:19,880 –> 00:43:24,000
talking about the social sort of
1201
00:43:21,280 –> 00:43:25,920
consequences of what this does in terms
1202
00:43:24,000 –> 00:43:28,640
of the creation of wealth and certain
1203
00:43:25,920 –> 00:43:30,680
sort of certain demographics versus not
1204
00:43:28,640 –> 00:43:31,880
in others sort of the geopolitical
1205
00:43:30,680 –> 00:43:33,680
implications of what it is that we’re
1206
00:43:31,880 –> 00:43:34,800
going through but I think in the end we
1207
00:43:33,680 –> 00:43:37,440
have to think about what we can do
1208
00:43:34,800 –> 00:43:39,599
personally right and how we can create
1209
00:43:37,440 –> 00:43:42,000
culture in our professional lives in our
1210
00:43:39,599 –> 00:43:44,040
personal lives how we can educate those
1211
00:43:42,000 –> 00:43:46,400
that need education how we can uplift
1212
00:43:44,040 –> 00:43:49,040
people and I fundamentally believe we
1213
00:43:46,400 –> 00:43:51,119
can uplift everyone right I mean we have
1214
00:43:49,040 –> 00:43:52,640
to believe that and it’s not what
1215
00:43:51,119 –> 00:43:55,040
everyone thinks and some people are very
1216
00:43:52,640 –> 00:43:56,839
direct about that you know that they are
1217
00:43:55,040 –> 00:43:58,079
more mercenary about saying you know
1218
00:43:56,839 –> 00:44:00,119
these people can be educated these
1219
00:43:58,079 –> 00:44:01,599
people cannot I don’t I don’t believe
1220
00:44:00,119 –> 00:44:02,960
that I think everyone can be educated in
1221
00:44:01,599 –> 00:44:04,440
certain ways now that doesn’t mean I
1222
00:44:02,960 –> 00:44:06,480
don’t think teams have to be changed or
1223
00:44:04,440 –> 00:44:07,760
reorganized I think they do but I think
1224
00:44:06,480 –> 00:44:09,480
we can do it in a way that is
1225
00:44:07,760 –> 00:44:11,240
fundamentally human especially when we
1226
00:44:09,480 –> 00:44:12,839
are dealing with situations that are
1227
00:44:11,240 –> 00:44:16,200
very very technology driven to your
1228
00:44:12,839 –> 00:44:18,119
point we have to create a safe space for
1229
00:44:16,200 –> 00:44:20,160
that sort of the human permeating the
1230
00:44:18,119 –> 00:44:22,000
technology to allow the technology to be
1231
00:44:20,160 –> 00:44:23,640
utilized to its full potential in a way
1232
00:44:22,000 –> 00:44:26,599
that is safe for us as a
1233
00:44:23,640 –> 00:44:29,599
people and we can do it finally at scale
1234
00:44:26,599 –> 00:44:31,599
because it can be personalized of course
1235
00:44:29,599 –> 00:44:34,559
this whole yeah I mean the notion of of
1236
00:44:31,599 –> 00:44:36,920
of this one this many to many interface
1237
00:44:34,559 –> 00:44:39,319
in the world today is it exists right I
1238
00:44:36,920 –> 00:44:40,920
mean the internet exists and um it’s not
1239
00:44:39,319 –> 00:44:42,760
just when I say the internet I’m I’m
1240
00:44:40,920 –> 00:44:45,400
using a broad term for it it’s all the
1241
00:44:42,760 –> 00:44:48,319
ways that we sort of sort of relate to
1242
00:44:45,400 –> 00:44:50,160
technology right I remember you know
1243
00:44:48,319 –> 00:44:51,960
reading early books or watching early
1244
00:44:50,160 –> 00:44:54,119
movies about the notion of the AI in
1245
00:44:51,960 –> 00:44:56,040
your year right that one day that’s all
1246
00:44:54,119 –> 00:44:57,880
you will need you you you’ll be talking
1247
00:44:56,040 –> 00:44:58,680
to something and
1248
00:44:57,880 –> 00:45:02,319
um
1249
00:44:58,680 –> 00:45:04,440
and the the system the the the many
1250
00:45:02,319 –> 00:45:05,720
words for it Avatar Etc is going to tell
1251
00:45:04,440 –> 00:45:07,040
you what you need to know right you
1252
00:45:05,720 –> 00:45:08,559
don’t have to access anything you don’t
1253
00:45:07,040 –> 00:45:10,680
have to type anything you will just have
1254
00:45:08,559 –> 00:45:13,200
to ask or think here’s what I need to
1255
00:45:10,680 –> 00:45:14,839
know and it’s available to me so then
1256
00:45:13,200 –> 00:45:17,640
what are we doing sort of what are we
1257
00:45:14,839 –> 00:45:19,920
responsible for what are we and and and
1258
00:45:17,640 –> 00:45:21,599
again again philosophical and social
1259
00:45:19,920 –> 00:45:23,319
questions aside about who controls the
1260
00:45:21,599 –> 00:45:25,000
machines right I mean those are very
1261
00:45:23,319 –> 00:45:27,160
real questions is it going to be the
1262
00:45:25,000 –> 00:45:30,480
technologists who become sort of you
1263
00:45:27,160 –> 00:45:33,079
know not to use a a poorly poorly used
1264
00:45:30,480 –> 00:45:34,960
word in in in Indian history but sort of
1265
00:45:33,079 –> 00:45:37,160
a cast of technologists who will control
1266
00:45:34,960 –> 00:45:38,800
the future to some degree that’s
1267
00:45:37,160 –> 00:45:40,520
happened right I mean when you think
1268
00:45:38,800 –> 00:45:41,920
about companies that are changing the
1269
00:45:40,520 –> 00:45:43,200
world today primarily there are
1270
00:45:41,920 –> 00:45:46,079
technology companies that are creating
1271
00:45:43,200 –> 00:45:48,000
the most impact right yes um and and are
1272
00:45:46,079 –> 00:45:49,400
impacting every industry and you know
1273
00:45:48,000 –> 00:45:51,480
when I remember when I first came into
1274
00:45:49,400 –> 00:45:54,440
my first set of CDA rols you know it
1275
00:45:51,480 –> 00:45:57,800
used to be digital media and advertising
1276
00:45:54,440 –> 00:46:00,640
and marketing and um uh technology and
1277
00:45:57,800 –> 00:46:02,359
uh and even cpg retail to some degree
1278
00:46:00,640 –> 00:46:04,040
that were kind of you know doing this
1279
00:46:02,359 –> 00:46:05,119
now it’s everyone right it doesn’t
1280
00:46:04,040 –> 00:46:07,440
matter whether you’re manufacturing
1281
00:46:05,119 –> 00:46:09,440
something or mining ion ore you are
1282
00:46:07,440 –> 00:46:10,800
using technology in a fundamentally
1283
00:46:09,440 –> 00:46:13,760
different way than you used five years
1284
00:46:10,800 –> 00:46:15,119
ago or 10 years ago it is becoming I
1285
00:46:13,760 –> 00:46:16,359
hesitate to say that every company is a
1286
00:46:15,119 –> 00:46:18,559
technology company every company is not
1287
00:46:16,359 –> 00:46:20,680
a technology company but every company
1288
00:46:18,559 –> 00:46:22,720
utilizes technology and digital
1289
00:46:20,680 –> 00:46:24,079
technology or they should be otherwise
1290
00:46:22,720 –> 00:46:26,400
they’re going to fall behind their their
1291
00:46:24,079 –> 00:46:27,880
competitors in the space so um and
1292
00:46:26,400 –> 00:46:29,119
governments doing the same thing
1293
00:46:27,880 –> 00:46:32,720
educational institutions are doing the
1294
00:46:29,119 –> 00:46:34,559
same thing and and they have to right I
1295
00:46:32,720 –> 00:46:35,960
I think a lot about obviously as any
1296
00:46:34,559 –> 00:46:37,440
parent does about how you how do you
1297
00:46:35,960 –> 00:46:41,720
educate your children what do you tell
1298
00:46:37,440 –> 00:46:43,480
them to do and and for me it’s it’s I
1299
00:46:41,720 –> 00:46:46,200
have strongly focused with my children
1300
00:46:43,480 –> 00:46:48,040
on how to understand how people operate
1301
00:46:46,200 –> 00:46:49,680
right like what does that look like your
1302
00:46:48,040 –> 00:46:51,720
friends your organizations the things
1303
00:46:49,680 –> 00:46:53,440
you compete in your Athletics it’s about
1304
00:46:51,720 –> 00:46:55,319
people so think about that and the
1305
00:46:53,440 –> 00:46:57,839
second part is not just how do you
1306
00:46:55,319 –> 00:47:00,200
access knowledge anymore but how are you
1307
00:46:57,839 –> 00:47:02,640
analyzing information right for some
1308
00:47:00,200 –> 00:47:03,920
utility it because the power right now I
1309
00:47:02,640 –> 00:47:05,920
remember I grew up in India where
1310
00:47:03,920 –> 00:47:07,760
schools were we had to learn a 100,000
1311
00:47:05,920 –> 00:47:09,240
formula for physics and 100,000 formula
1312
00:47:07,760 –> 00:47:10,800
for math you’re not allowed to use a
1313
00:47:09,240 –> 00:47:12,720
calculator and I’ll tell you it gave you
1314
00:47:10,800 –> 00:47:14,319
a great basis for being a good scientist
1315
00:47:12,720 –> 00:47:16,160
or a good engineer but in a lot of ways
1316
00:47:14,319 –> 00:47:17,960
it didn’t teach you how to survive in an
1317
00:47:16,160 –> 00:47:18,960
environment where the human connectivity
1318
00:47:17,960 –> 00:47:21,079
is going to become increasingly
1319
00:47:18,960 –> 00:47:22,440
important yeah and as you grow in your
1320
00:47:21,079 –> 00:47:25,160
career you will find that you’ll be
1321
00:47:22,440 –> 00:47:26,920
spending more time with people than less
1322
00:47:25,160 –> 00:47:28,480
right maybe fewer people and maybe
1323
00:47:26,920 –> 00:47:31,119
people who make more decisions but in
1324
00:47:28,480 –> 00:47:33,440
the end that human interaction becomes
1325
00:47:31,119 –> 00:47:35,800
as much part of your job as it was
1326
00:47:33,440 –> 00:47:38,079
earlier in the day just a different
1327
00:47:35,800 –> 00:47:41,839
kind
1328
00:47:38,079 –> 00:47:44,920
H I read the other day I don’t know if
1329
00:47:41,839 –> 00:47:48,760
it was already it is already a legisl
1330
00:47:44,920 –> 00:47:50,280
legislation or it’s just a project that
1331
00:47:48,760 –> 00:47:55,599
um there is
1332
00:47:50,280 –> 00:47:58,200
a like uh there was an idea to limit um
1333
00:47:55,599 –> 00:48:03,160
consumption of of social media and and
1334
00:47:58,200 –> 00:48:06,480
protect youth um yeah cuz even I
1335
00:48:03,160 –> 00:48:09,520
remember my childhood without a phone
1336
00:48:06,480 –> 00:48:12,160
without yeah being constantly like it is
1337
00:48:09,520 –> 00:48:14,680
an addiction um and what you see now I
1338
00:48:12,160 –> 00:48:18,440
don’t know how you do it with your
1339
00:48:14,680 –> 00:48:21,040
children but um it feels like they are
1340
00:48:18,440 –> 00:48:24,520
being sucked into it and and they are
1341
00:48:21,040 –> 00:48:27,599
too young to understand um how to fight
1342
00:48:24,520 –> 00:48:29,280
with it how to use it responsibly
1343
00:48:27,599 –> 00:48:33,319
yeah look I think this is an impossible
1344
00:48:29,280 –> 00:48:35,800
problem to solve right it is um it is a
1345
00:48:33,319 –> 00:48:37,559
moment of dramatic change it is a moment
1346
00:48:35,800 –> 00:48:39,160
when you’re absolutely right I think
1347
00:48:37,559 –> 00:48:41,640
social media has overwhelmed a lot of
1348
00:48:39,160 –> 00:48:44,000
children and um I’ve certainly seen it
1349
00:48:41,640 –> 00:48:46,640
in my own children um and we’ve tried as
1350
00:48:44,000 –> 00:48:48,200
parents to to make the best trade-offs
1351
00:48:46,640 –> 00:48:50,240
possible but we’ve made mistakes along
1352
00:48:48,200 –> 00:48:52,040
the way I mean I think we all do and
1353
00:48:50,240 –> 00:48:54,799
we’ve had to optimize and I think that
1354
00:48:52,040 –> 00:48:57,079
notion of experimenting in optimization
1355
00:48:54,799 –> 00:48:59,160
is going to be the way right and there
1356
00:48:57,079 –> 00:49:02,760
will be experiments in States in
1357
00:48:59,160 –> 00:49:04,680
countries in in continents um around how
1358
00:49:02,760 –> 00:49:05,799
to manage this right I mean China does
1359
00:49:04,680 –> 00:49:07,440
it very different from Europe Europe
1360
00:49:05,799 –> 00:49:09,040
does it very different from the US even
1361
00:49:07,440 –> 00:49:10,799
in the US there are states that do very
1362
00:49:09,040 –> 00:49:12,520
fundamentally different from each other
1363
00:49:10,799 –> 00:49:14,000
because of you know political views or
1364
00:49:12,520 –> 00:49:16,880
personal views or the views of the
1365
00:49:14,000 –> 00:49:18,400
legislators of that state and we’re
1366
00:49:16,880 –> 00:49:19,799
going to figure it out one way or the
1367
00:49:18,400 –> 00:49:20,960
other and there will be some mistakes
1368
00:49:19,799 –> 00:49:22,119
along the way there’ll probably be a lot
1369
00:49:20,960 –> 00:49:24,559
of mistakes along the way but that’s
1370
00:49:22,119 –> 00:49:25,799
true with any new technology right um
1371
00:49:24,559 –> 00:49:28,079
but as you pointed out this is a
1372
00:49:25,799 –> 00:49:30,440
technology that’s fundament m l
1373
00:49:28,079 –> 00:49:33,480
reshaping sort of the human world in a
1374
00:49:30,440 –> 00:49:35,280
very very real way and uh as with
1375
00:49:33,480 –> 00:49:37,079
anything like that I think we have to be
1376
00:49:35,280 –> 00:49:39,119
cautious and we have to be careful and
1377
00:49:37,079 –> 00:49:41,200
we have to do the work we have to do the
1378
00:49:39,119 –> 00:49:42,880
problem is in this particular scenario
1379
00:49:41,200 –> 00:49:45,880
if we wait too long to do that it’s
1380
00:49:42,880 –> 00:49:47,240
already evolved to the next stage um and
1381
00:49:45,880 –> 00:49:50,480
you know our children have a lot of
1382
00:49:47,240 –> 00:49:52,920
access um um in the US there there’s a
1383
00:49:50,480 –> 00:49:54,319
ton of privilege um and there’s you
1384
00:49:52,920 –> 00:49:55,760
you’re beginning to see the differential
1385
00:49:54,319 –> 00:49:57,720
between folks who have access to more
1386
00:49:55,760 –> 00:50:00,079
knowledge earlier on and folks who do
1387
00:49:57,720 –> 00:50:01,920
not and and you know as a technologist
1388
00:50:00,079 –> 00:50:03,400
myself I had to really struggle early on
1389
00:50:01,920 –> 00:50:05,760
with not allowing my children to have
1390
00:50:03,400 –> 00:50:06,880
access to this technology and you know I
1391
00:50:05,760 –> 00:50:08,839
made some mistakes along the way gave
1392
00:50:06,880 –> 00:50:12,200
them access to early pulled it back
1393
00:50:08,839 –> 00:50:15,040
right and then we had Co and for two
1394
00:50:12,200 –> 00:50:17,520
years all the rules we built around not
1395
00:50:15,040 –> 00:50:19,440
giving our children access to was
1396
00:50:17,520 –> 00:50:20,440
eliminated because it was the only way
1397
00:50:19,440 –> 00:50:22,640
they could communicate with their
1398
00:50:20,440 –> 00:50:24,520
friends and their family and we had no
1399
00:50:22,640 –> 00:50:27,319
choice right and children came out of
1400
00:50:24,520 –> 00:50:28,720
that changed I saw it we all saw it yeah
1401
00:50:27,319 –> 00:50:30,440
um and and one of the nice things was
1402
00:50:28,720 –> 00:50:32,040
they came out sort of hungry for for
1403
00:50:30,440 –> 00:50:35,079
human connection and contact and that
1404
00:50:32,040 –> 00:50:36,599
was great but also more dependent on
1405
00:50:35,079 –> 00:50:38,000
digital and Technology than they
1406
00:50:36,599 –> 00:50:41,079
probably would have been if we didn’t go
1407
00:50:38,000 –> 00:50:43,040
through that terrible event right so um
1408
00:50:41,079 –> 00:50:44,359
I think that accelerated many things as
1409
00:50:43,040 –> 00:50:46,280
we talked about at the beginning of this
1410
00:50:44,359 –> 00:50:48,240
conversation and it changed the way
1411
00:50:46,280 –> 00:50:50,440
people behave some of those behaviors
1412
00:50:48,240 –> 00:50:52,040
are going back some of them will not but
1413
00:50:50,440 –> 00:50:54,000
you know this is like Pandora’s Box
1414
00:50:52,040 –> 00:50:55,760
there’s no putting this back in it’s a
1415
00:50:54,000 –> 00:50:57,920
question of how do we manage it now that
1416
00:50:55,760 –> 00:50:59,920
it’s out in a way that is best for
1417
00:50:57,920 –> 00:51:01,559
society again I I’ll go back to that
1418
00:50:59,920 –> 00:51:03,319
best for society thing because I think
1419
00:51:01,559 –> 00:51:06,200
technologists absolutely have a
1420
00:51:03,319 –> 00:51:09,079
responsibility to be thinking about
1421
00:51:06,200 –> 00:51:13,680
that yeah and and right now with the
1422
00:51:09,079 –> 00:51:17,079
whole personalization and the real time
1423
00:51:13,680 –> 00:51:20,359
um you know context and and information
1424
00:51:17,079 –> 00:51:22,559
um Gathering and insights it’s even like
1425
00:51:20,359 –> 00:51:27,160
in terms of if we are thinking about the
1426
00:51:22,559 –> 00:51:30,319
aspect of Education um it’s it’s going
1427
00:51:27,160 –> 00:51:35,160
to be more Pro like it’s going to be
1428
00:51:30,319 –> 00:51:39,280
more um present in in not only children
1429
00:51:35,160 –> 00:51:41,000
but particularly children um uh
1430
00:51:39,280 –> 00:51:43,839
environment
1431
00:51:41,000 –> 00:51:48,640
so I guess
1432
00:51:43,839 –> 00:51:52,760
yes like we need human uh human um
1433
00:51:48,640 –> 00:51:54,160
relationships and and being in within
1434
00:51:52,760 –> 00:51:58,440
within others
1435
00:51:54,160 –> 00:52:00,640
but it will just be much easier to to
1436
00:51:58,440 –> 00:52:04,040
learn H
1437
00:52:00,640 –> 00:52:06,480
and discover new things I guess through
1438
00:52:04,040 –> 00:52:09,440
asking you what I’m trying to get into
1439
00:52:06,480 –> 00:52:11,319
is assistance AI assistance right like
1440
00:52:09,440 –> 00:52:13,079
the whole new if I mean if I knew the
1441
00:52:11,319 –> 00:52:15,280
answer to that I’d probably be a lot
1442
00:52:13,079 –> 00:52:16,760
richer than I am today right I it’s it’s
1443
00:52:15,280 –> 00:52:18,000
one of those things where I don’t I
1444
00:52:16,760 –> 00:52:21,240
don’t know that it’s been figured out
1445
00:52:18,000 –> 00:52:22,760
yet there’s a lot of you know a lot of
1446
00:52:21,240 –> 00:52:24,839
but but like the Facebooks and the
1447
00:52:22,760 –> 00:52:26,760
Googles and the metas and the Instagrams
1448
00:52:24,839 –> 00:52:29,040
and the WhatsApp there will be new Vari
1449
00:52:26,760 –> 00:52:31,799
of these new tools right and maybe one
1450
00:52:29,040 –> 00:52:34,119
of them one day will be sort of the you
1451
00:52:31,799 –> 00:52:35,640
know the the information God in your
1452
00:52:34,119 –> 00:52:37,720
head that tells you everything you need
1453
00:52:35,640 –> 00:52:39,799
to know um I mean look people have been
1454
00:52:37,720 –> 00:52:42,079
trying to do this for decades sort of be
1455
00:52:39,799 –> 00:52:44,400
that that one source of information and
1456
00:52:42,079 –> 00:52:45,839
it’s a very powerful thing right and how
1457
00:52:44,400 –> 00:52:47,680
governments will actually react to that
1458
00:52:45,839 –> 00:52:49,160
is going to be interesting to watch and
1459
00:52:47,680 –> 00:52:50,760
there will be more tension I
1460
00:52:49,160 –> 00:52:53,480
fundamentally believe between private
1461
00:52:50,760 –> 00:52:54,400
corporations and governments because um
1462
00:52:53,480 –> 00:52:56,160
you know the incentives are
1463
00:52:54,400 –> 00:52:57,720
fundamentally different in some cases
1464
00:52:56,160 –> 00:52:59,240
course I mean you’re you’re you’re
1465
00:52:57,720 –> 00:53:01,319
beholden to your shareholders versus
1466
00:52:59,240 –> 00:53:02,799
you’re beholden to your citizenry and
1467
00:53:01,319 –> 00:53:04,680
then are you a citizen of the world how
1468
00:53:02,799 –> 00:53:06,559
do you think about that and and but in
1469
00:53:04,680 –> 00:53:07,920
the end I’m going to care most about my
1470
00:53:06,559 –> 00:53:09,680
family and the people I love and the
1471
00:53:07,920 –> 00:53:11,960
ones who depend on me so how do you
1472
00:53:09,680 –> 00:53:13,480
manage all those boundaries and look
1473
00:53:11,960 –> 00:53:15,240
that’s the Human Condition right we’ve
1474
00:53:13,480 –> 00:53:17,400
been doing this ever since we were sort
1475
00:53:15,240 –> 00:53:20,160
of you know using rocks to kill animals
1476
00:53:17,400 –> 00:53:22,040
tens of thousands of years ago so um how
1477
00:53:20,160 –> 00:53:23,400
we continue to evolve as Society I think
1478
00:53:22,040 –> 00:53:25,240
will be as you pointed out heavily
1479
00:53:23,400 –> 00:53:28,400
influenced by how technology enables us
1480
00:53:25,240 –> 00:53:31,359
to be but the end you know if if a child
1481
00:53:28,400 –> 00:53:32,640
behaves badly it’s partially the child
1482
00:53:31,359 –> 00:53:34,200
but it’s also the parents and the
1483
00:53:32,640 –> 00:53:36,280
community and the village in which they
1484
00:53:34,200 –> 00:53:37,720
were were raised and and so again it’s
1485
00:53:36,280 –> 00:53:40,599
about creating as we’ve talked about
1486
00:53:37,720 –> 00:53:42,720
earlier the right sort of
1487
00:53:40,599 –> 00:53:44,119
universally agreed upon ethical
1488
00:53:42,720 –> 00:53:45,640
principles as to how we use these
1489
00:53:44,119 –> 00:53:47,119
Technologies now there will always be
1490
00:53:45,640 –> 00:53:50,280
bad actors there will always be people
1491
00:53:47,119 –> 00:53:52,400
who try to you know gain profit or or or
1492
00:53:50,280 –> 00:53:53,720
or or do things in a malicious way
1493
00:53:52,400 –> 00:53:56,440
because it gives them an advantage in
1494
00:53:53,720 –> 00:53:58,559
some ways but I fundamentally am an
1495
00:53:56,440 –> 00:54:00,680
Optimist I believe in the power of The
1496
00:53:58,559 –> 00:54:02,559
Human Condition and I believe that there
1497
00:54:00,680 –> 00:54:04,960
are more people focused on doing good
1498
00:54:02,559 –> 00:54:07,040
for society than not and so it’s
1499
00:54:04,960 –> 00:54:08,240
dependent even more dependent on us now
1500
00:54:07,040 –> 00:54:10,280
especially if we live in one of these
1501
00:54:08,240 –> 00:54:12,680
worlds to be thinking about Frameworks
1502
00:54:10,280 –> 00:54:14,680
and ways and education policies to allow
1503
00:54:12,680 –> 00:54:17,720
the stuff to happen the right way and it
1504
00:54:14,680 –> 00:54:19,440
matters who you vote for it matters um
1505
00:54:17,720 –> 00:54:21,200
how you give feedback back to the people
1506
00:54:19,440 –> 00:54:24,079
that you work for it matters how you
1507
00:54:21,200 –> 00:54:25,440
treat the people who work for you um and
1508
00:54:24,079 –> 00:54:27,400
again that’s just things that you teach
1509
00:54:25,440 –> 00:54:29,839
your children every day and or should
1510
00:54:27,400 –> 00:54:32,400
right about how to behave in in in in in
1511
00:54:29,839 –> 00:54:34,680
in society it’s the same thing um I
1512
00:54:32,400 –> 00:54:36,640
don’t think the fundamental rules change
1513
00:54:34,680 –> 00:54:39,359
I think the tools we have and the access
1514
00:54:36,640 –> 00:54:41,559
we have changes but again it comes back
1515
00:54:39,359 –> 00:54:44,280
to what do you believe your values and
1516
00:54:41,559 –> 00:54:46,040
principles are right and are we are we
1517
00:54:44,280 –> 00:54:48,440
are we more ready to agree on those
1518
00:54:46,040 –> 00:54:50,359
things than disagree and then we can
1519
00:54:48,440 –> 00:54:52,799
find ways to apply them to
1520
00:54:50,359 –> 00:54:56,880
technology and you said that you are
1521
00:54:52,799 –> 00:54:58,920
Optimist like I also do feel it’s
1522
00:54:56,880 –> 00:55:01,440
like mainly mainly good what what’s
1523
00:54:58,920 –> 00:55:04,880
happening and and the things we are
1524
00:55:01,440 –> 00:55:08,280
creating um but what are is what are the
1525
00:55:04,880 –> 00:55:11,599
particular aspects of technology or and
1526
00:55:08,280 –> 00:55:17,160
areas um which you probably advise or
1527
00:55:11,599 –> 00:55:19,920
work on uh which you would love to um
1528
00:55:17,160 –> 00:55:21,680
you know for other people to to put more
1529
00:55:19,920 –> 00:55:24,319
attention
1530
00:55:21,680 –> 00:55:25,599
to luckily for me luckily or unluckily
1531
00:55:24,319 –> 00:55:26,880
the things that I operate in are getting
1532
00:55:25,599 –> 00:55:28,480
a lot of attention in fact they’re
1533
00:55:26,880 –> 00:55:31,480
getting too much attention some might
1534
00:55:28,480 –> 00:55:33,440
argue right um um I mean if you’re in
1535
00:55:31,480 –> 00:55:35,160
the AI world if you’re in the analytics
1536
00:55:33,440 –> 00:55:36,760
world if you’re in the technology world
1537
00:55:35,160 –> 00:55:38,720
it’s all anyone’s talking about today
1538
00:55:36,760 –> 00:55:40,119
right because either in fear because
1539
00:55:38,720 –> 00:55:41,280
they’re worried you know they do
1540
00:55:40,119 –> 00:55:42,720
something very manual and they’re
1541
00:55:41,280 –> 00:55:43,880
worried that that thing might go away
1542
00:55:42,720 –> 00:55:45,720
something that they learned for the last
1543
00:55:43,880 –> 00:55:48,240
30 years 20 years 10 years five years is
1544
00:55:45,720 –> 00:55:49,440
not applicable anymore um and so they’re
1545
00:55:48,240 –> 00:55:51,599
trying to understand what that looks
1546
00:55:49,440 –> 00:55:53,440
like or they are part of this milu who
1547
00:55:51,599 –> 00:55:55,920
is setting principles and standards and
1548
00:55:53,440 –> 00:55:58,520
stages for for companies or governments
1549
00:55:55,920 –> 00:55:59,880
or people or or or organizations and you
1550
00:55:58,520 –> 00:56:01,880
know we’re getting to make a lot of
1551
00:55:59,880 –> 00:56:03,880
decisions with longlasting
1552
00:56:01,880 –> 00:56:05,599
consequences but again I I think it’s
1553
00:56:03,880 –> 00:56:07,359
very hard for one person to do it
1554
00:56:05,599 –> 00:56:10,240
anymore right it has to be part of an
1555
00:56:07,359 –> 00:56:12,680
ecosystem as to how that is managed um
1556
00:56:10,240 –> 00:56:14,599
and and sort of making sure that we can
1557
00:56:12,680 –> 00:56:17,599
continue to build that in the right way
1558
00:56:14,599 –> 00:56:19,640
is important I think Technologies like
1559
00:56:17,599 –> 00:56:20,920
sort of the interface between human and
1560
00:56:19,640 –> 00:56:23,119
analytics is going to become very
1561
00:56:20,920 –> 00:56:24,400
important right uh so not just the
1562
00:56:23,119 –> 00:56:27,280
technology that builds the analytics
1563
00:56:24,400 –> 00:56:29,559
tools but the translation layer between
1564
00:56:27,280 –> 00:56:31,720
that and us right and whether that’s
1565
00:56:29,559 –> 00:56:34,079
again the bug in New Year or reports or
1566
00:56:31,720 –> 00:56:35,920
visuals or sort of new applications that
1567
00:56:34,079 –> 00:56:38,319
we’re now using to communicate with the
1568
00:56:35,920 –> 00:56:40,119
world and how do we express ourselves
1569
00:56:38,319 –> 00:56:41,640
that layer is going to become very very
1570
00:56:40,119 –> 00:56:43,319
important because to your point that
1571
00:56:41,640 –> 00:56:46,119
layer is going to become reality to some
1572
00:56:43,319 –> 00:56:47,760
degree right um one of my favorite books
1573
00:56:46,119 –> 00:56:49,680
ever is a book called snow crash written
1574
00:56:47,760 –> 00:56:50,920
by a guy named Neil Stenson and it was
1575
00:56:49,680 –> 00:56:52,520
written a long time ago it was very
1576
00:56:50,920 –> 00:56:55,359
early talked about the early days of
1577
00:56:52,520 –> 00:56:57,119
cyber and you know it it it you know
1578
00:56:55,359 –> 00:56:58,640
originated the term and cyber and it
1579
00:56:57,119 –> 00:57:00,680
talked a lot about sort of this The
1580
00:56:58,640 –> 00:57:02,920
Duality of the world that will exist
1581
00:57:00,680 –> 00:57:05,319
when a digital twin of the world in is
1582
00:57:02,920 –> 00:57:07,200
in place right yeah and what is language
1583
00:57:05,319 –> 00:57:08,480
and what is communication and suddenly
1584
00:57:07,200 –> 00:57:10,039
it becomes all these high fluing ideas
1585
00:57:08,480 –> 00:57:11,960
there almost too much to contain in one
1586
00:57:10,039 –> 00:57:14,440
brain but I think you got to you got to
1587
00:57:11,960 –> 00:57:17,160
pick a piece that you know you can have
1588
00:57:14,440 –> 00:57:18,960
impact on right and that and you know
1589
00:57:17,160 –> 00:57:20,680
and and usually you have impact on many
1590
00:57:18,960 –> 00:57:22,000
pieces your family the people who work
1591
00:57:20,680 –> 00:57:24,400
with you the people who work for you the
1592
00:57:22,000 –> 00:57:25,760
people work for the other organizations
1593
00:57:24,400 –> 00:57:27,960
you spend your time in the hobbies that
1594
00:57:25,760 –> 00:57:30,039
you spend your time in I’m a passionate
1595
00:57:27,960 –> 00:57:31,440
car guy and racing guy and we have a
1596
00:57:30,039 –> 00:57:33,920
really rich Community here in the
1597
00:57:31,440 –> 00:57:35,240
Northeast us and you know sort of how
1598
00:57:33,920 –> 00:57:36,480
you go out there and present yourself
1599
00:57:35,240 –> 00:57:38,400
and talk about the things that matter to
1600
00:57:36,480 –> 00:57:40,799
you and the evolution of that world is
1601
00:57:38,400 –> 00:57:42,520
important and so finding time finding
1602
00:57:40,799 –> 00:57:44,240
balance for all those things I think is
1603
00:57:42,520 –> 00:57:45,200
going to become very very important and
1604
00:57:44,240 –> 00:57:47,359
you know some people have different
1605
00:57:45,200 –> 00:57:49,440
ideas of what balance looks like I like
1606
00:57:47,359 –> 00:57:51,480
True Balance um you know balance between
1607
00:57:49,440 –> 00:57:53,359
family and time for yourself and and and
1608
00:57:51,480 –> 00:57:55,160
then deep professional satisfaction from
1609
00:57:53,359 –> 00:57:56,480
the things that you do and I hope most
1610
00:57:55,160 –> 00:57:59,520
people have that opportunity I think
1611
00:57:56,480 –> 00:58:01,200
it’s I’ve been very lucky was it some
1612
00:57:59,520 –> 00:58:05,240
like actually that was my next question
1613
00:58:01,200 –> 00:58:09,240
so perfectly you spped into it so was it
1614
00:58:05,240 –> 00:58:12,359
something you seen among your family or
1615
00:58:09,240 –> 00:58:15,000
you something which you had to um mature
1616
00:58:12,359 –> 00:58:19,240
and understand that you know family and
1617
00:58:15,000 –> 00:58:21,079
time off of work um matters and not
1618
00:58:19,240 –> 00:58:23,920
everything I think it’s always both I
1619
00:58:21,079 –> 00:58:26,079
mean I have an amazing family very
1620
00:58:23,920 –> 00:58:27,480
educated very they believe that
1621
00:58:26,079 –> 00:58:29,119
education is the most important thing
1622
00:58:27,480 –> 00:58:32,000
that you can do and that you can support
1623
00:58:29,119 –> 00:58:34,000
as the family um I’m very I’m very close
1624
00:58:32,000 –> 00:58:35,839
to them you know they my mother’s an
1625
00:58:34,000 –> 00:58:38,200
architect my father is a cardiothoracic
1626
00:58:35,839 –> 00:58:40,039
surgeon my grandfather was a general in
1627
00:58:38,200 –> 00:58:41,839
the Indian army and also a doctor my
1628
00:58:40,039 –> 00:58:43,200
other grandfather was um an
1629
00:58:41,839 –> 00:58:44,799
industrialist who worked for a group
1630
00:58:43,200 –> 00:58:46,520
called The tatas for many many years and
1631
00:58:44,799 –> 00:58:49,000
taught me everything I ever wanted to
1632
00:58:46,520 –> 00:58:50,720
know about integrity so I had an amazing
1633
00:58:49,000 –> 00:58:52,640
amount of support growing up I went to a
1634
00:58:50,720 –> 00:58:54,319
great school I went to a great College I
1635
00:58:52,640 –> 00:58:56,680
was very blessed in that not everybody
1636
00:58:54,319 –> 00:58:58,480
has those opportunities right and I
1637
00:58:56,680 –> 00:59:00,160
recognize that there’s that creates
1638
00:58:58,480 –> 00:59:01,680
differentials and you know I think my
1639
00:59:00,160 –> 00:59:03,480
children have a lot of those Privileges
1640
00:59:01,680 –> 00:59:06,079
and opportunities too right they go to
1641
00:59:03,480 –> 00:59:07,960
Great Schools they get a lot of support
1642
00:59:06,079 –> 00:59:10,039
um um they have a strong family to
1643
00:59:07,960 –> 00:59:11,680
support them that doesn’t always happen
1644
00:59:10,039 –> 00:59:12,559
and look it doesn’t mean that these
1645
00:59:11,680 –> 00:59:14,920
children are going to be the most
1646
00:59:12,559 –> 00:59:17,920
successful right sometimes hunger is
1647
00:59:14,920 –> 00:59:21,200
what gives you success right I on not
1648
00:59:17,920 –> 00:59:23,559
having is what gives you success but I I
1649
00:59:21,200 –> 00:59:26,599
I and so I I don’t know that there’s a
1650
00:59:23,559 –> 00:59:29,079
formula for for happiness right but I
1651
00:59:26,599 –> 00:59:30,799
think a lot of it is to your point how
1652
00:59:29,079 –> 00:59:33,119
you are raised and who raises you and
1653
00:59:30,799 –> 00:59:34,960
then the other part is sort of what does
1654
00:59:33,119 –> 00:59:37,119
that first part do to you and allow you
1655
00:59:34,960 –> 00:59:38,839
to think about what you need right and
1656
00:59:37,119 –> 00:59:41,079
so if you don’t have that do you go out
1657
00:59:38,839 –> 00:59:43,240
there and find it and some of that is
1658
00:59:41,079 –> 00:59:44,680
implicit in who you are and personality
1659
00:59:43,240 –> 00:59:46,920
and I remember I have two boys and
1660
00:59:44,680 –> 00:59:48,160
they’re very different kids and I was
1661
00:59:46,920 –> 00:59:49,599
like how can the both these children
1662
00:59:48,160 –> 00:59:51,960
have come from the same two parents but
1663
00:59:49,599 –> 00:59:53,880
they did right I mean and that’s the the
1664
00:59:51,960 –> 00:59:56,400
magic of our of the human brain is it’s
1665
00:59:53,880 –> 00:59:59,400
this unbelievably complex computer there
1666
00:59:56,400 –> 01:00:01,839
society around like the the friends they
1667
00:59:59,400 –> 01:00:03,880
choose yeah friend I mean and there’s so
1668
01:00:01,839 –> 01:00:06,000
many small factors so many variables
1669
01:00:03,880 –> 01:00:07,920
right you know what sport did you play
1670
01:00:06,000 –> 01:00:09,760
what did you were you a the I mean I was
1671
01:00:07,920 –> 01:00:11,640
a theater kid who also played sport so I
1672
01:00:09,760 –> 01:00:13,520
got you know I always loved having
1673
01:00:11,640 –> 01:00:15,039
multiple communities and again probably
1674
01:00:13,520 –> 01:00:17,400
speaks to my add that I needed the
1675
01:00:15,039 –> 01:00:19,359
theater people and the basketball people
1676
01:00:17,400 –> 01:00:20,920
and the rugby people and you know this
1677
01:00:19,359 –> 01:00:23,319
people and the technology people and for
1678
01:00:20,920 –> 01:00:24,599
me like a little bit of all of it was
1679
01:00:23,319 –> 01:00:26,240
the most interesting part rather than
1680
01:00:24,599 –> 01:00:28,599
becoming very verticalized I have
1681
01:00:26,240 –> 01:00:31,280
friends all they care about is one thing
1682
01:00:28,599 –> 01:00:32,880
right and and and again like I said
1683
01:00:31,280 –> 01:00:35,559
success is not dependent on having one
1684
01:00:32,880 –> 01:00:37,960
or the other way of operating right
1685
01:00:35,559 –> 01:00:39,720
often so many different formulas yes
1686
01:00:37,960 –> 01:00:42,119
yeah I mean and I think those variables
1687
01:00:39,720 –> 01:00:43,599
are are hard right I mean there is if we
1688
01:00:42,119 –> 01:00:46,520
if everyone knew the formula for Success
1689
01:00:43,599 –> 01:00:48,760
we’d all be using it so and and the
1690
01:00:46,520 –> 01:00:50,680
other thing I I think again success for
1691
01:00:48,760 –> 01:00:52,920
me is sometimes kind of a tough word too
1692
01:00:50,680 –> 01:00:55,480
right I believe I am constantly
1693
01:00:52,920 –> 01:00:56,559
successful because I love my life right
1694
01:00:55,480 –> 01:00:58,039
I enjoy
1695
01:00:56,559 –> 01:00:59,319
that I have and it doesn’t mean bad
1696
01:00:58,039 –> 01:01:01,200
things don’t happen of course they do
1697
01:00:59,319 –> 01:01:04,000
people get sick people die people you
1698
01:01:01,200 –> 01:01:05,839
know jobs come Jobs go bad things happen
1699
01:01:04,000 –> 01:01:08,760
to people that you care about but I
1700
01:01:05,839 –> 01:01:11,960
think it’s how you you respond to them
1701
01:01:08,760 –> 01:01:13,119
that is important right and um I’m slly
1702
01:01:11,960 –> 01:01:15,880
not stating that that’s the only way to
1703
01:01:13,119 –> 01:01:17,400
be like that that for me that everyone
1704
01:01:15,880 –> 01:01:18,559
talks about grit and resilience and all
1705
01:01:17,400 –> 01:01:20,680
these things but if there’s anything I
1706
01:01:18,559 –> 01:01:22,640
wanted my children to learn it’s not
1707
01:01:20,680 –> 01:01:24,760
just how to live in a golden cocoon
1708
01:01:22,640 –> 01:01:26,359
right it is to know that there are
1709
01:01:24,760 –> 01:01:28,319
difficult things in the world and and
1710
01:01:26,359 –> 01:01:30,799
some of them they will inevitably have
1711
01:01:28,319 –> 01:01:33,319
to face and give them the strength to
1712
01:01:30,799 –> 01:01:35,200
understand that they can right um that
1713
01:01:33,319 –> 01:01:36,400
they and and they they will and look
1714
01:01:35,200 –> 01:01:38,359
sometimes those things are chemically
1715
01:01:36,400 –> 01:01:40,000
taken away from from people you know you
1716
01:01:38,359 –> 01:01:42,400
know your brain chemistry might not
1717
01:01:40,000 –> 01:01:44,280
allow for that and so we have to be
1718
01:01:42,400 –> 01:01:47,720
empathetic to the world around us trying
1719
01:01:44,280 –> 01:01:50,079
to understand that there is no easy
1720
01:01:47,720 –> 01:01:52,079
buckets of neurobehavior there is a wide
1721
01:01:50,079 –> 01:01:53,880
spectrum how people behave and I
1722
01:01:52,079 –> 01:01:55,440
typically tend to find when you are not
1723
01:01:53,880 –> 01:01:56,839
given one thing you’re typically given
1724
01:01:55,440 –> 01:02:00,119
another you have to figure out what that
1725
01:01:56,839 –> 01:02:03,240
thing is right so for me you know add
1726
01:02:00,119 –> 01:02:05,279
comes with certain things and and and
1727
01:02:03,240 –> 01:02:07,200
corporate add which is a manifestation
1728
01:02:05,279 –> 01:02:08,920
it also comes with certain things and
1729
01:02:07,200 –> 01:02:11,279
you have to find your own path through
1730
01:02:08,920 –> 01:02:13,359
that right and and and figure out sort
1731
01:02:11,279 –> 01:02:14,680
of how you perceive the world how the
1732
01:02:13,359 –> 01:02:16,279
world perceives you and how you can
1733
01:02:14,680 –> 01:02:18,559
operate through it in a way using
1734
01:02:16,279 –> 01:02:21,599
technology using your relationships
1735
01:02:18,559 –> 01:02:24,400
using your experience um and asking for
1736
01:02:21,599 –> 01:02:27,119
guidance asking for wisdom asking for
1737
01:02:24,400 –> 01:02:28,359
experience I love talking to my grand
1738
01:02:27,119 –> 01:02:30,720
what what used to be my grandparents
1739
01:02:28,359 –> 01:02:32,880
generation because I do believe there is
1740
01:02:30,720 –> 01:02:35,640
fundamental knowledge in experience and
1741
01:02:32,880 –> 01:02:37,400
age and I think sometimes you know
1742
01:02:35,640 –> 01:02:40,039
sometimes some you know families don’t
1743
01:02:37,400 –> 01:02:42,000
allow for that and I I one of the things
1744
01:02:40,039 –> 01:02:43,640
that I love is when my children get to
1745
01:02:42,000 –> 01:02:45,160
hang out with their grandparents or
1746
01:02:43,640 –> 01:02:48,279
their grandparents friends and and just
1747
01:02:45,160 –> 01:02:50,319
hear about stories of of of doing things
1748
01:02:48,279 –> 01:02:51,920
and being through stuff and good things
1749
01:02:50,319 –> 01:02:54,039
and bad things and hard things and easy
1750
01:02:51,920 –> 01:02:55,440
things and I think technology will
1751
01:02:54,039 –> 01:02:58,000
hopefully enable some of these stories
1752
01:02:55,440 –> 01:03:00,720
to be told St better too you know um
1753
01:02:58,000 –> 01:03:04,000
we’ll see I think that’s uniquely human
1754
01:03:00,720 –> 01:03:06,599
right um learning and growing through
1755
01:03:04,000 –> 01:03:08,960
through through stories and if we can
1756
01:03:06,599 –> 01:03:12,279
teach our AI how to learn and grow the
1757
01:03:08,960 –> 01:03:15,079
same way my hope is that more will be
1758
01:03:12,279 –> 01:03:16,680
beneficial than not right and if we can
1759
01:03:15,079 –> 01:03:17,680
allow that to happen with people we can
1760
01:03:16,680 –> 01:03:20,319
allow that to happen with the
1761
01:03:17,680 –> 01:03:20,319
Technologies we
1762
01:03:20,480 –> 01:03:26,440
create I think this this is a perfect
1763
01:03:23,359 –> 01:03:29,559
ending to our our chat
1764
01:03:26,440 –> 01:03:34,160
I think you you’ve summed it up really
1765
01:03:29,559 –> 01:03:37,960
really nicely okay um so just the last
1766
01:03:34,160 –> 01:03:41,319
last last question um yeah so reflecting
1767
01:03:37,960 –> 01:03:43,799
on your career which accomplishments are
1768
01:03:41,319 –> 01:03:46,960
you most proud of it’s always the teams
1769
01:03:43,799 –> 01:03:48,839
I’ve built always the teams I’ve built
1770
01:03:46,960 –> 01:03:49,960
um you know a lot of people who worked
1771
01:03:48,839 –> 01:03:52,119
for me have now worked for me at
1772
01:03:49,960 –> 01:03:55,119
multiple companies in one case four or
1773
01:03:52,119 –> 01:03:57,319
five companies um and the cultures I
1774
01:03:55,119 –> 01:03:59,079
built the things that we did together
1775
01:03:57,319 –> 01:04:01,319
were important the value that we drove
1776
01:03:59,079 –> 01:04:03,480
for shareholders was important but I
1777
01:04:01,319 –> 01:04:05,559
think it was the communities that we
1778
01:04:03,480 –> 01:04:06,760
created and it was not limited to one
1779
01:04:05,559 –> 01:04:08,720
company it was not limited to one
1780
01:04:06,760 –> 01:04:12,079
industry it was not limited to one
1781
01:04:08,720 –> 01:04:14,160
geography um you know I think we created
1782
01:04:12,079 –> 01:04:17,039
a wonderful ecosystem and we continue to
1783
01:04:14,160 –> 01:04:18,799
rely on each other and uplift each other
1784
01:04:17,039 –> 01:04:21,319
and talk to each other and learn from
1785
01:04:18,799 –> 01:04:22,400
each other and that is what I am by far
1786
01:04:21,319 –> 01:04:25,920
the proudest
1787
01:04:22,400 –> 01:04:28,799
of sounds like you’re a great boss I
1788
01:04:25,920 –> 01:04:32,480
I’ll will do some research to see if it
1789
01:04:28,799 –> 01:04:35,440
if it um if it’s the same on the other
1790
01:04:32,480 –> 01:04:39,760
side no I’m I’m sure I’m sure okay great
1791
01:04:35,440 –> 01:04:39,760
thank you thank you