
Paul Jones is a Vice President at Softbank Group, and together with his incredible team, they have built some of the most advanced tech in Computer Vision and AI with Softbank at Japan Computer Vision. Paul Jones is is an incredibly smart, kind and a humble person, you just want to learn more from. We both share two passions: incredible love for Japan and its people dedication to the finest craftsmanship, and innovative technology – the one that changes world for the better. That’s why I’m even more grateful to Paul for finding time to sit down with me to record a conversation for Are You Human podcast. We talk Japan’s way of innovating and approaching technology, Computer Vision, future of AI (Paul believes it will be all interactive), risks and opportunities around AI and Metaverse. We also discuss what West can learn from Japanese way of living, working and celebrating life. I hope you’ll enjoy this episode 🙂
Transcript
1
00:00:00,410 –> 00:00:03,172
It’s like any new business opportunity.
2
00:00:03,173 –> 00:00:06,826
There’s sort of no cost entry at this stage,
3
00:00:06,827 –> 00:00:10,964
but as people want to get more information, I
4
00:00:10,965 –> 00:00:12,708
think they want to control the information, or at
5
00:00:12,709 –> 00:00:15,636
least be feeling as though they have the opportunity
6
00:00:15,637 –> 00:00:19,040
to, as you said, be more engaged in it.
7
00:00:19,570 –> 00:00:22,676
I think just regurgitating a lot of search
8
00:00:22,677 –> 00:00:24,548
and commands that have been sort of put
9
00:00:24,549 –> 00:00:28,050
in there through machine learning is not necessarily
10
00:00:28,051 –> 00:00:30,572
what people’s perception of AI is.
11
00:00:30,573 –> 00:00:31,868
AI, is there more of an
12
00:00:31,869 –> 00:00:35,750
assistant rather than a decision maker?
13
00:00:37,050 –> 00:00:39,552
Hi, this is your host, Camila Hankiewicz, and
14
00:00:39,553 –> 00:00:42,320
together with my guests, we discuss how tech
15
00:00:42,321 –> 00:00:44,780
changes the way we live and work.
16
00:00:45,870 –> 00:00:47,020
Are you human.
17
00:00:49,150 –> 00:00:49,686
Paul?
18
00:00:49,687 –> 00:00:52,576
I’ve been waiting for this for how many months?
19
00:00:52,577 –> 00:00:53,312
How many months?
20
00:00:53,313 –> 00:00:55,970
We were trying to schedule this.
21
00:00:55,971 –> 00:00:57,946
I think we started, we stopped,
22
00:00:57,947 –> 00:00:59,780
I think somewhere around six months.
23
00:00:59,781 –> 00:01:02,042
So thank you very much for your patience.
24
00:01:02,043 –> 00:01:03,700
So what caused it?
25
00:01:04,470 –> 00:01:05,688
What were you doing?
26
00:01:05,689 –> 00:01:09,110
If you can share a little bit of mystery.
27
00:01:09,111 –> 00:01:11,528
Yeah, there’s obviously a lot going on the
28
00:01:11,529 –> 00:01:16,264
last couple of years with COVID and the
29
00:01:16,265 –> 00:01:18,380
impact it’s had on lots of things.
30
00:01:18,381 –> 00:01:21,772
So we spent really, the last two years out
31
00:01:21,773 –> 00:01:25,420
of COVID just reforming things and cleaning things up.
32
00:01:25,421 –> 00:01:26,508
And I think we’re in a really
33
00:01:26,509 –> 00:01:28,940
healthy position as we start this year.
34
00:01:29,950 –> 00:01:33,712
Not so many people outside of your circles know
35
00:01:33,713 –> 00:01:37,312
what you do, so maybe you could just give
36
00:01:37,313 –> 00:01:41,380
a little bit of overview of where you are
37
00:01:41,381 –> 00:01:43,680
with your company and where you’re going.
38
00:01:44,450 –> 00:01:45,200
Sure.
39
00:01:46,370 –> 00:01:48,394
So I’m with Softbank.
40
00:01:48,395 –> 00:01:50,388
In fact, I’m with JCB, which is
41
00:01:50,389 –> 00:01:55,134
a subsidiary, 100% subsidiary, of Softbank, KK.
42
00:01:55,135 –> 00:01:57,752
KK in Japanese is company.
43
00:01:57,753 –> 00:02:00,824
So Softbank C, we’ll call it Softbank, the company
44
00:02:00,825 –> 00:02:03,678
which is the legacy DNA, sort of telco.
45
00:02:03,679 –> 00:02:07,800
Very successful here in Japan for over 20 years. Now.
46
00:02:08,490 –> 00:02:12,572
About seven years ago, SoftBank decided to create what
47
00:02:12,573 –> 00:02:15,708
many would know as the vision fund, which is
48
00:02:15,709 –> 00:02:17,880
effectively what we refer to as SoftBank Group.
49
00:02:18,890 –> 00:02:21,622
And when opportunities present themselves from an investment
50
00:02:21,623 –> 00:02:23,648
point of view, they also present themselves to
51
00:02:23,649 –> 00:02:25,846
create joint ventures here in Japan.
52
00:02:25,847 –> 00:02:28,038
So I’m just part of one of the joint
53
00:02:28,039 –> 00:02:32,176
ventures called Japan computer vision, which is focused on
54
00:02:32,177 –> 00:02:35,526
computer vision, face recognition, image recognition, et cetera.
55
00:02:35,527 –> 00:02:36,906
And we were very fortunate.
56
00:02:36,907 –> 00:02:39,028
We were placed as an investment, I
57
00:02:39,029 –> 00:02:41,546
think, in 17, incepted in 19.
58
00:02:41,547 –> 00:02:43,570
So it’s been five years now.
59
00:02:43,571 –> 00:02:47,176
And because Covid came at a unique time,
60
00:02:47,177 –> 00:02:50,008
it allowed us to sort of develop as
61
00:02:50,009 –> 00:02:51,672
a company and develop our technology.
62
00:02:51,673 –> 00:02:52,888
So, yeah, that’s what I do.
63
00:02:52,889 –> 00:02:56,184
I’m based here in Tokyo and travelling a
64
00:02:56,185 –> 00:02:59,640
little bit the last 1218 months, but really
65
00:03:00,890 –> 00:03:03,324
focused here in Japan at the moment.
66
00:03:03,325 –> 00:03:05,308
You were not always in technology, right?
67
00:03:05,309 –> 00:03:06,940
Like, how did your career start?
68
00:03:06,941 –> 00:03:08,816
Yeah, no, I was not in technology.
69
00:03:08,817 –> 00:03:13,488
However, the opportunity to come and work under Softbank as
70
00:03:13,489 –> 00:03:17,344
part of something was very exciting to me, not just
71
00:03:17,345 –> 00:03:21,010
a few years ago, but probably goes way back.
72
00:03:21,011 –> 00:03:24,880
I don’t profess to being a technology type person.
73
00:03:26,050 –> 00:03:28,458
Very intrigued by the role
74
00:03:28,459 –> 00:03:29,892
that technology plays for us.
75
00:03:29,893 –> 00:03:35,110
So I came to Japan 2002.
76
00:03:35,111 –> 00:03:39,048
I was fortunate to be with Goldman Sachs for
77
00:03:39,049 –> 00:03:42,238
almost five years, during which time I did predominantly
78
00:03:42,239 –> 00:03:46,636
asset management, non performing loans type of business.
79
00:03:46,637 –> 00:03:47,560
Very interesting.
80
00:03:49,050 –> 00:03:52,760
And then left GS and was fortunate to
81
00:03:53,290 –> 00:03:59,292
raise some external capital from Soros and went
82
00:03:59,293 –> 00:04:01,640
about building a couple of things there.
83
00:04:02,250 –> 00:04:04,082
Got a little bit, sort of stopped.
84
00:04:04,083 –> 00:04:06,668
Obviously with subprime, it was a good learning
85
00:04:06,669 –> 00:04:09,692
experience, built a few agencies the last few
86
00:04:09,693 –> 00:04:14,308
years, was fortunate to be involved in the
87
00:04:14,309 –> 00:04:16,190
process and obviously the exit.
88
00:04:17,010 –> 00:04:18,480
And here I am today.
89
00:04:20,050 –> 00:04:23,620
It’s been a fun journey so far.
90
00:04:24,470 –> 00:04:27,678
I’ve really enjoyed the last few years with Softbank.
91
00:04:27,679 –> 00:04:30,008
I mean, Softbank, what Marciosi son has
92
00:04:30,009 –> 00:04:33,490
been able to achieve is incredible.
93
00:04:34,230 –> 00:04:36,338
This certainly has come with some criticisms,
94
00:04:36,339 –> 00:04:39,510
but those criticisms I think are valid.
95
00:04:40,490 –> 00:04:44,044
And fortunately, last year, as you may have seen, we
96
00:04:44,045 –> 00:04:47,690
did the IPO for arm, which was very successful.
97
00:04:47,691 –> 00:04:50,076
That’s why I couldn’t get reach
98
00:04:50,077 –> 00:04:52,128
of you for those months.
99
00:04:52,129 –> 00:04:55,216
It’s been a busy time for lots of people
100
00:04:55,217 –> 00:04:58,726
and we don’t necessarily work with arm directly.
101
00:04:58,727 –> 00:05:02,048
However, what I’m excited about in the future for
102
00:05:02,049 –> 00:05:04,468
the new entity is hopefully we will get to
103
00:05:04,469 –> 00:05:06,228
work with the likes of Arm and so many
104
00:05:06,229 –> 00:05:09,470
other great companies that are with the investment portfolio.
105
00:05:10,530 –> 00:05:13,170
So it’s very exciting.
106
00:05:13,171 –> 00:05:15,192
It hasn’t been easy, obviously, the last couple
107
00:05:15,193 –> 00:05:16,888
of years we’ve sort of corrected ourselves out
108
00:05:16,889 –> 00:05:21,288
of COVID and here we are starting 2024.
109
00:05:21,289 –> 00:05:24,808
And I think everyone would agree, being
110
00:05:24,809 –> 00:05:27,148
in a better position to embrace how
111
00:05:27,149 –> 00:05:29,800
technology plays a part in our lives.
112
00:05:30,410 –> 00:05:32,882
So, yeah, it’s exciting.
113
00:05:32,883 –> 00:05:34,710
You are playing it so humble.
114
00:05:35,290 –> 00:05:38,192
You can tell a little bit about what you are
115
00:05:38,193 –> 00:05:43,340
going to do next and where you are positioned with.
116
00:05:44,750 –> 00:05:48,640
I think South bank is an amazing company, amazing
117
00:05:48,641 –> 00:05:52,030
leadership, not only here in Japan, but globally.
118
00:05:52,690 –> 00:05:55,028
There’s been lots of opportunities that
119
00:05:55,029 –> 00:05:56,960
present themselves from time to time.
120
00:05:57,730 –> 00:06:00,586
But about six months or so ago, we decided
121
00:06:00,587 –> 00:06:06,136
to put a bit more focus into what’s happening
122
00:06:06,137 –> 00:06:10,494
with OpenAI and chat, GPTS, et cetera.
123
00:06:10,495 –> 00:06:13,352
And I think there’s a great opportunity there to
124
00:06:13,353 –> 00:06:19,596
try and parlay something that’s hardware related, maybe into
125
00:06:19,597 –> 00:06:23,148
software, and then obviously base it on the platform.
126
00:06:23,149 –> 00:06:26,876
So can’t go too much into detail, but definitely
127
00:06:26,877 –> 00:06:30,438
going to be creating something, I think, rather exciting
128
00:06:30,439 –> 00:06:35,152
this year, something that allows us to, again, take
129
00:06:35,153 –> 00:06:38,342
all the learnings, all the experiences, all the engagements
130
00:06:38,343 –> 00:06:40,532
that we’ve had for the last three to five
131
00:06:40,533 –> 00:06:43,412
years and hopefully make something really special.
132
00:06:43,413 –> 00:06:47,652
You are betting this will be the year
133
00:06:47,653 –> 00:06:52,186
of even bigger expansion of the AI development
134
00:06:52,187 –> 00:06:55,566
and the multimodal, and maybe even in robotics.
135
00:06:55,567 –> 00:06:58,568
Right, because you’ve been dealing with
136
00:06:58,569 –> 00:07:01,990
a bit of both worlds.
137
00:07:01,991 –> 00:07:03,608
Yeah, I think you’re right.
138
00:07:03,609 –> 00:07:07,842
I mean, OpenAI, whether it be generative, the predictive
139
00:07:07,843 –> 00:07:11,058
element, probably the next future, I think is interactive
140
00:07:11,059 –> 00:07:18,972
in terms of how we use AI to probably
141
00:07:18,973 –> 00:07:20,300
have a little bit more control.
142
00:07:21,710 –> 00:07:23,408
I think where we are at the moment is very
143
00:07:23,409 –> 00:07:27,078
much generative AI, but I definitely think the next stage
144
00:07:27,079 –> 00:07:32,554
is interactive AI, where you, me, whoever has the ability
145
00:07:32,555 –> 00:07:35,748
to influence the AI in terms of what you want
146
00:07:35,749 –> 00:07:39,594
it to do for you, rather than at this stage.
147
00:07:39,595 –> 00:07:42,968
The model is based around more of a sort
148
00:07:42,969 –> 00:07:46,610
of expediting a search or inquiries, et cetera.
149
00:07:48,790 –> 00:07:51,780
And it’s not so intelligent yet. Right.
150
00:07:52,390 –> 00:07:55,788
You have to specifically give it
151
00:07:55,789 –> 00:07:58,300
instructions, rather what you are saying.
152
00:07:58,301 –> 00:08:01,692
You would want it to maybe suggest you
153
00:08:01,693 –> 00:08:05,804
some more optimal ways of doing things, correct?
154
00:08:05,805 –> 00:08:10,300
Yeah, I think that’s definitely what people want.
155
00:08:11,790 –> 00:08:13,744
At the end of the day, the model is a
156
00:08:13,745 –> 00:08:20,272
subscription model, so it’s like any new business opportunity.
157
00:08:20,273 –> 00:08:23,946
There’s sort of no cost entry at this stage.
158
00:08:23,947 –> 00:08:28,068
But as people want to get more information, I
159
00:08:28,069 –> 00:08:29,764
think they want to control the information, or at
160
00:08:29,765 –> 00:08:32,692
least be feeling as though they have the opportunity
161
00:08:32,693 –> 00:08:36,100
to, as you said, be more engaged in it.
162
00:08:36,710 –> 00:08:39,736
I think just regurgitating a lot of search
163
00:08:39,737 –> 00:08:41,672
and commands that have been sort of put
164
00:08:41,673 –> 00:08:45,166
in there through machine learning is not necessarily
165
00:08:45,167 –> 00:08:47,692
what people’s perception of AI is.
166
00:08:47,693 –> 00:08:48,988
AI, is there more of an
167
00:08:48,989 –> 00:08:53,210
assistant rather than a decision maker?
168
00:08:53,211 –> 00:08:55,298
What do you find the most exciting
169
00:08:55,299 –> 00:08:58,810
for the next year to even progress?
170
00:08:58,811 –> 00:09:00,448
And what do you use in
171
00:09:00,449 –> 00:09:03,830
terms of intelligent smart machines?
172
00:09:03,831 –> 00:09:05,568
Yeah, I think there are just so
173
00:09:05,569 –> 00:09:07,530
many tools that have been created.
174
00:09:08,590 –> 00:09:11,332
So I think we’re living in an age where
175
00:09:11,333 –> 00:09:15,812
now technology, a modern day toolbox is technology.
176
00:09:15,813 –> 00:09:19,908
So if you look at it from 50, 60 years ago,
177
00:09:19,909 –> 00:09:23,002
it was trades person had tools that were from a hammer
178
00:09:23,003 –> 00:09:26,536
to a measure to a shovel, or whatever it might be,
179
00:09:26,537 –> 00:09:29,454
but the modern day tools are AI tools.
180
00:09:29,455 –> 00:09:33,422
So I think as long as we can create those tools
181
00:09:33,423 –> 00:09:39,298
that help us to uncover or discover aspects of our daily
182
00:09:39,299 –> 00:09:41,532
lives that make it easier, and so we can focus more
183
00:09:41,533 –> 00:09:45,068
on perhaps the strategy aspect or the planning aspect or even
184
00:09:45,069 –> 00:09:48,688
the creative aspect, I’m excited about that.
185
00:09:48,689 –> 00:09:52,848
As I said, I think really what we see, or what
186
00:09:52,849 –> 00:09:57,046
we’re seeing more is just this demand for interactive AI versus,
187
00:09:57,047 –> 00:10:02,326
as I said earlier, predictive, which was kind of first stage,
188
00:10:02,327 –> 00:10:05,010
generative, which is where we are right now.
189
00:10:05,011 –> 00:10:06,564
Not to say that we won’t continue to
190
00:10:06,565 –> 00:10:09,028
have predictive and generative AI, but there will
191
00:10:09,029 –> 00:10:12,266
be definitely a demand for more interactive AI.
192
00:10:12,267 –> 00:10:14,408
I know that you’ve been experimenting and
193
00:10:14,409 –> 00:10:17,694
you’ve been building lots of different technologies
194
00:10:17,695 –> 00:10:21,518
and offering in different areas.
195
00:10:21,519 –> 00:10:26,070
I know that you’ve been also interested in metaverse,
196
00:10:28,250 –> 00:10:33,590
but your big breakthrough was Japan computer vision.
197
00:10:35,610 –> 00:10:39,536
How did you narrow your focus there and why?
198
00:10:39,537 –> 00:10:40,650
Computer vision?
199
00:10:41,550 –> 00:10:43,420
Yeah, it’s a great question.
200
00:10:44,430 –> 00:10:47,680
I think computer vision was really.
201
00:10:47,681 –> 00:10:49,632
I was very interested in how
202
00:10:49,633 –> 00:10:50,928
we have cameras in everything.
203
00:10:50,929 –> 00:10:52,404
We have cameras in our phones, we have
204
00:10:52,405 –> 00:10:56,290
cameras in our houses, security cameras are everywhere.
205
00:10:56,291 –> 00:10:58,612
And I was fascinated by how you could take
206
00:10:58,613 –> 00:11:03,406
the average camera and make it more functional, particularly
207
00:11:03,407 –> 00:11:06,168
through building a piece of software and algorithm that
208
00:11:06,169 –> 00:11:10,024
effectively allows the average camera, if there’s such a
209
00:11:10,025 –> 00:11:13,090
thing, the average camera, to be more functional.
210
00:11:14,230 –> 00:11:17,628
And then in terms of the focus, it’s a really
211
00:11:17,629 –> 00:11:22,738
good question because it’s really about the consumer insights.
212
00:11:22,739 –> 00:11:27,532
It’s all about engaging with our customer base, listening, being
213
00:11:27,533 –> 00:11:34,736
aware, pivoting, actioning as you need to when you hear
214
00:11:34,737 –> 00:11:38,048
things that maybe not necessarily what you want to hear,
215
00:11:38,049 –> 00:11:41,908
but definitely, I think JCB, together with some of our
216
00:11:41,909 –> 00:11:46,596
partners, I think we were very fortunate to pivot and
217
00:11:46,597 –> 00:11:50,730
listen to where the market was heading.
218
00:11:50,731 –> 00:11:52,692
And that allowed us the opportunity to
219
00:11:52,693 –> 00:11:56,938
create today some pretty special solutions.
220
00:11:56,939 –> 00:12:01,790
So your main focus was always Japan and Asia,
221
00:12:01,791 –> 00:12:07,692
or you were first going after America or some
222
00:12:07,693 –> 00:12:11,916
english speaking countries, and then Japan, because I guess
223
00:12:11,917 –> 00:12:15,404
the perception and the needs are quite different.
224
00:12:15,405 –> 00:12:16,994
Yeah, I think you’re referring
225
00:12:16,995 –> 00:12:18,560
to computer vision, is that.
226
00:12:18,561 –> 00:12:19,088
Yes.
227
00:12:19,089 –> 00:12:19,740
Yeah.
228
00:12:20,670 –> 00:12:24,190
I think when you look at the number of
229
00:12:24,191 –> 00:12:28,688
cameras in use in macro markets, Japan, the US,
230
00:12:28,689 –> 00:12:32,000
China, those, of course, are the big three.
231
00:12:32,001 –> 00:12:35,130
So it made sense to try and evolve
232
00:12:35,131 –> 00:12:38,650
and create an experience here in Japan.
233
00:12:38,651 –> 00:12:41,498
I think we were very fortunate also that and respect.
234
00:12:41,499 –> 00:12:42,532
This should always be mentioned,
235
00:12:42,533 –> 00:12:44,130
that respect to your competitors.
236
00:12:45,670 –> 00:12:50,312
NEC, Panasonic, two amazing companies here in Japan have
237
00:12:50,313 –> 00:12:53,460
been in computer vision for a very long time.
238
00:12:54,470 –> 00:12:57,778
So there were a lot of learnings and experiences
239
00:12:57,779 –> 00:13:01,804
that they had that we were able to take
240
00:13:01,805 –> 00:13:05,532
and adapt and ultimately be in a position where
241
00:13:05,533 –> 00:13:08,918
I think we’ve caught up to NEC particularly.
242
00:13:08,919 –> 00:13:11,390
NEC is a fantastic company.
243
00:13:11,391 –> 00:13:13,328
I was recently in Australia presenting with
244
00:13:13,329 –> 00:13:16,886
them and the people they’re just incredible.
245
00:13:16,887 –> 00:13:23,780
They’re very outgoing, very receptive, very sharing, obviously
246
00:13:23,781 –> 00:13:26,980
within reason, but it gives us an amazing
247
00:13:26,981 –> 00:13:31,108
opportunity to work with some great people.
248
00:13:31,109 –> 00:13:34,632
What other companies are you looking up to?
249
00:13:34,633 –> 00:13:36,168
Yeah, so in terms of the companies that
250
00:13:36,169 –> 00:13:38,878
we respect, I mean, obviously it’s the usual,
251
00:13:38,879 –> 00:13:41,438
know, the global platforms, the Microsoft’s, your Google’s,
252
00:13:41,439 –> 00:13:44,792
your metas and so on.
253
00:13:44,793 –> 00:13:46,494
At the same time, even here in Japan,
254
00:13:46,495 –> 00:13:51,132
like I mentioned earlier, Panasonic, NEC, there’s an
255
00:13:51,133 –> 00:13:54,674
amazing array of japanese legacy companies, the automakers
256
00:13:54,675 –> 00:13:56,470
such as Toyotas and Nissans.
257
00:13:57,450 –> 00:13:59,122
I was with Epson recently.
258
00:13:59,123 –> 00:14:01,500
Epson’s a big printing company, but
259
00:14:02,750 –> 00:14:04,940
incredibly vast in what they do.
260
00:14:05,870 –> 00:14:10,336
So, yeah, we probably look at both domestic companies
261
00:14:10,337 –> 00:14:14,500
and global companies in terms of the one that
262
00:14:14,501 –> 00:14:16,932
I enjoy following the most at the moment.
263
00:14:16,933 –> 00:14:20,436
It’s the big two in my sort of focus
264
00:14:20,437 –> 00:14:23,280
at the moment, which is Microsoft and Google.
265
00:14:24,370 –> 00:14:25,752
Both of them are very.
266
00:14:25,753 –> 00:14:28,078
Well, I think we’re all fans of Apple.
267
00:14:28,079 –> 00:14:30,472
I think what Apple has done in terms of
268
00:14:30,473 –> 00:14:34,696
taking a product and just making it better and
269
00:14:34,697 –> 00:14:38,410
design and the functionality, et cetera, et cetera.
270
00:14:38,411 –> 00:14:40,540
What I’d love to see, to be honest,
271
00:14:40,541 –> 00:14:43,030
I’d love to see japanese companies be successful.
272
00:14:43,850 –> 00:14:45,308
They are very successful here
273
00:14:45,309 –> 00:14:47,410
in Japan, very successful.
274
00:14:47,411 –> 00:14:51,542
But I would love to see them ultimately
275
00:14:51,543 –> 00:14:53,850
be more competitive on the global stage.
276
00:14:54,750 –> 00:14:57,792
And we definitely are seeing that the last
277
00:14:57,793 –> 00:15:00,948
twelve months, with the yen weakening and the
278
00:15:00,949 –> 00:15:05,002
US dollar being so strong, is that Japan’s
279
00:15:05,003 –> 00:15:07,194
price parity has become competitive.
280
00:15:07,195 –> 00:15:08,948
But apart from that, what do you think
281
00:15:08,949 –> 00:15:12,720
is the reason why japanese companies are so,
282
00:15:13,330 –> 00:15:15,086
like, they are successful in Japan?
283
00:15:15,087 –> 00:15:19,380
But don’t you feel it’s a bit of.
284
00:15:21,670 –> 00:15:27,756
Not a clash, but it’s difficult for both worlds to
285
00:15:27,757 –> 00:15:31,436
understand each other and the way they conduct business.
286
00:15:31,437 –> 00:15:34,348
And lots of startups and lots of new companies which
287
00:15:34,349 –> 00:15:38,752
are trying to expand to countries like Japan fail because
288
00:15:38,753 –> 00:15:43,200
they don’t understand how the business is conducted, how trust
289
00:15:43,201 –> 00:15:48,752
is important and all the things which you know, because
290
00:15:48,753 –> 00:15:52,720
you’ve been there for 22, three years now.
291
00:15:54,210 –> 00:15:55,316
Yeah, 22.
292
00:15:55,317 –> 00:15:58,314
But lots of companies don’t
293
00:15:58,315 –> 00:16:01,920
realise how important it is.
294
00:16:02,870 –> 00:16:03,768
Yeah, definitely.
295
00:16:03,769 –> 00:16:12,930
I think it’s not as difficult as it may appear.
296
00:16:13,910 –> 00:16:16,680
I think Japan puts quality before everything.
297
00:16:18,570 –> 00:16:22,434
So as long as you have a quality mindset,
298
00:16:22,435 –> 00:16:24,840
I think you can always be successful here.
299
00:16:26,570 –> 00:16:30,102
Obviously it’s had to change, or it is changing
300
00:16:30,103 –> 00:16:32,736
due to the competitive nature of other markets close
301
00:16:32,737 –> 00:16:35,888
by, such as sourcing in Southeast Asia, or might
302
00:16:35,889 –> 00:16:39,222
be mainland China or even closer.
303
00:16:39,223 –> 00:16:41,846
Even Korea is very competitive, or was very competitive
304
00:16:41,847 –> 00:16:44,612
until five or six years ago, seven years ago.
305
00:16:44,613 –> 00:16:48,548
Excuse me, so I think as long as you have a
306
00:16:48,549 –> 00:16:52,000
quality mindset, then Japan is the right market for you.
307
00:16:52,950 –> 00:16:56,600
However, if you’re fixated on creating a product
308
00:16:56,601 –> 00:17:05,198
that is more high consumer tangible, then I’m
309
00:17:05,199 –> 00:17:07,260
not sure that Japan is right for you.
310
00:17:07,261 –> 00:17:10,738
With that said, probably one of the most successful
311
00:17:10,739 –> 00:17:13,270
brands here in Japan is a brand called Uniqlo.
312
00:17:14,410 –> 00:17:19,410
And Uniqlo is effectively producing, it’s fast retailing, it’s
313
00:17:19,411 –> 00:17:24,134
producing garments at low price, constantly evolving, multiple seasons
314
00:17:24,135 –> 00:17:27,541
in a year, multiple Skus, again at a competitive
315
00:17:27,542 –> 00:17:32,030
price, and they’ve done it, but they’ve also kept
316
00:17:32,031 –> 00:17:34,160
very close eyes on their quality.
317
00:17:34,161 –> 00:17:38,480
So you get a fantastic price product and very good.
318
00:17:39,490 –> 00:17:42,052
You know, I think the reason that
319
00:17:42,053 –> 00:17:45,496
Japan has probably struggled a little bit
320
00:17:45,497 –> 00:17:49,624
is that they don’t necessarily sacrifice quality.
321
00:17:49,625 –> 00:17:51,620
Quality is everything to them.
322
00:17:52,710 –> 00:17:54,216
And again, if you look at
323
00:17:54,217 –> 00:17:57,026
Lexus, for example, it’s effectively Toyota.
324
00:17:57,027 –> 00:18:01,052
But it was Toyota wanting to create a
325
00:18:01,053 –> 00:18:05,190
more quality product and it’s been very successful.
326
00:18:06,170 –> 00:18:09,888
So, yeah, hopefully that answers the question.
327
00:18:09,889 –> 00:18:12,582
As long as you come with a quality mindset,
328
00:18:12,583 –> 00:18:16,448
then I think it’s easy to then be on
329
00:18:16,449 –> 00:18:20,090
the same level of communication with Japan.
330
00:18:21,170 –> 00:18:25,210
And do you see japanese companies being successful
331
00:18:25,211 –> 00:18:28,244
apart from those who you mentioned on a
332
00:18:28,245 –> 00:18:32,930
global market at the same level? Let’s.
333
00:18:32,931 –> 00:18:37,220
I mean, I think there’s been periods of time know,
334
00:18:37,221 –> 00:18:39,918
it could be the Americas, it could be in Europe.
335
00:18:39,919 –> 00:18:43,774
It’s definitely come back again to Asia Pacific.
336
00:18:43,775 –> 00:18:46,552
I think most japanese companies now are
337
00:18:46,553 –> 00:18:49,934
very focused on making Asia successful.
338
00:18:49,935 –> 00:18:51,720
I mentioned Uniglobe before.
339
00:18:52,570 –> 00:18:53,964
You can talk about the densers of
340
00:18:53,965 –> 00:18:55,436
the world, such as the advertising companies,
341
00:18:55,437 –> 00:18:58,170
although they’ve become very global recently.
342
00:18:58,171 –> 00:18:59,798
So there are a lot of good examples.
343
00:18:59,799 –> 00:19:01,424
Toyota is a great example, Nissan’s a
344
00:19:01,425 –> 00:19:02,672
great example, Honda’s a great example.
345
00:19:02,673 –> 00:19:04,380
The automotive sector is very strong.
346
00:19:05,870 –> 00:19:09,046
And then there’s a lot of health medical
347
00:19:09,047 –> 00:19:12,208
related solutions here in Japan, which, again, a
348
00:19:12,209 –> 00:19:14,708
lot of the world doesn’t know about, as
349
00:19:14,709 –> 00:19:16,298
you’ll know from your experience coming to Japan.
350
00:19:16,299 –> 00:19:18,548
I mean, the culinary experience, the
351
00:19:18,549 –> 00:19:21,250
food experience here is phenomenal.
352
00:19:21,251 –> 00:19:23,742
The way that they process food, the way they prepare
353
00:19:23,743 –> 00:19:27,140
food, the way they grow food, agriculture is amazing.
354
00:19:28,070 –> 00:19:30,632
So there are lots of things and present it right.
355
00:19:30,633 –> 00:19:34,680
Like presentation is also very important to them.
356
00:19:34,681 –> 00:19:35,352
Exactly.
357
00:19:35,353 –> 00:19:39,640
So I think there are definitely time and a place.
358
00:19:40,650 –> 00:19:42,642
Again, as we said earlier, the yen
359
00:19:42,643 –> 00:19:44,412
is at a very unique time.
360
00:19:44,413 –> 00:19:45,516
It’s very weak at the moment.
361
00:19:45,517 –> 00:19:46,802
So that makes it very attractive
362
00:19:46,803 –> 00:19:50,550
for internationals to invest into Japan.
363
00:19:50,551 –> 00:19:52,118
And we’re definitely seeing an influx
364
00:19:52,119 –> 00:19:55,558
of new investment into Japan.
365
00:19:55,559 –> 00:19:58,096
Wouldn’t say a lot more m a.
366
00:19:58,097 –> 00:19:59,862
But definitely a lot more joint
367
00:19:59,863 –> 00:20:03,242
ventures and placements and real estate
368
00:20:03,243 –> 00:20:06,452
sector, automotive sector, health sector, technology.
369
00:20:06,453 –> 00:20:08,240
Of course. Yeah.
370
00:20:08,850 –> 00:20:11,412
And would you say Japan is big
371
00:20:11,413 –> 00:20:13,780
on innovating in the areas of.
372
00:20:17,030 –> 00:20:21,208
You know, I recently posed this question at a speech that
373
00:20:21,209 –> 00:20:27,116
I made down in Australia was, if you had to give
374
00:20:27,117 –> 00:20:31,590
one word to explain or define Australians versus Japanese.
375
00:20:32,650 –> 00:20:35,110
And the word for the Australians was inventive,
376
00:20:35,930 –> 00:20:38,310
and the word for Japanese was innovative.
377
00:20:39,850 –> 00:20:43,148
So when you take that away, you kind of
378
00:20:43,149 –> 00:20:46,402
think, okay, well, if a product could be developed
379
00:20:46,403 –> 00:20:49,602
somewhere offshore, but then you bring it to Japan,
380
00:20:49,603 –> 00:20:51,760
then Japan would just make it better.
381
00:20:53,010 –> 00:20:55,440
And in fact. Exactly.
382
00:20:57,570 –> 00:21:00,308
And it also reminds me a lot of the
383
00:21:00,309 –> 00:21:03,524
early days and success that Apple’s now enjoyed under.
384
00:21:03,525 –> 00:21:05,608
You know, Jobs made it very clear that one
385
00:21:05,609 –> 00:21:08,424
of the places he loved to come to in
386
00:21:08,425 –> 00:21:11,214
his early days of evolving Apple was Japan.
387
00:21:11,215 –> 00:21:13,890
So he would find that innovation.
388
00:21:14,970 –> 00:21:17,692
I guess you invent something, you innovate something.
389
00:21:17,693 –> 00:21:18,732
What’s the next thing?
390
00:21:18,733 –> 00:21:20,002
I guess it’s Apple.
391
00:21:20,003 –> 00:21:24,760
It’s taking and making it just that little bit better.
392
00:21:25,690 –> 00:21:28,528
Often people would make comments that a lot
393
00:21:28,529 –> 00:21:33,878
of what Apple created during Jobs’success was attributed
394
00:21:33,879 –> 00:21:35,888
to a lot of the feedback that he
395
00:21:35,889 –> 00:21:39,070
got from the japanese market, including Softbank.
396
00:21:39,071 –> 00:21:40,804
Softbank was very much engaged in
397
00:21:40,805 –> 00:21:42,874
the iPhone in its early stages.
398
00:21:42,875 –> 00:21:45,120
Models like four, five, six.
399
00:21:48,850 –> 00:21:49,572
Lots of.
400
00:21:49,573 –> 00:21:51,360
Lots of processes. Right.
401
00:21:52,470 –> 00:21:56,920
Like Kaizen, for example, have been invented in
402
00:21:56,921 –> 00:22:01,896
Japan, but then took to adapt in different
403
00:22:01,897 –> 00:22:07,900
markets and iterated into us, for example.
404
00:22:07,901 –> 00:22:08,172
Right.
405
00:22:08,173 –> 00:22:11,910
But the foundation was always in Japan.
406
00:22:12,650 –> 00:22:12,988
Yeah.
407
00:22:12,989 –> 00:22:17,430
It mean, I love Japan.
408
00:22:17,950 –> 00:22:18,832
I wouldn’t guess.
409
00:22:18,833 –> 00:22:20,670
I wouldn’t have guessed.
410
00:22:20,671 –> 00:22:23,664
But there are times where you
411
00:22:23,665 –> 00:22:27,210
feel that they miss some opportunities.
412
00:22:29,070 –> 00:22:30,788
You’ve been here, so you’ll appreciate it.
413
00:22:30,789 –> 00:22:33,434
But there are so many unique processes
414
00:22:33,435 –> 00:22:36,068
and procedures that if they could be
415
00:22:36,069 –> 00:22:39,920
adopted externally, people would be amazed at.
416
00:22:41,650 –> 00:22:42,244
Yeah.
417
00:22:42,245 –> 00:22:44,664
You know, going back to one of the
418
00:22:44,665 –> 00:22:46,488
original questions you asked is what I would
419
00:22:46,489 –> 00:22:49,810
love to see is Japan become global?
420
00:22:51,670 –> 00:22:55,288
I’m not sure exactly what it is, and I’m sure it
421
00:22:55,289 –> 00:22:58,812
will take more than just my focus and many other great
422
00:22:58,813 –> 00:23:01,932
people that I work with each day, but it would be
423
00:23:01,933 –> 00:23:05,842
great to just take something that is japanese.
424
00:23:05,843 –> 00:23:08,060
And I think we’ve had this discussion before
425
00:23:08,061 –> 00:23:10,992
in our personal conversations that I think you
426
00:23:10,993 –> 00:23:14,006
have a bit of a passion for knives.
427
00:23:14,007 –> 00:23:16,416
Knives related. Right.
428
00:23:16,417 –> 00:23:20,740
So when you look at the amount of
429
00:23:20,741 –> 00:23:24,010
skill that goes into creating the blades, knives.
430
00:23:24,011 –> 00:23:27,108
And I just recently created a
431
00:23:27,109 –> 00:23:29,684
sake brand as a side interest.
432
00:23:29,685 –> 00:23:34,408
And for something that is made up of four
433
00:23:34,409 –> 00:23:37,528
or five ingredients, the amount of detail that the
434
00:23:37,529 –> 00:23:42,430
Japanese put into creating sake is just outstanding.
435
00:23:42,431 –> 00:23:48,290
So taking rice water, some yeast and some mould
436
00:23:48,291 –> 00:23:51,244
and creating something that is purely bliss with a
437
00:23:51,245 –> 00:23:54,280
little bit of fruit additives is pretty.
438
00:23:55,930 –> 00:23:59,648
Hopefully we’ll be able to find something that we
439
00:23:59,649 –> 00:24:02,048
can take to the world and give people a
440
00:24:02,049 –> 00:24:05,210
great experience that is made in Japan.
441
00:24:05,870 –> 00:24:10,416
It’s so difficult with simple products.
442
00:24:10,417 –> 00:24:15,600
It’s the most difficult to make them extraordinary because
443
00:24:17,730 –> 00:24:24,168
you have to have huge focus on perfect every
444
00:24:24,169 –> 00:24:27,768
little step of the way of the thing you
445
00:24:27,769 –> 00:24:30,968
are creating, like Italians did with pasta as well.
446
00:24:30,969 –> 00:24:34,980
A few ingredients or bread, you can have
447
00:24:35,850 –> 00:24:38,284
amazing and you can have a very bad
448
00:24:38,285 –> 00:24:41,340
bread, but the core ingredients are the same.
449
00:24:41,341 –> 00:24:45,240
It’s always the skill of the craftsman as well.
450
00:24:47,050 –> 00:24:51,164
It’s well said and I think hopefully we talked to
451
00:24:51,165 –> 00:24:52,982
a lot about technology and the role that it plays.
452
00:24:52,983 –> 00:24:56,230
Hopefully we can use the technology, particularly generative
453
00:24:56,231 –> 00:25:00,516
and in the future, interactive AI, to help
454
00:25:00,517 –> 00:25:03,194
maintain some of those processes and procedures.
455
00:25:03,195 –> 00:25:07,010
Helping future generations, hopefully.
456
00:25:07,011 –> 00:25:08,084
What is this?
457
00:25:08,085 –> 00:25:11,002
And you get this very detailed explanation
458
00:25:11,003 –> 00:25:12,564
of the process and the procedure and
459
00:25:12,565 –> 00:25:14,180
the craftsmanship, as you said.
460
00:25:15,830 –> 00:25:18,856
But yeah, it’s an exciting time and
461
00:25:18,857 –> 00:25:22,158
hopefully Japan will be able to benefit.
462
00:25:22,159 –> 00:25:24,398
I mean, we’re definitely seeing the automotive industry
463
00:25:24,399 –> 00:25:25,676
is doing very well at the moment.
464
00:25:25,677 –> 00:25:27,884
As I mentioned earlier, health industry is
465
00:25:27,885 –> 00:25:29,628
doing well and I definitely think the
466
00:25:29,629 –> 00:25:31,132
technology industry will do well.
467
00:25:31,133 –> 00:25:37,648
We were having conversations with Lenovo about three, four months
468
00:25:37,649 –> 00:25:41,040
ago and I asked the question of, in terms of
469
00:25:41,041 –> 00:25:47,712
software creating creation for wearables like VR and AR type
470
00:25:47,713 –> 00:25:50,950
glasses, where are the best software engineers?
471
00:25:50,951 –> 00:25:52,502
Are they in India? Are they in China?
472
00:25:52,503 –> 00:25:53,178
Are they in America?
473
00:25:53,179 –> 00:25:54,170
Are they in Australia?
474
00:25:54,171 –> 00:25:55,642
And they said, no, they’re here in Japan.
475
00:25:55,643 –> 00:25:57,600
They’ve been in Japan for 20 years.
476
00:25:59,170 –> 00:26:01,674
We recently were with Dell.
477
00:26:01,675 –> 00:26:04,104
Dell did their Asia Pacific conference here
478
00:26:04,105 –> 00:26:07,190
in Yokohama, just outside of Tokyo.
479
00:26:07,191 –> 00:26:11,758
And again, everyone was just acknowledging just Japan’s
480
00:26:11,759 –> 00:26:14,568
attention to detail, particularly from, say, Dell and
481
00:26:14,569 –> 00:26:17,218
Lenovo are very similar, but from Dell’s perspective
482
00:26:17,219 –> 00:26:24,402
about creating hardware and creating experiences that ultimately,
483
00:26:24,403 –> 00:26:26,332
when it comes to finishing it and making
484
00:26:26,333 –> 00:26:29,136
it just what it needs to be, those
485
00:26:29,137 –> 00:26:31,980
subtle nuances, Japan does very well.
486
00:26:34,670 –> 00:26:35,520
Yeah.
487
00:26:35,521 –> 00:26:40,374
Do you think technology can help those industries
488
00:26:40,375 –> 00:26:45,466
and areas of japanese businesses which are unfortunately
489
00:26:45,467 –> 00:26:49,594
prone to decreasing number of labour?
490
00:26:49,595 –> 00:26:54,152
I’m talking about those traditional businesses, family run
491
00:26:54,153 –> 00:26:59,460
businesses, which I’m sure you’re aware that many
492
00:27:01,670 –> 00:27:08,090
founders struggle to expand because they don’t have.
493
00:27:08,091 –> 00:27:11,468
Maybe their sons or their family doesn’t want to
494
00:27:11,469 –> 00:27:15,116
take over and they don’t have anyone who has
495
00:27:15,117 –> 00:27:19,470
the right amount of skill to keep the legacy.
496
00:27:19,471 –> 00:27:20,320
Yeah, definitely.
497
00:27:20,321 –> 00:27:23,568
I think SoftBank has a legacy aspect, not
498
00:27:23,569 –> 00:27:27,450
only with mobile phones, but also with robotics.
499
00:27:28,670 –> 00:27:32,452
And definitely using technology to create some
500
00:27:32,453 –> 00:27:34,548
kind of robotic aspect or element to
501
00:27:34,549 –> 00:27:37,972
the process, I think is important.
502
00:27:37,973 –> 00:27:41,418
I think hospitality sector Japanese have an incredible
503
00:27:41,419 –> 00:27:46,552
hospitality culturally in their dna and obviously with
504
00:27:46,553 –> 00:27:49,380
COVID now hopefully well behind us.
505
00:27:50,150 –> 00:27:52,872
Tourism is a big part of key
506
00:27:52,873 –> 00:27:55,192
markets, including Japan, and we definitely have
507
00:27:55,193 –> 00:27:57,708
seen those numbers, but same thing.
508
00:27:57,709 –> 00:27:59,922
They cannot maintain the service levels
509
00:27:59,923 –> 00:28:02,140
for, I think it’s 30 million,
510
00:28:02,141 –> 00:28:04,050
approximately, tourists that come to Japan.
511
00:28:04,051 –> 00:28:05,628
It did in 19, actually, and I think
512
00:28:05,629 –> 00:28:08,652
already Japan looks like in 24 this year,
513
00:28:08,653 –> 00:28:10,240
they’ll probably go past 30 million.
514
00:28:10,241 –> 00:28:13,488
But there are challenges and demands that
515
00:28:13,489 –> 00:28:15,222
are placed on the service industry.
516
00:28:15,223 –> 00:28:17,936
And unfortunately, if someone doesn’t get the same
517
00:28:17,937 –> 00:28:20,896
experience, then it can drop away pretty quickly.
518
00:28:20,897 –> 00:28:23,044
So that’s where I think technology can help.
519
00:28:23,045 –> 00:28:25,716
If you look at, for example, cheque in, cheque
520
00:28:25,717 –> 00:28:29,044
out, if we talk hospitality, there’s certain automation you
521
00:28:29,045 –> 00:28:31,870
can do in some of the food preparation, perhaps.
522
00:28:34,290 –> 00:28:35,688
There’s lots of ways to do it.
523
00:28:35,689 –> 00:28:38,488
I think the reality is that a lot of people
524
00:28:38,489 –> 00:28:42,872
are still very scared about technology and the role that
525
00:28:42,873 –> 00:28:46,956
it should be playing, or meant to be playing, rather
526
00:28:46,957 –> 00:28:52,332
than how it can infringe on people’s livelihoods in terms
527
00:28:52,333 –> 00:28:56,508
of what someone does now, which technology might be able
528
00:28:56,509 –> 00:28:58,760
to do in the future.
529
00:28:59,770 –> 00:29:01,910
And we’ve definitely seen that with computer vision.
530
00:29:01,911 –> 00:29:04,368
Computer vision, obviously you can use it from a
531
00:29:04,369 –> 00:29:07,142
surveillance point of view, security point of view, identity
532
00:29:07,143 –> 00:29:09,392
point of view, access point of view.
533
00:29:09,393 –> 00:29:12,128
And in each case, those four that I just mentioned, they
534
00:29:12,129 –> 00:29:15,680
do have a huge or significant human element to them.
535
00:29:16,530 –> 00:29:19,562
But you can use computer vision news technology to effectively
536
00:29:19,563 –> 00:29:22,164
allow those people that are currently working in any of
537
00:29:22,165 –> 00:29:26,808
those industries to then focus on something else where the
538
00:29:26,809 –> 00:29:29,380
service level then gets put back up again.
539
00:29:29,910 –> 00:29:31,192
It’s just a case of whether
540
00:29:31,193 –> 00:29:34,382
people want to be service related.
541
00:29:34,383 –> 00:29:39,450
I think Japan, they are, as I said, built in their dna.
542
00:29:39,451 –> 00:29:43,852
How do you educate those who are afraid of technology?
543
00:29:43,853 –> 00:29:50,110
How do you show them the opportunities it presents?
544
00:29:50,111 –> 00:29:53,328
Yeah, that’s like the best question so far.
545
00:29:53,329 –> 00:29:54,620
That’s the best one.
546
00:29:55,630 –> 00:29:57,702
It’s just demo and trial.
547
00:29:57,703 –> 00:30:00,060
I think once you get people to experience technology,
548
00:30:01,310 –> 00:30:03,588
it’s so important, and we stress this so much,
549
00:30:03,589 –> 00:30:06,388
how important it is to do the right demo
550
00:30:06,389 –> 00:30:08,770
experience and the right trial experience.
551
00:30:08,771 –> 00:30:11,684
However, if you don’t do it, then technology can
552
00:30:11,685 –> 00:30:14,602
be so overwhelming, it can be so intimidating.
553
00:30:14,603 –> 00:30:17,992
So I think as long as you can put technology into
554
00:30:17,993 –> 00:30:20,728
some form of a demo or trial, and it doesn’t have
555
00:30:20,729 –> 00:30:23,182
to be excessive, it can be three four, 5 seconds.
556
00:30:23,183 –> 00:30:24,968
It doesn’t have to be three or four,
557
00:30:24,969 –> 00:30:26,792
five or 30 or 40, 50 minutes.
558
00:30:26,793 –> 00:30:29,932
So I think when people then get to trial it
559
00:30:29,933 –> 00:30:32,908
and test it themselves a little bit, like we talked
560
00:30:32,909 –> 00:30:35,930
about with AI before, get more of an interactive.
561
00:30:35,931 –> 00:30:38,032
I think if you can create that in your
562
00:30:38,033 –> 00:30:42,976
experience, then all the barriers go down pretty quickly.
563
00:30:42,977 –> 00:30:46,512
But don’t you feel like not all technology
564
00:30:46,513 –> 00:30:52,970
which is now developed should be developed?
565
00:30:52,971 –> 00:30:59,130
I mean, there are some aspects to it that people worry
566
00:30:59,131 –> 00:31:06,440
that it will alienate us even further or make people.
567
00:31:06,441 –> 00:31:12,888
I’m talking mainly about metaverse and computer vision in
568
00:31:12,889 –> 00:31:18,092
a way that people will start living in those
569
00:31:18,093 –> 00:31:22,188
different worlds instead of focusing what they have here.
570
00:31:22,189 –> 00:31:23,340
Yeah, definitely.
571
00:31:23,341 –> 00:31:26,236
Again, a great question and very relevant point.
572
00:31:26,237 –> 00:31:30,512
I think with COVID effectively for two years, we all
573
00:31:30,513 –> 00:31:35,872
became kind of very isolated, very detached, and we came
574
00:31:35,873 –> 00:31:41,424
out of that period then thinking, well, how do we
575
00:31:41,425 –> 00:31:45,250
continue to use, how do we continue to be people?
576
00:31:45,251 –> 00:31:47,082
So, yeah, I think there is a danger.
577
00:31:47,083 –> 00:31:48,938
I think it has to be monitored.
578
00:31:48,939 –> 00:31:52,058
I think we as software particularly, are very conscious
579
00:31:52,059 –> 00:31:55,678
of the human element, first and foremost, how it’s
580
00:31:55,679 –> 00:31:58,930
used, as long as it’s not having detriment.
581
00:32:00,310 –> 00:32:03,288
When we talk about generative AI, for example, you
582
00:32:03,289 –> 00:32:05,096
can ask any question, it gives you an answer.
583
00:32:05,097 –> 00:32:06,268
Now, whether that answer is right or
584
00:32:06,269 –> 00:32:07,836
wrong, it gives you an answer.
585
00:32:07,837 –> 00:32:10,604
So suddenly the thinking is taken away.
586
00:32:10,605 –> 00:32:15,468
So I do believe that all technology needs to have
587
00:32:15,469 –> 00:32:19,232
regulations and it needs to be governed to a certain
588
00:32:19,233 –> 00:32:22,832
degree, because otherwise, if the technology gets in the wrong
589
00:32:22,833 –> 00:32:26,860
hands, it can be used in a negative way.
590
00:32:28,110 –> 00:32:30,032
But again, from experience the last few
591
00:32:30,033 –> 00:32:33,962
years, I think every major technology player
592
00:32:33,963 –> 00:32:36,228
on the planet is very conscious of
593
00:32:36,229 –> 00:32:39,470
their social elements and social responsibility.
594
00:32:41,490 –> 00:32:46,062
And as a result, I think we slowed down in Covid.
595
00:32:46,063 –> 00:32:48,382
We haven’t actually gone faster.
596
00:32:48,383 –> 00:32:52,248
Even if you look at what’s happening with the
597
00:32:52,249 –> 00:32:54,878
current generative AI phase, there’s a lot of buz,
598
00:32:54,879 –> 00:32:56,460
there’s a lot of interest around it.
599
00:32:56,461 –> 00:32:59,586
But if you did not have generative
600
00:32:59,587 –> 00:33:02,124
AI, what other technology is there?
601
00:33:02,125 –> 00:33:04,764
We haven’t really developed anything new.
602
00:33:04,765 –> 00:33:10,738
There’s nothing that is groundbreaking, if that’s
603
00:33:10,739 –> 00:33:12,000
the right word to be using.
604
00:33:12,001 –> 00:33:14,368
So I think there’s a lot of noise at the
605
00:33:14,369 –> 00:33:18,768
moment around AI in the sense that to go back
606
00:33:18,769 –> 00:33:20,992
to your question, I’m taking a few minutes to get
607
00:33:20,993 –> 00:33:23,900
there, but it really comes down to the education.
608
00:33:24,830 –> 00:33:27,588
And I think again, like we talked earlier with demo and
609
00:33:27,589 –> 00:33:30,548
trial, if you have the ability to educate people in the
610
00:33:30,549 –> 00:33:33,390
right way, I think you will have success with AI.
611
00:33:34,690 –> 00:33:36,984
But if you don’t then, as you said, it
612
00:33:36,985 –> 00:33:38,888
can be seen as very intimidating, it can be
613
00:33:38,889 –> 00:33:42,190
distracting, it can be seen as alienating.
614
00:33:42,191 –> 00:33:43,700
I think you use that word.
615
00:33:44,890 –> 00:33:47,948
So, yeah, it needs to be.
616
00:33:47,949 –> 00:33:49,644
But there’s an education process.
617
00:33:49,645 –> 00:33:54,108
And the hard thing is that the people that
618
00:33:54,109 –> 00:33:57,538
make decisions, the people that spend on technology versus
619
00:33:57,539 –> 00:33:59,760
the people that use it, people that need it,
620
00:33:59,761 –> 00:34:04,460
whether elderly, younger, it’s still very fragmented still.
621
00:34:06,350 –> 00:34:10,292
But definitely technology used
622
00:34:10,293 –> 00:34:12,522
correctly is very exciting.
623
00:34:12,523 –> 00:34:17,108
So you don’t believe those headlines and
624
00:34:17,109 –> 00:34:19,898
some scientists screaming that we should throw
625
00:34:19,899 –> 00:34:23,902
down, the development of, which is exponential.
626
00:34:23,903 –> 00:34:27,112
If you think about how many more advancements and
627
00:34:27,113 –> 00:34:30,952
how many more releases of new versions are happening
628
00:34:30,953 –> 00:34:34,717
in a shorter and shorter period of time, don’t
629
00:34:34,718 –> 00:34:39,068
you feel like we are creating things in a
630
00:34:39,069 –> 00:34:45,371
faster pace than what we can even conceive and
631
00:34:45,372 –> 00:34:49,340
understand what kind of impacts it can have?
632
00:34:49,949 –> 00:34:51,333
Yeah, totally.
633
00:34:51,334 –> 00:34:52,976
I think we need to slow down.
634
00:34:52,977 –> 00:34:56,949
I think there are obvious social impacts
635
00:34:56,950 –> 00:35:01,820
in terms of what technology can have.
636
00:35:02,510 –> 00:35:04,596
For example, it can take people’s jobs away
637
00:35:04,597 –> 00:35:06,052
and then what does that person do?
638
00:35:06,053 –> 00:35:07,236
What’s their purpose in life?
639
00:35:07,237 –> 00:35:09,188
I think we have to be
640
00:35:09,189 –> 00:35:11,040
very conscious of the social issues.
641
00:35:11,650 –> 00:35:15,034
But as I said earlier, I think a lot of it’s
642
00:35:15,035 –> 00:35:18,078
in the explanation and a lot of it’s in the branding.
643
00:35:18,079 –> 00:35:21,048
If we look at AI, we had the
644
00:35:21,049 –> 00:35:24,168
term predictive at the start, then generative, and
645
00:35:24,169 –> 00:35:26,050
as I mentioned, now we have interactive.
646
00:35:26,950 –> 00:35:28,344
But if you were to choose any one of
647
00:35:28,345 –> 00:35:30,012
those three words, which is the one that’s probably
648
00:35:30,013 –> 00:35:33,164
most appealing, it’s probably interactive because I can be
649
00:35:33,165 –> 00:35:34,956
a part of it, I can shape it.
650
00:35:34,957 –> 00:35:36,652
Whereas generative was more about,
651
00:35:36,653 –> 00:35:37,936
okay, where’s this coming from?
652
00:35:37,937 –> 00:35:41,470
And predictive was kind of, that’s too fast.
653
00:35:41,471 –> 00:35:45,900
So I think it’s an interesting time to be where we are.
654
00:35:47,070 –> 00:35:49,584
I think to answer your question a bit more,
655
00:35:49,585 –> 00:35:51,168
I do think we need to regulate it.
656
00:35:51,169 –> 00:35:53,572
I think we have to slow down a little bit.
657
00:35:53,573 –> 00:35:55,680
I think it’s going a little bit too fast.
658
00:35:57,010 –> 00:35:59,380
I think I worry more about young people.
659
00:35:59,381 –> 00:36:00,484
I have a daughter, a ten year
660
00:36:00,485 –> 00:36:04,660
old daughter, who’s just amazing, but she
661
00:36:05,350 –> 00:36:07,540
comes in contact with so much technology.
662
00:36:08,150 –> 00:36:09,270
Exactly.
663
00:36:09,271 –> 00:36:12,926
And sometimes it’s just a personal remark,
664
00:36:12,927 –> 00:36:19,052
but sometimes that technology creates in her
665
00:36:19,053 –> 00:36:22,570
world that everything should happen now.
666
00:36:22,571 –> 00:36:26,002
There’s no fact finding, there’s no discovery, there’s
667
00:36:26,003 –> 00:36:30,810
no learning, there’s no positives and negatives.
668
00:36:32,030 –> 00:36:34,030
It must happen now.
669
00:36:34,031 –> 00:36:40,432
So, effectively, she is growing too fast and
670
00:36:40,433 –> 00:36:44,948
not experiencing things that don’t get me wrong
671
00:36:44,949 –> 00:36:46,772
in hindsight, if you could have made something
672
00:36:46,773 –> 00:36:48,340
go faster 20 years ago, fine.
673
00:36:48,341 –> 00:36:50,308
But yeah, I do believe that we have
674
00:36:50,309 –> 00:36:53,672
to slow things down just a little bit.
675
00:36:53,673 –> 00:36:58,270
Isn’t it always the same for each generation?
676
00:36:58,271 –> 00:37:01,998
Each generation feels like the new generation.
677
00:37:01,999 –> 00:37:06,568
Times are just speeding up and you can’t keep up.
678
00:37:06,569 –> 00:37:10,812
And then you have those stories where you have
679
00:37:10,813 –> 00:37:14,268
parents who are not able to send you a
680
00:37:14,269 –> 00:37:15,986
text and they send you an MMS.
681
00:37:15,987 –> 00:37:18,448
I don’t know, like some funny to you
682
00:37:18,449 –> 00:37:21,370
stories, but for them it’s something normal.
683
00:37:22,430 –> 00:37:23,872
It’s just too much for them.
684
00:37:23,873 –> 00:37:26,170
It’s just too complex.
685
00:37:27,230 –> 00:37:30,960
They haven’t have grown with it.
686
00:37:31,650 –> 00:37:35,962
I’m the generation where I was born
687
00:37:35,963 –> 00:37:38,618
before the Internet, okay, the Internet existed,
688
00:37:38,619 –> 00:37:40,004
but not for the masses, right?
689
00:37:40,005 –> 00:37:42,884
I had my first mobile phone when I
690
00:37:42,885 –> 00:37:45,096
was 14 and that was a big deal.
691
00:37:45,097 –> 00:37:48,580
I was just giving my friends in school
692
00:37:49,110 –> 00:37:52,136
the snake, Nokia Snake game to play and
693
00:37:52,137 –> 00:37:55,140
I was the coolest girl in the school.
694
00:37:56,090 –> 00:38:03,050
But we somehow could experience the progression of it and
695
00:38:03,051 –> 00:38:08,966
generation of your daughter, they have all those stimuli instantly
696
00:38:08,967 –> 00:38:12,940
attacking them and it’s so difficult for them, I think,
697
00:38:15,470 –> 00:38:19,568
to know the different type of living.
698
00:38:19,569 –> 00:38:23,710
Like experiencing, learning, not knowing.
699
00:38:24,450 –> 00:38:26,676
Yeah, that’s a really good point.
700
00:38:26,677 –> 00:38:29,498
Not knowing, experiencing, discovering.
701
00:38:29,499 –> 00:38:31,360
Maybe discovery is the good word.
702
00:38:35,810 –> 00:38:38,206
I think whether it be leaders,
703
00:38:38,207 –> 00:38:40,456
whether it be parents, same thing.
704
00:38:40,457 –> 00:38:45,858
Basically it’s our responsibility to try and maintain
705
00:38:45,859 –> 00:38:48,108
that environment where you can still discover and
706
00:38:48,109 –> 00:38:50,710
learn and have your ups and downs.
707
00:38:52,090 –> 00:38:56,348
I think the challenge that you have
708
00:38:56,349 –> 00:38:59,392
is there’s just so much going on.
709
00:38:59,393 –> 00:39:02,960
There’s so many devices, there’s so many different
710
00:39:02,961 –> 00:39:07,792
ways to get access to the information, whether
711
00:39:07,793 –> 00:39:11,028
that was web one, two, three, the future,
712
00:39:11,029 –> 00:39:12,682
four, five, six, et cetera.
713
00:39:12,683 –> 00:39:15,012
So I think each time that we
714
00:39:15,013 –> 00:39:17,412
go through these little, well, not little,
715
00:39:17,413 –> 00:39:22,430
they’re significant technology evolutions or adjustments.
716
00:39:22,950 –> 00:39:25,490
People have to keep moving and adjusting.
717
00:39:26,150 –> 00:39:27,480
You make a really good point.
718
00:39:27,481 –> 00:39:31,512
I think if you’re not aware of
719
00:39:31,513 –> 00:39:35,692
how to handle technology, then effectively technology
720
00:39:35,693 –> 00:39:38,332
almost, be careful what I say here.
721
00:39:38,333 –> 00:39:43,196
But technology almost becomes a detriment in the sense
722
00:39:43,197 –> 00:39:49,320
that people question, what is it that I do?
723
00:39:50,410 –> 00:39:51,750
Which is really sad.
724
00:39:52,430 –> 00:39:53,500
Yeah, exactly.
725
00:39:54,430 –> 00:39:57,216
You put on this earth to live for
726
00:39:57,217 –> 00:40:00,050
60, 70, 80, 90 years, et cetera.
727
00:40:00,051 –> 00:40:03,170
And during that time you leave a mark.
728
00:40:03,171 –> 00:40:04,868
Whereas now you’re leaving a
729
00:40:04,869 –> 00:40:08,160
code or you’re leaving data.
730
00:40:08,930 –> 00:40:10,560
You’re leaving data. Exactly.
731
00:40:11,890 –> 00:40:15,128
So when you look at, for example, like what
732
00:40:15,129 –> 00:40:18,478
we can do now with 3d renderings, and it’s
733
00:40:18,479 –> 00:40:21,620
incredible, there’s so many things that are going on,
734
00:40:22,550 –> 00:40:26,200
but I do believe we have the ability to.
735
00:40:27,130 –> 00:40:28,748
I guess the crux of this part of the
736
00:40:28,749 –> 00:40:34,418
conversation is to slow down a little bit, perhaps
737
00:40:34,419 –> 00:40:40,368
a lot, actually, and try and then use technology
738
00:40:40,369 –> 00:40:42,460
and how it can make our lives better.
739
00:40:44,270 –> 00:40:45,232
We talked about before,
740
00:40:45,233 –> 00:40:46,768
like manufacturing, for example.
741
00:40:46,769 –> 00:40:48,144
Nobody wants to be sitting in a
742
00:40:48,145 –> 00:40:51,498
factory and putting themselves into stressful situations
743
00:40:51,499 –> 00:40:54,506
on their bodies, for example, lifting items,
744
00:40:54,507 –> 00:40:56,762
changing machinery, changing parts.
745
00:40:56,763 –> 00:40:59,828
So again, I think using technology in
746
00:40:59,829 –> 00:41:03,786
the right way can be really exciting.
747
00:41:03,787 –> 00:41:05,940
But I had the experience the other day.
748
00:41:06,470 –> 00:41:09,528
I drove somewhere, and I’m listening the whole way to
749
00:41:09,529 –> 00:41:12,568
a Google map, or I’m watching my gps, and if
750
00:41:12,569 –> 00:41:15,464
you told me now, no maps get home, I’d be
751
00:41:15,465 –> 00:41:17,060
like, I don’t know how to get home.
752
00:41:17,610 –> 00:41:20,508
You cannot read the normal map on the paper.
753
00:41:20,509 –> 00:41:22,188
No, you know what I mean?
754
00:41:22,189 –> 00:41:24,012
It wasn’t that long ago that you
755
00:41:24,013 –> 00:41:25,480
had a book in your car.
756
00:41:26,890 –> 00:41:30,758
My daughter said to me the other day, and it reminded
757
00:41:30,759 –> 00:41:34,208
me of when I was a kid, if I had a
758
00:41:34,209 –> 00:41:37,168
question, there were a set of encyclopaedias on the wall.
759
00:41:37,169 –> 00:41:38,190
Yes.
760
00:41:38,191 –> 00:41:41,652
And you would go to there, or my
761
00:41:41,653 –> 00:41:43,956
father would help me find the answer.
762
00:41:43,957 –> 00:41:46,612
Whereas now it’s okay, Google,
763
00:41:46,613 –> 00:41:48,320
what’s the answer to this?
764
00:41:51,090 –> 00:41:53,274
Which you can hear in the background?
765
00:41:53,275 –> 00:41:55,576
I have Google set up through our house.
766
00:41:55,577 –> 00:41:57,480
So maybe I’m the problem.
767
00:41:57,481 –> 00:41:59,368
Maybe I’m the one that has created an
768
00:41:59,369 –> 00:42:01,992
environment where my daughter doesn’t have to ask
769
00:42:01,993 –> 00:42:03,890
those questions of a papa anymore.
770
00:42:05,610 –> 00:42:08,236
No, but that’s what we were saying,
771
00:42:08,237 –> 00:42:12,834
that we want to be feeling useful.
772
00:42:12,835 –> 00:42:16,332
And through learning, through our
773
00:42:16,333 –> 00:42:18,860
wisdom, we give this value. Right.
774
00:42:19,870 –> 00:42:26,688
If technology takes it away, where is going
775
00:42:26,689 –> 00:42:33,090
to be our place where we will excel,
776
00:42:33,091 –> 00:42:35,200
where we will be better than technology?
777
00:42:35,890 –> 00:42:37,204
Excuse me.
778
00:42:37,205 –> 00:42:40,570
Yeah, it’s a fantastic consideration.
779
00:42:40,571 –> 00:42:44,440
I think in my family, I have a
780
00:42:44,441 –> 00:42:49,182
beautiful sister who’s an amazing designer and spends
781
00:42:49,183 –> 00:42:52,286
time thinking about something in all the elements
782
00:42:52,287 –> 00:42:55,038
and colorings and structure, et cetera.
783
00:42:55,039 –> 00:42:57,612
Whereas in the last couple of years, we’ve seen
784
00:42:57,613 –> 00:43:04,658
AI tools emerge so quickly where effectively a designer,
785
00:43:04,659 –> 00:43:09,400
a creator, a marketer, can effectively be replaced by
786
00:43:10,350 –> 00:43:12,490
the ability to use the tools.
787
00:43:13,630 –> 00:43:17,328
But even in that situation, I think that’s what we have
788
00:43:17,329 –> 00:43:19,488
to look at, is these are tools, and if we can
789
00:43:19,489 –> 00:43:22,266
use those tools, we can become more efficient, more effective.
790
00:43:22,267 –> 00:43:23,840
That’s the most important thing.
791
00:43:24,610 –> 00:43:29,812
The longer that you take the stance, I guess, to
792
00:43:29,813 –> 00:43:35,810
not use the tools, then that’s probably where that anxiety,
793
00:43:36,390 –> 00:43:39,080
that feeling of, wow, what do I do?
794
00:43:39,081 –> 00:43:40,580
Comes up very quickly.
795
00:43:43,350 –> 00:43:47,996
But still, we still, I think,
796
00:43:47,997 –> 00:43:53,042
place value on conversation and interaction.
797
00:43:53,043 –> 00:43:54,092
That’s why I keep coming
798
00:43:54,093 –> 00:43:55,682
back to this interactive element.
799
00:43:55,683 –> 00:44:00,822
I think if we can create technology that still requires
800
00:44:00,823 –> 00:44:04,368
the human element to be commanding it and interacting with
801
00:44:04,369 –> 00:44:07,254
it, then I think that can be really exciting.
802
00:44:07,255 –> 00:44:09,744
Yeah, but that’s what you said.
803
00:44:09,745 –> 00:44:10,922
We are excelling.
804
00:44:10,923 –> 00:44:12,132
We are good at.
805
00:44:12,133 –> 00:44:12,308
Right.
806
00:44:12,309 –> 00:44:15,268
Like communication and relationships with other people.
807
00:44:15,269 –> 00:44:21,010
But what if we teach machines to do this for us?
808
00:44:21,011 –> 00:44:25,534
Don’t we end up with machines conversing
809
00:44:25,535 –> 00:44:28,382
and interacting with other machines and leaving
810
00:44:28,383 –> 00:44:32,366
us somewhere as a robot?
811
00:44:32,367 –> 00:44:35,228
Actually, as a recipient of the
812
00:44:35,229 –> 00:44:37,522
result of thinking and conversing?
813
00:44:37,523 –> 00:44:41,346
Yeah, it’s such an interesting conversation.
814
00:44:41,347 –> 00:44:46,010
And again, not to digress too far, but we should.
815
00:44:46,011 –> 00:44:51,136
In my family, on my wife’s side, I still have two
816
00:44:51,137 –> 00:44:54,368
beautiful in laws, mother in law and father in law, who
817
00:44:54,369 –> 00:44:57,660
are still alive, and they’re now well into their eighty s.
818
00:44:58,510 –> 00:44:59,970
A good example.
819
00:44:59,971 –> 00:45:02,906
Recently, my mother in law had an accident
820
00:45:02,907 –> 00:45:06,930
which required some reconstruction of her hips.
821
00:45:06,931 –> 00:45:09,076
But now she walks better, but
822
00:45:09,077 –> 00:45:12,266
she’s effectively become like a robot.
823
00:45:12,267 –> 00:45:15,710
Basically, parts of her body have been replaced.
824
00:45:15,711 –> 00:45:19,592
And she’s the first to say that the
825
00:45:19,593 –> 00:45:22,216
feeling and the discomfort that she had before
826
00:45:22,217 –> 00:45:25,210
the accident to now is better.
827
00:45:25,211 –> 00:45:31,450
So again, it’s quite scary to think that
828
00:45:31,451 –> 00:45:37,612
we actually become the robots which we’re already
829
00:45:37,613 –> 00:45:40,656
seeing, that in different phases of our lives,
830
00:45:40,657 –> 00:45:44,016
people are living longer, we’re using technology to
831
00:45:44,017 –> 00:45:47,232
procure food differently, we’re moving away to more
832
00:45:47,233 –> 00:45:51,680
to plant based production versus animal based.
833
00:45:52,930 –> 00:45:54,602
If I was an animal, I’d be worried.
834
00:45:54,603 –> 00:45:56,596
Okay, well, I know it wasn’t a great.
835
00:45:56,597 –> 00:45:57,680
What’s my value?
836
00:46:01,010 –> 00:46:03,752
It’s probably not just people asking the same question.
837
00:46:03,753 –> 00:46:06,488
I think there’s lots of things that are asking the
838
00:46:06,489 –> 00:46:11,918
same question, but the overall question that you’re posing.
839
00:46:11,919 –> 00:46:14,280
Yeah, I think we definitely need to be
840
00:46:14,281 –> 00:46:18,338
conscious and aware of technology and the role
841
00:46:18,339 –> 00:46:20,508
that it plays in our lives, that it
842
00:46:20,509 –> 00:46:23,714
doesn’t make us feel like we’re not relevant.
843
00:46:23,715 –> 00:46:28,048
Yeah, but what kind of world would you
844
00:46:28,049 –> 00:46:30,380
imagine for your daughter to live in?
845
00:46:32,350 –> 00:46:34,460
Hopefully a happy and safe one.
846
00:46:35,550 –> 00:46:40,676
One that she can make decisions based on
847
00:46:40,677 –> 00:46:45,012
her own thinking and influences that people like
848
00:46:45,013 –> 00:46:50,510
myself, my wife, her family, friends have shaped.
849
00:46:51,330 –> 00:46:54,312
I think if it’s purely just command based,
850
00:46:54,313 –> 00:46:59,112
then that certainly makes living a lot more
851
00:46:59,113 –> 00:47:01,410
sedentary, and that’s definitely not healthy.
852
00:47:02,390 –> 00:47:04,088
Again, if we think about how we can
853
00:47:04,089 –> 00:47:07,740
use technology for her to help her become
854
00:47:07,741 –> 00:47:10,722
hopefully more active, not become again sedentary.
855
00:47:10,723 –> 00:47:13,004
Watching games and content.
856
00:47:13,005 –> 00:47:14,924
And more content. And more content.
857
00:47:14,925 –> 00:47:16,120
And more content.
858
00:47:17,850 –> 00:47:22,768
So the thing I worry for younger people
859
00:47:22,769 –> 00:47:25,526
is just, you said it earlier, the speed.
860
00:47:25,527 –> 00:47:29,638
I mean, there’s just so much going on, and they’re
861
00:47:29,639 –> 00:47:34,548
influenced so much by what they hear, what they see,
862
00:47:34,549 –> 00:47:38,596
what they think they should be doing to a point
863
00:47:38,597 –> 00:47:42,068
where when you pause just for a moment and ask
864
00:47:42,069 –> 00:47:45,288
a simple question, what do you want to do?
865
00:47:45,289 –> 00:47:48,420
The response, unfortunately, is, I don’t know.
866
00:47:48,950 –> 00:47:51,480
Whereas you don’t explore it.
867
00:47:51,481 –> 00:47:53,880
Yeah, it wasn’t that long ago where.
868
00:47:53,881 –> 00:47:56,744
Well, if you don’t know, go and find out.
869
00:47:56,745 –> 00:47:59,612
Go and try and find it. And. Okay.
870
00:47:59,613 –> 00:48:01,452
Yeah, you might get hurt a little bit, or you
871
00:48:01,453 –> 00:48:05,964
might encounter a challenge, or you find an incredible success.
872
00:48:05,965 –> 00:48:12,672
But definitely technology could take
873
00:48:12,673 –> 00:48:14,060
away a lot of that.
874
00:48:15,070 –> 00:48:19,456
So what’s the antidote, apart from slowing down?
875
00:48:19,457 –> 00:48:23,412
Because I think, in a way, maybe you feel so, too.
876
00:48:23,413 –> 00:48:26,980
It’s a wishful thinking, because if you think of
877
00:48:26,981 –> 00:48:33,406
other countries, like China, Europe in general, or UK,
878
00:48:33,407 –> 00:48:37,590
us, they all compete against each other.
879
00:48:37,591 –> 00:48:41,460
They all want to be nice sound.
880
00:48:42,550 –> 00:48:44,088
I can hear it.
881
00:48:44,089 –> 00:48:45,512
Do you want to take it?
882
00:48:45,513 –> 00:48:46,680
No, it’s okay.
883
00:48:47,930 –> 00:48:48,508
There you go.
884
00:48:48,509 –> 00:48:53,868
There’s a piece of technology that’s the home experience
885
00:48:53,869 –> 00:48:56,514
here where it’s heating up my bathtub and telling
886
00:48:56,515 –> 00:48:58,252
me that it’s ready to jump into.
887
00:48:58,253 –> 00:48:59,702
Oh, I love those japanese.
888
00:48:59,703 –> 00:49:02,342
I love those japanese inventions.
889
00:49:02,343 –> 00:49:05,440
Although it’s been there for years, but I remember
890
00:49:05,441 –> 00:49:08,832
when I was last year, it’s last year in
891
00:49:08,833 –> 00:49:12,980
Nagoya, and I rented this apartment and I could
892
00:49:12,981 –> 00:49:17,908
control from the kitchen the water temperature and all
893
00:49:17,909 –> 00:49:19,546
this stuff from my bathroom.
894
00:49:19,547 –> 00:49:21,946
And he was telling me like, yeah, in Japanese.
895
00:49:21,947 –> 00:49:25,990
But I kind of tried to translate, your water is ready.
896
00:49:25,991 –> 00:49:26,740
Exactly.
897
00:49:27,350 –> 00:49:28,340
So cool.
898
00:49:30,070 –> 00:49:30,856
Again.
899
00:49:30,857 –> 00:49:33,048
There’s a good example of
900
00:49:33,049 –> 00:49:35,208
where technology can be really?
901
00:49:35,209 –> 00:49:37,106
Yeah, but it’s not invasive.
902
00:49:37,107 –> 00:49:38,652
You can turn it off.
903
00:49:38,653 –> 00:49:39,400
Correct.
904
00:49:41,130 –> 00:49:44,940
To go back to your question, I think anyone
905
00:49:44,941 –> 00:49:48,144
who’s involved in technology, yourself, myself, lots of the
906
00:49:48,145 –> 00:49:52,016
people that we interact with every day, and maybe
907
00:49:52,017 –> 00:49:55,216
even people who watch this in the future, is
908
00:49:55,217 –> 00:49:57,050
that we have a responsibility.
909
00:49:57,710 –> 00:50:01,066
It’s up to us to put certain parameters
910
00:50:01,067 –> 00:50:06,580
in place where we create technology that definitely
911
00:50:06,581 –> 00:50:10,922
helps and makes, hopefully lives easier, et cetera.
912
00:50:10,923 –> 00:50:13,200
But we also still have to sort of.
913
00:50:15,410 –> 00:50:17,128
I don’t like to use a word, but we
914
00:50:17,129 –> 00:50:19,220
definitely need to put the controls in place.
915
00:50:22,150 –> 00:50:23,368
I don’t think we have the right to
916
00:50:23,369 –> 00:50:24,888
control people or what they want to do.
917
00:50:24,889 –> 00:50:26,748
If people want more technology, they
918
00:50:26,749 –> 00:50:27,916
have the right to do that.
919
00:50:27,917 –> 00:50:29,788
But I think we also have
920
00:50:29,789 –> 00:50:32,920
the responsibility to slow things down.
921
00:50:34,010 –> 00:50:37,388
And I definitely think there’s a movement around that.
922
00:50:37,389 –> 00:50:39,808
I think Covid, most people would say
923
00:50:39,809 –> 00:50:41,820
they felt like they really slowed down.
924
00:50:42,910 –> 00:50:46,250
But if you actually look at it, maybe Covid,
925
00:50:47,230 –> 00:50:49,744
for all the negatives, the positive was that it
926
00:50:49,745 –> 00:50:52,220
did force us to slow down a little bit.
927
00:50:53,310 –> 00:50:54,698
It made us stay remote.
928
00:50:54,699 –> 00:50:58,724
It kept us in different positions, but then we’ve come out
929
00:50:58,725 –> 00:51:01,338
of it and now we’re going fast, so it feels faster.
930
00:51:01,339 –> 00:51:04,824
But really the evolution is on track as
931
00:51:04,825 –> 00:51:06,408
to what it probably would have been in
932
00:51:06,409 –> 00:51:10,020
any other industrial era or something like that.
933
00:51:11,270 –> 00:51:13,208
But, yeah, the point is, I think we
934
00:51:13,209 –> 00:51:17,308
need to be aware of making sure that
935
00:51:17,309 –> 00:51:20,040
technology doesn’t just phase everybody out.
936
00:51:22,010 –> 00:51:24,920
That’s the trickiest bit, I guess.
937
00:51:25,930 –> 00:51:27,000
Yeah, it is.
938
00:51:27,870 –> 00:51:32,384
Particularly something to think about is that if that
939
00:51:32,385 –> 00:51:37,630
moral stance is such a strong part to us
940
00:51:37,631 –> 00:51:41,748
as people involved in technology, that perhaps we have
941
00:51:41,749 –> 00:51:44,372
the option to not be involved in technology and
942
00:51:44,373 –> 00:51:48,180
be involved in more wholesome type activities, for example,
943
00:51:48,181 –> 00:51:53,570
agriculture or manufacturing, et cetera.
944
00:51:54,630 –> 00:51:57,608
But again, I think in terms of the
945
00:51:57,609 –> 00:52:01,112
role that technology is playing, and I keep
946
00:52:01,113 –> 00:52:04,744
coming back to this, AI, particularly as a
947
00:52:04,745 –> 00:52:08,690
subject matter, is predictive, generative and interactive.
948
00:52:08,691 –> 00:52:13,052
I think if we can be more focused on the
949
00:52:13,053 –> 00:52:17,240
interactive and allow people the opportunity to shape the technology,
950
00:52:20,490 –> 00:52:23,408
put the parameters in place, put the governance in place,
951
00:52:23,409 –> 00:52:24,860
then I think it will be fine.
952
00:52:26,110 –> 00:52:28,288
And I definitely think we saw that recently, as know,
953
00:52:28,289 –> 00:52:31,318
we saw that with Sam Altman and what that slight
954
00:52:31,319 –> 00:52:37,492
challenge that he had with was communicated both ways, whether
955
00:52:37,493 –> 00:52:40,378
Microsoft wanted to go in a certain direction or Sam
956
00:52:40,379 –> 00:52:41,754
wanted to go in a certain direction.
957
00:52:41,755 –> 00:52:45,272
But they were fortunate that they worked it out.
958
00:52:45,273 –> 00:52:47,086
They probably had different agendas.
959
00:52:47,087 –> 00:52:50,776
But credit, where credit’s due to both Sam and
960
00:52:50,777 –> 00:52:52,728
Microsoft, that they were able to sort of come
961
00:52:52,729 –> 00:52:56,760
back to the table and put parameters in place.
962
00:52:59,050 –> 00:53:01,370
The lawyers did it for them.
963
00:53:01,371 –> 00:53:02,570
Exactly.
964
00:53:02,571 –> 00:53:04,338
It could be that it takes the lawyers.
965
00:53:04,339 –> 00:53:06,572
That’s why we create the legal system to
966
00:53:06,573 –> 00:53:09,560
effectively put governance, et cetera, in place.
967
00:53:10,090 –> 00:53:15,766
I think even with Zuckerberg, particularly in know, Facebook
968
00:53:15,767 –> 00:53:18,704
was face, it was a very sensitive word to
969
00:53:18,705 –> 00:53:21,366
be used, and they’ve evolved to become meta.
970
00:53:21,367 –> 00:53:25,268
And then even with the use of data and a lot
971
00:53:25,269 –> 00:53:29,908
of the philanthropy that not only Zuckerberg has done, Gates and
972
00:53:29,909 –> 00:53:34,552
some of the lots of philanthropists around the world, that they
973
00:53:34,553 –> 00:53:39,970
give back a lot of what they create fiscally.
974
00:53:41,270 –> 00:53:42,824
A lot of it’s media, too.
975
00:53:42,825 –> 00:53:44,984
We don’t get to hear a lot of these really
976
00:53:44,985 –> 00:53:48,908
amazing stories of philanthropy and what people like Gates have
977
00:53:48,909 –> 00:53:53,308
done for years, and even Musk is doing and all
978
00:53:53,309 –> 00:53:57,800
of these amazing people in the world.
979
00:53:59,150 –> 00:54:00,528
One thing that comes to mind, as I
980
00:54:00,529 –> 00:54:04,752
had this conversation with you, is I do
981
00:54:04,753 –> 00:54:06,890
think we need more female perspective.
982
00:54:09,630 –> 00:54:12,576
Not only like minorities in general. Right.
983
00:54:12,577 –> 00:54:14,666
Like people with different perspectives
984
00:54:14,667 –> 00:54:16,020
on things and problems.
985
00:54:16,021 –> 00:54:16,484
Yeah.
986
00:54:16,485 –> 00:54:18,196
But if you look at, like I just mentioned
987
00:54:18,197 –> 00:54:21,620
four or five names they’re all men, of course.
988
00:54:21,621 –> 00:54:22,964
Why do you think is that?
989
00:54:22,965 –> 00:54:24,744
I don’t know the answer to the question.
990
00:54:24,745 –> 00:54:27,752
Maybe at time and place the opportunities were
991
00:54:27,753 –> 00:54:30,792
different than what they were versus today.
992
00:54:30,793 –> 00:54:32,488
It’s definitely the case.
993
00:54:32,489 –> 00:54:34,792
I’m excited at the same time for my
994
00:54:34,793 –> 00:54:39,468
daughter to be equal to not have to
995
00:54:39,469 –> 00:54:43,590
worry about male prejudice or chauvinistic attitudes.
996
00:54:44,410 –> 00:54:46,380
But I still believe it would be great
997
00:54:46,381 –> 00:54:51,050
for us to have more female perspectives.
998
00:54:51,790 –> 00:54:53,152
I generally mean that.
999
00:54:53,153 –> 00:54:58,590
I think we are.
1000
00:54:58,591 –> 00:55:01,904
I don’t know exact number, but it’s like 50 50.
1001
00:55:01,905 –> 00:55:04,960
There’s not like it’s 80% men and 20% women.
1002
00:55:05,810 –> 00:55:08,538
This is obviously a very subject, delicate
1003
00:55:08,539 –> 00:55:09,892
subject for some people to talk about.
1004
00:55:09,893 –> 00:55:12,728
But for me personally, I would just
1005
00:55:12,729 –> 00:55:14,958
love to see more female perspectives.
1006
00:55:14,959 –> 00:55:16,980
I appreciate you saying that.
1007
00:55:17,590 –> 00:55:19,620
No, it is.
1008
00:55:20,470 –> 00:55:22,610
It’s not just about cosmetics.
1009
00:55:23,350 –> 00:55:24,952
When I say cosmetics, I’m referring to,
1010
00:55:24,953 –> 00:55:29,100
like, the appearance because it’s everything from
1011
00:55:29,101 –> 00:55:32,550
the presentation to the nuances.
1012
00:55:33,690 –> 00:55:37,068
And when I think about my last few years with
1013
00:55:37,069 –> 00:55:41,950
Softbank, and it’s been incredible journey so far is that
1014
00:55:41,951 –> 00:55:46,510
a lot of the successful products really took into account
1015
00:55:46,511 –> 00:55:50,330
both sides, not just a male perspective.
1016
00:55:51,090 –> 00:55:52,756
And then, of course, it took into age
1017
00:55:52,757 –> 00:55:55,706
perspectives, younger, older, et cetera, et cetera.
1018
00:55:55,707 –> 00:55:58,612
So I think, again, when you put technology
1019
00:55:58,613 –> 00:56:03,990
with all of this global gentification and equality,
1020
00:56:03,991 –> 00:56:06,152
it’s a lot to take in.
1021
00:56:06,153 –> 00:56:11,112
Of course, it’s a really unique time.
1022
00:56:11,113 –> 00:56:14,900
So perhaps that’s also the other thing.
1023
00:56:16,010 –> 00:56:17,836
Men and women traditionally have
1024
00:56:17,837 –> 00:56:19,762
different value sets, et cetera.
1025
00:56:19,763 –> 00:56:22,860
And men who have shaped technology
1026
00:56:22,861 –> 00:56:25,160
feel they have to create something.
1027
00:56:26,090 –> 00:56:27,800
That’s what men prove themselves.
1028
00:56:28,510 –> 00:56:28,976
Yeah.
1029
00:56:28,977 –> 00:56:31,280
Whereas if we were to bring in more of a
1030
00:56:31,281 –> 00:56:35,712
female, a woman’s perspective, then it would be perhaps as
1031
00:56:35,713 –> 00:56:40,832
you expressing today would be, why don’t we slow down?
1032
00:56:40,833 –> 00:56:42,752
And again, I have to be careful, of
1033
00:56:42,753 –> 00:56:45,172
course, because this gets put out there.
1034
00:56:45,173 –> 00:56:47,908
But my understanding is, and again, I don’t think
1035
00:56:47,909 –> 00:56:50,308
I’m speaking out of turn here, but when you
1036
00:56:50,309 –> 00:56:53,898
look at this recent altercation that existed with Microsoft
1037
00:56:53,899 –> 00:56:57,608
on OpenAI, it also came down.
1038
00:56:57,609 –> 00:57:00,846
There was a strong undertone of gender
1039
00:57:00,847 –> 00:57:04,062
in terms of male sentiment versus female
1040
00:57:04,063 –> 00:57:05,874
sentiment, et cetera, et cetera.
1041
00:57:05,875 –> 00:57:07,804
But again, credit where credit’s due, they
1042
00:57:07,805 –> 00:57:11,640
were able to find the sense.
1043
00:57:13,210 –> 00:57:15,468
Hopefully I’m making some sense here.
1044
00:57:15,469 –> 00:57:16,204
It does.
1045
00:57:16,205 –> 00:57:22,896
I had this thought that maybe the way Japan has
1046
00:57:22,897 –> 00:57:29,472
managed to have those two different worlds coexist in a
1047
00:57:29,473 –> 00:57:36,916
perfect harmony like the old traditional world where craftsmanship and
1048
00:57:36,917 –> 00:57:42,506
this wabi sabi and the concept of slow life exists
1049
00:57:42,507 –> 00:57:48,872
and is celebrated versus the futuristic world.
1050
00:57:48,873 –> 00:57:51,496
Maybe we should look up to how
1051
00:57:51,497 –> 00:57:53,944
Japan has managed to do that.
1052
00:57:53,945 –> 00:57:56,268
Of course, like we discussed, there are
1053
00:57:56,269 –> 00:57:59,436
many aspects which could be improved and
1054
00:57:59,437 –> 00:58:03,954
could be more open to renegotiate.
1055
00:58:03,955 –> 00:58:08,096
But when we think about what’s going to happen and how
1056
00:58:08,097 –> 00:58:13,008
we are trying to find our value and our place in
1057
00:58:13,009 –> 00:58:18,608
the world in this fast paced and ever changing technology, I
1058
00:58:18,609 –> 00:58:21,828
guess maybe Japan has some learnings for us.
1059
00:58:21,829 –> 00:58:23,284
Yeah, it does.
1060
00:58:23,285 –> 00:58:27,250
And again, for anyone that watches this after we’ve
1061
00:58:27,251 –> 00:58:33,048
finished, I think what, there’s approximately 200 countries on
1062
00:58:33,049 –> 00:58:36,638
the planet, but there’s five or ten that dominate.
1063
00:58:36,639 –> 00:58:42,184
Japan definitely brings an easiness, a sense of
1064
00:58:42,185 –> 00:58:45,928
cultural aspect into everything that they do.
1065
00:58:45,929 –> 00:58:48,364
There’s very few products that are created here
1066
00:58:48,365 –> 00:58:50,498
that are not really conscious of the consumer
1067
00:58:50,499 –> 00:58:53,468
in terms of functionality and features and benefits.
1068
00:58:53,469 –> 00:58:55,612
And it’s really everything is
1069
00:58:55,613 –> 00:58:56,946
created around the consumer.
1070
00:58:56,947 –> 00:58:59,290
Everything is about functionality.
1071
00:59:00,750 –> 00:59:05,520
At the same time, there are certain things that probably
1072
00:59:05,521 –> 00:59:07,850
need to be created a bit more in the masses,
1073
00:59:10,190 –> 00:59:13,172
and Japan could probably take some learnings from that.
1074
00:59:13,173 –> 00:59:17,274
We said it earlier, I think there are different phases.
1075
00:59:17,275 –> 00:59:17,828
Right?
1076
00:59:17,829 –> 00:59:20,650
There’s the inventive phase, the innovative phase,
1077
00:59:20,651 –> 00:59:23,752
I guess, what’s the other word?
1078
00:59:23,753 –> 00:59:25,890
That basically satisfaction phase.
1079
00:59:28,550 –> 00:59:34,408
But maybe it’s the intuitive phase where we
1080
00:59:34,409 –> 00:59:37,458
take common sense, not machine learning, not AI.
1081
00:59:37,459 –> 00:59:39,500
We take common sense to
1082
00:59:39,501 –> 00:59:42,810
effectively define the final outcome.
1083
00:59:42,811 –> 00:59:46,108
I think we’re kind of having the same conversation here,
1084
00:59:46,109 –> 00:59:52,670
but it really is, we have to be responsible on
1085
00:59:52,671 –> 00:59:57,020
how technology plays a part in our lives.
1086
00:59:58,990 –> 01:00:04,490
Which is not something which usually America
1087
01:00:04,491 –> 01:00:07,236
and Silicon Valley is advocating for. Right?
1088
01:00:07,237 –> 01:00:11,434
Like move fast and break things kind of movement.
1089
01:00:11,435 –> 01:00:11,812
Yeah.
1090
01:00:11,813 –> 01:00:13,652
Well, there’s, I think the most common
1091
01:00:13,653 –> 01:00:15,700
expression at the moment is fail fast.
1092
01:00:16,470 –> 01:00:17,768
Fail fast.
1093
01:00:17,769 –> 01:00:19,608
That’s a common expression that I hear when
1094
01:00:19,609 –> 01:00:23,560
I have the opportunity to travel or meet
1095
01:00:23,561 –> 01:00:26,010
particularly people from the United States.
1096
01:00:26,011 –> 01:00:29,404
I guess there is so much at stake right now, right.
1097
01:00:29,405 –> 01:00:32,572
Those new technologies being created are much
1098
01:00:32,573 –> 01:00:36,514
more powerful than what we were dealing
1099
01:00:36,515 –> 01:00:39,808
with earlier, or you wouldn’t say so.
1100
01:00:39,809 –> 01:00:41,690
I think it’s like mobile phones.
1101
01:00:43,390 –> 01:00:44,592
Most people, if you ask
1102
01:00:44,593 –> 01:00:45,808
the question, what is technology?
1103
01:00:45,809 –> 01:00:47,850
They would say, oh, software, hardware.
1104
01:00:48,430 –> 01:00:51,200
That’s generally the two responses, right.
1105
01:00:51,730 –> 01:00:53,348
We now became very platform
1106
01:00:53,349 –> 01:00:56,122
based operating system phones.
1107
01:00:56,123 –> 01:00:58,690
Now we got operating systems for AI.
1108
01:00:58,691 –> 01:01:01,492
So the whole platform model has
1109
01:01:01,493 –> 01:01:05,160
become very much to the front.
1110
01:01:05,161 –> 01:01:08,008
And again, if you look at, for example, in
1111
01:01:08,009 –> 01:01:10,718
the gaming world, like Pokemon, you’re probably familiar Pokemon.
1112
01:01:10,719 –> 01:01:14,168
It’s a massive export out of know.
1113
01:01:14,169 –> 01:01:16,690
It operates out of a company called Niantic.
1114
01:01:16,691 –> 01:01:18,600
But Niantic is purely a platform.
1115
01:01:19,530 –> 01:01:22,908
It’s an environment that was created so
1116
01:01:22,909 –> 01:01:25,516
that developers effectively could go in there
1117
01:01:25,517 –> 01:01:29,080
and produce and create a gaming experience.
1118
01:01:30,730 –> 01:01:33,408
I don’t have the answer, but if we, again, we
1119
01:01:33,409 –> 01:01:36,278
use AI as the talking point, which is really exciting.
1120
01:01:36,279 –> 01:01:38,230
We’ve gone from predictive to generative.
1121
01:01:38,231 –> 01:01:39,366
They still will exist.
1122
01:01:39,367 –> 01:01:42,964
But if you can create a range of products that
1123
01:01:42,965 –> 01:01:47,466
is interactive, then I think you then have the ability
1124
01:01:47,467 –> 01:01:52,030
for people to switch off or change the speed.
1125
01:01:53,670 –> 01:01:57,080
As long as you keep those options
1126
01:01:57,081 –> 01:01:59,438
that governance within, then I think we’re.
1127
01:01:59,439 –> 01:02:00,020
Ok.
1128
01:02:00,790 –> 01:02:03,288
Yeah, this is something which I
1129
01:02:03,289 –> 01:02:04,660
don’t know if you’ve seen.
1130
01:02:05,450 –> 01:02:08,866
There was an AI safety summit in the UK
1131
01:02:08,867 –> 01:02:13,308
beginning of December and Elon Musk was invited as
1132
01:02:13,309 –> 01:02:18,432
well, and he was speaking with Prime Minister about
1133
01:02:18,433 –> 01:02:24,528
that, about putting emphasis on making sure that you
1134
01:02:24,529 –> 01:02:30,430
can switch off machines, software, especially robotics when you
1135
01:02:30,431 –> 01:02:35,150
come to contact, into contact with humans physically.
1136
01:02:35,890 –> 01:02:38,980
You want to make sure that when something
1137
01:02:38,981 –> 01:02:41,650
goes wrong, you have the red button.
1138
01:02:41,651 –> 01:02:43,240
Yeah, no, definitely.
1139
01:02:43,241 –> 01:02:47,512
I remember seeing, it’s only recent, but I did
1140
01:02:47,513 –> 01:02:51,640
see a synopsis of a summary of that particular
1141
01:02:51,641 –> 01:02:55,294
event, which I thought was really exciting because effectively
1142
01:02:55,295 –> 01:02:59,500
it’s the EU, Europe, greater Europe, et cetera, that
1143
01:02:59,501 –> 01:03:01,240
staged the first one of those.
1144
01:03:02,170 –> 01:03:04,572
So I think the sentiment that you have, which
1145
01:03:04,573 –> 01:03:08,108
I share with you as well, is exciting because
1146
01:03:08,109 –> 01:03:10,810
it shows that it actually is coming from Europe,
1147
01:03:12,110 –> 01:03:15,638
which the first one wasn’t staged in Asia Pacific
1148
01:03:15,639 –> 01:03:17,542
or in the Americas, it was staged in Europe.
1149
01:03:17,543 –> 01:03:19,696
So there’s probably a movement there.
1150
01:03:19,697 –> 01:03:24,582
And I think some of the unfortunate geopolitical
1151
01:03:24,583 –> 01:03:27,748
type situations that you’ve had, recent issues in
1152
01:03:27,749 –> 01:03:31,524
Israel and recent issues in Ukraine, I think
1153
01:03:31,525 –> 01:03:34,552
Europeans traditionally are probably more aware of trying
1154
01:03:34,553 –> 01:03:40,446
to keep harmony in an otherwise challenging environment.
1155
01:03:40,447 –> 01:03:42,168
So I think the event that you
1156
01:03:42,169 –> 01:03:44,520
mentioned earlier was great because it was.
1157
01:03:44,521 –> 01:03:47,406
Yeah, we do need to put rules and regulations
1158
01:03:47,407 –> 01:03:52,156
and governance in place so that it gives people
1159
01:03:52,157 –> 01:03:55,196
the opportunity to, as we would use with computer
1160
01:03:55,197 –> 01:03:57,000
vision, opt in or opt out.
1161
01:03:58,250 –> 01:04:02,722
Yeah, but it’s very tricky to find to balance
1162
01:04:02,723 –> 01:04:05,690
this line because you don’t want to over regulate.
1163
01:04:05,691 –> 01:04:09,388
Because usually the sentiment is that when
1164
01:04:09,389 –> 01:04:13,620
you regulate, it will serve better.
1165
01:04:13,621 –> 01:04:17,172
Those big guys, the big tech companies who are
1166
01:04:17,173 –> 01:04:24,382
already having enough legal, let’s say, assistance to navigate
1167
01:04:24,383 –> 01:04:32,968
through those challenging regulatory laws while preventing startups and
1168
01:04:32,969 –> 01:04:35,800
small companies from doing the same.
1169
01:04:36,810 –> 01:04:39,800
Yeah, it’s a very good point.
1170
01:04:41,210 –> 01:04:45,548
I think there’s different views and stances on that.
1171
01:04:45,549 –> 01:04:49,376
I think depending on the market, big is better.
1172
01:04:49,377 –> 01:04:50,992
That’s not always the case, of course.
1173
01:04:50,993 –> 01:04:56,502
I think in different markets, including Japan, where there’s
1174
01:04:56,503 –> 01:04:59,552
more emphasis on the startup, I mean, we often
1175
01:04:59,553 –> 01:05:01,882
get that that Japan doesn’t have a startup culture
1176
01:05:01,883 –> 01:05:04,132
or a system, which is actually not true.
1177
01:05:04,133 –> 01:05:06,240
It’s incredible how many.
1178
01:05:08,210 –> 01:05:08,708
Go on.
1179
01:05:08,709 –> 01:05:13,368
Sorry, no, I just remember when I was like, we
1180
01:05:13,369 –> 01:05:17,784
were meeting as part of the 500 global programme, we
1181
01:05:17,785 –> 01:05:21,758
were meeting with lots of different representatives, like from Toshiba
1182
01:05:21,759 –> 01:05:28,108
and others, and they were claiming that startup company for
1183
01:05:28,109 –> 01:05:32,188
them is a company which raised at least ten to
1184
01:05:32,189 –> 01:05:36,252
50 million, which is not something which you would see
1185
01:05:36,253 –> 01:05:40,144
in Europe, like they don’t have, or at least from
1186
01:05:40,145 –> 01:05:43,340
the people whom we talked to or heard from.
1187
01:05:43,870 –> 01:05:48,620
The concept of precede seed doesn’t exist for them.
1188
01:05:49,810 –> 01:05:53,120
It’s not a company, it’s not a thing.
1189
01:05:54,370 –> 01:05:57,412
Yeah, I think obviously we talk about
1190
01:05:57,413 –> 01:06:01,562
bootstrapping, for example, companies and different language.
1191
01:06:01,563 –> 01:06:05,806
Of course I think it does exist.
1192
01:06:05,807 –> 01:06:09,358
It’s just that in terms of relevance to big corporates,
1193
01:06:09,359 –> 01:06:14,230
as you said, funding of the circa ten, $15 million.
1194
01:06:14,231 –> 01:06:16,620
But there are definitely many, many companies, many
1195
01:06:16,621 –> 01:06:20,348
different types of businesses, small businesses, startups that
1196
01:06:20,349 –> 01:06:24,194
start up with hundreds of dollars, maybe tens
1197
01:06:24,195 –> 01:06:27,320
of thousands of dollars versus millions of dollars.
1198
01:06:28,570 –> 01:06:29,628
I think you’re right though.
1199
01:06:29,629 –> 01:06:32,432
When you do talk to the big corporates, they probably will
1200
01:06:32,433 –> 01:06:35,712
reply like that, because at the end of the day, if
1201
01:06:35,713 –> 01:06:39,184
it’s from their investment side off their PNL, it’s still the
1202
01:06:39,185 –> 01:06:41,600
same amount of work and effort for them.
1203
01:06:43,090 –> 01:06:47,556
And I think as a result, the word startup or
1204
01:06:47,557 –> 01:06:53,978
new business, et cetera, is probably not as relevant.
1205
01:06:53,979 –> 01:06:55,070
Not relevant.
1206
01:06:57,590 –> 01:06:58,808
People are aware of it.
1207
01:06:58,809 –> 01:07:00,798
But yeah, there’s definitely a lot, there’s
1208
01:07:00,799 –> 01:07:05,352
lots of different small businesses, startups, they
1209
01:07:05,353 –> 01:07:08,412
just are not tagged as that. I think it’s probably just
1210
01:07:08,413 –> 01:07:09,852
a different language actually.
1211
01:07:09,853 –> 01:07:10,300
Right?
1212
01:07:10,301 –> 01:07:11,516
Could be, yeah.
1213
01:07:11,517 –> 01:07:16,988
And I was so excited to see how much
1214
01:07:16,989 –> 01:07:21,408
of the excitement there is in Tokyo, especially how
1215
01:07:21,409 –> 01:07:25,334
many people are trying to change to improve.
1216
01:07:25,335 –> 01:07:28,512
And yeah, I think there is still
1217
01:07:28,513 –> 01:07:32,106
not enough spotlight on those smaller players
1218
01:07:32,107 –> 01:07:35,092
because they can shape our future.
1219
01:07:35,093 –> 01:07:35,780
Yeah, definitely.
1220
01:07:35,781 –> 01:07:37,466
I mean, it’s really exciting.
1221
01:07:37,467 –> 01:07:39,604
I mean, as I said earlier, the last
1222
01:07:39,605 –> 01:07:42,670
four years, the first Covid years were tough.
1223
01:07:43,430 –> 01:07:44,648
Last two years was more
1224
01:07:44,649 –> 01:07:46,450
about correction and consolidation.
1225
01:07:48,710 –> 01:07:50,900
I was travelling just the last few days.
1226
01:07:51,510 –> 01:07:53,368
You can feel the energy.
1227
01:07:53,369 –> 01:07:54,872
There’s a lot of interest.
1228
01:07:54,873 –> 01:07:57,362
A lot of people are very curious about Japan
1229
01:07:57,363 –> 01:07:58,988
because I think we all got to see each
1230
01:07:58,989 –> 01:08:01,868
other again, different ways, how we handled Covid, how
1231
01:08:01,869 –> 01:08:05,030
we didn’t handle Covid, et cetera, et.
1232
01:08:06,250 –> 01:08:08,704
But again, we’ve got in Japan at the
1233
01:08:08,705 –> 01:08:11,632
moment an administration, a government that is very
1234
01:08:11,633 –> 01:08:14,890
much about encouraging Japan to go global.
1235
01:08:15,550 –> 01:08:18,560
I think that goes back to a few
1236
01:08:18,561 –> 01:08:22,394
previous administrations, including Abe son and Koyazumi.
1237
01:08:22,395 –> 01:08:25,652
These are names, but it’s not just been
1238
01:08:25,653 –> 01:08:27,252
happening for the last three, four years.
1239
01:08:27,253 –> 01:08:28,852
It’s been happening for the last,
1240
01:08:28,853 –> 01:08:31,279
really last 10, 15, 20 years.
1241
01:08:33,590 –> 01:08:36,808
I think the success of Japan in the future, though,
1242
01:08:36,809 –> 01:08:40,413
will be probably a bit more related to joint ventures.
1243
01:08:40,414 –> 01:08:44,600
I think those like yourselves, or people even
1244
01:08:44,601 –> 01:08:46,124
like myself, even though I’ve been here a
1245
01:08:46,125 –> 01:08:49,595
long time, if you can find the right
1246
01:08:49,596 –> 01:08:54,229
japanese partner, then the world is your oyster.
1247
01:08:54,890 –> 01:08:57,116
There’s ways to export Japan out
1248
01:08:57,117 –> 01:08:58,416
and then bring things in.
1249
01:08:58,417 –> 01:09:04,448
And finding those right partners is probably
1250
01:09:04,449 –> 01:09:06,112
the missing piece at the moment.
1251
01:09:06,113 –> 01:09:09,167
Yeah, I think it applies to everything, right.
1252
01:09:09,168 –> 01:09:11,810
To your personal life as well.
1253
01:09:11,811 –> 01:09:13,587
If you get married to the right
1254
01:09:13,588 –> 01:09:16,804
partner, you can achieve much more.
1255
01:09:16,805 –> 01:09:17,524
Yeah, definitely.
1256
01:09:17,525 –> 01:09:19,332
I mean, there’s definitely more you can
1257
01:09:19,333 –> 01:09:22,319
do in partnership than on your own. Right.
1258
01:09:23,510 –> 01:09:24,728
The one thing that I think the
1259
01:09:24,729 –> 01:09:27,134
Japanese are very good at is collaborating.
1260
01:09:27,135 –> 01:09:28,952
They work well together in
1261
01:09:28,953 –> 01:09:32,609
teams and in collaborative environments.
1262
01:09:34,550 –> 01:09:36,950
A lot of it’s set up to be successful.
1263
01:09:37,930 –> 01:09:40,091
Obviously, like many things, communication can
1264
01:09:40,092 –> 01:09:41,265
sometimes be of a challenge.
1265
01:09:41,266 –> 01:09:45,778
But, yeah, I’ve always been very bullish on Japan,
1266
01:09:45,779 –> 01:09:51,328
but particularly at the moment, I think as the
1267
01:09:51,329 –> 01:09:56,016
world goes faster, like we talked about, people still
1268
01:09:56,017 –> 01:10:00,944
want quality, and as long as you’re looking for
1269
01:10:00,945 –> 01:10:02,916
quality, then Japan is your place.
1270
01:10:02,917 –> 01:10:04,320
That’s where you need to come.
1271
01:10:05,010 –> 01:10:08,468
That’s where you are with your saki and your
1272
01:10:08,469 –> 01:10:12,800
new venture, which hopefully I will find out soon.
1273
01:10:13,490 –> 01:10:15,832
Yeah, hopefully in the next few months,
1274
01:10:15,833 –> 01:10:17,192
we can talk more about it.
1275
01:10:17,193 –> 01:10:20,744
Happy to chat again and share with you, probably.
1276
01:10:20,745 –> 01:10:22,680
I’d love to keep the conversation going
1277
01:10:22,681 –> 01:10:24,968
about how to not only bridge the
1278
01:10:24,969 –> 01:10:27,122
cultural aspects between, say, you and Europe.
1279
01:10:27,123 –> 01:10:30,012
Here I am in Japan, but more
1280
01:10:30,013 –> 01:10:32,040
about the whole pace thing you mentioned.
1281
01:10:33,370 –> 01:10:35,164
How do you take the speed and make it
1282
01:10:35,165 –> 01:10:41,660
a positive rather than being seen as potentially a
1283
01:10:42,190 –> 01:10:49,340
scary thing, detrimental, intimidating, daunting, all these negative words,
1284
01:10:50,270 –> 01:10:52,932
if you can change the narrative to it.
1285
01:10:52,933 –> 01:10:56,388
So I guess that’s where probably a question for
1286
01:10:56,389 –> 01:10:59,258
you, maybe, is what do you see as emerging
1287
01:10:59,259 –> 01:11:02,170
as technology in Europe versus, say, Asia?
1288
01:11:02,171 –> 01:11:07,624
And we will leave this for our next episode because,
1289
01:11:07,625 –> 01:11:12,392
yeah, it’s a longer question, but thank you so much
1290
01:11:12,393 –> 01:11:17,910
for this, and we will definitely explore more in depth
1291
01:11:18,810 –> 01:11:23,554
your current project in a few months when time allows.
1292
01:11:23,555 –> 01:11:26,892
I mean, my passing comment for you, or to
1293
01:11:26,893 –> 01:11:29,264
catch up again, is thank you for the opportunity.
1294
01:11:29,265 –> 01:11:31,392
I appreciate your passion for.
1295
01:11:31,393 –> 01:11:32,352
Thank you.
1296
01:11:32,353 –> 01:11:34,630
This part of the world and the ability
1297
01:11:34,631 –> 01:11:38,256
to really just communicate, it’s such a great
1298
01:11:38,257 –> 01:11:40,060
skill and talent that you have.
1299
01:11:40,750 –> 01:11:42,592
And for anyone that is listening to this,
1300
01:11:42,593 –> 01:11:44,212
if you do need anything, just reach out.
1301
01:11:44,213 –> 01:11:45,572
And I’m always happy to try
1302
01:11:45,573 –> 01:11:48,990
and share insights and experiences.
1303
01:11:49,730 –> 01:11:52,320
Again, I think 2024 will be a great year.
1304
01:11:53,410 –> 01:11:54,836
I’ve said it a few times already.
1305
01:11:54,837 –> 01:11:55,844
Let’s just think of the
1306
01:11:55,845 –> 01:11:57,208
interactive everything we think of.
1307
01:11:57,209 –> 01:11:58,968
Let’s think of how to interact and how to
1308
01:11:58,969 –> 01:12:03,140
get people shaping and being a part of technology.
1309
01:12:03,910 –> 01:12:09,564
And if you want to reveal a bit more about your
1310
01:12:09,565 –> 01:12:15,244
saki, how is the process, how is the production going?
1311
01:12:15,245 –> 01:12:17,148
Yeah, so we literally got the
1312
01:12:17,149 –> 01:12:18,640
team in from the United States.
1313
01:12:18,641 –> 01:12:20,288
So the product launches in the
1314
01:12:20,289 –> 01:12:23,456
United States in April 1.
1315
01:12:23,457 –> 01:12:25,290
Production has been completed.
1316
01:12:26,270 –> 01:12:27,710
We are now.
1317
01:12:27,711 –> 01:12:29,110
How many bottles?
1318
01:12:29,111 –> 01:12:30,916
We did a few. We did a few.
1319
01:12:30,917 –> 01:12:33,172
It’s been received very well.
1320
01:12:33,173 –> 01:12:34,020
Okay.
1321
01:12:34,021 –> 01:12:36,740
And we’re now planning actually for 25
1322
01:12:36,741 –> 01:12:44,580
and 26, so it’s going really well.
1323
01:12:45,270 –> 01:12:48,590
The team in the United States are just fantastic.
1324
01:12:48,591 –> 01:12:52,472
They’re incredibly aware, very in
1325
01:12:52,473 –> 01:12:54,638
tune with what is required.
1326
01:12:54,639 –> 01:12:57,540
And the taste is good.
1327
01:12:58,470 –> 01:13:00,188
Know, the whole branding, the
1328
01:13:00,189 –> 01:13:03,580
whole exercise, the naming. Yeah.
1329
01:13:03,581 –> 01:13:05,196
It’s really perfect.
1330
01:13:05,197 –> 01:13:07,116
They’ve done an incredible job and
1331
01:13:07,117 –> 01:13:08,514
we’re lucky to be involved.
1332
01:13:08,515 –> 01:13:10,492
Okay, save one bottle for me,
1333
01:13:10,493 –> 01:13:12,650
please, so we can do campaign.
1334
01:13:12,651 –> 01:13:14,330
I will definitely.
1335
01:13:14,331 –> 01:13:19,468
Well, thank you so much for today, and again, look
1336
01:13:19,469 –> 01:13:21,988
forward to us catching up again in the future.
1337
01:13:21,989 –> 01:13:23,658
Likewise. Likewise, Paul.
1338
01:13:23,659 –> 01:13:25,044
And thank you so much.
1339
01:13:25,045 –> 01:13:26,380
Thank you. Have a great day.