
In this episode, Kamila welcomes Eirik Norman Hansen, a dedicated technology optimist. They discuss the potential of technology in addressing global challenges, emphasising the importance of hyperadaption. Eirik’s journey to becoming a technology advocate is highlighted, along with his belief in the need for technology to provide real value. We explore the impact of digitalisation on traditional business models and global citizenship. Eirik stresses the importance of understanding technology basics and shares his optimism about microbots in healthcare. The conversation touches on Neuralink, data privacy, and the metaverse. We also address inspiring young minds to embrace technology and staying informed in the tech world.
Connect with Eirik: https://www.linkedin.com/in/eirik
Connect with Kamila: https://www.linkedin.com/in/hankka/
Transcript:
1
00:00:00.090 –> 00:00:02.628
It’s your responsibility to make sure that you kind
2
00:00:02.628 –> 00:00:05.652
of try to figure out, okay, what does it
3
00:00:05.652 –> 00:00:08.932
mean for me instead of just pushing it away?
4
00:00:08.932 –> 00:00:12.442
I oftentimes use this example when I start my talks.
5
00:00:12.442 –> 00:00:13.972
I did a talk a couple of years ago
6
00:00:13.972 –> 00:00:18.068
at a big conference for the construction industry, and
7
00:00:18.068 –> 00:00:20.052
I was asked to talk a little bit about
8
00:00:20.052 –> 00:00:23.482
the coming of artificial intelligence and robotics.
9
00:00:23.482 –> 00:00:25.028
And this was a few years ago.
10
00:00:25.028 –> 00:00:31.148
So when the Boston dynamics dog came around and it was
11
00:00:31.148 –> 00:00:35.068
the same thing, then after my talk, a guy came up
12
00:00:35.068 –> 00:00:37.068
to me and I guess it was in the beginning of
13
00:00:37.068 –> 00:00:40.822
a 60s CEO of a company here in Norway.
14
00:00:40.822 –> 00:00:41.968
And he said that, yeah, this was
15
00:00:41.968 –> 00:00:43.728
really exciting, but we need to be
16
00:00:43.728 –> 00:00:46.650
careful with artificial intelligence and robots.
17
00:00:47.390 –> 00:00:48.940
He told you that in.
18
00:00:49.630 –> 00:00:50.700
Yeah, yeah.
19
00:00:51.470 –> 00:00:52.698
And he was dead serious.
20
00:00:52.698 –> 00:00:56.148
And I said, why? What do you mean? What do you mean?
21
00:00:56.148 –> 00:01:00.110
And he said, remember how it went in Terminator?
22
00:01:01.010 –> 00:01:04.072
He was dead serious because his reference, when we
23
00:01:04.072 –> 00:01:07.432
talked about creating software that was smart in some
24
00:01:07.432 –> 00:01:09.790
way or the other, putting it into a robot,
25
00:01:09.790 –> 00:01:12.718
what came to mind was Terminator.
26
00:01:12.718 –> 00:01:17.032
And we know that that’s not the way it’s going to work.
27
00:01:17.032 –> 00:01:18.188
So for him to kind of.
28
00:01:18.188 –> 00:01:21.052
Yeah, but maybe now there’s a hard time
29
00:01:21.052 –> 00:01:23.868
getting enough resources to be on your will.
30
00:01:23.868 –> 00:01:24.440
It.
31
00:01:26.970 –> 00:01:29.004
Maybe you are talking to Android right now.
32
00:01:29.004 –> 00:01:29.760
You don’t know that.
33
00:01:29.760 –> 00:01:30.480
Exactly.
34
00:01:30.480 –> 00:01:31.888
You don’t know either.
35
00:01:31.888 –> 00:01:33.366
Remember, I have four chips.
36
00:01:33.366 –> 00:01:38.134
This is just a hologram, and I don’t need chips.
37
00:01:38.134 –> 00:01:40.640
They created the better version of me.
38
00:01:43.330 –> 00:01:44.308
Hello.
39
00:01:44.308 –> 00:01:46.282
This is your host, Camila Hankiewic.
40
00:01:46.282 –> 00:01:49.316
And together with my guests, we discuss how tech
41
00:01:49.316 –> 00:01:52.404
is changing the way we live and work.
42
00:01:52.404 –> 00:01:55.166
Are you ready? Hi, Eric.
43
00:01:55.166 –> 00:01:56.130
It’s a pleasure.
44
00:01:57.350 –> 00:02:00.280
Hi, it’s a pleasure to be here. Thank you.
45
00:02:00.280 –> 00:02:02.872
How is Oslo right now?
46
00:02:02.872 –> 00:02:05.288
Oslo is quite nice, actually.
47
00:02:05.288 –> 00:02:08.482
Nice, sunny, early autumn.
48
00:02:08.482 –> 00:02:09.548
So it’s starting.
49
00:02:09.548 –> 00:02:12.882
The trees are starting to turn red and yellow.
50
00:02:12.882 –> 00:02:14.866
So it’s pretty wonderful.
51
00:02:14.866 –> 00:02:16.188
Beautiful outside, I must say.
52
00:02:16.188 –> 00:02:17.560
So Oslo is great.
53
00:02:18.270 –> 00:02:21.248
It’s a perfect ambience to think
54
00:02:21.248 –> 00:02:23.580
and just think about the future.
55
00:02:24.430 –> 00:02:28.990
I can’t unfortunately understand your language yet.
56
00:02:28.990 –> 00:02:31.844
Maybe with help of AI, I will at some point.
57
00:02:31.844 –> 00:02:37.530
But I watched the TEDx which you gave in English.
58
00:02:37.530 –> 00:02:38.770
Thank you.
59
00:02:38.770 –> 00:02:43.768
And I saw in all your presentations, all your
60
00:02:43.768 –> 00:02:48.792
materials, you share, you publish, you come up as
61
00:02:48.792 –> 00:02:54.504
a very future optimistic, tech optimistic person.
62
00:02:54.504 –> 00:02:58.380
You believe that there is lots of current
63
00:02:58.380 –> 00:03:01.292
problems will be solved with technology, but as
64
00:03:01.292 –> 00:03:06.330
you know, there is unlimited number of challenges.
65
00:03:06.330 –> 00:03:11.232
Do you think there are any particular problems or
66
00:03:11.232 –> 00:03:14.982
issues which we should work on as a priority?
67
00:03:14.982 –> 00:03:16.860
That’s a long intro. Sorry.
68
00:03:18.670 –> 00:03:22.004
And it’s a big question because there are
69
00:03:22.004 –> 00:03:25.908
so many things I think that technology or
70
00:03:25.908 –> 00:03:28.510
I know that technology can help us solve.
71
00:03:29.810 –> 00:03:33.348
Focusing on sustainability, which is kind
72
00:03:33.348 –> 00:03:36.136
of decent place to start.
73
00:03:36.136 –> 00:03:38.622
You can divide that up into different sections.
74
00:03:38.622 –> 00:03:41.790
And on one hand, we have the economical
75
00:03:41.790 –> 00:03:44.142
sustainability and we need to be more efficient.
76
00:03:44.142 –> 00:03:46.680
We need to use our resources in a better way.
77
00:03:47.290 –> 00:03:50.658
We know that we will have a lack of resources
78
00:03:50.658 –> 00:03:53.932
on the human side, as there will be more people
79
00:03:53.932 –> 00:03:57.480
like me turning gray and old and less.
80
00:04:01.470 –> 00:04:02.816
That’s one big area.
81
00:04:02.816 –> 00:04:06.854
We can use technology just to kind of narrow that gap,
82
00:04:06.854 –> 00:04:09.872
to be able to kind of keep the wheels turning and
83
00:04:09.872 –> 00:04:13.732
make sure that we are still in economic health.
84
00:04:13.732 –> 00:04:18.356
The other part is, of course, environmental issues, on how
85
00:04:18.356 –> 00:04:23.742
to make sure that we use, for instance, artificial intelligence
86
00:04:23.742 –> 00:04:28.312
and analytics to predict how much stuff we actually need
87
00:04:28.312 –> 00:04:31.838
to produce, that we don’t produce any more than necessary,
88
00:04:31.838 –> 00:04:35.288
so we don’t throw away, reduce waste and all that
89
00:04:35.288 –> 00:04:38.908
stuff, and also balance the amount of energy that we
90
00:04:38.908 –> 00:04:42.476
are consuming, both to produce stuff, but also to just
91
00:04:42.476 –> 00:04:44.330
go about our lives.
92
00:04:44.330 –> 00:04:45.916
And then the third thing I think is
93
00:04:45.916 –> 00:04:49.040
very important is we need to educate people.
94
00:04:49.040 –> 00:04:51.488
We need more people to be up to
95
00:04:51.488 –> 00:04:53.712
speed on what’s going on around them.
96
00:04:53.712 –> 00:04:56.976
And for that, we can also use technology like
97
00:04:56.976 –> 00:05:02.788
artificial intelligence, like extended realities, like analytics and stuff
98
00:05:02.788 –> 00:05:06.090
like that, to make sure that we are upgrading
99
00:05:06.090 –> 00:05:10.052
the people we have, but also reeducating people who
100
00:05:10.052 –> 00:05:14.150
need to change way of work.
101
00:05:14.150 –> 00:05:17.448
I think there’s a lot of opportunities, and
102
00:05:17.448 –> 00:05:19.192
in the middle of all this, we also
103
00:05:19.192 –> 00:05:21.650
need to kind of create new solutions.
104
00:05:22.390 –> 00:05:27.036
Technology has gone from being something we use to just
105
00:05:27.036 –> 00:05:31.276
be efficient, to become a natural part not only of
106
00:05:31.276 –> 00:05:34.796
our lives for fun and games, but to be able
107
00:05:34.796 –> 00:05:37.826
to actually move forward in the right direction.
108
00:05:37.826 –> 00:05:41.200
So it’s a huge area, a huge question.
109
00:05:41.200 –> 00:05:43.024
And there are so many things that we could
110
00:05:43.024 –> 00:05:48.190
do with technology, from going to space to solving
111
00:05:48.190 –> 00:05:51.412
very concrete and simple issues that we need to
112
00:05:51.412 –> 00:05:53.310
get hold on here on Earth.
113
00:05:54.690 –> 00:05:58.970
Okay, so about the space, I just saw the pictures
114
00:05:58.970 –> 00:06:03.490
of your speech where you wore a NASA suit.
115
00:06:04.070 –> 00:06:05.460
That was about that.
116
00:06:06.550 –> 00:06:08.020
Where did you get it from?
117
00:06:10.230 –> 00:06:12.072
That was quite funny, actually, because
118
00:06:12.072 –> 00:06:13.368
I was asked to do a.
119
00:06:13.368 –> 00:06:15.086
It was a huge conference
120
00:06:15.086 –> 00:06:17.850
on transportation and logistics.
121
00:06:17.850 –> 00:06:20.988
And the backstory here was that there
122
00:06:20.988 –> 00:06:22.962
are two big players in Norway.
123
00:06:22.962 –> 00:06:25.628
The one I was working for, they’re blue, their
124
00:06:25.628 –> 00:06:28.412
corporate color is blue, and the others are green.
125
00:06:28.412 –> 00:06:31.312
And this conference was about the green shift, but
126
00:06:31.312 –> 00:06:34.032
they couldn’t be green since their competitors are green.
127
00:06:34.032 –> 00:06:35.504
So they zoomed out.
128
00:06:35.504 –> 00:06:38.278
So they said, okay, let’s look at the blue planet.
129
00:06:38.278 –> 00:06:41.268
And then by coincidence, I know a
130
00:06:41.268 –> 00:06:43.722
few people working in the space industry.
131
00:06:43.722 –> 00:06:45.748
And I said, just for fun, that I think
132
00:06:45.748 –> 00:06:48.480
I can borrow a spacesuit so I could come
133
00:06:51.010 –> 00:06:53.770
down from the ceiling into the conference.
134
00:06:53.770 –> 00:06:55.412
Didn’t think much more about it
135
00:06:55.412 –> 00:06:57.496
until a couple of weeks later.
136
00:06:57.496 –> 00:06:58.744
Then suddenly they called me
137
00:06:58.744 –> 00:07:00.718
and said, do that spacesuit.
138
00:07:00.718 –> 00:07:02.632
Is that possible? So that was why.
139
00:07:02.632 –> 00:07:05.550
So I borrowed it from the Norwegian.
140
00:07:05.550 –> 00:07:08.514
I think it’s called the Norwegian space Agency.
141
00:07:08.514 –> 00:07:13.378
It’s not NASA, actually, and it’s just a demo suit.
142
00:07:13.378 –> 00:07:14.476
But it was quite cool.
143
00:07:14.476 –> 00:07:17.666
So I was lowered from the ceiling and landed on the stage
144
00:07:17.666 –> 00:07:21.184
and did my talk in a space suit, which was more.
145
00:07:21.184 –> 00:07:24.432
It was harder than I thought, but it was fun.
146
00:07:24.432 –> 00:07:26.160
I guess you were sweating like
147
00:07:26.160 –> 00:07:28.096
crazy because it was demo.
148
00:07:28.096 –> 00:07:31.360
So it didn’t have all those mechanics inside, right?
149
00:07:31.360 –> 00:07:33.974
No, it was just a cover suit.
150
00:07:33.974 –> 00:07:35.520
But, yeah, it was definitely hot.
151
00:07:35.520 –> 00:07:35.936
Right.
152
00:07:35.936 –> 00:07:37.386
But it made an impression.
153
00:07:37.386 –> 00:07:38.480
It was quite cool.
154
00:07:39.090 –> 00:07:41.108
And I love space, by the way.
155
00:07:41.108 –> 00:07:43.736
I mean, it’s so fascinating what we can do
156
00:07:43.736 –> 00:07:46.232
and what people are actually doing out there.
157
00:07:46.232 –> 00:07:49.768
There is this british entrepreneur, I
158
00:07:49.768 –> 00:07:52.808
think Richard Brown is called.
159
00:07:52.808 –> 00:07:55.230
Have you heard of gravity industries?
160
00:07:55.230 –> 00:07:59.228
So he builds all those, like, you have four, I
161
00:07:59.228 –> 00:08:03.100
think, like, 18 total engines, and he can fly, basically.
162
00:08:03.100 –> 00:08:07.772
He’s the Ironman, the modern iron man.
163
00:08:07.772 –> 00:08:08.928
I love it.
164
00:08:08.928 –> 00:08:10.700
Yeah, I would love to try that.
165
00:08:11.310 –> 00:08:14.966
Yeah, it looks heavy.
166
00:08:14.966 –> 00:08:18.720
You need to know how to fly and how to position yourself,
167
00:08:18.720 –> 00:08:23.284
not to just fall back on the ground on your back.
168
00:08:23.284 –> 00:08:25.440
But, yeah, amazing things are happening.
169
00:08:26.690 –> 00:08:28.580
And I know that you are talking
170
00:08:28.580 –> 00:08:34.870
about exponential development and the hyper adoption.
171
00:08:34.870 –> 00:08:36.520
What does it mean for you?
172
00:08:36.520 –> 00:08:37.590
Yeah.
173
00:08:37.590 –> 00:08:41.510
One thing that I’m very into is that,
174
00:08:41.510 –> 00:08:43.832
as we all, most people at least know
175
00:08:43.832 –> 00:08:47.612
now, technology, or most technology emerging technologies are
176
00:08:47.612 –> 00:08:50.466
developing in an exponential speed.
177
00:08:50.466 –> 00:08:52.252
But what does it really mean?
178
00:08:52.252 –> 00:08:53.692
And that’s the hard thing, because
179
00:08:53.692 –> 00:08:56.242
it’s almost impossible to imagine.
180
00:08:56.242 –> 00:08:57.404
What does it mean?
181
00:08:57.404 –> 00:08:59.568
And right now, we’re living in what could
182
00:08:59.568 –> 00:09:03.040
be called an exponential revolution, so to speak.
183
00:09:03.040 –> 00:09:06.112
So I try to kind of get people aware of that.
184
00:09:06.112 –> 00:09:07.888
There is a huge difference.
185
00:09:07.888 –> 00:09:12.048
When I studied business and economics back in the
186
00:09:12.048 –> 00:09:16.202
days, in the mid 90s, we talked about development
187
00:09:16.202 –> 00:09:18.996
and prediction based on very linear thinking.
188
00:09:18.996 –> 00:09:20.788
And that’s a very common thing to do,
189
00:09:20.788 –> 00:09:23.380
and that’s a thing that humans can understand.
190
00:09:23.910 –> 00:09:28.824
But to compare linear development and exponential development, if you
191
00:09:28.824 –> 00:09:32.446
imagine that I were to walk 30 steps, 1 meter,
192
00:09:32.446 –> 00:09:35.960
and now in one direction, and every step I took
193
00:09:35.960 –> 00:09:39.148
was 1 meter, I don’t know how much that is
194
00:09:39.148 –> 00:09:41.420
in feet, but you get the grip and then.
195
00:09:41.420 –> 00:09:43.586
Don’T worry, I operate in meters.
196
00:09:43.586 –> 00:09:45.160
That’s good. Yeah.
197
00:09:46.410 –> 00:09:49.004
One plus one, plus one plus 130 times.
198
00:09:49.004 –> 00:09:52.246
We all understand that I will end up going 30 meters.
199
00:09:52.246 –> 00:09:54.640
That’s pretty not hard to understand.
200
00:09:54.640 –> 00:09:57.072
But if I were to take, or if I could
201
00:09:57.072 –> 00:10:01.008
take exponential steps, which would be very, very cool, the
202
00:10:01.008 –> 00:10:03.200
first step will then be 1 meter, the second would
203
00:10:03.200 –> 00:10:05.748
be 2 meters, the third will be 4 meters, then
204
00:10:05.748 –> 00:10:08.634
it doubles every time for 30 times, and then you’ll
205
00:10:08.634 –> 00:10:12.356
end up going 26 times around the globe instead.
206
00:10:12.356 –> 00:10:13.720
And that’s the difference.
207
00:10:13.720 –> 00:10:18.232
So when planning now for going forward and thinking about
208
00:10:18.232 –> 00:10:21.224
what is actually possible to do, we need to understand
209
00:10:21.224 –> 00:10:25.256
that we can take quite big steps, big leaps, and
210
00:10:25.256 –> 00:10:28.972
you might have to be quite crazy to understand that
211
00:10:28.972 –> 00:10:33.708
there’s a possibility you could put forth to solve a
212
00:10:33.708 –> 00:10:36.028
problem that we don’t have the technology for because we
213
00:10:36.028 –> 00:10:37.964
are going to invent it anyway.
214
00:10:37.964 –> 00:10:40.438
That could actually be the spirit.
215
00:10:40.438 –> 00:10:41.712
So that’s one thing.
216
00:10:41.712 –> 00:10:44.384
Technology is developing extremely fast.
217
00:10:44.384 –> 00:10:47.008
And right now with chat GTP and all that
218
00:10:47.008 –> 00:10:52.352
stuff suddenly emerging, we saw that from the free
219
00:10:52.352 –> 00:10:55.402
version, the first version that they launched last autumn.
220
00:10:55.402 –> 00:10:57.946
It’s only a year ago that chat GTP
221
00:10:57.946 –> 00:10:59.760
was kind of given out for everyone.
222
00:11:00.290 –> 00:11:07.374
It had 165, I think, billion or trillion parameters.
223
00:11:07.374 –> 00:11:10.798
And then when the fourth version came, it’s
224
00:11:10.798 –> 00:11:13.848
1000 trillion parameters or something like that.
225
00:11:13.848 –> 00:11:15.400
It’s a huge amount more.
226
00:11:15.400 –> 00:11:17.202
And that’s because of the exponential
227
00:11:17.202 –> 00:11:19.740
development that actually makes it possible.
228
00:11:19.740 –> 00:11:22.620
The next person will probably be twice as much at that.
229
00:11:22.620 –> 00:11:25.560
And that’s know an enormous difference.
230
00:11:26.170 –> 00:11:27.532
So that’s one thing.
231
00:11:27.532 –> 00:11:30.358
The high production terminology I learned
232
00:11:30.358 –> 00:11:32.134
from a guy called James McQueely.
233
00:11:32.134 –> 00:11:36.678
He’s a Forrester researcher and principal and brilliant guy, wrote
234
00:11:36.678 –> 00:11:38.528
a book about this a couple of years ago.
235
00:11:38.528 –> 00:11:40.960
And it’s about how we people are
236
00:11:40.960 –> 00:11:44.004
changing the way we adopt technology.
237
00:11:44.004 –> 00:11:46.756
Because back in the days, ten years ago, 1520
238
00:11:46.756 –> 00:11:50.964
years ago, we oftentimes bought new technology because it
239
00:11:50.964 –> 00:11:53.076
was cool, it was new, there was a phone
240
00:11:53.076 –> 00:11:54.904
with a camera on it, never seen it before.
241
00:11:54.904 –> 00:11:59.608
But now we adopt technology because we at least think we
242
00:11:59.608 –> 00:12:02.328
have some kind of understanding about what it could do for
243
00:12:02.328 –> 00:12:07.628
us, how we could increase the value for my life or
244
00:12:07.628 –> 00:12:10.604
the way I do my job or something like that.
245
00:12:10.604 –> 00:12:12.578
And then based on that, we adopt
246
00:12:12.578 –> 00:12:16.034
it almost in an exponential speed.
247
00:12:16.034 –> 00:12:18.688
So like when chat GTP, to use that as
248
00:12:18.688 –> 00:12:23.600
an example, when they launched last year, 1 million
249
00:12:23.600 –> 00:12:28.816
new subscribers signed up the first 24 hours.
250
00:12:28.816 –> 00:12:32.612
And there’s never been any service adopting that
251
00:12:32.612 –> 00:12:35.658
many new users in such a short period
252
00:12:35.658 –> 00:12:39.322
of time, much faster than Snapchat or TikTok
253
00:12:39.322 –> 00:12:42.634
or anything else, and that’s hyper adoption.
254
00:12:42.634 –> 00:12:46.542
So we are both developing technology in a space rocket.
255
00:12:46.542 –> 00:12:51.496
Speed, almost unbelievably, it’s hard to understand.
256
00:12:51.496 –> 00:12:53.048
And then people are starting to
257
00:12:53.048 –> 00:12:55.400
adopt technology quite quickly as well.
258
00:12:55.400 –> 00:12:59.116
And those two forces are really pushing us forward.
259
00:12:59.116 –> 00:13:02.188
And that’s what maybe helped most of us
260
00:13:02.188 –> 00:13:04.802
through the pandemic, for instance, that we realized
261
00:13:04.802 –> 00:13:07.000
that, yeah, cool, there’s stuff we can do.
262
00:13:08.270 –> 00:13:09.360
That’s true.
263
00:13:09.360 –> 00:13:12.096
But then there is lots of factors which
264
00:13:12.096 –> 00:13:17.550
played the role in adopting such technologies.
265
00:13:17.550 –> 00:13:20.038
Computational power is cheaper.
266
00:13:20.038 –> 00:13:21.268
We have access.
267
00:13:21.268 –> 00:13:23.962
More people have access to the Internet.
268
00:13:23.962 –> 00:13:29.124
And then also such tech, like AI, like
269
00:13:29.124 –> 00:13:33.656
llms, are utilizing the recursive learning, right?
270
00:13:33.656 –> 00:13:37.054
So the more people use it, the better it becomes.
271
00:13:37.054 –> 00:13:39.102
The better it becomes, the more valuable
272
00:13:39.102 –> 00:13:43.256
it becomes, and the effect goes on.
273
00:13:43.256 –> 00:13:46.892
But I completely agree about the exponential value.
274
00:13:46.892 –> 00:13:50.850
In a way, it’s such a universal
275
00:13:50.850 –> 00:13:54.070
truth how things work, even in finance.
276
00:13:55.370 –> 00:13:58.918
What’s the compound interest or compound returns?
277
00:13:58.918 –> 00:13:59.968
It’s the same thing.
278
00:13:59.968 –> 00:14:03.824
If you save a little bit each day, or if you
279
00:14:03.824 –> 00:14:09.168
make a little bit bigger progress each day, the compound result
280
00:14:09.168 –> 00:14:14.788
is much bigger than what you imagine as a linear one.
281
00:14:14.788 –> 00:14:15.520
Exactly.
282
00:14:16.690 –> 00:14:21.268
But what made you so interested, what
283
00:14:21.268 –> 00:14:23.412
made you so passionate about technology?
284
00:14:23.412 –> 00:14:27.288
Like, was there any one moment in
285
00:14:27.288 –> 00:14:32.568
your childhood, maybe, which sparked this interest,
286
00:14:32.568 –> 00:14:35.790
or it was just like natural progression?
287
00:14:35.790 –> 00:14:41.324
No, actually, back in the days, I thought a lot about
288
00:14:41.324 –> 00:14:45.116
why on earth did I end up being like this?
289
00:14:45.116 –> 00:14:47.548
And I think one thing is kind of the person I
290
00:14:47.548 –> 00:14:52.060
am, I’m very curious, and I love figuring out stuff.
291
00:14:52.590 –> 00:14:59.376
But the thing was that in 1978, I was seven years old.
292
00:14:59.376 –> 00:15:02.004
So now you can figure out how old I am.
293
00:15:02.004 –> 00:15:03.012
I was seven years old.
294
00:15:03.012 –> 00:15:07.988
And my uncle, he came home from a trip to the
295
00:15:07.988 –> 00:15:13.316
United States, and he brought home a little computer game.
296
00:15:13.316 –> 00:15:14.820
And it was actually a little computer.
297
00:15:14.820 –> 00:15:17.464
I still have it in my office, and it still works.
298
00:15:17.464 –> 00:15:20.222
It was called Merlin, and you could play tic tac toe.
299
00:15:20.222 –> 00:15:23.510
You can program it to play really simple music.
300
00:15:23.510 –> 00:15:27.032
And I was just blown away about this thing.
301
00:15:27.032 –> 00:15:29.708
I just put in some batteries, and the
302
00:15:29.708 –> 00:15:31.692
most technical stuff I ever seen before that
303
00:15:31.692 –> 00:15:33.980
was, I don’t know, flashlight, maybe.
304
00:15:33.980 –> 00:15:37.068
And then this thing came along, and on
305
00:15:37.068 –> 00:15:40.270
top of that, Star wars came out.
306
00:15:40.270 –> 00:15:44.182
We started to see a lot of science fiction movies
307
00:15:44.182 –> 00:15:46.928
and books and comics, and I just loved it.
308
00:15:46.928 –> 00:15:53.200
I was totally absorbed about this thing going on.
309
00:15:54.290 –> 00:15:57.652
We started to kind of dream about how the future would
310
00:15:57.652 –> 00:16:02.400
be, like with robots and computers and all that stuff.
311
00:16:04.470 –> 00:16:09.496
A few years later, I got a Commodore 64,
312
00:16:09.496 –> 00:16:13.220
one of the first home computers, like you.
313
00:16:14.710 –> 00:16:15.992
And when everyone else.
314
00:16:15.992 –> 00:16:19.166
When I was always a big thing with the baptized.
315
00:16:19.166 –> 00:16:22.390
When you get baptized, not baptized, confirmation.
316
00:16:24.410 –> 00:16:29.488
All my friends, they wanted scooters and stuff.
317
00:16:29.488 –> 00:16:33.696
My biggest dream was a floppy drive.
318
00:16:33.696 –> 00:16:35.104
I had the computers, I wanted a
319
00:16:35.104 –> 00:16:39.072
floppy drive and equipment to my computer.
320
00:16:39.072 –> 00:16:42.816
And I also realized that programming isn’t my thing.
321
00:16:42.816 –> 00:16:44.708
What I’m into is to try to
322
00:16:44.708 –> 00:16:46.548
understand what I can use it for.
323
00:16:46.548 –> 00:16:49.972
I was this dreamer about living.
324
00:16:49.972 –> 00:16:52.980
A lot of the stories that we heard back then, early
325
00:16:52.980 –> 00:16:58.142
80s was about the year 2000, and we saw these futuristic
326
00:16:58.142 –> 00:17:00.664
cities that we were supposed to grow up in.
327
00:17:00.664 –> 00:17:03.464
There were supposed to be flying cars, this
328
00:17:03.464 –> 00:17:06.574
super high tech, going to other planets.
329
00:17:06.574 –> 00:17:09.051
And that was a future that I kind
330
00:17:09.051 –> 00:17:12.652
of was very, what do you call it?
331
00:17:12.652 –> 00:17:13.723
Drawn into.
332
00:17:13.723 –> 00:17:16.268
I thought it was so fascinating to see how
333
00:17:16.268 –> 00:17:18.848
we can kind of live our lives and how
334
00:17:18.848 –> 00:17:22.000
we can use all these things to help us
335
00:17:22.000 –> 00:17:25.718
become, I don’t know, better versions of ourselves.
336
00:17:25.718 –> 00:17:28.175
So that was kind of the beginning, I think. And then I.
337
00:17:28.175 –> 00:17:28.780
Yeah.
338
00:17:29.790 –> 00:17:32.394
Was it pre millennium?
339
00:17:32.394 –> 00:17:35.188
It was pre 2000, because I don’t know if you had
340
00:17:35.188 –> 00:17:38.644
the same, but there was supposed to be a crash, computer
341
00:17:38.644 –> 00:17:43.892
crash, when the computer supposed to not be able.
342
00:17:43.892 –> 00:17:51.976
We were so disappointed because I started
343
00:17:51.976 –> 00:17:54.792
my first proper job, then I was.
344
00:17:54.792 –> 00:17:57.528
Done studying, I studied, and you
345
00:17:57.528 –> 00:17:59.080
wanted to have a free day.
346
00:18:00.090 –> 00:18:01.580
And then we were sitting there.
347
00:18:01.580 –> 00:18:03.292
We were sitting there and monitoring all
348
00:18:03.292 –> 00:18:05.650
our it systems, and we were following.
349
00:18:05.650 –> 00:18:07.852
I remember there was a web page where
350
00:18:07.852 –> 00:18:11.238
people started to report in New Zealand.
351
00:18:11.238 –> 00:18:15.120
When the year 2000 came, nothing happened.
352
00:18:15.120 –> 00:18:16.486
It was so disappointing.
353
00:18:16.486 –> 00:18:18.576
Disappointing as far as I remember, yeah.
354
00:18:18.576 –> 00:18:20.608
In a way, we were
355
00:18:20.608 –> 00:18:24.160
expecting this chaos because technology.
356
00:18:24.690 –> 00:18:26.916
And then nothing really happened.
357
00:18:26.916 –> 00:18:27.962
It was quite robust.
358
00:18:27.962 –> 00:18:29.188
I remember, I think it was
359
00:18:29.188 –> 00:18:31.002
one elevator somewhere that stopped.
360
00:18:31.002 –> 00:18:32.628
They thought it was because of that,
361
00:18:32.628 –> 00:18:35.256
but it turned out afterward, it’s happening.
362
00:18:35.256 –> 00:18:36.100
It’s happening.
363
00:18:37.110 –> 00:18:39.912
It was kind of disappointing, but also another thing
364
00:18:39.912 –> 00:18:42.990
that I thought of afterwards, not then, but afterwards,
365
00:18:42.990 –> 00:18:45.752
the contrast between the world we thought we would
366
00:18:45.752 –> 00:18:50.348
have in the year 2000, back in the.
367
00:18:50.348 –> 00:18:52.120
What we actually ended up having,
368
00:18:52.730 –> 00:18:55.388
it was nothing really happened.
369
00:18:55.388 –> 00:18:57.676
My city still looks the same.
370
00:18:57.676 –> 00:18:58.780
There’s not a big deal.
371
00:18:58.780 –> 00:19:01.856
But going back to the 80s, no flying cars, right?
372
00:19:01.856 –> 00:19:03.472
In Season street, they were saying,
373
00:19:03.472 –> 00:19:04.742
like, there will be flying cars.
374
00:19:04.742 –> 00:19:07.104
Where are they actually?
375
00:19:07.104 –> 00:19:08.540
There are some now in.
376
00:19:09.310 –> 00:19:10.208
I know, I know.
377
00:19:10.208 –> 00:19:11.316
And I tried one, actually.
378
00:19:11.316 –> 00:19:16.468
I went to the car exhibition in Munich here the
379
00:19:16.468 –> 00:19:20.276
other day, a couple of weeks ago, and the first
380
00:19:20.276 –> 00:19:25.898
few flying drone taxi cars ish is starting to emerge.
381
00:19:25.898 –> 00:19:29.192
But my point was that going back to the 80s, no
382
00:19:29.192 –> 00:19:33.582
one knew that in the year 2000 we would have internets.
383
00:19:33.582 –> 00:19:38.834
And that’s basically a much bigger invention
384
00:19:38.834 –> 00:19:41.618
and a much cooler thing than flying
385
00:19:41.618 –> 00:19:43.340
cars and all the physical stuff.
386
00:19:43.340 –> 00:19:48.332
And it also shows that our imagination we are.
387
00:19:48.332 –> 00:19:52.160
So back to the linear thinking we are just
388
00:19:52.160 –> 00:19:55.328
predicting, or what you say, projecting where we are
389
00:19:55.328 –> 00:19:57.654
now, so we make a little cooler houses.
390
00:19:57.654 –> 00:19:59.340
And that’s basically it.
391
00:20:01.090 –> 00:20:03.748
That’s the fascinating thing about all this, because we
392
00:20:03.748 –> 00:20:08.596
can almost not imagine the possibilities that we will
393
00:20:08.596 –> 00:20:11.556
have going into the future, which I think is
394
00:20:11.556 –> 00:20:13.796
very, very, and we are getting better and better
395
00:20:13.796 –> 00:20:17.192
at understanding how to do that, how to make
396
00:20:17.192 –> 00:20:19.800
the best out of the opportunities we have.
397
00:20:19.800 –> 00:20:23.592
I usually start my talks by saying that we’re living
398
00:20:23.592 –> 00:20:26.472
in fascinating times and we have never ever been able
399
00:20:26.472 –> 00:20:29.148
to do so many things as we can right now.
400
00:20:29.148 –> 00:20:32.492
And the point is, technology on one hand,
401
00:20:32.492 –> 00:20:34.780
and science on the other hand, makes us
402
00:20:34.780 –> 00:20:37.820
able to do just amazing stuff right now.
403
00:20:37.820 –> 00:20:40.752
And if we focus on that and figure out,
404
00:20:40.752 –> 00:20:42.544
okay, how can we do this even better?
405
00:20:42.544 –> 00:20:46.592
How can we use this technology to change the way we
406
00:20:46.592 –> 00:20:51.492
live, the way we produce stuff, the way we all that,
407
00:20:51.492 –> 00:20:54.852
that’s the way we should go, don’t care that much about.
408
00:20:54.852 –> 00:20:56.948
Do we really need or want
409
00:20:56.948 –> 00:21:00.938
to live in a jetsonish society?
410
00:21:00.938 –> 00:21:02.000
Probably not.
411
00:21:02.790 –> 00:21:03.512
Yeah.
412
00:21:03.512 –> 00:21:07.112
Although it’s funny because you have to
413
00:21:07.112 –> 00:21:11.662
imagine things when you read Sci-Fi books.
414
00:21:11.662 –> 00:21:14.776
I don’t know if you are fan. I guess you do.
415
00:21:14.776 –> 00:21:17.788
You are, because you are thinking about the future so
416
00:21:17.788 –> 00:21:21.724
much, but lots of things which have been imagined only
417
00:21:21.724 –> 00:21:27.090
in authors books at some point became reality.
418
00:21:27.090 –> 00:21:30.592
So there is a space for that. Right.
419
00:21:30.592 –> 00:21:34.528
But I really believe the same as you
420
00:21:34.528 –> 00:21:39.232
do, that Internet has been such a big
421
00:21:39.232 –> 00:21:45.280
thing because it allowed for collaborative thinking and
422
00:21:45.810 –> 00:21:51.924
collaboration at scale and speed, previously unheard of.
423
00:21:51.924 –> 00:21:56.088
And right now, the next stage, the next step is
424
00:21:56.088 –> 00:22:00.470
the AI, which takes it on a whole another level.
425
00:22:00.470 –> 00:22:01.704
Most definitely.
426
00:22:01.704 –> 00:22:05.064
And then again, and that’s my point, because I
427
00:22:05.064 –> 00:22:10.108
totally agree, and it’s also impressive how much certain,
428
00:22:10.108 –> 00:22:14.748
at least science fiction authors are able to kind
429
00:22:14.748 –> 00:22:18.140
of, maybe not spot on, but in the right
430
00:22:18.140 –> 00:22:20.642
direction of where we’re going oftentimes.
431
00:22:20.642 –> 00:22:22.566
But they’re also lacking.
432
00:22:22.566 –> 00:22:25.382
Like back to the beginning of the 80s when we watched
433
00:22:25.382 –> 00:22:28.928
Star Trek and Star wars, they could push a button on
434
00:22:28.928 –> 00:22:31.456
the wall and talk to a person in another room.
435
00:22:31.456 –> 00:22:33.492
And that was mind blowing back then.
436
00:22:33.492 –> 00:22:36.042
That’s called intercom, and it’s a pretty boring
437
00:22:36.042 –> 00:22:39.348
Internet to imagine that you could press one
438
00:22:39.348 –> 00:22:41.780
button and talk to the entire world.
439
00:22:41.780 –> 00:22:43.672
No one talked about that because you could
440
00:22:43.672 –> 00:22:45.320
not really grasp your head around it.
441
00:22:45.320 –> 00:22:47.304
But lots of other.
442
00:22:47.304 –> 00:22:48.760
We figured it out.
443
00:22:48.760 –> 00:22:49.700
Yeah, exactly.
444
00:22:51.270 –> 00:22:52.740
But that’s the thing.
445
00:22:53.350 –> 00:22:59.260
How can we kind of exploit all the opportunities we have
446
00:22:59.260 –> 00:23:04.204
to create a better way for ourselves in the future?
447
00:23:04.204 –> 00:23:07.276
I think that’s kind of my main purpose, my main
448
00:23:07.276 –> 00:23:12.448
goal, and also I feel a lot of people I
449
00:23:12.448 –> 00:23:17.254
meet when I am out doing my talks are worrying
450
00:23:17.254 –> 00:23:20.384
too much about stuff they shouldn’t worry about.
451
00:23:20.384 –> 00:23:26.852
And it’s almost always because they don’t know, they
452
00:23:26.852 –> 00:23:30.132
don’t have the insights they need to understand.
453
00:23:30.132 –> 00:23:32.856
It’s in human nature to be afraid of
454
00:23:32.856 –> 00:23:36.366
things which we don’t know, we haven’t experienced.
455
00:23:36.366 –> 00:23:37.096
Exactly.
456
00:23:37.096 –> 00:23:41.592
And then you saw, because now everyone is talking
457
00:23:41.592 –> 00:23:44.680
about AI, and everyone is talking about a very
458
00:23:44.680 –> 00:23:48.200
narrow part of AI, because AI has been around
459
00:23:48.730 –> 00:23:53.164
since the 50s, but now generative AI that we
460
00:23:53.164 –> 00:23:56.898
are seeing today is just a class of algorithms
461
00:23:56.898 –> 00:24:00.208
underneath the big umbrella of artificial intelligence that has
462
00:24:00.208 –> 00:24:03.808
the capability of creating new stuff based on the
463
00:24:03.808 –> 00:24:06.192
data it’s trained on, which is great.
464
00:24:06.192 –> 00:24:09.260
And it’s not an oracle, it’s not a search engine.
465
00:24:10.510 –> 00:24:12.618
But we thought that to begin with, old schools
466
00:24:12.618 –> 00:24:15.908
started to, we can’t have this, students will.
467
00:24:15.908 –> 00:24:17.556
And then it turns out maybe
468
00:24:17.556 –> 00:24:19.156
it’s not that dangerous after all.
469
00:24:19.156 –> 00:24:23.214
And if you look a little beyond your first impression,
470
00:24:23.214 –> 00:24:27.352
beyond the obvious, as I usually say, like I read
471
00:24:27.352 –> 00:24:32.552
an article here the other day, it’s the, oh, two
472
00:24:32.552 –> 00:24:35.582
universities, at least I can figure that out afterwards.
473
00:24:35.582 –> 00:24:38.812
But two universities in the states, they have come up with
474
00:24:38.812 –> 00:24:43.404
a solution together with a woman who had a stroke a
475
00:24:43.404 –> 00:24:46.460
couple of years ago who lost the ability to speak.
476
00:24:46.460 –> 00:24:49.216
Just imagine not being able to speak.
477
00:24:49.216 –> 00:24:53.558
And now they can connect her using a bunch of sensors.
478
00:24:53.558 –> 00:24:54.112
I saw this.
479
00:24:54.112 –> 00:24:56.032
A bunch of different technologies. Yeah.
480
00:24:56.032 –> 00:24:59.552
And now they can actually figure out what she
481
00:24:59.552 –> 00:25:03.498
is saying and she can get back her voice
482
00:25:03.498 –> 00:25:07.460
and they can use, then also, amongst other things,
483
00:25:07.460 –> 00:25:14.494
they use generative AI to generate sentences, to translate.
484
00:25:14.494 –> 00:25:16.318
Yes, translate.
485
00:25:16.318 –> 00:25:19.352
And that’s pretty amazing.
486
00:25:19.352 –> 00:25:21.512
No, I get it.
487
00:25:21.512 –> 00:25:22.760
I get the good stuff.
488
00:25:22.760 –> 00:25:26.690
But aren’t you afraid of the risks?
489
00:25:26.690 –> 00:25:30.508
Like, to what extent should tech companies
490
00:25:30.508 –> 00:25:33.218
interfere in our lives and bodies?
491
00:25:33.218 –> 00:25:36.410
I know that you implanted yourself
492
00:25:36.410 –> 00:25:39.820
chip in your, was it pink? No, it was here. Right?
493
00:25:40.590 –> 00:25:44.080
I know it’s a very simple technology.
494
00:25:44.080 –> 00:25:47.072
So you are not really sending a lot
495
00:25:47.072 –> 00:25:49.728
of data, storing a lot of data.
496
00:25:49.728 –> 00:25:51.988
You can just probably pay and I
497
00:25:51.988 –> 00:25:53.060
don’t know if you still have it.
498
00:25:53.060 –> 00:25:54.228
Do you still have it?
499
00:25:54.228 –> 00:25:56.180
Yeah, I have four of them now,
500
00:25:56.180 –> 00:25:58.110
it’s kind of turned into a habit.
501
00:26:01.330 –> 00:26:04.536
But I agree most definitely.
502
00:26:04.536 –> 00:26:08.328
We need to be very aware of what we’re doing.
503
00:26:08.328 –> 00:26:14.382
We need to have very clear laws and regulations, and that’s
504
00:26:14.382 –> 00:26:17.756
coming, and I think we can take care of that.
505
00:26:17.756 –> 00:26:19.836
We have a tendency to kind of.
506
00:26:19.836 –> 00:26:24.732
There will always be people who want to use whatever
507
00:26:24.732 –> 00:26:28.784
for bad, and now technology, and that’s a huge problem.
508
00:26:28.784 –> 00:26:30.990
But we need to fix that.
509
00:26:30.990 –> 00:26:35.020
We can’t lock ourselves down and we can’t get
510
00:26:36.750 –> 00:26:42.358
rid of or we can’t use the positive upside.
511
00:26:42.358 –> 00:26:44.212
I think the positive upside, it’s much,
512
00:26:44.212 –> 00:26:46.756
much bigger than the negative side.
513
00:26:46.756 –> 00:26:48.708
And the negative side, that’s the thing that we
514
00:26:48.708 –> 00:26:51.860
need to kind of fight, as we have always
515
00:26:51.860 –> 00:26:54.648
done, and then we need to continue that.
516
00:26:54.648 –> 00:26:59.220
But definitely be very much aware of security
517
00:26:59.990 –> 00:27:03.700
tracking, taking care of your personal information.
518
00:27:04.710 –> 00:27:08.284
And also there will be a huge, what do you
519
00:27:08.284 –> 00:27:14.332
call that liability responsibility for companies to make sure that
520
00:27:14.332 –> 00:27:18.288
when they use, for instance, artificial intelligence in some way
521
00:27:18.288 –> 00:27:21.216
or form, it’s their responsibility to make sure that the
522
00:27:21.216 –> 00:27:24.250
data is consistent, it’s not biased.
523
00:27:26.190 –> 00:27:30.450
That’s most definitely an important issue.
524
00:27:30.450 –> 00:27:32.954
Yes, but that’s theory.
525
00:27:32.954 –> 00:27:38.986
But in reality, governments and regulators usually lag
526
00:27:38.986 –> 00:27:43.096
behind what’s happening in commercial space, let’s say.
527
00:27:43.096 –> 00:27:47.832
And lots of people, like you said, you have
528
00:27:47.832 –> 00:27:53.560
to allow for safe experimentation and failure, and this
529
00:27:53.560 –> 00:27:56.550
needs to happen for us to progress.
530
00:27:57.290 –> 00:27:59.378
But I think lots of people don’t
531
00:27:59.378 –> 00:28:04.040
realize how bad it can get.
532
00:28:05.210 –> 00:28:14.534
For example, I used this DNA gene analysis software
533
00:28:14.534 –> 00:28:18.150
just to understand if I’m prone to any diseases.
534
00:28:18.150 –> 00:28:20.070
What are my ancestors?
535
00:28:20.070 –> 00:28:21.092
Where did they come from?
536
00:28:21.092 –> 00:28:23.070
Unfortunately, Europe.
537
00:28:24.210 –> 00:28:27.796
I hoped for Italy, but 0.1% still.
538
00:28:27.796 –> 00:28:28.720
Me too.
539
00:28:30.930 –> 00:28:31.652
Okay.
540
00:28:31.652 –> 00:28:32.750
We are cousins.
541
00:28:33.510 –> 00:28:35.880
Yeah, me too. I hope for that.
542
00:28:35.880 –> 00:28:38.690
And I was Walsh and Scandinavian.
543
00:28:41.270 –> 00:28:44.216
But the thing is that in the end,
544
00:28:44.216 –> 00:28:47.400
companies, private companies, want to make money.
545
00:28:48.010 –> 00:28:49.554
It’s for the good, but it’s
546
00:28:49.554 –> 00:28:52.470
also for the monetary purposes.
547
00:28:54.170 –> 00:28:57.564
In this case, my friends advised me not to
548
00:28:57.564 –> 00:29:02.032
use my real data, my real id, just in
549
00:29:02.032 –> 00:29:07.728
case they decide to sell my details to insurance
550
00:29:07.728 –> 00:29:12.500
companies who may charge me premium if platform shows
551
00:29:12.500 –> 00:29:15.594
that I am prone to some long term disease.
552
00:29:15.594 –> 00:29:20.154
How can we make sure that we are in charge?
553
00:29:20.154 –> 00:29:24.488
We are protecting our data, but we are giving the
554
00:29:24.488 –> 00:29:29.208
data which will help us to personalize the services or
555
00:29:29.208 –> 00:29:33.304
the data, or the services which will give us more
556
00:29:33.304 –> 00:29:37.362
benefit than it actually can cause harm.
557
00:29:37.362 –> 00:29:41.020
Yeah, and that’s also a very complicated question.
558
00:29:41.020 –> 00:29:44.280
And unfortunately, in Europe, what did you expect?
559
00:29:47.130 –> 00:29:49.808
But in Europe, we have a different kind of
560
00:29:49.808 –> 00:29:55.600
approach to privacy than, for instance, the US and
561
00:29:55.600 –> 00:30:01.082
also big parts of Asia, China, for instance.
562
00:30:01.082 –> 00:30:03.890
Yeah, but I think that’s a good question because
563
00:30:03.890 –> 00:30:08.804
in China we know that the government will use
564
00:30:08.804 –> 00:30:14.260
information to protect themselves, and that’s bad.
565
00:30:15.270 –> 00:30:17.720
In the US, they use it for.
566
00:30:17.720 –> 00:30:19.518
And the excuse is always that it’s
567
00:30:19.518 –> 00:30:23.342
purely free country and for commercial reasons.
568
00:30:23.342 –> 00:30:26.482
But I promise you that if the Pentagon
569
00:30:26.482 –> 00:30:29.452
wants to figure out what’s going on with
570
00:30:29.452 –> 00:30:32.492
you, they will get hold of that data.
571
00:30:32.492 –> 00:30:33.532
Most definitely.
572
00:30:33.532 –> 00:30:36.924
And sometimes that could be a very good thing,
573
00:30:36.924 –> 00:30:39.936
because if you are a serial killer that we
574
00:30:39.936 –> 00:30:42.720
would like to get hold of, great.
575
00:30:42.720 –> 00:30:46.768
But if it’s for other reasons, not so great.
576
00:30:46.768 –> 00:30:50.090
And these are some of the fundamentally
577
00:30:52.130 –> 00:30:55.520
extremely hard questions to get hold of.
578
00:30:57.090 –> 00:31:00.960
In Europe, we have a tendency to point to the way
579
00:31:02.130 –> 00:31:06.776
the GDPR and our privacy regulations and all that stuff.
580
00:31:06.776 –> 00:31:10.520
But even in Europe, there are bad guys
581
00:31:10.520 –> 00:31:15.374
who try to misuse information and insights.
582
00:31:15.374 –> 00:31:18.330
But that’s what’s about to happen even
583
00:31:18.330 –> 00:31:21.276
now between Europe and the US.
584
00:31:21.276 –> 00:31:24.972
There were some kind of regulation agreement this year.
585
00:31:24.972 –> 00:31:28.896
So we can now trust that if I want to
586
00:31:28.896 –> 00:31:33.408
store my sensitive data for my clients and customers, for
587
00:31:33.408 –> 00:31:37.088
instance, on a server from Google, they have said that
588
00:31:37.088 –> 00:31:40.870
they are now following up on the european regulations.
589
00:31:40.870 –> 00:31:42.272
So that should be okay.
590
00:31:42.272 –> 00:31:45.284
And then it’s all about trust at some level.
591
00:31:45.284 –> 00:31:46.692
But there will always be bad
592
00:31:46.692 –> 00:31:49.120
guys and there will always be.
593
00:31:49.730 –> 00:31:54.286
I think that we’re moving into an area or era.
594
00:31:54.286 –> 00:31:56.632
Era, that’s a very hard word for norwegian to say.
595
00:31:56.632 –> 00:32:03.608
Not an area where you as a person also need to
596
00:32:03.608 –> 00:32:07.100
be much more aware of what you’re doing with your data.
597
00:32:07.100 –> 00:32:10.268
Because in some regulations, like, for instance, the chip, and
598
00:32:10.268 –> 00:32:12.828
this was a very good kind of curve back to
599
00:32:12.828 –> 00:32:17.030
my chips where I collect data for my temperature.
600
00:32:17.630 –> 00:32:20.928
That data is stored on my device and
601
00:32:20.928 –> 00:32:24.220
not shared with anyone unless I wanted to.
602
00:32:26.030 –> 00:32:29.094
And it’s on blockchain and you have your private keys.
603
00:32:29.094 –> 00:32:30.048
Exactly.
604
00:32:30.048 –> 00:32:31.040
But that’s the way.
605
00:32:31.040 –> 00:32:33.306
But now we’re kind of in a little limbo
606
00:32:33.306 –> 00:32:36.212
area here where the technology, as I said, is
607
00:32:36.212 –> 00:32:39.812
developing so fast, the regulations can’t keep track.
608
00:32:39.812 –> 00:32:41.988
They’re trying to get back on it.
609
00:32:41.988 –> 00:32:44.312
We will have this back and forth thing going on,
610
00:32:44.312 –> 00:32:47.656
and we have had before and we’re still there.
611
00:32:47.656 –> 00:32:50.872
The issue right now is that the consequences will be
612
00:32:50.872 –> 00:32:55.080
so much bigger because of the reach and all that.
613
00:32:55.080 –> 00:32:59.964
But these are complicated questions because we need to,
614
00:32:59.964 –> 00:33:02.316
as users and people, we should think about it.
615
00:33:02.316 –> 00:33:03.996
But we also need to understand that.
616
00:33:03.996 –> 00:33:05.788
I heard some guy, he said that
617
00:33:05.788 –> 00:33:08.992
the time for privacy is gone.
618
00:33:08.992 –> 00:33:14.064
So at some extent, unless you live under
619
00:33:14.064 –> 00:33:18.656
a rock out in the woods, there will
620
00:33:18.656 –> 00:33:22.260
always be some information about you.
621
00:33:22.260 –> 00:33:24.164
And then we also have a tendency to kind
622
00:33:24.164 –> 00:33:29.450
of overrate the meaning of the last added opportunities
623
00:33:29.450 –> 00:33:33.736
because up until now, using a cell phone, using
624
00:33:33.736 –> 00:33:36.712
your payment cards, going to the doctor, all that
625
00:33:36.712 –> 00:33:39.608
stuff, there’s a bunch of data already.
626
00:33:39.608 –> 00:33:41.090
Some are regulated.
627
00:33:42.470 –> 00:33:45.160
We are going to have mechanisms based
628
00:33:45.160 –> 00:33:48.268
on just trust, first of all.
629
00:33:48.268 –> 00:33:51.212
Then we use technology like blockchain or whatever to
630
00:33:51.212 –> 00:33:54.796
make sure that my data is taken care of
631
00:33:54.796 –> 00:33:57.212
and that I have control over my data.
632
00:33:57.212 –> 00:33:58.860
But it’s still hard.
633
00:33:58.860 –> 00:34:00.236
It’s very hard.
634
00:34:00.236 –> 00:34:01.666
Most people don’t.
635
00:34:01.666 –> 00:34:04.192
You and I probably don’t understand it
636
00:34:04.192 –> 00:34:06.352
either, and we don’t think about it.
637
00:34:06.352 –> 00:34:08.560
But it’s also kind of a give and take
638
00:34:08.560 –> 00:34:14.121
relationship, because when GDPR came around, Facebook, for instance,
639
00:34:14.121 –> 00:34:19.348
they closed several services or user experience kind of
640
00:34:19.348 –> 00:34:23.428
algorithms for european users because they didn’t want to
641
00:34:23.428 –> 00:34:28.007
kind of danger themselves of breaking any of the
642
00:34:28.007 –> 00:34:32.520
regulations, leading to a lot of european users being
643
00:34:32.520 –> 00:34:38.684
quite unsatisfied with the service because suddenly my experience
644
00:34:38.684 –> 00:34:43.212
was less good because they didn’t use my.
645
00:34:43.212 –> 00:34:45.628
So if I search for you and we
646
00:34:45.628 –> 00:34:48.886
were connected, you suddenly just popped up randomly
647
00:34:48.886 –> 00:34:51.420
amongst 1000 others with the same name.
648
00:34:53.070 –> 00:34:54.911
We are kind of strange people.
649
00:34:54.911 –> 00:34:57.094
So this is very complicated.
650
00:34:57.094 –> 00:35:00.256
I understand that I may not make any sense
651
00:35:00.256 –> 00:35:03.908
for anyone right now, but I still think that
652
00:35:03.908 –> 00:35:06.644
we need to make sure that we move forward
653
00:35:06.644 –> 00:35:08.772
and we need to find mechanisms for this.
654
00:35:08.772 –> 00:35:10.740
And we do find mechanisms for this.
655
00:35:10.740 –> 00:35:13.822
Within healthcare, for instance, there are strict
656
00:35:13.822 –> 00:35:16.408
regulations in Europe for what doctors and
657
00:35:16.408 –> 00:35:18.472
hospitals can share with others.
658
00:35:18.472 –> 00:35:21.928
But we are also, fortunately, finding ways to kind of
659
00:35:21.928 –> 00:35:29.260
separate your personal the data identified as a person and
660
00:35:29.260 –> 00:35:33.080
the data we could use to make for research.
661
00:35:34.490 –> 00:35:39.580
The results from your tests aggregated anonymized data.
662
00:35:40.270 –> 00:35:41.020
Exactly.
663
00:35:42.190 –> 00:35:43.340
We’re getting that.
664
00:35:44.990 –> 00:35:45.808
Yeah, I get that.
665
00:35:45.808 –> 00:35:50.868
But we are in technology, we understand to
666
00:35:50.868 –> 00:35:55.812
a certain extent risks and potential opportunities and
667
00:35:55.812 –> 00:35:58.720
benefits, but most people don’t care.
668
00:35:59.250 –> 00:36:01.396
Most people don’t know, don’t care.
669
00:36:01.396 –> 00:36:03.528
They have a lot on their plate and
670
00:36:03.528 –> 00:36:06.376
they think about other things than this.
671
00:36:06.376 –> 00:36:15.468
So definitely there should be people, institutions, which do
672
00:36:15.468 –> 00:36:19.858
the thinking for the masses, let’s say, but include
673
00:36:19.858 –> 00:36:23.990
them for the conversation, for the discussion.
674
00:36:25.370 –> 00:36:29.616
Yeah, I think that’s where we’re going. We do that.
675
00:36:29.616 –> 00:36:32.096
Like when GDPR came around, that was also
676
00:36:32.096 –> 00:36:35.904
very, for most people, impossible to understand.
677
00:36:35.904 –> 00:36:38.950
Now I just have to trust that my bank
678
00:36:38.950 –> 00:36:42.902
keeps and followed the rules, and if they don’t,
679
00:36:42.902 –> 00:36:45.428
someone will make sure that they get punished for
680
00:36:45.428 –> 00:36:48.228
that and the same thing will happen with the
681
00:36:48.228 –> 00:36:50.180
things that we’re talking about right now.
682
00:36:50.180 –> 00:36:52.362
And I think that companies, that’s
683
00:36:52.362 –> 00:36:53.262
going to be a challenge.
684
00:36:53.262 –> 00:36:58.312
Also, companies will have a much bigger responsibility making sure
685
00:36:58.312 –> 00:37:01.998
that when we use technology, like AI, for instance, it’s
686
00:37:01.998 –> 00:37:05.164
my responsibility as a company to make sure that I
687
00:37:05.164 –> 00:37:10.124
do it in an ethical and secure and all that.
688
00:37:10.124 –> 00:37:13.532
We talked also about interfering with our
689
00:37:13.532 –> 00:37:17.516
bodies and sharing the data like you
690
00:37:17.516 –> 00:37:22.070
do privately for yourself, for your purposes.
691
00:37:22.070 –> 00:37:26.278
I see lots of positive applications.
692
00:37:26.278 –> 00:37:31.396
For example, the microbots, when you can give
693
00:37:31.396 –> 00:37:40.132
the doses remotely, it’s an incredible opportunity for
694
00:37:40.132 –> 00:37:44.632
people, for patients who previously wouldn’t even be
695
00:37:44.632 –> 00:37:46.936
able to receive any special care.
696
00:37:46.936 –> 00:37:53.800
Right, but where is this line between letting technology
697
00:37:53.800 –> 00:37:58.588
interfere with ourselves, like with our bodies, and being
698
00:37:58.588 –> 00:38:02.716
in control of where it can go?
699
00:38:02.716 –> 00:38:05.770
I don’t know if I make sense.
700
00:38:05.770 –> 00:38:07.036
Yeah, I get what you mean.
701
00:38:07.036 –> 00:38:09.740
And one kind of solution here, a simple
702
00:38:09.740 –> 00:38:12.810
solution, would be what we call edge computing.
703
00:38:13.950 –> 00:38:17.424
Let me use my chips as an example, because I said I
704
00:38:17.424 –> 00:38:20.192
have four of them, and the reason I have four is that
705
00:38:20.192 –> 00:38:23.812
I’m too much of a city to take them out again.
706
00:38:23.812 –> 00:38:25.108
I don’t need all four of them.
707
00:38:25.108 –> 00:38:28.292
But I was part of a project where a friend
708
00:38:28.292 –> 00:38:32.148
of mine started a company where they, together with the
709
00:38:32.148 –> 00:38:36.776
university’s hospital of Stockholm, looked at the opportunity to use
710
00:38:36.776 –> 00:38:40.766
implants to monitor your body for health reasons.
711
00:38:40.766 –> 00:38:43.416
And the first one I got, probably the video
712
00:38:43.416 –> 00:38:46.898
you’ve seen, I use that oftentimes in my presentations,
713
00:38:46.898 –> 00:38:51.564
they injected it here in my hand, and it
714
00:38:51.564 –> 00:38:53.442
was quite painful, I must admit.
715
00:38:53.442 –> 00:38:56.578
But as I said, I don’t like needles.
716
00:38:56.578 –> 00:38:59.248
But anyways, that was just to see if it worked.
717
00:38:59.248 –> 00:39:01.942
And the other one I got was a chip
718
00:39:01.942 –> 00:39:06.128
where they put a little lead light on it.
719
00:39:06.128 –> 00:39:10.608
And the point here is that the NFC technology
720
00:39:10.608 –> 00:39:13.472
is a passive technology, so there’s no battery in
721
00:39:13.472 –> 00:39:15.520
the chips that I have in my body.
722
00:39:16.290 –> 00:39:17.428
Same thing as you have in
723
00:39:17.428 –> 00:39:19.732
your access card to your office.
724
00:39:19.732 –> 00:39:22.452
So when you enter your office, you take your access card
725
00:39:22.452 –> 00:39:25.176
out and you put it into the reader on the wall.
726
00:39:25.176 –> 00:39:28.728
Then it makes this induction effect same thing as
727
00:39:28.728 –> 00:39:33.048
you recharge your phone without a cord, so it
728
00:39:33.048 –> 00:39:35.688
generates enough energy in the chip to make it
729
00:39:35.688 –> 00:39:39.058
readable or to light up the light bulb.
730
00:39:39.058 –> 00:39:41.468
That’s what they wanted to check, because if they
731
00:39:41.468 –> 00:39:43.868
could do that, then it, in theory, would be
732
00:39:43.868 –> 00:39:48.194
possible to apply sensors on the chip.
733
00:39:48.194 –> 00:39:51.024
So the third one I got is here,
734
00:39:51.024 –> 00:39:52.848
just underneath my, what do you call that?
735
00:39:52.848 –> 00:39:56.368
Color bomb with a temperature sensor in it,
736
00:39:56.368 –> 00:39:59.046
so I can take my temperature, this chip,
737
00:39:59.046 –> 00:40:05.290
and to read the temperature it’s encrypted.
738
00:40:05.290 –> 00:40:07.828
I had to download a specific
739
00:40:07.828 –> 00:40:10.122
app that’s connected with my chip.
740
00:40:10.122 –> 00:40:13.396
So when I tap that with my phone, it will
741
00:40:13.396 –> 00:40:16.206
activate the chip and it will take a measure.
742
00:40:16.206 –> 00:40:19.256
And the temperature measurement is stored in my phone
743
00:40:19.256 –> 00:40:22.180
and my kind of account and nowhere else.
744
00:40:24.150 –> 00:40:26.472
That’s kind of the idea with it.
745
00:40:26.472 –> 00:40:28.890
And then it knows my baseline.
746
00:40:28.890 –> 00:40:31.346
So the interesting thing is to see the variations.
747
00:40:31.346 –> 00:40:34.274
And then in theory, you could put on more sensors.
748
00:40:34.274 –> 00:40:37.148
And then the cool thing about that is that
749
00:40:37.148 –> 00:40:40.396
this is a very cheap technology to create.
750
00:40:40.396 –> 00:40:43.744
And if we could do that, you could provide simple,
751
00:40:43.744 –> 00:40:46.486
basic health care that you and I take for granted
752
00:40:46.486 –> 00:40:48.672
to places in the world where they do not have
753
00:40:48.672 –> 00:40:53.476
doctors accessible like we are used to. The next step.
754
00:40:53.476 –> 00:40:56.468
Another kind of interesting use case here is if you
755
00:40:56.468 –> 00:40:59.610
have really old parents living on a little island.
756
00:40:59.610 –> 00:41:02.042
That’s what I wanted to mention. Japan.
757
00:41:02.042 –> 00:41:03.636
Yeah, exactly.
758
00:41:03.636 –> 00:41:06.376
And then usually when they get to a certain
759
00:41:06.376 –> 00:41:08.552
age and the risk is too high, they are
760
00:41:08.552 –> 00:41:12.216
forced to move closer to a health care.
761
00:41:12.216 –> 00:41:15.080
They don’t necessarily have to be ill,
762
00:41:15.080 –> 00:41:17.096
but that’s what we do now.
763
00:41:17.096 –> 00:41:19.324
We could maybe let them live there for
764
00:41:19.324 –> 00:41:20.892
a couple of more years because we can
765
00:41:20.892 –> 00:41:25.314
monitor them, their health development from remotely.
766
00:41:25.314 –> 00:41:26.700
And this is just a start.
767
00:41:26.700 –> 00:41:30.342
And I don’t think that we are going to implant
768
00:41:30.342 –> 00:41:36.688
people, inject people with microchips, but I think it will
769
00:41:36.688 –> 00:41:40.740
be more common than not on a voluntarily basis, because
770
00:41:40.740 –> 00:41:42.340
it could be a good thing as well.
771
00:41:42.340 –> 00:41:45.396
And then you move into next like nanobots and
772
00:41:45.396 –> 00:41:51.092
customized surgery, or not surgery, but medicines and stuff.
773
00:41:51.092 –> 00:41:52.800
Diagnosis, which is amazing.
774
00:41:54.210 –> 00:41:56.526
How is it better in terms of your chips?
775
00:41:56.526 –> 00:41:59.870
How is it better from non invasive Iot
776
00:41:59.870 –> 00:42:04.488
wearables, like wearable devices like rings, they can
777
00:42:04.488 –> 00:42:07.106
still measure your temperature, they can still measure
778
00:42:07.106 –> 00:42:10.950
your pressure and all the vital signals.
779
00:42:11.530 –> 00:42:15.948
Do you really need to put something inside your body?
780
00:42:15.948 –> 00:42:19.968
No, you don’t really, because you can use your watch.
781
00:42:19.968 –> 00:42:21.568
I have a smartwatch and I
782
00:42:21.568 –> 00:42:24.176
have an aura ring with twelve.
783
00:42:24.176 –> 00:42:26.784
I think it’s 20 sensors or something like that.
784
00:42:26.784 –> 00:42:31.088
But the point is that it should, in
785
00:42:31.088 –> 00:42:33.828
theory be more accurate because it’s inside your
786
00:42:33.828 –> 00:42:35.508
body and not outside your body.
787
00:42:35.508 –> 00:42:36.698
It could also be easier.
788
00:42:36.698 –> 00:42:38.850
You don’t have to recharge it.
789
00:42:38.850 –> 00:42:41.970
So it’s kind of a different use case.
790
00:42:41.970 –> 00:42:45.828
And then in addition to that, the buy effect, which is
791
00:42:45.828 –> 00:42:47.688
cool, is that I can open the door and I can
792
00:42:47.688 –> 00:42:50.200
pay in the container at the office building where I’m at.
793
00:42:50.200 –> 00:42:51.304
Come on.
794
00:42:51.304 –> 00:42:52.420
Once in a while.
795
00:42:53.110 –> 00:42:54.574
But that’s not the purpose.
796
00:42:54.574 –> 00:42:56.782
Yeah, but you know that you can open the door
797
00:42:56.782 –> 00:42:59.560
with your phone or with your voice as well.
798
00:43:00.090 –> 00:43:02.040
Yeah, but they don’t have that system.
799
00:43:02.570 –> 00:43:05.530
I don’t even know where my access card is.
800
00:43:05.530 –> 00:43:07.030
So that’s just a gimmick.
801
00:43:08.570 –> 00:43:10.668
I also have my business card here, so when
802
00:43:10.668 –> 00:43:12.768
we meet in person, you can just scan me
803
00:43:12.768 –> 00:43:15.104
with your phone and get my business card.
804
00:43:15.104 –> 00:43:17.200
So that’s just a funny, but I think it’s interesting
805
00:43:17.200 –> 00:43:20.080
to see how technology can be used in a different.
806
00:43:20.080 –> 00:43:22.432
And this is just the same kind of technology that
807
00:43:22.432 –> 00:43:25.300
we’ve been using on cats and dogs for 40 years.
808
00:43:25.300 –> 00:43:27.924
I’m okay with cat and dog because they
809
00:43:27.924 –> 00:43:30.148
walk around and you don’t know where they
810
00:43:30.148 –> 00:43:32.212
are going, and they cannot call you.
811
00:43:32.212 –> 00:43:34.680
But with this, I’m not so sure.
812
00:43:34.680 –> 00:43:35.928
But I see that you are
813
00:43:35.928 –> 00:43:38.472
a perfect candidate for technology.
814
00:43:38.472 –> 00:43:40.190
Like a neuron.
815
00:43:40.190 –> 00:43:41.300
Bring it on.
816
00:43:44.230 –> 00:43:48.172
So, would you want to have
817
00:43:48.172 –> 00:43:51.362
an neuralink implant like connection?
818
00:43:51.362 –> 00:43:53.116
No, I think that would be.
819
00:43:53.116 –> 00:43:56.700
I mean, again, when I said yes to this
820
00:43:56.700 –> 00:44:00.172
implant project, it was because I was told and
821
00:44:00.172 –> 00:44:02.800
described and shown, why are we doing this?
822
00:44:02.800 –> 00:44:05.184
And it made sense to me, and I want to
823
00:44:05.184 –> 00:44:08.176
be part of that and contribute because I thought or
824
00:44:08.176 –> 00:44:09.984
think that this is a good thing to do.
825
00:44:09.984 –> 00:44:12.512
And the same thing about neuralink, I guess,
826
00:44:12.512 –> 00:44:19.252
that if I can contribute to creating or
827
00:44:19.252 –> 00:44:22.420
being able to create a better world, sure.
828
00:44:22.420 –> 00:44:24.324
And I trust in technology.
829
00:44:24.324 –> 00:44:27.416
Maybe not that much in musk, he’s in that case.
830
00:44:27.416 –> 00:44:29.358
But that’s also one of the reasons
831
00:44:29.358 –> 00:44:32.232
that he has moved the world forward.
832
00:44:32.232 –> 00:44:34.900
I think you can like or dislike him.
833
00:44:36.090 –> 00:44:37.692
You see a lot of people who
834
00:44:37.692 –> 00:44:42.770
are really being the true disruptors.
835
00:44:42.770 –> 00:44:44.274
They are kind of strange.
836
00:44:44.274 –> 00:44:47.244
They are kind of crazy. But, yeah.
837
00:44:47.244 –> 00:44:48.908
I wouldn’t be negative to it.
838
00:44:48.908 –> 00:44:51.718
I wouldn’t do it in a back room somewhere.
839
00:44:51.718 –> 00:44:54.208
But, no, I think that’s a good idea.
840
00:44:54.208 –> 00:44:54.656
Yeah.
841
00:44:54.656 –> 00:44:57.888
We say in polish, there is a saying that.
842
00:44:57.888 –> 00:45:01.226
I think it’s international genius
843
00:45:01.226 –> 00:45:03.120
and insanity go together.
844
00:45:04.850 –> 00:45:05.508
Yeah.
845
00:45:05.508 –> 00:45:07.760
And if they need a. Here.
846
00:45:10.450 –> 00:45:11.236
Okay.
847
00:45:11.236 –> 00:45:16.320
So, have you heard of Brian Johnson and his.
848
00:45:17.730 –> 00:45:18.954
He’s.
849
00:45:18.954 –> 00:45:20.766
He used to be an entrepreneur.
850
00:45:20.766 –> 00:45:25.128
He sold his company for hundreds of millions, and now
851
00:45:25.128 –> 00:45:29.596
he’s trying to turn himself into 18 year old.
852
00:45:29.596 –> 00:45:32.524
He’s, like, 48, I think, at this point.
853
00:45:32.524 –> 00:45:38.418
So he’s rigorously following all the procedures, and he’s
854
00:45:38.418 –> 00:45:41.376
taking, like, 200 pills a day just to.
855
00:45:41.376 –> 00:45:42.640
And he tracks everything.
856
00:45:42.640 –> 00:45:45.648
So he’s trying to reverse aging, let’s say,
857
00:45:45.648 –> 00:45:48.848
the Benjamin button of modern ages, although he
858
00:45:48.848 –> 00:45:50.678
doesn’t look so good as Brad Pitt.
859
00:45:50.678 –> 00:45:53.108
But anyway, would you want to do the same?
860
00:45:53.108 –> 00:45:56.708
No, I didn’t know because I think there’s a
861
00:45:56.708 –> 00:46:00.148
difference between I would like to be, if I
862
00:46:00.148 –> 00:46:03.448
could be 150 or 200 years, love it.
863
00:46:03.448 –> 00:46:05.896
And of course it would be a good thing to not
864
00:46:05.896 –> 00:46:11.790
end up being this old, can’t move around, everything is hurting.
865
00:46:11.790 –> 00:46:18.530
But I have no desire to reverse or being 18
866
00:46:18.530 –> 00:46:26.780
again for the reasons, the reason need to be right.
867
00:46:26.780 –> 00:46:29.184
I think it would be fun to live for a
868
00:46:29.184 –> 00:46:34.750
long time, at least for now, but not going to.
869
00:46:34.750 –> 00:46:36.848
I feel that he’s doing this for.
870
00:46:36.848 –> 00:46:39.152
The wrong reasons, but someone has.
871
00:46:39.152 –> 00:46:41.670
So you said that we need guinea pigs.
872
00:46:41.670 –> 00:46:42.928
Yeah, sure.
873
00:46:42.928 –> 00:46:44.820
And then go ahead, I don’t care.
874
00:46:44.820 –> 00:46:48.080
I don’t mind people do it, but I wouldn’t do it.
875
00:46:48.610 –> 00:46:51.220
I feel quite comfortable being like I am right now.
876
00:46:51.220 –> 00:46:53.614
No, but obviously, because I’ve said, I’ve
877
00:46:53.614 –> 00:46:56.392
been asked that before, could you see
878
00:46:56.392 –> 00:47:00.104
yourself being 200 years old and. Yeah, I could.
879
00:47:00.104 –> 00:47:03.512
I think that would be amazing, obviously.
880
00:47:03.512 –> 00:47:06.652
Or it wouldn’t be a good thing if I ended up
881
00:47:06.652 –> 00:47:12.652
being an old, worn out body, couldn’t move around all that
882
00:47:12.652 –> 00:47:19.564
stuff, then you need to slow down the aging process.
883
00:47:19.564 –> 00:47:22.576
But I feel that he is trying to go backwards and
884
00:47:22.576 –> 00:47:26.300
I’m not sure if that, for me, that’s not the reason.
885
00:47:26.990 –> 00:47:27.760
Good enough.
886
00:47:27.760 –> 00:47:32.650
Maybe he’s just buying himself time to wait or to arrive
887
00:47:32.650 –> 00:47:39.908
at the point where scientists can help us move to the
888
00:47:39.908 –> 00:47:43.870
cloud, become immortal, at least in a digital version.
889
00:47:43.870 –> 00:47:46.040
And then you can live 200 years.
890
00:47:46.040 –> 00:47:48.792
But you would experience your
891
00:47:48.792 –> 00:47:51.620
senses in a different way.
892
00:47:52.230 –> 00:47:55.518
Would you want to the whole metaverse?
893
00:47:55.518 –> 00:47:57.400
And what’s next?
894
00:47:58.090 –> 00:48:01.400
Downloading your consciousness into the cloud?
895
00:48:02.490 –> 00:48:07.532
Not sure, because for me that’s a cool idea. Maybe.
896
00:48:07.532 –> 00:48:14.070
And some minds might be good to save some minds.
897
00:48:14.070 –> 00:48:15.168
I don’t know.
898
00:48:15.168 –> 00:48:16.608
There are already.
899
00:48:16.608 –> 00:48:19.456
You can make a replica of yourself does the service.
900
00:48:19.456 –> 00:48:21.988
Yes, I know, but it’s quite creepy, right?
901
00:48:21.988 –> 00:48:25.730
Like you can talk with your exes at
902
00:48:25.730 –> 00:48:29.940
the best case scenario, the exes pre times
903
00:48:29.940 –> 00:48:34.130
when it started going wrong, or your deceased.
904
00:48:36.070 –> 00:48:38.120
Then mean at some point.
905
00:48:38.120 –> 00:48:40.222
I also saw an example from South
906
00:48:40.222 –> 00:48:41.496
Korea a couple of years ago.
907
00:48:41.496 –> 00:48:45.848
And there’s a tragic story about a mom or a
908
00:48:45.848 –> 00:48:49.778
family losing their daughter, six year old to cancer.
909
00:48:49.778 –> 00:48:50.498
And she died.
910
00:48:50.498 –> 00:48:52.972
And they had the opportunity and I don’t know why
911
00:48:52.972 –> 00:48:56.972
and how that came about, but they ended up having
912
00:48:56.972 –> 00:49:03.568
a company creating working with VR and the metaverse, and
913
00:49:03.568 –> 00:49:08.182
they created an avatar of their daughter and the mother.
914
00:49:08.182 –> 00:49:09.776
The movie that you can see is
915
00:49:09.776 –> 00:49:11.930
where the mother actually meets her daughter.
916
00:49:11.930 –> 00:49:16.388
They use some kind of generative AI model.
917
00:49:16.388 –> 00:49:19.402
This was before GTP was launched.
918
00:49:19.402 –> 00:49:22.964
But something in that direction trained on
919
00:49:22.964 –> 00:49:25.208
all the memories they had about her.
920
00:49:25.208 –> 00:49:27.192
Her voice was sampled so she could
921
00:49:27.192 –> 00:49:29.294
actually see and talk to her daughter.
922
00:49:29.294 –> 00:49:30.808
She even had gloves on.
923
00:49:30.808 –> 00:49:32.926
She could touch her daughter.
924
00:49:32.926 –> 00:49:36.520
I’m not sure if that’s a good thing or not.
925
00:49:38.330 –> 00:49:42.972
And I talked to a psychologist about this, and she
926
00:49:42.972 –> 00:49:47.836
told me that in some cases, it could be if
927
00:49:47.836 –> 00:49:52.112
something, accidents and being able to kind of take the
928
00:49:52.112 –> 00:49:55.264
last farewell could be a good thing.
929
00:49:55.264 –> 00:49:57.792
But it’s also something that at some
930
00:49:57.792 –> 00:49:59.248
point, you need to move on.
931
00:49:59.248 –> 00:50:02.452
And this might hold you back in the past.
932
00:50:02.452 –> 00:50:04.452
Yeah, and. I don’t know.
933
00:50:04.452 –> 00:50:05.600
I have no idea.
934
00:50:06.370 –> 00:50:10.436
But then you have this other service, as you tried,
935
00:50:10.436 –> 00:50:12.692
or I tried as well, where you can kind of
936
00:50:12.692 –> 00:50:18.030
create your memory based on your voice for your relatives.
937
00:50:18.030 –> 00:50:20.728
So my grand, grand grandchildren could talk to
938
00:50:20.728 –> 00:50:22.792
me even though I’ve been dead for years.
939
00:50:22.792 –> 00:50:26.332
And then based on trained on what I’ve trained it on,
940
00:50:26.332 –> 00:50:31.932
that could be more fun, I think, than anything else.
941
00:50:31.932 –> 00:50:34.140
So we have to kind of distinguish between
942
00:50:34.140 –> 00:50:38.002
what’s necessary, what’s not necessary, what is entertainment,
943
00:50:38.002 –> 00:50:40.700
and what is maybe something for.
944
00:50:42.110 –> 00:50:43.184
What do you call that?
945
00:50:43.184 –> 00:50:44.830
Therapeutic.
946
00:50:44.830 –> 00:50:45.580
Yeah.
947
00:50:47.870 –> 00:50:50.368
That’s different as well, because in the middle of
948
00:50:50.368 –> 00:50:54.020
this, and that’s interesting, I learned that we men,
949
00:50:54.020 –> 00:50:56.836
we are not as good as opening ourselves up
950
00:50:56.836 –> 00:51:00.948
for psychologists, for instance, but we are quite good
951
00:51:00.948 –> 00:51:04.548
at opening up ourselves talking to a computer.
952
00:51:04.548 –> 00:51:06.750
And if that helps, and strangers.
953
00:51:07.410 –> 00:51:09.908
Strangers and computers, and quite strange as
954
00:51:09.908 –> 00:51:14.960
well, but if that helps people, great.
955
00:51:15.530 –> 00:51:20.796
But, yeah, this is a landscape where it’s hard
956
00:51:20.796 –> 00:51:23.480
to kind of say what’s right and what’s wrong.
957
00:51:24.170 –> 00:51:24.732
It is.
958
00:51:24.732 –> 00:51:27.804
And that’s why you need diverse people,
959
00:51:27.804 –> 00:51:32.918
diverse specialists to help design those algorithms.
960
00:51:32.918 –> 00:51:37.936
Psychologists really need to have their say
961
00:51:37.936 –> 00:51:42.160
what potential risks it can bring.
962
00:51:42.690 –> 00:51:43.348
Definitely.
963
00:51:43.348 –> 00:51:46.372
You said the really interesting thing that why
964
00:51:46.372 –> 00:51:49.908
should we keep reliving the past while there
965
00:51:49.908 –> 00:51:54.696
is whole new opportunities and new things?
966
00:51:54.696 –> 00:51:57.860
We can create the new memories we can create.
967
00:51:58.550 –> 00:52:01.438
And I don’t know if you’ve
968
00:52:01.438 –> 00:52:04.390
seen this black mirror episode.
969
00:52:04.390 –> 00:52:08.812
I think it has been like a similar case in
970
00:52:08.812 –> 00:52:13.452
many different Sci-Fi movies, but it’s actually happening at this
971
00:52:13.452 –> 00:52:19.376
point with meta relaunching the ray ban glasses, which allow
972
00:52:19.376 –> 00:52:24.656
you in real time and in your person view record
973
00:52:24.656 –> 00:52:27.950
all there is, like all that you are doing.
974
00:52:27.950 –> 00:52:31.780
And in this black mirror episode, there is this
975
00:52:31.780 –> 00:52:37.146
family, I think, and they just keep watching memories.
976
00:52:37.146 –> 00:52:39.268
And while you are doing it, you are
977
00:52:39.268 –> 00:52:41.876
not living, you are not creating new ones.
978
00:52:41.876 –> 00:52:45.570
And where is the balance?
979
00:52:45.570 –> 00:52:48.862
Yeah, and it depends.
980
00:52:48.862 –> 00:52:51.838
Like I said, I talk to psychologists because I think it’s
981
00:52:51.838 –> 00:52:57.196
very easy for me to say that you shouldn’t do that.
982
00:52:57.196 –> 00:52:59.480
But I haven’t lost a child.
983
00:53:00.810 –> 00:53:02.780
That must be one of the most
984
00:53:02.780 –> 00:53:06.250
terrible things a parent can experience.
985
00:53:06.250 –> 00:53:09.552
And then in that trauma, if that kind of
986
00:53:09.552 –> 00:53:12.566
solution could help you to kind of overcome.
987
00:53:12.566 –> 00:53:15.952
But like my psychology, he’s not
988
00:53:15.952 –> 00:53:17.936
mine or she’s not mine. There’s another one.
989
00:53:17.936 –> 00:53:21.748
But she said that that’s the case.
990
00:53:21.748 –> 00:53:24.852
If it’s therapeutically wise to
991
00:53:24.852 –> 00:53:26.756
do that, then it’s okay.
992
00:53:26.756 –> 00:53:29.120
If it’s not just kind of a service.
993
00:53:30.290 –> 00:53:34.472
We need experts, like you said, diverse experts to
994
00:53:34.472 –> 00:53:39.688
help us understand when to use what for what.
995
00:53:39.688 –> 00:53:42.360
Then I forgot to ask you about another
996
00:53:42.360 –> 00:53:46.028
subject of yours, which is also very interesting.
997
00:53:46.028 –> 00:53:47.690
The drones.
998
00:53:47.690 –> 00:53:51.228
I know that you love drones, and I saw on
999
00:53:51.228 –> 00:53:57.904
one of your speeches you bought so many, those, I
1000
00:53:57.904 –> 00:54:01.472
think disposable, hopefully recycled plastic, I would say.
1001
00:54:01.472 –> 00:54:03.894
I hope so. Drones.
1002
00:54:03.894 –> 00:54:07.392
So what do you find so fascinating about it?
1003
00:54:07.392 –> 00:54:12.138
I know there are good use cases.
1004
00:54:12.138 –> 00:54:16.292
I think you’ve heard of zipline, this american
1005
00:54:16.292 –> 00:54:20.042
or african company which uses drones to deliver
1006
00:54:20.042 –> 00:54:23.650
blood to remote places and remote hospitals.
1007
00:54:24.310 –> 00:54:26.808
But there are other uses of
1008
00:54:26.808 –> 00:54:29.854
drones like surveillance and military.
1009
00:54:29.854 –> 00:54:32.550
What fascinates you in them?
1010
00:54:32.550 –> 00:54:33.592
Yeah.
1011
00:54:33.592 –> 00:54:36.876
First of all, I think my fascination for drones started
1012
00:54:36.876 –> 00:54:41.350
years and years ago just because it was fun flying.
1013
00:54:43.770 –> 00:54:49.746
You know, drones have turned into be a huge industry.
1014
00:54:49.746 –> 00:54:52.208
And I’ve been fortunate enough to be part of
1015
00:54:52.208 –> 00:54:54.992
a big conference here in Norway and also went
1016
00:54:54.992 –> 00:54:58.240
to the states to see some of the proper
1017
00:54:58.240 –> 00:55:02.356
use of drones, not only for military purposes, but
1018
00:55:02.356 –> 00:55:05.450
know most of all for civilian purposes.
1019
00:55:05.450 –> 00:55:07.140
And one thing is, like you said,
1020
00:55:07.140 –> 00:55:11.890
transportation or blood samples or even organs.
1021
00:55:11.890 –> 00:55:14.552
I know that there’s a project going on here in
1022
00:55:14.552 –> 00:55:21.650
Oslo between the two big hospitals, because time is crucial.
1023
00:55:22.230 –> 00:55:26.482
We could also see drones transporting goods
1024
00:55:26.482 –> 00:55:29.532
and us people for that matter, but
1025
00:55:29.532 –> 00:55:37.510
also for rescue workers, for inspecting buildings.
1026
00:55:38.170 –> 00:55:40.332
A friend of mine showed me, he sent
1027
00:55:40.332 –> 00:55:42.192
me a video clip here the other day.
1028
00:55:42.192 –> 00:55:45.456
He lives in Bergen and he lives in apartment building,
1029
00:55:45.456 –> 00:55:48.832
and the apartment building is on the dock, so.
1030
00:55:48.832 –> 00:55:50.528
And they wanted to wash it.
1031
00:55:50.528 –> 00:55:53.012
Sometimes you need to kind of wash your building.
1032
00:55:53.012 –> 00:55:55.652
And it was so expensive because putting up
1033
00:55:55.652 –> 00:55:59.230
a kind of construction thing in the sea,
1034
00:56:00.290 –> 00:56:03.546
since it’s on the dockside, was extremely expensive.
1035
00:56:03.546 –> 00:56:06.104
But then they came about a company on the
1036
00:56:06.104 –> 00:56:09.966
west coast of Norway doing building cleaning with drones.
1037
00:56:09.966 –> 00:56:12.328
And it was one fourth of the
1038
00:56:12.328 –> 00:56:14.536
price took one third of the time.
1039
00:56:14.536 –> 00:56:16.204
Two guys came with a car and a
1040
00:56:16.204 –> 00:56:19.240
compressor and they flew a drone washing the
1041
00:56:20.650 –> 00:56:22.844
know, it was like, wow, is that possible?
1042
00:56:22.844 –> 00:56:23.916
And, yeah, it is.
1043
00:56:23.916 –> 00:56:28.560
And you could do it and then, just to show
1044
00:56:28.560 –> 00:56:32.358
how simple it is, I bought the new DJI drone.
1045
00:56:32.358 –> 00:56:34.590
DJI being one of the biggest
1046
00:56:34.590 –> 00:56:36.496
drone producers in the world.
1047
00:56:36.496 –> 00:56:40.448
It’s a chinese company, but they have the
1048
00:56:40.448 –> 00:56:43.332
new drone, the first person view drone. They have.
1049
00:56:43.332 –> 00:56:46.506
You have goggles on, and it’s like you’re
1050
00:56:46.506 –> 00:56:48.880
sitting inside the drone and flying it.
1051
00:56:49.650 –> 00:56:52.148
Five years ago, ten years ago, that was pretty hard.
1052
00:56:52.148 –> 00:56:55.386
Now you just use a little controller that you tilt
1053
00:56:55.386 –> 00:56:57.928
with your hand, and then the drone turns around and
1054
00:56:57.928 –> 00:57:00.734
you have a dot that I see from the camera,
1055
00:57:00.734 –> 00:57:02.632
and I can just follow that around.
1056
00:57:02.632 –> 00:57:04.648
And then a friend of mine at my
1057
00:57:04.648 –> 00:57:06.562
office, we decided to try to fly.
1058
00:57:06.562 –> 00:57:11.260
He stood like this, and I flew over his head.
1059
00:57:11.260 –> 00:57:12.200
Oh, no.
1060
00:57:12.730 –> 00:57:14.810
Does he still have hair?
1061
00:57:14.810 –> 00:57:15.340
Yeah.
1062
00:57:15.340 –> 00:57:16.860
And arms and everything.
1063
00:57:16.860 –> 00:57:18.156
Because it’s not hard.
1064
00:57:18.156 –> 00:57:19.228
It’s so super.
1065
00:57:19.228 –> 00:57:22.928
I wouldn’t recommend people to do that or encourage people to
1066
00:57:22.928 –> 00:57:27.216
do that, but it was not a problem at all.
1067
00:57:27.216 –> 00:57:29.568
And that’s because the drone now, with all
1068
00:57:29.568 –> 00:57:33.366
the technology inside, and this is a commercial
1069
00:57:33.366 –> 00:57:35.824
drone you can buy in the store, it
1070
00:57:35.824 –> 00:57:38.996
costs, like, I don’t know, €500 or something.
1071
00:57:38.996 –> 00:57:40.586
It’s not that expensive.
1072
00:57:40.586 –> 00:57:42.750
And it’s filled up with sensors.
1073
00:57:43.490 –> 00:57:45.978
It’s filled up with artificial intelligence
1074
00:57:45.978 –> 00:57:47.588
and connections and all that stuff.
1075
00:57:47.588 –> 00:57:49.496
So it makes it possible to do that.
1076
00:57:49.496 –> 00:57:51.368
And then just imagine if you’re a
1077
00:57:51.368 –> 00:57:54.020
little trained, you can do amazing stuff.
1078
00:57:55.210 –> 00:57:59.660
I think that the combination of we have huge
1079
00:57:59.660 –> 00:58:03.762
drone can transport heavy stuff, we have small drones,
1080
00:58:03.762 –> 00:58:06.950
can transport small stuff, we can enter and surveil.
1081
00:58:08.490 –> 00:58:11.720
If it’s accidents or whatever.
1082
00:58:13.210 –> 00:58:16.190
The opportunities are tremendous.
1083
00:58:16.190 –> 00:58:17.456
That’s one kind.
1084
00:58:17.456 –> 00:58:20.490
And now there’s a guy flying a drone on Mars.
1085
00:58:21.810 –> 00:58:23.360
How crazy is that?
1086
00:58:24.130 –> 00:58:25.440
It is crazy.
1087
00:58:26.130 –> 00:58:28.532
It is crazy. Yeah.
1088
00:58:28.532 –> 00:58:31.178
But things are changing so rapidly.
1089
00:58:31.178 –> 00:58:34.952
There are so many stimuli, so much
1090
00:58:34.952 –> 00:58:37.534
information all around us, and I don’t
1091
00:58:37.534 –> 00:58:41.110
know if you sometimes get overwhelmed. I do.
1092
00:58:41.110 –> 00:58:42.840
I’m in tech. I love tech.
1093
00:58:42.840 –> 00:58:47.660
But it’s just so difficult to keep up with
1094
00:58:47.660 –> 00:58:52.440
all that it’s happening even in your chosen field.
1095
00:58:53.130 –> 00:58:54.760
How can normal people.
1096
00:58:55.370 –> 00:58:56.610
We are not normal.
1097
00:58:56.610 –> 00:58:58.288
Okay, I’m talking to myself.
1098
00:58:58.288 –> 00:58:59.654
I think I’m not normal.
1099
00:58:59.654 –> 00:59:02.940
But how can other people.
1100
00:59:04.990 –> 00:59:08.272
How can they keep up with the pace of technology?
1101
00:59:08.272 –> 00:59:11.870
How can they not miss on opportunities?
1102
00:59:12.610 –> 00:59:14.990
Is there any way, any advice?
1103
00:59:15.890 –> 00:59:18.708
Listen to your podcasts. Of course.
1104
00:59:18.708 –> 00:59:20.506
It’s almost impossible.
1105
00:59:20.506 –> 00:59:21.546
It’s almost impossible.
1106
00:59:21.546 –> 00:59:23.272
You have to kind of choose what I
1107
00:59:23.272 –> 00:59:25.192
say, at least now I try to.
1108
00:59:25.192 –> 00:59:29.672
And that’s maybe part of my job, I
1109
00:59:29.672 –> 00:59:33.300
feel, is to make or help people.
1110
00:59:33.990 –> 00:59:37.884
Being able to kind of understand what’s going on
1111
00:59:37.884 –> 00:59:41.836
that’s relevant for your or our world right now.
1112
00:59:41.836 –> 00:59:43.388
There’s a bunch, I mean, we can talk
1113
00:59:43.388 –> 00:59:46.172
about quantum computers, but most of us have,
1114
00:59:46.172 –> 00:59:48.780
that’s just out of this world.
1115
00:59:51.630 –> 00:59:53.216
I say that right now.
1116
00:59:53.216 –> 00:59:56.326
I think that it’s your responsibility
1117
00:59:56.326 –> 00:59:57.392
in one way or the other.
1118
00:59:57.392 –> 01:00:00.432
And we can’t be experts, all of us, that’s not
1119
01:00:00.432 –> 01:00:04.086
possible and definitely not within all these different fields.
1120
01:00:04.086 –> 01:00:06.308
But you could choose, like understand a little
1121
01:00:06.308 –> 01:00:09.098
bit more about data and data analysis.
1122
01:00:09.098 –> 01:00:09.956
What does it mean?
1123
01:00:09.956 –> 01:00:11.220
What can we use it for?
1124
01:00:11.220 –> 01:00:16.070
How does it apply to my job or my industry?
1125
01:00:16.070 –> 01:00:17.534
Read an article.
1126
01:00:17.534 –> 01:00:19.848
There’s a bunch of places where
1127
01:00:19.848 –> 01:00:21.576
you can find information about that.
1128
01:00:21.576 –> 01:00:23.336
I’ll come back to a few, and then
1129
01:00:23.336 –> 01:00:30.860
I think that artificial intelligence as important area
1130
01:00:30.860 –> 01:00:34.040
of knowledge that we need to understand difference.
1131
01:00:34.810 –> 01:00:35.836
What is it?
1132
01:00:35.836 –> 01:00:37.080
What is it not?
1133
01:00:38.490 –> 01:00:41.360
And try to kind of experiment a little bit,
1134
01:00:41.360 –> 01:00:44.582
try chat, GTP, try to create a few pictures
1135
01:00:44.582 –> 01:00:47.392
on dolly or whatever service you like and just
1136
01:00:47.392 –> 01:00:49.740
see what happens if you change out the word.
1137
01:00:51.010 –> 01:00:55.012
That way you can learn a little bit how it works.
1138
01:00:55.012 –> 01:01:00.164
Then I think that extended realities, both VR and
1139
01:01:00.164 –> 01:01:05.608
AR, and it’s something that’s going to be more
1140
01:01:05.608 –> 01:01:08.776
and more important in our future going forward.
1141
01:01:08.776 –> 01:01:11.288
Not that we are moving into the metaverse, but
1142
01:01:11.288 –> 01:01:14.632
it’s tools that we could use to collaborate, or
1143
01:01:14.632 –> 01:01:19.084
there’s a bunch of exciting examples around that.
1144
01:01:19.084 –> 01:01:22.828
And also robotics and automation, and then
1145
01:01:22.828 –> 01:01:25.762
not only robots that are physical robots,
1146
01:01:25.762 –> 01:01:27.804
but software robots, even more important.
1147
01:01:27.804 –> 01:01:30.896
So watch a TED talk, read a little
1148
01:01:30.896 –> 01:01:36.150
article I use MIT technology review, wonderful magazine.
1149
01:01:36.150 –> 01:01:38.830
You don’t have to be a super tech
1150
01:01:38.830 –> 01:01:41.508
developer to understand what’s going on there.
1151
01:01:41.508 –> 01:01:47.146
Wired magazine, there’s a lot of different areas.
1152
01:01:47.146 –> 01:01:49.040
Verge online.
1153
01:01:50.770 –> 01:01:53.252
Try to kind of get hold of some of your
1154
01:01:53.252 –> 01:01:58.120
interests and what it means for what you’re doing.
1155
01:01:58.120 –> 01:02:01.672
And another thing that I think it’s important that I try
1156
01:02:01.672 –> 01:02:05.704
to kind of encourage people to do is both not to
1157
01:02:05.704 –> 01:02:08.680
be afraid of losing your job, because you’re not.
1158
01:02:09.210 –> 01:02:12.236
My claim that never before has it been so
1159
01:02:12.236 –> 01:02:14.812
important to be a human being as right now.
1160
01:02:14.812 –> 01:02:17.628
And looking at your job, if you look at your
1161
01:02:17.628 –> 01:02:20.982
job, your job today is a bunch of different tasks.
1162
01:02:20.982 –> 01:02:23.408
Some of those tasks will change and they
1163
01:02:23.408 –> 01:02:25.840
have changed and they will continue to change.
1164
01:02:25.840 –> 01:02:29.248
Look at which of these tasks could some of these
1165
01:02:29.248 –> 01:02:34.522
technologies or services or whatever help me do better, reduce
1166
01:02:34.522 –> 01:02:38.452
risk, be more efficient, free up time so I can
1167
01:02:38.452 –> 01:02:44.782
use my human capabilities like intuition and experience and creativity
1168
01:02:44.782 –> 01:02:46.980
and all that to solve other problems.
1169
01:02:47.990 –> 01:02:53.976
Don’t be afraid of, try to kind of pick your life
1170
01:02:53.976 –> 01:02:59.292
a little apart and see what can be used for what?
1171
01:02:59.292 –> 01:03:02.840
To help me live a better life.
1172
01:03:04.170 –> 01:03:06.546
I still feel like we are living in a bubble.
1173
01:03:06.546 –> 01:03:13.696
I get surprised when I attend conferences which are maybe
1174
01:03:13.696 –> 01:03:18.064
not so tech focused, but some industry focused and people
1175
01:03:18.064 –> 01:03:22.080
haven’t heard of Chad Dupiti or Dali or.
1176
01:03:23.090 –> 01:03:27.610
Okay, these are maybe more specific, but like a normal
1177
01:03:27.610 –> 01:03:33.784
person from a small or medium sized town who, I
1178
01:03:33.784 –> 01:03:37.976
don’t know, works as a teacher, they don’t go to,
1179
01:03:37.976 –> 01:03:43.912
they don’t know those article, those medium outlets, and they
1180
01:03:43.912 –> 01:03:46.040
will not even know how to start.
1181
01:03:46.810 –> 01:03:50.364
But you are right that you have to maybe
1182
01:03:50.364 –> 01:03:55.900
try to have curiosity, to try to understand which
1183
01:03:55.900 –> 01:04:01.200
parts of your job are boring, are mundane, and
1184
01:04:01.200 –> 01:04:04.528
they are logic based in a way.
1185
01:04:04.528 –> 01:04:09.820
And just know that potentially technology can help you
1186
01:04:10.450 –> 01:04:13.876
to either eliminate it or optimize your time.
1187
01:04:13.876 –> 01:04:16.772
So like you said, you can be more
1188
01:04:16.772 –> 01:04:19.280
human in whatever work you are doing.
1189
01:04:19.810 –> 01:04:22.756
Yeah, and that’s a very important point.
1190
01:04:22.756 –> 01:04:26.888
But I’m betting you that if you start to kind of
1191
01:04:26.888 –> 01:04:30.008
look in your network of peers within your, if you’re a
1192
01:04:30.008 –> 01:04:32.260
teacher, for instance, I think that’s a good example.
1193
01:04:33.990 –> 01:04:37.256
You will find someone who are interested in
1194
01:04:37.256 –> 01:04:38.972
both the good side and the bad side
1195
01:04:38.972 –> 01:04:40.802
and talk about that on LinkedIn, for instance.
1196
01:04:40.802 –> 01:04:43.980
In Norway, we have a wonderful teacher who
1197
01:04:43.980 –> 01:04:46.348
really has gone all in on trying to
1198
01:04:46.348 –> 01:04:50.130
understand how can we use technology and AI.
1199
01:04:50.130 –> 01:04:54.342
For know, he’s becoming almost this phenomenon.
1200
01:04:54.342 –> 01:04:56.550
He has his own podcast. He’s a teacher.
1201
01:04:56.550 –> 01:04:58.752
He’s teaching kids on school.
1202
01:04:58.752 –> 01:05:01.488
And then slowly it turned out that he was like,
1203
01:05:01.488 –> 01:05:03.876
you know what, these guys, they know so much.
1204
01:05:03.876 –> 01:05:06.560
They can help me, we can do these things together.
1205
01:05:07.170 –> 01:05:11.652
And that understanding that he kind of builds and
1206
01:05:11.652 –> 01:05:14.312
it’s much better coming from him than from some
1207
01:05:14.312 –> 01:05:20.610
kind of AI guru somewhere, creates understanding.
1208
01:05:21.190 –> 01:05:25.112
And then again back to, and I really emphasize that because
1209
01:05:25.112 –> 01:05:29.548
we cannot be experts, all of us, and we should not
1210
01:05:29.548 –> 01:05:33.276
be experts, all of us in this technology field.
1211
01:05:33.276 –> 01:05:34.674
But we need to be curious.
1212
01:05:34.674 –> 01:05:36.972
We can’t just say that, well, I don’t like
1213
01:05:36.972 –> 01:05:40.890
it because that’s the way the world moves.
1214
01:05:42.030 –> 01:05:46.384
It’s kind of your obligation, what do you call that?
1215
01:05:46.384 –> 01:05:48.150
You have to take that responsibility.
1216
01:05:48.150 –> 01:05:48.970
Responsibility.
1217
01:05:50.430 –> 01:05:53.268
It’s your responsibility to make sure that you kind
1218
01:05:53.268 –> 01:05:56.292
of try to figure out, okay, what does it
1219
01:05:56.292 –> 01:05:59.556
mean for me instead of just pushing it away?
1220
01:05:59.556 –> 01:06:03.002
I oftentimes use this example when I start my talks.
1221
01:06:03.002 –> 01:06:04.596
I did a talk a couple of years ago
1222
01:06:04.596 –> 01:06:08.712
at a big conference for the construction industry, and
1223
01:06:08.712 –> 01:06:10.632
I was asked to talk a little bit about
1224
01:06:10.632 –> 01:06:14.062
the coming of artificial intelligence and robotics.
1225
01:06:14.062 –> 01:06:16.790
And this was a few years ago, so it.
1226
01:06:16.790 –> 01:06:19.540
When the Boston dynamics dog came around
1227
01:06:21.110 –> 01:06:22.872
and it was the same thing.
1228
01:06:22.872 –> 01:06:25.948
And after my talk, a guy came up to me
1229
01:06:25.948 –> 01:06:27.788
and I guess it was in the beginning of his
1230
01:06:27.788 –> 01:06:31.382
60s, CEO of a company here in Norway.
1231
01:06:31.382 –> 01:06:32.608
And he said that, yeah, this was
1232
01:06:32.608 –> 01:06:34.288
really exciting, but we need to be
1233
01:06:34.288 –> 01:06:37.290
careful with artificial intelligence and robots.
1234
01:06:37.950 –> 01:06:39.930
He told you that in silence.
1235
01:06:42.050 –> 01:06:43.242
And he was dead serious.
1236
01:06:43.242 –> 01:06:46.708
And I said, why? What do you mean? What do you mean?
1237
01:06:46.708 –> 01:06:50.670
And he said, remember how it went in Terminator?
1238
01:06:51.810 –> 01:06:54.872
He was dead serious because his reference, when we
1239
01:06:54.872 –> 01:06:58.168
talked about creating software that was smart in some
1240
01:06:58.168 –> 01:07:00.590
way or the other, putting it into a robot,
1241
01:07:00.590 –> 01:07:03.438
what came to mind was Terminator.
1242
01:07:03.438 –> 01:07:07.832
And we know that that’s not the way it’s going to work.
1243
01:07:07.832 –> 01:07:09.240
So for him to kind of.
1244
01:07:09.240 –> 01:07:12.220
But maybe now there’s a hard time getting
1245
01:07:12.220 –> 01:07:15.240
enough resources to be on your will it.
1246
01:07:17.770 –> 01:07:19.808
Maybe you are talking to Android right now.
1247
01:07:19.808 –> 01:07:20.560
You don’t know that.
1248
01:07:20.560 –> 01:07:21.216
Exactly.
1249
01:07:21.216 –> 01:07:22.672
You don’t know either.
1250
01:07:22.672 –> 01:07:24.102
Remember, I have four chips.
1251
01:07:24.102 –> 01:07:28.934
This is just a hologram, and I don’t need chips.
1252
01:07:28.934 –> 01:07:31.360
They created the better version of me.
1253
01:07:35.570 –> 01:07:38.596
Do you actually remember the
1254
01:07:38.596 –> 01:07:42.240
pre Internet, pre mobile times?
1255
01:07:42.790 –> 01:07:45.448
Do you have any good memories around that?
1256
01:07:45.448 –> 01:07:47.182
I was like, boring.
1257
01:07:47.182 –> 01:07:51.510
I have to go out to play with my friends.
1258
01:07:51.510 –> 01:07:54.130
And right now you’re playing quake.
1259
01:07:54.890 –> 01:07:56.172
Yeah, I do remember.
1260
01:07:56.172 –> 01:08:03.842
I’m born in 1970, but I never played soccer.
1261
01:08:03.842 –> 01:08:05.452
I never played. That was me.
1262
01:08:05.452 –> 01:08:08.582
But I was very fond of being outside in the woods.
1263
01:08:08.582 –> 01:08:09.232
I still am.
1264
01:08:09.232 –> 01:08:12.032
That’s my kind know. Connecting off.
1265
01:08:12.032 –> 01:08:14.288
Yeah, Norway is a great place for.
1266
01:08:14.288 –> 01:08:14.800
Yeah.
1267
01:08:14.800 –> 01:08:20.140
So I always done, and I really, really remember when
1268
01:08:21.149 –> 01:08:25.247
the first time I went abroad with some friends of
1269
01:08:25.247 –> 01:08:29.988
mine and just imagined the freedom because we didn’t have
1270
01:08:29.988 –> 01:08:31.578
a cell phone, there were no Internet.
1271
01:08:31.578 –> 01:08:34.359
So when we left, we were on our own.
1272
01:08:34.359 –> 01:08:35.453
No parents.
1273
01:08:35.453 –> 01:08:36.935
They couldn’t call us.
1274
01:08:36.935 –> 01:08:38.456
We couldn’t call them either.
1275
01:08:38.456 –> 01:08:39.859
And we didn’t really care.
1276
01:08:42.790 –> 01:08:45.756
I think that’s why when I’m trying
1277
01:08:45.756 –> 01:08:50.076
to switch off, that’s what I do.
1278
01:08:50.076 –> 01:08:54.066
Go back to somewhere where there are no connection,
1279
01:08:54.066 –> 01:08:56.428
there are no Internet, there are no nothing.
1280
01:08:56.428 –> 01:08:58.109
I enjoy that.
1281
01:08:58.109 –> 01:09:03.014
I think it’s also a matter of choosing
1282
01:09:03.014 –> 01:09:08.048
whether you are online or you’re not.
1283
01:09:08.048 –> 01:09:11.344
And because of all the opportunities we have,
1284
01:09:11.344 –> 01:09:12.720
we have a tendency to stay online.
1285
01:09:12.720 –> 01:09:14.196
And I know I’m the worst one.
1286
01:09:14.196 –> 01:09:17.044
I’m always working, I’m always thinking, but sometimes
1287
01:09:17.044 –> 01:09:19.108
I really need to just shut off.
1288
01:09:19.108 –> 01:09:21.924
Then I find a cabin way up in
1289
01:09:21.924 –> 01:09:24.798
the mountains where there are no coverage whatsoever.
1290
01:09:24.798 –> 01:09:26.776
So I could just leave my phone in the car
1291
01:09:26.776 –> 01:09:30.420
and I go away for a couple of days.
1292
01:09:30.950 –> 01:09:36.630
And that’s an extremely good feeling.
1293
01:09:38.649 –> 01:09:39.778
It is good feeling.
1294
01:09:39.778 –> 01:09:45.964
But then I’m just scared that for what’s happening,
1295
01:09:45.964 –> 01:09:51.055
what’s coming with neuralink and being connected constantly to
1296
01:09:51.055 –> 01:09:54.512
tech, maybe we will arrive to the point where
1297
01:09:54.512 –> 01:09:58.240
we will have to maybe time our switch off.
1298
01:09:58.240 –> 01:10:00.928
It’s like right now, between this and
1299
01:10:00.928 –> 01:10:03.924
this hour, no Internet, no technology.
1300
01:10:03.924 –> 01:10:06.420
Yeah, but that’s the point.
1301
01:10:06.420 –> 01:10:08.868
I read an article last year, I would think
1302
01:10:08.868 –> 01:10:11.252
it was forest research, who did a survey and
1303
01:10:11.252 –> 01:10:15.326
they found out that 80%, over 80% of ordinary
1304
01:10:15.326 –> 01:10:18.130
people feel that the world is now digital.
1305
01:10:18.870 –> 01:10:24.312
And then one very wise person said
1306
01:10:24.312 –> 01:10:27.084
that the switch has not gone from.
1307
01:10:27.084 –> 01:10:31.560
Before you had to log on, now you need to log off.
1308
01:10:32.490 –> 01:10:37.068
So before you were not online all the time, but you
1309
01:10:37.068 –> 01:10:39.228
logged on, and then you could be as much as you
1310
01:10:39.228 –> 01:10:42.640
wanted to, then you logged off, but now you are online
1311
01:10:42.640 –> 01:10:45.020
all the time, you need to make sure that you log.
1312
01:10:45.710 –> 01:10:48.896
Know, it’s hard for some people,
1313
01:10:48.896 –> 01:10:50.768
it’s harder than for others.
1314
01:10:50.768 –> 01:10:52.772
But at least you have the opportunity.
1315
01:10:52.772 –> 01:10:54.916
And I think that would be going forward.
1316
01:10:54.916 –> 01:10:56.618
We see that at least in Norway, there’s
1317
01:10:56.618 –> 01:10:59.818
a change in how young people, or younger
1318
01:10:59.818 –> 01:11:03.620
people are moving away from social media.
1319
01:11:03.620 –> 01:11:06.116
They’re moving away from being connected all the time.
1320
01:11:06.116 –> 01:11:09.960
They’re actually cherishing their time off.
1321
01:11:09.960 –> 01:11:12.808
Not to be not monitored by
1322
01:11:12.808 –> 01:11:14.820
the government, but just to be.
1323
01:11:15.350 –> 01:11:17.528
Now it’s just you and I.
1324
01:11:17.528 –> 01:11:20.188
But on that note, just imagine if that was
1325
01:11:20.188 –> 01:11:23.290
the case, you and I couldn’t have had this
1326
01:11:23.290 –> 01:11:25.986
conversation right now, and we take this for granted.
1327
01:11:25.986 –> 01:11:28.348
And I think that’s pretty amazing as well, because it
1328
01:11:28.348 –> 01:11:32.060
gives you the opportunity to do things like this.
1329
01:11:33.710 –> 01:11:36.342
Yes, but maybe we could meet in the woods,
1330
01:11:36.342 –> 01:11:38.800
put a camera, there’s this big, big camera, and
1331
01:11:38.800 –> 01:11:41.840
record it on, what was the name?
1332
01:11:41.840 –> 01:11:43.142
Fausw.
1333
01:11:43.142 –> 01:11:45.410
No, Vhs.
1334
01:11:45.410 –> 01:11:47.840
And send it to whoever wanted to.
1335
01:11:49.410 –> 01:11:54.744
I remember we have so many memories post tech,
1336
01:11:54.744 –> 01:11:58.824
but I still remember when the Internet started happening.
1337
01:11:58.824 –> 01:12:00.142
Actually, no, it was pre Internet.
1338
01:12:00.142 –> 01:12:01.198
Yes, it was Internet.
1339
01:12:01.198 –> 01:12:03.806
Because I had Napsar downloading,
1340
01:12:03.806 –> 01:12:07.538
maybe not so legally songs.
1341
01:12:07.538 –> 01:12:10.508
And I remember with our friends, we were
1342
01:12:10.508 –> 01:12:13.548
buying those floppy disks, and because the one
1343
01:12:13.548 –> 01:12:18.320
floppy disk could keep 1.38 megabytes, I think.
1344
01:12:18.320 –> 01:12:24.102
So the song was three to four mega megabytes.
1345
01:12:24.102 –> 01:12:26.512
So we were dividing those songs and then
1346
01:12:26.512 –> 01:12:31.258
just recording pieces on each floppy disk.
1347
01:12:31.258 –> 01:12:34.900
And I remember when one file was
1348
01:12:34.900 –> 01:12:38.810
corrupted, it was like a terror.
1349
01:12:38.810 –> 01:12:40.030
It was despair.
1350
01:12:41.090 –> 01:12:42.106
Just imagine.
1351
01:12:42.106 –> 01:12:43.796
And do you remember I have a box
1352
01:12:43.796 –> 01:12:46.938
at my office with old Maxell Maxel.
1353
01:12:46.938 –> 01:12:48.558
I think it was Maxel.
1354
01:12:48.558 –> 01:12:53.330
Floppy disks, 1.42 megabytes.
1355
01:12:55.210 –> 01:12:57.260
The small ones are the big ones.
1356
01:12:57.260 –> 01:12:58.316
No, the small ones.
1357
01:12:58.316 –> 01:12:59.404
The plastic ones.
1358
01:12:59.404 –> 01:13:00.040
Yes.
1359
01:13:02.170 –> 01:13:05.388
And their payoff was more space than you’ll ever
1360
01:13:05.388 –> 01:13:09.820
need, which is quite nothing, as you said.
1361
01:13:11.310 –> 01:13:12.060
Yeah.
1362
01:13:13.150 –> 01:13:17.088
Now, we are so excited about how many photos we
1363
01:13:17.088 –> 01:13:20.452
can have on our phones, but probably in few years
1364
01:13:20.452 –> 01:13:24.320
time, those people will be laughing at us.
1365
01:13:25.170 –> 01:13:26.084
Yeah.
1366
01:13:26.084 –> 01:13:28.996
And just imagine the one chip that I have here
1367
01:13:28.996 –> 01:13:36.276
in my hand has now memory of 128k, which is
1368
01:13:36.276 –> 01:13:39.224
nothing, but it’s twice as much as I had on
1369
01:13:39.224 –> 01:13:42.180
the Commodore 64 that only had sixty four k.
1370
01:13:42.790 –> 01:13:44.558
And I have that own little microchip
1371
01:13:44.558 –> 01:13:47.590
in my hand, which is strange.
1372
01:13:48.970 –> 01:13:50.722
I didn’t have Condor.
1373
01:13:50.722 –> 01:13:53.340
I had Pegasus, which was the
1374
01:13:53.340 –> 01:13:55.884
cheap version of Atari, I think.
1375
01:13:55.884 –> 01:13:58.918
And I was playing Contra and Mario Bros.
1376
01:13:58.918 –> 01:14:02.640
I don’t know if you had those funny times. Funny times?
1377
01:14:02.640 –> 01:14:03.776
Yeah, it was funny times.
1378
01:14:03.776 –> 01:14:04.860
It was fun times.
1379
01:14:06.990 –> 01:14:09.760
But we are looking into the future.
1380
01:14:09.760 –> 01:14:12.074
We are looking optimistically.
1381
01:14:12.074 –> 01:14:14.708
And I can see that the
1382
01:14:14.708 –> 01:14:17.490
world needs more people like you.
1383
01:14:17.490 –> 01:14:18.400
Thank you.
1384
01:14:18.930 –> 01:14:19.716
Yes.
1385
01:14:19.716 –> 01:14:24.388
Who inspire others to see the goodness, obviously, to
1386
01:14:24.388 –> 01:14:29.322
think about the potential downside, but not be afraid
1387
01:14:29.322 –> 01:14:32.530
of trying it, and being constantly curious.
1388
01:14:32.530 –> 01:14:36.170
So, for that, I thank you very much.
1389
01:14:36.170 –> 01:14:37.532
Thank you so much.
1390
01:14:37.532 –> 01:14:39.240
It was very nice of you.
1391
01:14:40.170 –> 01:14:41.458
Likewise.
1392
01:14:41.458 –> 01:14:43.452
And let’s keep in touch.
1393
01:14:43.452 –> 01:14:47.090
Yeah, let’s do for sure. Bye.
1394
01:14:47.090 –> 01:14:47.650
Thanks, Eric.