After 17 years at Google, earlier this year Yariv left the company to start an investment fund specialising in early-stage AI startups Future Fund. We talked about – what else 😃 – AI in – all its flavours, shapes and applications; from using it to create and promote art, to logistics, to improving our diets and the way we cook. We also discussed social media vs. addiction, loneliness, AI voice assistants and much more. Have a listen.
Transcript:
00:00:00.080 but I think there are so many um big
00:00:04.480 inefficiencies in the system of food
00:00:07.319 that you actually you know with a eye
00:00:09.599 it’s a so I think it’s from the
00:00:11.280 beginning one is um growing food so you
00:00:16.119 know I’ve
00:00:17.439 been traveling in Emerging Markets I’ve
00:00:20.320 actually recently been in an NGO
00:00:22.480 volunteered in an NGO of a friend in
00:00:25.080 Tanzania some of these places are really
00:00:28.279 like you know you could grow anything
00:00:30.240 because they’re like you know you know
00:00:31.560 on the equator and stuff like that you
00:00:32.920 know that that it’s really good but they
00:00:34.719 are actually growing poorly nutrient and
00:00:39.480 actually very low margin stuff just
00:00:43.039 because that’s the only thing they know
00:00:44.360 how to grow these are things that with
00:00:46.399 AI are very easy to build and I think
00:00:49.840 can have a huge impact on our daily diet
00:00:53.280 so you know again you know it’s like
00:00:54.719 very don’t don’t need to do the
00:00:56.680 impossible and the crazy just a little
00:00:58.800 bit better more varied more balanced A
00:01:02.000 system that actually knows you remembers
00:01:04.239 what you ate knows actually what’s good
00:01:06.640 and bad knows some good cooking tips
00:01:08.400 that will make you will actually eat
00:01:10.240 healthier better food so I am a a a huge
00:01:14.759 huge huge fan of of a of technological
00:01:19.079 progress and and I think that yeah sure
00:01:22.400 we need to do it in a in a responsible
00:01:25.280 way um but but but the control should be
00:01:28.240 on the application not on the technology
00:01:37.640 ology welcome to Ru human podcast on
00:01:41.200 this episode you’ll hear a conversation
00:01:43.119 I have with Yar Adam Yar began his
00:01:46.240 career as a software engineer in Israel
00:01:49.399 in late ’90s he worked as as both an
00:01:53.159 engineer and engineering manager at
00:01:55.719 several startups until 2007 where he
00:01:59.320 transitioned to product management at
00:02:01.240 Google there he was part of the founding
00:02:04.119 teams for multiple strategic areas such
00:02:07.240 as Google assistant and uh Google Lens
00:02:10.720 he spent seven years working on cutting
00:02:13.720 edge
00:02:14.680 technology um mainly AI consumer
00:02:17.440 applications before moving to Google
00:02:19.920 Cloud where he led product for
00:02:23.040 conversational Ai and starting in
00:02:26.080 2022 applied gen AI after 7 years at
00:02:30.720 Google Yar recently left the company to
00:02:34.800 start his uh investment fund
00:02:37.000 specializing in early stage AI startups
00:02:40.400 the future funds he’s also actively
00:02:43.599 involved in projects related to uh AI
00:02:46.280 art and AI augmented food we talk all
00:02:51.200 about human AI collaboration threats
00:02:54.879 opportunities his early stage VC fund
00:02:57.879 and so much more I really enjoyed our
00:03:01.560 conversation and I was really looking
00:03:03.760 forward to it so now you’ll hear it and
00:03:07.319 I hope you’ll enjoy it
00:03:13.360 thanks are you
00:03:16.680 human hello hello y thank you so much
00:03:19.799 for finding time to to do this hi thank
00:03:22.440 you for inviting me yeah it’s a going to
00:03:25.120 be pleasure I’m sure yeah yeah and I’m
00:03:27.480 sure you will have a lot of uh things to
00:03:30.159 share off with your new Venture uh
00:03:32.640 future fund I I noticed but I I guess
00:03:36.400 for people who just you know found out
00:03:39.000 about you it would be great to
00:03:41.799 understand how you arrived to managing
00:03:45.360 and starting um AI focused uh investment
00:03:49.439 fund uh I know that you’ve started your
00:03:52.200 career as a software engineer and in
00:03:55.720 Israel and then you moved to work on
00:03:58.599 some big things in in Google and now you
00:04:02.439 went back to the board and and you took
00:04:04.920 all the risk and you started your own
00:04:07.159 thing how yeah how how did it all happen
00:04:11.239 the path the path yeah I think I you
00:04:15.560 know I always um liked starting new
00:04:18.880 things um you know I I had a startup
00:04:21.880 actually in the 30 year of of University
00:04:25.680 and then I worked you know with a bunch
00:04:26.960 of startups um when I moved to Google I
00:04:30.680 found myself again and again either
00:04:33.000 starting or joining very early things so
00:04:36.560 started the privacy in 2007 or eight um
00:04:41.320 joined the very early small H Team of
00:04:44.639 emerging markets in
00:04:47.360 2009 always a little bit too early and
00:04:50.479 then in
00:04:52.720 2015 joined a small team that back then
00:04:56.280 started what became Google assistant and
00:04:58.680 also we started what became Google ends
00:05:01.720 and I think
00:05:02.880 that got me excited about AI I actually
00:05:06.520 started because I got excited about
00:05:09.759 AI um and and read really a good run at
00:05:15.039 Google learned the ton and and the last
00:05:17.080 three years we joined a um Google Cloud
00:05:20.840 conversational AI to lead that and then
00:05:23.240 we basically kind of started the Gen
00:05:26.759 applied
00:05:28.039 gen about two years
00:05:31.120 and um yeah after 17 years at
00:05:35.280 Google and I’m also turning 50 this year
00:05:38.240 I thought you know it’s a long time um
00:05:41.759 and I was actually thinking what to do I
00:05:43.840 had a bunch of AI related initiatives in
00:05:46.759 the art space in the food space but
00:05:51.240 sometime last year I met with some old
00:05:53.840 friends that I actually knew from Google
00:05:55.639 from the early days and we said wouldn’t
00:05:58.720 it be great to actually start together a
00:06:01.199 fund um that invest in early stage AI um
00:06:06.080 we actually talked about starting a fund
00:06:07.800 multiple times in the last decade we
00:06:09.560 thought it’s like a natural next step of
00:06:12.039 scaling
00:06:13.039 ourselves um you know it’s Google so you
00:06:15.440 know I let product and strategy for
00:06:18.319 teams and gradually these teams and
00:06:19.840 Scopes grow and here it’s kind of the
00:06:22.319 same thing just like you know you invest
00:06:24.479 and you support you don’t actually
00:06:26.360 manage and and deal with the dayto day
00:06:28.919 although we do see ourselves as handson
00:06:31.440 investors and we we are trying to be to
00:06:34.319 be involved um so it was like yeah kind
00:06:38.479 of a natural step and a very fun one
00:06:41.199 right so I’m actually doing it with
00:06:42.599 people that I know for a very long time
00:06:45.720 and and we decided to do it because all
00:06:47.560 of us have a lot of experience in in AI
00:06:49.639 in the real world and suddenly there
00:06:51.360 seems to be like this very a a a a
00:06:54.759 supportive tailwind and a huge wave that
00:06:58.080 were basically kind of riding on it and
00:07:01.479 we are a bit of a unique creature in the
00:07:06.120 investment space especially in Europe
00:07:09.199 because of our backgrounds the other
00:07:11.440 people also have extensive backgrounds
00:07:13.440 in
00:07:14.400 in in building
00:07:17.240 companies and doing AI in the
00:07:20.080 Academia um so yeah so it’s very it’s
00:07:23.599 it’s fun I don’t see it as risky in any
00:07:26.039 way have you ever like have you done um
00:07:29.840 invest uh investing before that or
00:07:33.240 that’s a jump like into deep water so so
00:07:36.560 so the four or maybe soon the five of us
00:07:40.440 have been doing Angel Investing and
00:07:42.319 actually also sometimes been investing
00:07:44.000 you know as a as two or or or threes
00:07:46.759 between ourselves so I think like we
00:07:48.599 have about I think 30
00:07:51.280 Investments that we did as as Angel
00:07:54.039 Investors and know specifically mostly
00:07:55.639 in the AI space um but yeah investing as
00:07:59.000 a team team we started the last year
00:08:02.599 yeah and where do you see capital from
00:08:05.000 uh is it Europe or us or mix we were
00:08:09.319 starting to raise a fund last year and
00:08:11.440 then we said okay before actually we
00:08:13.240 raise let’s do it from our own capital
00:08:16.360 and from our closed Network and we
00:08:18.919 wanted to create some track record from
00:08:22.199 that and also see after a long time not
00:08:24.720 working together just to see that we we
00:08:26.560 actually enjoy working together exactly
00:08:28.680 yeah so we started like Q4 that’s
00:08:30.520 actually going um super super well we’re
00:08:34.320 are having a lot of fun and we already
00:08:37.919 made I think we’re like in the seventh
00:08:40.200 or eighth investment and nine and 10 are
00:08:44.240 already where in final stages of this so
00:08:47.959 we
00:08:48.760 will hopefully raise
00:08:51.440 Capital this year um so we started with
00:08:56.640 thinking hey our focus should be in
00:08:59.480 Europe just because we thought that’s
00:09:00.880 where the biggest gap for early stage
00:09:03.720 money that has deep understanding of the
00:09:06.120 domain and can support the
00:09:08.320 founders but we actually found out that
00:09:12.000 our
00:09:13.200 profile um appeals to also companies
00:09:17.920 outside so definitely so we have a
00:09:20.760 companies coming from Israel we actually
00:09:23.040 have quite a few that came through the G
00:09:25.240 Google Network mostly um from The Valley
00:09:28.680 in in California and you know having
00:09:31.920 worked 17 years at Google most of them
00:09:34.000 with California and managing teams
00:09:35.880 globally I don’t see a real reason why
00:09:39.760 to limit ourselves to a specific region
00:09:42.600 so yeah so at the moment we’re Global
00:09:45.200 but
00:09:46.399 definitely we are looking at how can we
00:09:49.040 make Europe an advantage as opposed to
00:09:52.399 our pres as Europe a strength and
00:09:54.279 advantage and differentiator as opposed
00:09:56.959 to a a a a a kind of a limitation
00:09:59.519 because clearly a lot of stuff is
00:10:00.600 happening in San Francisco and and there
00:10:04.519 are definitely advantages for for some
00:10:06.519 Regional Focus um first there is a ton
00:10:09.720 of great talent here um especially in
00:10:12.640 the the top Academia institutions and
00:10:15.560 and getting to them early from the
00:10:17.440 states is not easy and they often you
00:10:19.440 know they’re not everywhere in mature
00:10:21.880 and one of our partners is a professor
00:10:23.560 in eth so we’re actually naturally
00:10:26.279 connected to these
00:10:27.800 universities um working with some of the
00:10:30.519 governments organizations that are
00:10:32.480 supporting I’m sure you know Poland is
00:10:34.000 like investing a ton in its ecosystem
00:10:36.480 France is
00:10:37.600 investing France would be I would think
00:10:40.000 they are leading right like the recent
00:10:41.959 investments from Microsoft and exactly
00:10:44.680 so so they are actually having a a a a a
00:10:48.480 fund that is in Europe where so four of
00:10:51.880 us are in Switzerland one is in Sweden
00:10:54.880 um is is appealing to them and also
00:10:57.800 actually being a little bit
00:11:00.320 outside the very noisy and sometimes
00:11:05.560 hery ecosystem of San Francisco is also
00:11:08.560 allows us a little bit of a fresh look
00:11:10.959 at things so yeah so I think like we’re
00:11:13.160 trying to to take what’s good in the
00:11:16.000 location in Europe but not limit ourself
00:11:18.560 explicitly yeah yeah yeah like I I
00:11:21.880 completely agree with you we have lots
00:11:23.600 of talent but I guess it’s partly to the
00:11:26.880 way we we believe in our ourselves as
00:11:29.959 well that way we uh get support from
00:11:33.320 from the society and and also
00:11:36.440 initiatives um there is and not so many
00:11:39.240 Role Models compared to what San Franc
00:11:42.519 Silicon Valley has uh yeah I I watched
00:11:45.360 the other day uh Scott Galloway’s um uh
00:11:50.079 also keynote I watched the O as well but
00:11:51.959 we will talk about it in a
00:11:53.440 second um and yeah he he showed like
00:11:57.560 this graph which I don’t if was like 10
00:12:00.560 or five times less uh investment is like
00:12:05.639 given to Europe compared to just just us
00:12:12.120 so I’m just like wondering what’s what
00:12:15.480 needs to change uh for for for people to
00:12:19.720 start more companies to believe in
00:12:21.600 themselves and and to you know keep
00:12:24.360 going and have more optimism in in in
00:12:28.199 doing things yeah I think you touch it
00:12:30.279 right I think like you know role model
00:12:33.519 people and companies play a key role I
00:12:36.360 saw it like you know in Israel right
00:12:38.120 because you know it was like you know
00:12:39.120 back in the 90s over there and and I
00:12:42.560 think like you know the the big moment
00:12:44.320 for Israel was IQ acquisition for nearly
00:12:47.040 you know you know half million everyone
00:12:48.839 oh wow this is possible and that’s what
00:12:51.199 actually you know kicked off
00:12:53.000 really and the big boom and I think also
00:12:55.839 you go through a a maturity phase were
00:12:59.440 first like you know because it’s so much
00:13:00.920 easier to work with companies in your
00:13:02.560 own market and it’s like it seems like
00:13:05.000 you know scary to go to the US but but
00:13:07.600 you must um you must think big you must
00:13:11.680 not think
00:13:13.120 locally um and I think that does
00:13:17.560 go through some stage and for Israel it
00:13:21.040 happened right like I think Israel the
00:13:22.519 first also was like that and then very
00:13:24.800 quickly it was either late 90s or early
00:13:27.440 2000s the moment the company opened
00:13:30.480 usually engineering was in Israel and
00:13:32.480 product and sales were in the US after
00:13:35.320 product moved to Israel but the business
00:13:37.160 was always us first and I think that’s
00:13:39.720 also like you know an angle that that
00:13:42.600 that we are trying to to bring to our
00:13:44.480 companies you know move as quickly as
00:13:47.079 possible to the main Market but I think
00:13:48.959 like you can see what mistal did in
00:13:50.360 France right completely changed
00:13:51.759 including the government’s view on thing
00:13:53.800 the moment there is a ro mod saying oh
00:13:55.759 wow you can actually go
00:13:57.600 big ENT preneurs follow um so I do think
00:14:02.240 there is you know some kind of a culture
00:14:05.199 thing and a role model
00:14:07.199 thing but
00:14:08.959 definitely and we’re seeing a a vibrant
00:14:12.600 and growing ecosystem of
00:14:15.399 startups in Europe yeah so so I’m very
00:14:18.279 excited yeah I guess you know if you
00:14:21.800 work the the code if you find if if you
00:14:26.320 have like the best um how say Factor
00:14:29.600 so the talent great product execution um
00:14:34.279 and you are able to find those um
00:14:36.720 there’s so such a big pool uh in Europe
00:14:40.199 which you can tap into so you don’t have
00:14:42.120 so much competition from from us yeah
00:14:44.600 yeah and there are by the way there are
00:14:45.800 some like you know really good other VES
00:14:47.600 in Europe that we’re partnering with
00:14:49.079 right like yeah so so there is I think
00:14:51.160 there is a ton of opportunity now with
00:14:53.959 especially with AI and I’m sure we’ll
00:14:56.720 see the the growth and the development
00:14:58.519 of that system
00:14:59.959 so what what’s do you specialize in in
00:15:03.079 any like in applications within this
00:15:06.320 or is there anything which uh excites
00:15:09.680 you
00:15:11.240 particularly
00:15:14.320 um yes um so so initially we didn’t
00:15:19.360 limit
00:15:20.600 ourselves we are definitely trying to
00:15:23.519 focus on on on you know what you call AI
00:15:26.079 deep Tech I I think the the barrier for
00:15:31.120 applications for AI is really you know
00:15:34.759 is really high now and because that the
00:15:37.920 models and and and the
00:15:40.480 horizontal orchestration and other apis
00:15:43.319 layer on top of
00:15:45.199 them are definitely you know are setting
00:15:48.160 such a high bar that is
00:15:50.319 commoditized capturing sustainable value
00:15:54.240 in the application layer I think is is
00:15:57.040 not trivial especially if you’re EXT
00:15:59.440 two three four years away because at the
00:16:02.480 moment there are still a lot of
00:16:04.240 challenges right you need to work hard
00:16:06.160 to make things work but I don’t think
00:16:08.240 it’s inherent to the technology I think
00:16:10.480 it’s more of a symptom of raw technology
00:16:13.480 at that very early stage so a lot of the
00:16:16.040 hacks in Rag and whatever but I think
00:16:18.040 that in two to three years actually it
00:16:19.680 will be out of the box so we are
00:16:23.720 looking usually in deep Tech and that
00:16:26.199 that exist in in different places and um
00:16:29.399 I think there is a um place for
00:16:32.079 foundational model I wouldn’t invest in
00:16:34.040 a in a language large language model now
00:16:37.680 or even maybe in the main modalities but
00:16:39.959 but you know music is still interesting
00:16:41.959 modality there is definitely more and
00:16:43.839 more startups that are creating
00:16:45.319 foundational models in specific
00:16:49.920 domains whether that’s you
00:16:53.399 know
00:16:55.040 engineering or medical or others right
00:16:58.079 it’s talk about the langu but actually
00:16:59.440 representing that that space that’s one
00:17:01.199 interesting definitely like the security
00:17:04.319 side of AI is an interesting one um
00:17:07.839 there are some tools that are are
00:17:09.480 interesting but but
00:17:11.000 still some of these areas are crowded
00:17:14.000 two play two spaces that excite me and I
00:17:17.559 would love us to to actually specialize
00:17:19.959 and and and develop our muscle even more
00:17:22.880 one is AI for
00:17:25.959 sciences and so I I think like you know
00:17:29.080 definitely um biology and and poin
00:17:32.039 folding is the one that is most in the
00:17:36.200 news Alpha three but like you you think
00:17:38.840 you know AI for Sciences like you know
00:17:41.280 how can you use gen and AI um to do
00:17:44.799 better science you know so there is like
00:17:46.880 in chemistry in biology in in materials
00:17:50.320 and and others I think over there the
00:17:54.440 defensibility and barrier are a bit more
00:17:57.039 clear it’s a bit less crowded than the
00:18:00.640 mainstream of of where gen is um and it
00:18:06.039 does requirement a certain set of
00:18:08.120 scientific expertise which I think we
00:18:09.760 have a good Baseline to build and I
00:18:12.240 think it’s a fascinating space so that’s
00:18:15.120 one area I think robotics is also very
00:18:17.919 interesting I think that that space is
00:18:20.120 also um becoming crowded but I think
00:18:22.679 it’s like also so big and has the
00:18:24.440 potential that that again and and you
00:18:26.760 know we’re always finding interesting
00:18:28.400 companies is working on Super
00:18:29.480 interesting you know you know how do you
00:18:31.240 you solve issues related to to expected
00:18:33.600 water shortage right like you know in in
00:18:37.200 how do you relate it to climate um and
00:18:41.039 then there is a bunch of derivatives so
00:18:42.880 yeah so trying to go off the bit and
00:18:44.919 path of Genna for the Enterprise which
00:18:47.919 unfor you know it is actually my
00:18:49.960 expertise and we’re definitely looking
00:18:52.120 at it um yeah and more on on places
00:18:56.480 where defensibility and barrier to
00:18:59.280 entry um I I read somewhere that you are
00:19:03.320 also interested in was it AI augmented
00:19:07.120 food what does it mean yes yes yes
00:19:12.159 um so so I think you know there are
00:19:16.039 multiple very hard problems you know
00:19:18.640 climate water and food of course is also
00:19:22.280 a problem and and actually I think it
00:19:24.400 manifest you know you know a lot of
00:19:26.559 people don’t have enough food or eating
00:19:28.280 actually not the best food maybe it will
00:19:30.840 change with o zenic in a small way and
00:19:34.440 and unlike others I actually think that
00:19:36.440 this is relatively an easier problem to
00:19:39.919 solve because I think there are so many
00:19:43.679 um big inefficiencies in the system of
00:19:47.640 food that you actually you know with AI
00:19:50.120 it’s it’s a so so I think it’s from the
00:19:52.000 beginning one is um growing food so you
00:19:56.760 know I’ve been a
00:19:59.159 traveling in Emerging Markets I’ve
00:20:01.039 actually recently been in an NGO
00:20:03.200 volunteered in an NGO of a friend in
00:20:05.799 Tanzania some of these places are really
00:20:09.000 like you know you could grow anything
00:20:10.960 because they like you know you know on
00:20:12.440 the equator and stuff like that you know
00:20:13.840 that that it’s really good but they are
00:20:15.880 actually growing poorly nutrient and
00:20:20.200 actually very low margin stuff just
00:20:23.799 because that’s the only thing they know
00:20:25.120 how to grow and some of these NGS are
00:20:27.360 trying to teach them oh actually you
00:20:28.799 know instead of you know growing maze
00:20:31.080 grow a vanilla or something you know it
00:20:33.159 actually can make you 100 extra money so
00:20:35.240 you everything else can improve or maybe
00:20:37.840 grow nuts that are much more nutrient
00:20:41.360 than than than sugar
00:20:43.799 carbs and but they don’t have the the
00:20:47.200 knowledge right so actually I think with
00:20:49.320 technology you can actually transfer
00:20:51.039 that knowledge and expertise this is
00:20:52.320 just you know one example the second one
00:20:54.440 is about you know definitely
00:20:55.760 transportation and distribution
00:20:58.720 um right like like you know even at
00:21:01.080 Google we we did some work with some of
00:21:02.880 the
00:21:04.240 largest consumer right like you know
00:21:06.360 like the Mig and the co-ops and you know
00:21:08.559 of the of the world they have like huge
00:21:11.799 waste because they don’t want you to go
00:21:14.000 to the competitor on Friday or Saturday
00:21:15.960 but they like throw stuff on Sunday so
00:21:18.279 like you know how can you actually
00:21:20.360 optimize and plan that such that
00:21:23.360 actually you won’t lose you you won’t
00:21:25.279 lose consumers but you also won’t waste
00:21:27.080 food again we actually some AI products
00:21:30.080 that Google clouded I think they got to
00:21:32.159 like you know something like 30% saving
00:21:34.000 that’s huge and then at the end which I
00:21:37.200 think is also super easy on the consumer
00:21:40.279 level we are so
00:21:43.559 unsophisticated in the way that we eat
00:21:46.240 and prepare food all of the people I
00:21:48.760 know pretty much know how to cook
00:21:51.080 somehow like five dishes and they just
00:21:54.880 keep circulating they doing the same
00:21:57.200 they do it in one way that they know and
00:22:00.880 you know and it’s so easy to do like you
00:22:03.760 know even on these dishes like small
00:22:06.120 fixes on like on on on some of the
00:22:08.480 ingredients or how you make it like if
00:22:10.400 you take a chef you don’t even need a
00:22:12.520 like you know a three star chef and you
00:22:15.000 put it next to someone for a week and
00:22:17.919 you just give them like you know a few
00:22:19.320 simple tips of oh instead of this thing
00:22:21.480 buy this thing you know here is how you
00:22:23.559 can vary here is actually how you can
00:22:25.240 cook it a little bit better here is how
00:22:27.320 you can personalize it so these are
00:22:29.240 things that with AI are very easy to
00:22:32.200 build and I think can have a huge impact
00:22:35.360 on our daily diet so you know again you
00:22:37.640 know it’s like very don’t don’t need to
00:22:39.840 do the impossible and the crazy just a
00:22:42.000 little bit better more varied more
00:22:44.559 balance a system that actually knows you
00:22:47.159 remembers what you ate knows actually
00:22:49.720 what’s good and bad knows some good
00:22:51.279 cooking tips that will make you will
00:22:52.960 actually eat healthier better food so so
00:22:56.480 that’s was kind of exciting then I found
00:22:58.200 like super cool partner she she actually
00:23:00.120 she actually brought the name her name
00:23:01.960 is Erica she’s a super smart designer
00:23:04.679 she created she was like also like me a
00:23:07.080 few years ago excited about AI but she
00:23:09.039 said why isn’t AI working on all the
00:23:11.880 senses so she said you know and one of
00:23:14.000 the things she said you know hey wasn’t
00:23:15.559 working for example on taste can we
00:23:17.559 actually kind of maybe digitally print
00:23:19.880 you know chocolate or other stuff and
00:23:21.559 food so so she started a little bit of
00:23:23.480 that so what we’re trying to do here is
00:23:25.640 kind of a research center and a physical
00:23:27.640 Research Center that will bring in
00:23:30.279 industry and Academia and entrepreneurs
00:23:33.120 and others to to kind of push the the
00:23:36.520 the envelope on some of these ideas and
00:23:39.440 provide prototypes and research and and
00:23:42.240 things right so because it seems such an
00:23:44.880 important problem and yet like we are
00:23:47.279 such in a basic way right like you know
00:23:49.120 people’s cooking did not change much in
00:23:52.159 the last 20 30 years compared to a lot
00:23:53.720 of stuff yeah so that was like a very
00:23:55.520 long speech but I think it’s like such
00:23:57.200 no no no fascinating I and I also feel
00:24:00.240 so like my other Venture actually apart
00:24:02.559 from working in AI in Tech um I I like I
00:24:06.799 have a retail like e-commerce store and
00:24:09.880 we sell Japanese knives so obviously I
00:24:12.159 love to cook as well and I I find it
00:24:15.120 fascinating as well how people I guess
00:24:18.760 the way we um like our work balance work
00:24:23.039 life balance doesn’t really work and and
00:24:26.559 there are lots of companies which
00:24:27.720 benefit which which take advantage of um
00:24:30.840 you know people’s La lack of time H and
00:24:33.880 and offer those prepackaged things yeah
00:24:36.960 some of the companies are doing the like
00:24:39.760 really cool uh way uh such as I I don’t
00:24:43.000 know if you’ve heard of Gusto they are
00:24:45.559 huge there is fresh food yeah yeah yes
00:24:48.679 so so I like I I saw the other day um
00:24:52.480 how they manage um offering I think it’s
00:24:56.720 like I don’t know now how many but
00:24:58.960 100 plus different um combinations of
00:25:02.440 food which you can select for each each
00:25:04.559 week and everything is robotized
00:25:06.159 everything is automated and and you know
00:25:08.960 just people don’t probably they don’t
00:25:11.159 even think like I’ll just randomly
00:25:13.440 select okay today I will eat this I just
00:25:15.559 don’t like I don’t know chicken or
00:25:17.080 whatever uh so it doesn’t um uh it
00:25:21.120 doesn’t demand people to think about
00:25:23.600 those choices right but in a way I find
00:25:26.760 it sad because you know this is this is
00:25:29.679 your time to maybe it’s in a way of
00:25:33.399 meditative right like it’s it’s time to
00:25:35.720 reconnect with your close ones uh when
00:25:37.880 you when you cook at home and you know
00:25:39.799 what you are eating and it’s it’s nice
00:25:41.640 it’s like a survival kind of thing if
00:25:43.720 you are lost in the woods at least you
00:25:45.919 know what you can you know um combine to
00:25:49.440 survive and and eat yeah I love
00:25:52.080 gardening and food and I believe that it
00:25:54.640 actually it allows us to to connect to
00:25:56.799 ourselves yeah and and understand and
00:25:59.640 think about it um yeah I know a lot of
00:26:02.880 people are saying hey the next
00:26:03.919 generation will not cook although my I
00:26:07.240 find it sad actually yeah so so yeah I
00:26:10.679 think like it’s great that there are
00:26:11.960 these companies I do think that it’s
00:26:13.600 actually a great skill and a great
00:26:15.200 activity your point actually it’s a
00:26:16.720 social activity and I I think it’s
00:26:19.640 actually something that I would love
00:26:21.120 actually to use AI to augment and
00:26:23.360 improve people rather than
00:26:25.799 replace yeah yeah so yeah
00:26:28.799 yeah go on no go on go on sorry no I’m
00:26:30.960 saying you know other than food you you
00:26:32.279 know generally I love this kind of
00:26:34.320 adjacent uses of AI the other thing I do
00:26:37.559 is a Ai and art and yeah which also very
00:26:41.159 passionate about yeah what do you do in
00:26:42.919 this area so so yeah this is like don’t
00:26:46.080 tell
00:26:46.880 me like a Bitcoin and a blockchain and
00:26:51.120 all
00:26:52.760 NF no I I actually um yeah I got excited
00:26:58.320 about it like you know a decade ago
00:27:01.720 um because I I you know I I always um
00:27:07.000 always loved art actually my sister is
00:27:09.760 an artist one of my daughters is
00:27:11.320 studying art you know both of my
00:27:13.919 daughters are very are very good at it I
00:27:16.159 always loved
00:27:17.440 creating um when AI came out um I felt
00:27:22.679 oh wow finally I have kind of a brush
00:27:25.880 that I know how to use I felt I actually
00:27:28.600 you know create with it I always felt
00:27:30.159 that programming is kind of a creative
00:27:31.679 activity and with AI you know it blew my
00:27:34.360 mind and and I have a very good friend
00:27:37.080 that he’s a very big media art collector
00:27:39.840 especially a video art and I was talking
00:27:41.840 to him you know hey how do we make and
00:27:43.799 this was long before mid journey and all
00:27:46.159 of this 10 years ago I told him he how
00:27:48.720 can we um make AI art a thing and a
00:27:52.519 mainstream thing and he told me you know
00:27:54.840 back in the back in the days that hey
00:27:57.600 the biggest challenge with video art for
00:27:59.360 example was if you want really good
00:28:01.000 people to do certain art you need to
00:28:02.640 make sure that the big collectors are
00:28:04.399 collecting it the big galleries are
00:28:06.279 buying it the big museums are displaying
00:28:08.159 it you know or otherwise they don’t have
00:28:09.960 a market so he said you know video art
00:28:12.039 start struggled with that let’s start
00:28:14.600 with having that conversation and
00:28:16.240 introducing them so that will create the
00:28:20.000 the motivation funny enough you know we
00:28:22.519 start I I Tred to talk to some people in
00:28:24.559 the Contemporary Art scene and
00:28:26.399 contemporary is a misleading word in my
00:28:28.320 opinion based on my experience
00:28:29.880 immediately they told me no no no no no
00:28:31.880 no art is for humans by humans no
00:28:35.320 machines nothing so I told them okay I
00:28:38.279 get it forget it AI is not artificial
00:28:40.640 intelligence imagine that aliens come
00:28:43.399 from another planet and they say hey we
00:28:46.360 are doing art we would love to do art
00:28:48.399 with you we’d love to understand your
00:28:49.919 art we’d love to explain you about art
00:28:52.080 will you tell them no no no art is for
00:28:53.720 humans and bomans so I don’t have aliens
00:28:56.440 but AI is as close two aliens that we
00:28:58.919 have today it’s a form of intelligence
00:29:01.240 it’s different than ours it has its own
00:29:02.840 limitations most because our own
00:29:04.279 limitations so I actually bought here a
00:29:06.799 few you know know years ago you know
00:29:09.279 alien intelligence.com because you.org
00:29:12.320 you know because everything started this
00:29:14.200 days with a domain name and that
00:29:16.640 actually opened people a little bit more
00:29:21.240 um and then you know I kind of gave up I
00:29:23.880 did a few AI art project projects myself
00:29:27.799 but then last year again I felt that the
00:29:30.000 time is ripe again and really now like
00:29:33.880 AI became a household it’s not any more
00:29:36.799 kind of in the corridor of experts
00:29:38.480 everybody now knows Ai and arst seems to
00:29:42.480 be like you know kind of still left
00:29:43.919 behind you know because if you ask
00:29:45.240 people what’s a um AI art exactly people
00:29:48.840 will say nft or M Journey which I think
00:29:50.799 is completely
00:29:52.080 diminishing and and the in in in the
00:29:54.559 aart world you know it’s hardly ever
00:29:56.919 there most of of the works that I have
00:29:59.080 seen I didn’t think they are great there
00:30:02.640 are some interesting artists definitely
00:30:04.120 but like you know there is a lot of
00:30:06.799 superficial so what we wanted to do is
00:30:10.440 really the goal is to deepen the
00:30:13.360 dialogue and especially connect the Two
00:30:15.799 Worlds of AI because I feel that in the
00:30:17.519 AI World there are like amazing
00:30:20.799 creative scientist and Technical people
00:30:24.080 that often when I read a paper when I
00:30:26.080 see some of the example I’m saying wow
00:30:28.399 if that person just had an artistic
00:30:30.600 intent this is actually a really good
00:30:32.480 artwork that actually stops you and
00:30:34.600 makes you
00:30:35.559 think and on the other hand I do feel
00:30:37.960 that there are artists that are
00:30:39.159 interested in AI but they lack the
00:30:41.640 technical in depth to do really
00:30:43.600 interesting they just want to focus on
00:30:45.720 creating work right like and then again
00:30:47.960 then then it all diminishes to to to to
00:30:50.360 very aesthetic Works nothing that that
00:30:52.360 is deeply interesting so we created this
00:30:55.240 Association and we managed actually to
00:30:57.519 get quite a few of um leading museums
00:31:02.320 and others into it and what we are doing
00:31:05.279 actually is is a competition so we
00:31:08.240 thought okay let’s start with a
00:31:09.639 competition to open it for people to
00:31:12.000 submit a i AI art works and the AI can
00:31:17.440 be a a a um a a a subject it can be a
00:31:22.360 tool and collaborator it can be a
00:31:25.279 Creator um it can be a performer
00:31:28.519 and so yeah we’re very soon to to launch
00:31:30.360 that and we hope that and and the plan
00:31:32.000 is then to take these works and present
00:31:34.480 them in the partner galleries to
00:31:36.600 organize workshops and events when we
00:31:38.919 have you know artist and the and AI
00:31:42.440 experts talking because I think there is
00:31:44.679 valid hard questions but but you need an
00:31:48.440 informed dialogue to to to to progress
00:31:50.880 anything so so yeah that that’s a very
00:31:54.679 exciting and and I think like you know
00:31:57.200 again you are asking is that nft no
00:31:58.799 there is a lot of interesting definitely
00:32:00.240 people are using AI to create music
00:32:03.360 people are actually using an AI as a
00:32:05.960 choreographer so people are basically
00:32:08.039 saying it oh you please create
00:32:09.440 choreography for this topic or for this
00:32:11.240 music and then they actually go and do
00:32:12.799 it people are using robots for a um um
00:32:18.480 generating stuff um I I you know I
00:32:22.000 myself used you know a bunch of
00:32:23.840 different tools so so it’s actually you
00:32:26.080 know very Broad and I think could be
00:32:28.679 very interesting and it’s a shame that
00:32:30.840 humanity is missing on it because of
00:32:32.639 like you know kind of conservative
00:32:34.679 contemporaries I would love to I I’m
00:32:36.880 trying by myself to uh learn um electric
00:32:40.159 guitar guitar and I find it very I’m I’m
00:32:43.880 very hopeful that um there will be tools
00:32:46.519 which will you know guide you through um
00:32:49.720 those those things if you for example
00:32:51.679 are living in a more remote place and
00:32:54.360 you can’t really uh have oneone sessions
00:32:57.679 with with a teacher I think this is
00:32:59.240 great to have at least to to put some
00:33:01.519 foundations for for people to get
00:33:03.519 immersed with with art with music and
00:33:06.039 and and you know just create from that
00:33:09.200 yeah 100% I think like to teach people
00:33:11.120 stuff right you need to do do thing
00:33:12.720 right you need to scale expertise and
00:33:14.240 that expertise needs to be patient and
00:33:16.639 clear and trustworthy and I think like
00:33:19.440 AI actually know so we need to fix a
00:33:21.159 little bit the TR the trust worthy or I
00:33:22.880 think like you know it’s not we’re like
00:33:25.000 getting there and there is a lot of
00:33:26.120 tricks to do it and I think like yeah to
00:33:28.559 your point you know that scale expertise
00:33:30.480 whether to all of us like if you know
00:33:33.519 now you can have like it’s kind of
00:33:35.639 taking um um the idea of having like you
00:33:39.559 know a a world class expert teach you
00:33:43.519 stuff but also to your point and maybe
00:33:45.399 more important you know you can actually
00:33:47.679 take a expert scientists doctors
00:33:51.000 agriculture expert you know to countries
00:33:52.639 where that expertise is is lacking yeah
00:33:54.919 so two two points on that one of them is
00:33:58.120 I definitely I know what whom to connect
00:34:00.799 you with and I don’t know if you’ve
00:34:02.120 heard of her or you are connected
00:34:03.679 already so soar can she of course I know
00:34:08.839 from go okay so yes I I interviewed her
00:34:12.320 and I’m met I’m meant to to meet with
00:34:15.399 her um because she’s also in London yeah
00:34:17.679 she’s awesome yes yeah she she’s hosting
00:34:20.399 those cool crazy in the best way yeah
00:34:23.679 yes yes and yeah she she’s she she’s
00:34:25.960 great uh so we which we had a chat I
00:34:28.480 think like a month ago something small
00:34:30.839 okay so you you know of course people
00:34:33.480 great minds attract each other yeah see
00:34:35.760 the of mine so yeah so you know she does
00:34:38.440 the same thing she wants to faciliate
00:34:40.359 those conversations I think it’s it’s so
00:34:43.440 needed yeah yeah and the other thing is
00:34:47.359 um since you’ve been working for many
00:34:51.520 many years when you started with Google
00:34:53.599 uh on those assistance how do you find
00:34:56.320 it right now you know it what open AI
00:34:59.680 recently released with the 4 40 4.0 or
00:35:04.599 40 um and the whole Assistance or
00:35:07.599 tutoring and like I guess you saw lots
00:35:10.160 of De demos they they made so so many
00:35:13.680 collaborations with the one of the like
00:35:16.240 biggest um like
00:35:19.000 celebrities um and how do you find this
00:35:23.440 is it more helpful do you see any risks
00:35:27.079 of you know how people are going to use
00:35:30.800 um like those kind of assistants are
00:35:33.160 they going to go you know like maybe
00:35:37.680 using only shortcuts and not really
00:35:40.440 learning and and testing their own
00:35:44.680 capabilities so it was a very cool demo
00:35:46.839 for open a you know yeah you know I saw
00:35:48.640 it and I said oh finally works right and
00:35:51.079 we started it um yeah 10 years ago or so
00:35:55.000 and and yeah it was a little bit too
00:35:56.680 early from the the technology
00:35:58.800 perspective um but a lot of the the
00:36:02.240 ideas are actually still valid I I’m a
00:36:05.520 huge Optimist when it comes to
00:36:08.680 technology um you
00:36:11.359 know fire is still very dangerous and
00:36:14.520 can be used in very dangerous ways
00:36:16.760 wheels are super dangerous and most of
00:36:19.079 the bad stuff you know are actually
00:36:21.079 because of the invention of the wheel
00:36:23.760 electricity the jet engine you know
00:36:26.480 pencils books TV you know all of them
00:36:30.319 can be used and are you you know and are
00:36:32.800 used you know for
00:36:35.599 also bad goals and they allow you know
00:36:39.760 kind of if you have a really bad idea in
00:36:41.680 the means it allows you to scale and
00:36:44.520 this kind yeah but but like I think
00:36:46.079 their positive contribution to to to
00:36:48.880 society you know far far far out ways
00:36:52.359 these kind of potentials and and and
00:36:55.240 when I look at the um at some of the
00:36:58.400 biggest um risk and challenges that we
00:37:01.680 have as a society you know we touched
00:37:03.319 food we we we kind of mentioned the you
00:37:06.280 know water you know we mentioned climate
00:37:08.440 you know all of these actually cannot be
00:37:10.880 solved unless we do technological
00:37:12.680 advantages and and and we touched you
00:37:14.680 know about like you know smaller things
00:37:16.880 right like you know better diet Health
00:37:19.640 scaling these
00:37:21.079 expertise um so I am a um a huge huge
00:37:26.480 huge fan of of of of technological
00:37:30.359 progress and and I think that yeah sure
00:37:33.720 we need to do it in a in a responsible
00:37:36.560 way um but but but the control should be
00:37:39.520 on the application not on the technology
00:37:42.359 um yes specifically with um with
00:37:46.920 assistant what excited me back then
00:37:49.960 still excites
00:37:52.000 me
00:37:53.599 now and and multiple things you know
00:37:58.560 first um you know natural language is
00:38:03.720 the most robust and the most natural
00:38:08.680 interface yeah right like you know I can
00:38:11.040 express very complex desires ideas or
00:38:14.560 points and everyone knows how to do it
00:38:16.560 right like you know it crosses you don’t
00:38:18.480 need to you know you know to be
00:38:20.680 techy you know ever since you are like
00:38:22.920 two years old and and until you die it
00:38:25.920 doesn’t matter where you live everybody
00:38:28.319 on this planet actually knows how to
00:38:31.599 express complex ideas and Desires in
00:38:34.960 language so when you create a technology
00:38:38.760 that actually can understand and can
00:38:41.119 assist you based on that interface you
00:38:43.280 are actually leveling the field in a in
00:38:46.560 a huge way um the the and of course it’s
00:38:50.640 not now it’s not just natural language
00:38:52.640 but it’s also like kind of pointing
00:38:54.319 right like finally also image
00:38:57.640 recognition and and video recognition
00:39:00.280 also reached um that kind of state and I
00:39:03.480 think that combination is a killer use
00:39:05.319 case because certain stuff are super
00:39:07.920 good you know with with kind of speech
00:39:10.119 other stuff you know are super good
00:39:12.160 visual everybody like like you know when
00:39:14.040 people like try to explain that point is
00:39:15.640 like when you go to a restaurant
00:39:17.760 choosing from a menu you know when the
00:39:19.839 you know when the waiter comes and tells
00:39:21.240 you about the dishes of the day after
00:39:22.720 three dishes you lose him so like you
00:39:24.839 know so you know on a menu easily go
00:39:27.880 through 50 or 100 dishes so actually you
00:39:29.800 know so you want that rich ability to to
00:39:33.160 Vision but when you want to express some
00:39:35.359 of these restaurants that have tablets
00:39:36.960 that’s actually very annoying because
00:39:38.200 when I know what I want or I have a
00:39:39.720 question language is so precise and
00:39:41.800 robust so you want the precis and
00:39:43.800 robustness of language and on the other
00:39:46.359 hand you want the richness of visual
00:39:48.079 recognition so I think that’s like one
00:39:50.200 aspect but the other aspect that like
00:39:53.440 super excites me is when you think about
00:39:58.000 the um the mobile Revolution right so we
00:40:00.800 kind of had like you know the first the
00:40:02.280 internet and the
00:40:03.880 the and digital Revolution when we when
00:40:06.800 you know we had computers and we could
00:40:08.920 connect to the internet and and the
00:40:10.400 world changed in a big way right so
00:40:12.599 suddenly you don’t need to go to a
00:40:14.400 physical location and you can consume
00:40:16.160 information and services from home and
00:40:19.520 then mobile came and when you think
00:40:21.560 about mobile it actually not it’s not
00:40:25.359 such a big difference what did they do
00:40:26.920 you kind of take the big computer you
00:40:28.720 cut the cables you make it slightly
00:40:31.160 smaller so it fits in in your pocket and
00:40:33.640 you can go around with it and they added
00:40:36.319 very simple right there is a camera
00:40:38.800 there is a microphone there is a a gyro
00:40:42.640 gyroscope so it knows the position and
00:40:44.319 there is a accelerometer and and a GPS
00:40:47.480 that’s it that’s the controls that you
00:40:49.599 have in a mobile and that completely
00:40:51.960 changed the world right like you know it
00:40:53.560 it became the most addictive thing that
00:40:56.040 everyone uses 24 74 hours you know
00:40:59.000 completely it wasn’t it wasn’t the tech
00:41:00.839 it was the way the tech ins it enabled
00:41:04.599 but it enabled right so so you know you
00:41:07.160 suddenly started you know all of these
00:41:09.319 verbs that we didn’t use like you know
00:41:11.400 to like to message to to to you know to
00:41:15.800 post you know you know all of these you
00:41:18.079 know you know to check in to navigate to
00:41:20.760 email you know all of these verbs that
00:41:22.520 we are doing all the way you know
00:41:23.800 weren’t able and then I’m saying wow now
00:41:27.520 that we can actually take intelligence
00:41:30.280 and connect it to any device you don’t
00:41:32.119 need to take it out of a pocket
00:41:33.880 everything around you in theory can
00:41:35.480 actually understand you and talk to you
00:41:38.359 back and now you can actually connect it
00:41:40.200 to any device you know to a car to your
00:41:43.280 wearables to to to to medical devices to
00:41:47.119 toys to Scientific devices right so
00:41:50.160 these are like so so I think like the
00:41:52.440 opportunity and richness of services and
00:41:55.160 new verbs and new ways of doing things
00:41:58.440 is going to be like in my opinion you
00:42:00.800 know orders of magnitude more than this
00:42:03.599 kind of crew device with small screen
00:42:06.440 that you need to kind of yes so so so I
00:42:09.200 think like the type of of new classes of
00:42:11.880 use cases that now finally you know we
00:42:15.359 are like kind of a historic day because
00:42:17.319 what they what open AI did it’s not just
00:42:19.960 the understanding it’s the latency
00:42:22.160 finally you have a latency for voice and
00:42:25.200 video yes for everyone
00:42:27.880 there is a whole chain here but but that
00:42:31.440 it’s like you know sub 200 millisecond
00:42:33.800 which is like kind of a a comfortable
00:42:35.680 latency for us I think like it it just
00:42:37.920 like opens up that modality and I think
00:42:40.000 soon we will see glasses and other
00:42:42.400 variables and things but you know the
00:42:45.000 the reviews opinions about um goggles
00:42:49.440 and like the
00:42:50.480 wearable especially uh like uh apple
00:42:54.960 apple invention um are not so positive
00:42:59.559 um you know people are saying that it’s
00:43:03.240 it’s not really comfortable it’s it’s
00:43:06.559 making you DIY and it’s preventing you
00:43:09.240 from socializing with people I have a
00:43:13.079 ton of respect for people that are doing
00:43:15.599 the early steps and getting the fire
00:43:18.240 having done it before multiple times in
00:43:20.160 my career I know that like you know the
00:43:22.720 step that finally Nails it is actually
00:43:25.720 thanks to those people that actually
00:43:27.280 walked the desert so I think like the
00:43:32.200 it’s not the issue and I think a lot of
00:43:35.319 the ideas are valid but in order
00:43:38.960 actually to take a valid idea and break
00:43:42.160 and create a a great product out of it
00:43:45.760 you need maturity of technology and
00:43:47.599 maturity of understanding right and
00:43:49.200 again you know the the the example that
00:43:52.240 that that you know that all of us of
00:43:53.559 course understand is that you know
00:43:55.280 mobile phones start to be quite common
00:43:58.160 you know in
00:43:59.720 1995 um but it didn’t really happen
00:44:02.720 until the iPhone came right like you
00:44:04.319 know 2007 or whenever that that exactly
00:44:06.520 that that right so so a decade until
00:44:08.839 someone figured out the form factor that
00:44:11.000 will actually be great and I think like
00:44:13.160 this time yeah so so I
00:44:16.280 actually am a big believer in in kind of
00:44:21.440 a glass or something like you know like
00:44:23.000 like un feeling you know something on
00:44:24.599 the eye you know on the bone of the ear
00:44:28.200 um and and and and something that sens
00:44:31.319 so you don’t need to speak out loud you
00:44:32.720 kind of right you can think about what
00:44:34.680 you’re saying that there is already that
00:44:36.160 technology that work um and I’m pretty
00:44:39.640 sure right like you know when I when I
00:44:42.720 grew up I grew up like with TVs that
00:44:44.800 were 21 inch and black and white and
00:44:47.280 giant and you need to go to the find the
00:44:49.559 antenna on the roof and and fix it right
00:44:53.359 like you know and and and people at the
00:44:55.160 time said oh because of TV all the
00:44:57.040 social order will fall down because
00:44:58.680 people no I think it was the opposite
00:45:00.480 one person in the village had a TV and
00:45:02.960 everybody was coming in coming yeah that
00:45:04.640 was also actually yeah there was remot
00:45:06.839 still see it in some countries like in
00:45:08.400 developing countries right yeah we had
00:45:10.440 the stick as a remote control I you know
00:45:14.040 I’m pretty sure that if we time travel
00:45:17.040 people will be shocked what how can you
00:45:19.200 really know your people if you don’t
00:45:20.760 like fire with them or if you don’t hunt
00:45:22.599 with them or if you don’t you know you
00:45:24.720 know cross the ocean on a boat with them
00:45:27.000 or if you don’t ride horses with them
00:45:29.440 right like you know I’m I’m pretty sure
00:45:30.839 that every generation has its own kind
00:45:33.119 of preconceived what is the must and the
00:45:35.920 right way of socializing and you kind of
00:45:39.000 move on um and and I and yeah maybe it’s
00:45:43.960 s or what not but like it’s very
00:45:45.280 interesting you know why another good
00:45:47.240 friend of mine um is the chief product
00:45:51.000 officer of Roblox and he had like this
00:45:52.760 great Insight he said that when he’s
00:45:54.760 asking his son what is he doing when the
00:45:56.640 son is playing games video games he’s
00:45:59.119 not saying I I’m playing video game he’s
00:46:01.800 saying I’m hanging out with my friends
00:46:03.400 my friends MH right like so so it’s like
00:46:05.480 and I see with my kids as well so so so
00:46:09.119 I think like there is an evolution here
00:46:11.240 and and I agree that there is something
00:46:12.960 in the physical world I don’t think it’s
00:46:14.319 going to disappear but there is this
00:46:16.200 awesome website I think it’s like the
00:46:18.359 something like the the ultimate
00:46:20.200 pessimist or something it shows actually
00:46:22.480 newspapers and articles like every time
00:46:24.760 the past from the past you know are so
00:46:28.040 dangerous you know bicycles will you
00:46:30.359 know human women will become horrible
00:46:33.160 because of bicycles you know it’s all
00:46:34.960 like so yeah so we laugh at it no but
00:46:38.760 you don’t you can’t deny the the data
00:46:41.960 that uh you know now the biggest um
00:46:46.400 epidemic actually is loneliness uh among
00:46:49.440 especially among young um uh
00:46:52.640 boys and partly it is to uh um losing
00:46:58.680 those social skills because they are
00:47:00.400 staying at home and I I I think it was
00:47:04.079 also in Scot Scott Galloway’s um keynote
00:47:08.640 that obviously the the demand for porn
00:47:13.000 and for uh websites like uh character AI
00:47:17.040 those virtual girlfriends is is
00:47:20.480 spiking yeah I I I wonder what’s the
00:47:22.640 cause and what’s the phenomena here like
00:47:25.599 like you know like you know
00:47:27.880 is it because of that or is it does it
00:47:29.920 actually show some deeper truth about us
00:47:34.440 because you know no one is forcing you
00:47:36.800 to to use the
00:47:38.680 technology I agree you just said that I
00:47:42.400 I think again right like when we need
00:47:43.839 also to to be careful how we use
00:47:45.280 addictive right like you know somebody
00:47:46.880 addictive kind of you know addictive
00:47:48.359 with parenthesis and some of it is
00:47:49.839 really addictive I do I I do strongly
00:47:52.200 believe right like you know there was
00:47:53.559 this um Tristian PM from Google that
00:47:56.680 said said hey you know that like
00:47:58.040 actually that the technology companies
00:48:00.680 should stop optimizing on more time on
00:48:03.240 their apps and actually you know for
00:48:04.599 well being and I think we’re seeing you
00:48:07.079 know starts of do so I 100% believe that
00:48:09.800 again the same way the fact that we can
00:48:12.359 generate you know more food doesn’t mean
00:48:15.000 we need to be obese
00:48:17.680 and um so so I think like again the fact
00:48:21.599 that we can generate super fun useful
00:48:26.359 help technology doesn’t mean we should
00:48:29.680 like you know sit all day in a closed
00:48:32.079 room um and and use it and I think like
00:48:35.040 as a society yeah we need to think and
00:48:38.400 and in my opinion part of it is you know
00:48:40.960 educating parents and educating children
00:48:43.119 to make sure that we are actually
00:48:44.880 intelligent informed consumers of of of
00:48:48.960 of this I think that’s the best I don’t
00:48:50.400 believe in laws and other I think like
00:48:51.920 really at the end it’s about education
00:48:54.319 that parents teachers and other role
00:48:57.200 models of people that shape behavior and
00:49:00.480 children we learn you know what’s the
00:49:03.240 impact what’s the importance of doing
00:49:04.760 physical activity what’s the importance
00:49:06.680 of eating correctly what’s the
00:49:08.040 importance of social activity what are
00:49:10.680 the risks and values in everything right
00:49:13.200 there is always value and risk in
00:49:15.640 anything we do right there is like you
00:49:17.240 know physical activity and sports there
00:49:19.760 is also risk if you do too much like I
00:49:22.200 think everything needs to be at the
00:49:23.799 right dose and in the right way and I
00:49:26.000 think like techn techology is that too
00:49:27.720 and it’s just like moving so fast that
00:49:29.880 that the education and the Educators
00:49:32.559 often are just left behind um but but I
00:49:37.040 strongly believe that that that that the
00:49:39.440 key thing is actually making the
00:49:40.880 education system much more relevant to
00:49:43.559 Modern days I think it’s so important
00:49:46.160 that that people understand the body and
00:49:48.119 how it works the brain and how it works
00:49:51.200 the world and how it works and once you
00:49:54.079 understand that you actually understand
00:49:55.240 also how technology you know interferes
00:49:58.160 and interact with the
00:50:01.000 world and and with the with your body
00:50:05.720 and I just you know again you know this
00:50:07.200 like a little bit of my own run that I
00:50:09.240 wish kids would have learned much more
00:50:12.079 advanced physics and biology on really
00:50:16.280 understanding you as a machine and the
00:50:18.280 machines and Technology around you and
00:50:20.400 the world is a machine in a sense and I
00:50:23.400 think that would actually contribute the
00:50:25.400 most to Better World rather than you
00:50:27.480 know leaving it to a few people or
00:50:29.200 trying to force those yeah for on an
00:50:32.000 uninformed audience no of course but you
00:50:35.319 know as as you see uh history shows that
00:50:38.760 people don’t really sometimes they don’t
00:50:40.880 have know limits they they indulge uh in
00:50:44.799 in certain things and especially
00:50:47.440 kids it takes years to train them it
00:50:50.440 takes years obviously it should start
00:50:52.520 from very early a age but you have some
00:50:56.720 many different factors and so much um
00:50:59.799 you know social influence and it’s not
00:51:02.119 only your kid it’s it’s the whole so I I
00:51:05.359 have a theory about
00:51:07.480 that I think I think needs to be things
00:51:10.680 need to
00:51:11.799 be measurable and
00:51:14.319 simple and then they work you know I
00:51:16.400 used to to have this kind of you know
00:51:18.200 back in the you know rule of a seat belt
00:51:19.799 you know a seat belt is like and it
00:51:22.359 saves your life and you get like a
00:51:25.079 $1,000 fine
00:51:27.640 and if you don’t put it on yet it took
00:51:29.799 years and years to to educate people to
00:51:31.760 put the seat belt so I generally say you
00:51:33.880 know something I know needs to be you
00:51:35.400 know as simple as a seat belt if you
00:51:36.680 want to have any chance of of of of
00:51:38.280 changing Behavior but that’s regulated
00:51:40.160 by law right yeah but no but again you
00:51:42.160 know the regulation regulated you know
00:51:44.040 most of us don’t care about regulated
00:51:45.440 outside Switzerland right you know it
00:51:47.760 does you know the fact that there is you
00:51:49.040 know most people don’t even know the
00:51:50.200 laws but but but like you know some
00:51:52.119 examples that I I take is like for
00:51:54.280 example diet so diet or or or watching
00:51:57.960 you know your weight used to be crazy
00:51:59.920 thing or no one did until Weight
00:52:01.240 Watchers came with this kind of very
00:52:02.839 simple points and then you actually saw
00:52:04.960 that go to the because people don’t need
00:52:06.359 to understand everything okay they
00:52:08.079 understood that thing and now they could
00:52:09.520 actually you know watch the waight you
00:52:11.240 know only crazy people used to go to
00:52:13.319 gyms and stuff until I think the first
00:52:16.319 one was kind of the 10,000 challenge by
00:52:19.680 a Fitbit but then like once you have
00:52:22.440 like you know more and more things that
00:52:24.400 measure and show you green green red and
00:52:26.760 whatnot but gamification again it’s
00:52:30.480 addictive in a way no yes but my my so I
00:52:34.920 think that actually wellbeing is not
00:52:37.440 there yet I don’t think anyone is yet
00:52:39.920 measuring wellbeing in a good way
00:52:42.760 reports it so you don’t have a feedback
00:52:44.640 loop you know diabetes for example the
00:52:46.880 moment you had the continuous measuring
00:52:48.599 of diabetes and that thing I because I
00:52:51.240 was like actually working in the medical
00:52:52.599 space you know finally they understood
00:52:54.920 how their behavior impact that and
00:52:57.200 people are very good with feedback loop
00:52:58.960 and I actually think that thanks to AI
00:53:02.480 that can actually now watch you and
00:53:04.319 listen to you and can actually tell you
00:53:07.079 oh actually or tell you about yourself
00:53:09.200 or about your kids or about your partner
00:53:10.960 or about your teammates hey actually you
00:53:12.880 know you’re talking less or talking more
00:53:14.440 you’re shouting less or shouting more
00:53:16.200 you’re using more negative or more you
00:53:18.079 know or more positive words you’re
00:53:20.079 actually you know in different rooms
00:53:21.640 right so like very simple observations I
00:53:24.880 think can actually give
00:53:27.319 not only of of how good is your
00:53:29.880 wellbeing but I actually strongly
00:53:32.000 believe that it can predict like you
00:53:33.520 know teenagers problem of depression
00:53:35.079 other divorces getting fired and and I
00:53:39.520 actually believe that you know I was
00:53:41.359 talking before you know about that like
00:53:43.160 some of these apps that will become I
00:53:45.599 believe that applications that help us
00:53:49.000 monitor our
00:53:50.440 wellbeing and improve it and in a very
00:53:53.359 simple way will actually be a result of
00:53:56.160 this technology and I actually think
00:53:57.880 that once that’s there and you have just
00:54:00.640 like a clock that is showing you the
00:54:02.200 time there is clock that is showing you
00:54:04.119 how are you doing in your relationship
00:54:06.000 or how your children are doing and I
00:54:07.359 think that that will allow us like you
00:54:09.079 know these quick feedback loops that hey
00:54:11.440 actually hey you know you know we notice
00:54:13.280 you’re doing too much games or we notice
00:54:15.000 that actually you’re talking less after
00:54:17.040 you you you talk the games and I think
00:54:19.240 it will be much more simple data driven
00:54:22.200 intuitive solutions to this rather than
00:54:24.799 like you know either make it regulate or
00:54:27.040 or very theoretical so so yeah so so
00:54:29.880 again I’m a very big believer that this
00:54:32.160 technology that understands how people
00:54:34.680 interact actually allows to understand
00:54:36.440 people more and allows to people
00:54:38.200 understand you know how many times you
00:54:39.920 you had miscommunication you know in
00:54:41.799 your personal professional life that
00:54:43.160 screw everything like so so I think like
00:54:45.160 these kind of tools will help that yeah
00:54:47.799 definitely maybe reminding you to send
00:54:50.200 flowers to your wife for them for them
00:54:53.680 yeah anniversary um okay
00:54:56.920 but so what do you think about L like
00:55:00.200 those longevity um not freaks but um
00:55:03.680 let’s say pseudo scientists and those
00:55:07.160 who are using themselves um as a guinea
00:55:10.760 pig yeah so so Pudo science you know I’m
00:55:14.920 not a fan of any Pudo science right
00:55:16.760 science need to be no no I called it
00:55:18.319 that way because there’s lots of
00:55:19.720 criticism about it but obviously um
00:55:22.640 because lots of things are being done
00:55:24.880 purely on marketing
00:55:26.839 um level but so I think again you know
00:55:29.400 charlatan if it’s you know a charlatan
00:55:31.640 if especially it’s a charlatan that
00:55:33.000 knows he a charlatan so I think like you
00:55:34.920 know we should call that out you know
00:55:37.720 and again usually they’re just like um
00:55:39.880 taking advantage of people that are
00:55:42.079 uneducated and then misinformed and
00:55:44.520 selling dreams right that was always a
00:55:46.440 business of selling dreams to to people
00:55:49.160 and then I think there is like you know
00:55:51.200 people that early pioneers and are doing
00:55:53.839 really work and to your point they
00:55:56.319 willing to put
00:55:57.880 themselves as a guinea pig in on in
00:56:00.520 order to to progress science and and and
00:56:03.640 of course we know that that’s also we’ve
00:56:06.280 always seen that you know a lot
00:56:08.400 of um you know early days of radiation
00:56:12.359 you know there is like we know the
00:56:14.559 consequences and I think we should all
00:56:16.400 be grateful and think um someone
00:56:19.440 sacrific to support and I think maybe
00:56:21.920 these are some places maybe that
00:56:23.200 robotics or or or again you know AI
00:56:27.160 on on Sciences can help so you can
00:56:30.440 actually simulate or do this very
00:56:32.880 complex because the problem is right
00:56:34.599 like you know the human body is so
00:56:37.119 complex each cell is so complex because
00:56:39.280 of all the many
00:56:42.240 chemical interaction and then the Cell
00:56:44.880 interraction between themselves and then
00:56:46.839 the whole control so like it’s just
00:56:48.760 impossible if you want to check
00:56:50.280 something how it impact it’s really
00:56:51.720 really hard to do it in a in a controll
00:56:53.760 up but but if you can actually create
00:56:55.960 you know either based on what Nvidia is
00:56:57.720 are kind of saying hey a simulated human
00:57:00.280 or you can actually do more and more
00:57:03.000 complex analysis of chemistry and others
00:57:05.240 right like you know um you know how do
00:57:07.280 you slow aging of sales sales via
00:57:11.280 chemical interactions you know that’s
00:57:12.960 great um you know whether you want
00:57:16.119 longevity or not I think it’s a personal
00:57:18.280 choice some people are saying hey you
00:57:19.720 know it’s hard enough to plan my life
00:57:21.280 for 70 years um I think everyone at
00:57:23.960 least wants to have better quality of
00:57:27.240 of Life funny enough when you talk with
00:57:29.480 a lot of these longevity are saying you
00:57:31.440 know hey the top factors are one you
00:57:34.960 need to have a a a more muscle mass
00:57:38.599 right like you know that’s actually the
00:57:40.000 the biggest contribution to longevity
00:57:41.680 and the second one is cardio right so I
00:57:44.280 think like you know my recommendation
00:57:46.280 again is like actually know you know go
00:57:47.839 to the gym and that actually until
00:57:50.960 longevity is is ready you’re already
00:57:53.720 getting you know most of what it gives
00:57:56.480 um but yeah I think it’s a you know
00:57:59.319 definitely there is a market for it and
00:58:02.119 I think it’s like for itself it’s a good
00:58:04.559 thing I think like everything again you
00:58:06.680 know it needs to be done in the right
00:58:08.160 way and definitely again you know
00:58:09.720 anything that is science needs to be in
00:58:11.760 a scientific way as a consumer you
00:58:13.559 should not buy something that is
00:58:14.920 supposed to be science from a
00:58:16.760 non-scientist right like you know you go
00:58:18.319 to a doctor that is a doctor and you you
00:58:20.880 you buy a a computer from computer
00:58:23.240 company right like like I’m surprised
00:58:25.839 sometimes that this is
00:58:27.760 not common sense the biggest problem of
00:58:31.440 of society right now it’s obesity right
00:58:34.000 and and that’s why as I mentioned before
00:58:38.160 Olympic is is is crazily popular right
00:58:41.960 now especially in us and I I just fear
00:58:46.480 that people Humanity like to go on those
00:58:51.039 strive to to find those shortcuts um
00:58:54.400 without mining finding the long-term
00:58:57.920 consequences or not learning the lesson
00:59:01.079 I do not believe in delegating that to
00:59:03.319 governments yes government should do it
00:59:04.880 at their all but I don’t believe in the
00:59:06.440 completely delegating and this is again
00:59:09.079 I think that the solution is like
00:59:10.599 actually that people should understand
00:59:13.160 it’s funny enough right like most people
00:59:15.039 don’t understand what’s happening when I
00:59:18.200 eating what’s happening to my why are
00:59:21.119 cells getting old why is cardio
00:59:23.640 important what does it do why is
00:59:26.240 actually where is cholesterol playing in
00:59:27.839 my body where is like you know sugars
00:59:30.280 and other
00:59:31.680 carb and I don’t you know you don’t need
00:59:34.680 to get like you know to PhD level but I
00:59:36.960 actually think that with a little bit of
00:59:38.880 effort you actually get to it and then
00:59:41.200 and then you can actually asking for
00:59:42.880 question okay you are saying that this
00:59:44.200 solves can you explain to me how you
00:59:46.720 know how exactly that is does it
00:59:48.880 interact in the system that actually
00:59:50.520 helps me um and I
00:59:53.280 think I’m and people are spending it’s
00:59:55.880 not not like you know we don’t go to
00:59:58.280 school right like we spend a ton of
01:00:00.480 money on school and a ton of time on
01:00:02.200 school the two most kind of important
01:00:04.599 resources that we have you know anywhere
01:00:06.200 between 10 to 15 years right that’s kind
01:00:08.359 of maybe it’s taught in a way that it’s
01:00:11.000 not interesting but like yes exactly but
01:00:13.599 I think like we’re like like I think
01:00:15.480 like the only thing I would come and say
01:00:17.079 like people need to be much more
01:00:19.280 opinionated on the quality and the type
01:00:22.920 of schooling that the they and their
01:00:24.920 kids are getting I think that’s the
01:00:27.200 biggest issue and that’s why I would put
01:00:28.760 like you know mo most of the energy of
01:00:33.599 society’s runting I I think if we fix
01:00:37.400 that you know the the money and time is
01:00:39.599 there just like fix the content and and
01:00:42.880 I think that will solve a lot again you
01:00:44.599 know he got again he’s like yeah sure
01:00:46.559 uneducated people just don’t know what
01:00:48.319 to ask and then they kind of they tend
01:00:50.200 to believe and if you’re convincing
01:00:52.760 enough you win them but it’s very tricky
01:00:55.680 it’s very hard to trick people when they
01:00:57.440 know the domain yes but it’s also the
01:01:00.160 question of um access to education yes
01:01:04.559 but I strongly believe that you know if
01:01:06.119 if we were if you are podcasting me 200
01:01:08.359 years ago and I would tell you in some
01:01:10.319 countries you know 100 years ago in some
01:01:12.039 country today and I would tell you I
01:01:13.880 actually think that everyone should know
01:01:15.480 how to read and write or I actually
01:01:17.559 think that everyone should know you know
01:01:19.319 basic mathematics you would tell me
01:01:20.799 you’re crazy what you mean like you know
01:01:23.799 you know women will know mathematics you
01:01:26.400 know like you know in some countries you
01:01:27.839 know still today yeah you are crazy to
01:01:29.839 think that and I think everyone you know
01:01:32.520 the the basic education is orders of
01:01:36.240 magnitude what it was before so so so I
01:01:38.480 actually don’t think it’s impossible or
01:01:40.280 hard or
01:01:42.839 anything yeah it’s a it’s a matter of
01:01:45.640 doing yeah so so I’m believe yeah I
01:01:47.920 think I I think like and I actually I
01:01:50.440 think that technology will help you know
01:01:52.079 I see how my kids are learning and how I
01:01:54.559 thought you know with bril or with all
01:01:57.400 these other you know right like you know
01:01:59.480 it’s so easy and then some of the
01:02:00.920 examples that we have seen yesterday you
01:02:02.400 know from open and Google is about this
01:02:04.599 kind of personalized teaching and
01:02:07.599 teaching it in the way you know you
01:02:09.079 don’t need to have any more pre-written
01:02:11.440 textbooks you can actually ask the
01:02:13.200 assistant explain to me you know teach
01:02:15.319 me designs with a basketball example
01:02:17.799 show me a picture you know oh I don’t
01:02:20.440 understand this right so actually again
01:02:21.920 looking at this technology I think it’s
01:02:24.240 actually solving the the the the problem
01:02:27.319 of how can we do personalized education
01:02:31.039 for everyone right because scaling
01:02:32.880 teachers is very very hard but scaling
01:02:35.079 teaching assistance yeah so I think like
01:02:37.079 again going back the same pattern this
01:02:39.319 this Ai and technology has the potention
01:02:42.119 to solve its own problem of educating
01:02:44.559 people how to use it correctly yeah yeah
01:02:47.319 yeah for this at this stage I guess
01:02:49.520 maybe this is the best um the best we
01:02:52.680 can do for ourselves but I don’t know
01:02:55.920 thinking futuristic terms do you think
01:02:59.160 we will ever get to the stage where we
01:03:01.559 will be uploading you our knowledge I
01:03:04.279 think yeah I think you know you know
01:03:06.359 people are you know there’s all this
01:03:07.960 discussion of you know augmented humans
01:03:10.400 we’re already augmented right you know
01:03:12.079 I’m augmented by my car I’m not walking
01:03:14.039 to places I’m augmented by my phone I’m
01:03:16.760 augmented by my pen and pencil I’m
01:03:18.599 augmented by my glasses I’m Meed by my
01:03:21.200 my earphone ased by my
01:03:23.839 computer will we make the M mentation
01:03:26.559 you know closer and closer and will we
01:03:29.000 use it as a memory I think yes um and I
01:03:33.880 think we already you know at the moment
01:03:35.680 you know the fact that you have a mobile
01:03:38.279 phone that already has access to all the
01:03:40.799 internet and you ask the Delta from that
01:03:44.200 from you yourself knowing it you know
01:03:45.880 does it matter if the phone is in your
01:03:47.480 hand inside your hand doesn’t honestly I
01:03:50.319 I don’t think it’s a huge difference it
01:03:51.839 just feels more secure that you can turn
01:03:53.799 it off but it’s accessible right but you
01:03:56.440 can also turn off here as well right
01:03:59.200 like so yes I think like you know we are
01:04:02.319 actually much closer to that right again
01:04:04.640 if you if you’re thinking what it took
01:04:08.200 you as a child to know things and to
01:04:10.880 find out things versus now right no but
01:04:14.039 I I was like you learn as a human you
01:04:16.079 learn by doing by experimenting by
01:04:18.160 failing and and just retrying and I
01:04:22.359 don’t know I just can’t comprehend how
01:04:24.839 it would make a has feel like would we
01:04:27.520 lose uh motivation would we find more
01:04:31.839 enjoyment and time to do other things I
01:04:34.680 don’t know I don’t know no just too far
01:04:39.319 let’s focus on what’s what what the Pres
01:04:41.720 technology can do yeah okay so let’s end
01:04:44.799 up on the on the very um optim all our
01:04:48.359 conversation felt very optimistic but
01:04:50.720 let’s let’s let’s add to it so what do
01:04:53.119 you wish for H future fun for the next
01:04:56.960 um few years to achieve so so so the
01:04:59.680 reason I started it because I really
01:05:02.920 wanted to contribute for this kind of
01:05:05.960 technology for good I really want to be
01:05:08.640 part of supporting good new tech and
01:05:12.680 help direct it and help actually great
01:05:15.880 Founders build great companies that will
01:05:18.960 impact our our future that was like you
01:05:21.760 know kind of my biggest motivation is
01:05:24.559 that I believe yes there you know
01:05:26.160 potential and I I can help it at least
01:05:28.359 in some way so I would love that like
01:05:31.960 you know in like I don’t know three
01:05:33.240 years if we are actually if we have been
01:05:36.240 like you know kind of key contributors
01:05:38.799 to a a a a bunch of really good
01:05:42.760 technology and products coming out of
01:05:45.680 Europe and other places and where we can
01:05:48.960 actually say hey these are
01:05:51.760 changing um people’s life for good and I
01:05:55.319 would love that we’ll be so successful
01:05:56.839 that we can actually have some pro bono
01:05:58.279 to invest ourselves in stuff that
01:06:00.400 actually you know to support
01:06:01.520 organizations and other with technology
01:06:03.720 so actually even more proactively you
01:06:05.640 know do that and basically you know use
01:06:08.880 technology to to to build a better
01:06:11.160 future and I just want to say you know
01:06:13.359 sometimes we look at technology as
01:06:14.880 something that is foreign or artificial
01:06:17.000 or what not but you know when I look at
01:06:19.920 a bird building a nest or a beaver
01:06:22.240 building a a dam I don’t think that’s
01:06:24.520 artificial or outside you know Beaver is
01:06:26.880 part of Nature and what he builds is
01:06:28.559 also part of nature we are also part of
01:06:30.680 nature the technology that we build is
01:06:32.480 also part of nature it madees of
01:06:34.359 material that are you know elements in
01:06:36.520 nature so I think like all we are doing
01:06:38.760 is like basically just like you know
01:06:40.359 kind of of of of O of
01:06:44.640 um adding a more complexity to nature
01:06:49.160 but it’s still nature and we’re kind of
01:06:51.039 improving nature in a sense by you know
01:06:53.960 by taking sand and turn it into the
01:06:57.240 gpus um so so so yeah so I hope like you
01:07:01.160 know we take much more sand and make
01:07:02.599 more out of it in a way that creates a
01:07:05.119 better place the more
01:07:07.640 silicon
01:07:09.880 okay you yeah I I I I really wish that
01:07:13.720 for for you and and your partners and um
01:07:16.799 hopefully few few few few years and time
01:07:20.960 we will get back together and you will
01:07:23.359 tell me all about those successes you on
01:07:27.920 thank you thank you yth and have a
01:07:29.880 lovely lovely day oops you too thank you
01:07:32.599 I just I just love super fun thank you
01:07:34.760 thank you bye bye