The wait is finally over, so get ready for an electrifying episode; from the high-stakes world of AI in politics (and the 2024 “Year of Elections” angle) to the groundbreaking innovations in healthcare and community engagement, we uncovered it all. Kevin shared some shocking ways AI is reshaping elections, manipulating voter data, and even influencing public opinion through social media. We also touched on data privacy, cybersecurity, and the fight for underrepresented voices like the Latino community. Kevin is a huge advocate, and having been raised in Pico Rivera, the latino area of Los Angeles – he’s seen and heard it all.
Transcript
00:00:00.080 a lot of it was just kind of like I
00:00:01.719 mentioned the the things that I
00:00:03.280 experienced and and not all of it was
00:00:05.960 directly me but it was just the lack of
00:00:08.200 investment in the community it was the
00:00:11.120 poverty that I saw people uh uh facing
00:00:14.320 on a daily basis it was you know we we
00:00:16.920 needed we survived on food spin food
00:00:19.199 stamps you know so that we could eat
00:00:21.760 regularly I mean there was it was a
00:00:24.039 constant you know I would watch my mom
00:00:25.960 be just stressed out about can I pay
00:00:28.560 this bill can I pay this bill can this
00:00:30.679 and you know there just wasn’t that same
00:00:33.680 opportunity it didn’t feel in our
00:00:35.520 community nobody around us was a doctor
00:00:38.360 or a lawyer or a business leader or
00:00:40.800 anything like that it was everybody was
00:00:44.280 very very blue collar very very working
00:00:47.000 class and struggling to get by and you
00:00:49.520 know at that time in in the 80s and
00:00:51.760 early 90s in in California you could
00:00:54.719 afford to live there and and have a a
00:00:58.680 very like low paying job or a very
00:01:01.359 working class job but you know that’s
00:01:03.440 that’s kind of a a different time now
00:01:05.560 it’s a Buon era unfortunately but um
00:01:09.600 that that’s really what I think drove me
00:01:12.200 was was this idea of like we can do
00:01:14.159 better we are better I I see the people
00:01:16.520 that I go to school with who have no
00:01:19.680 resources and are just brilliant and
00:01:22.119 it’s it’s not it always felt like there
00:01:24.400 was a racial element to you know black
00:01:26.240 and brown communities of that we’re just
00:01:28.439 not as smart as our our white
00:01:31.240 counterparts and that didn’t feel right
00:01:33.960 you know as I was as I was growing up
00:01:35.600 and the things that I saw it was no no
00:01:37.240 no we are just a smart we’re just
00:01:38.640 underresourced we’re we’re purposely
00:01:40.560 disinvested in and that that really
00:01:43.119 drove me hi this is your host kamila
00:01:46.200 hankiewicz and together with my guests we
00:01:48.840 discuss how Tech changes the way we live
00:01:51.439 and
00:01:52.520 work are you
00:01:55.719 human Kevin it’s a
00:01:58.560 pleasure it’s wonderful to me too hi Cam
00:02:02.240 initially I found found out about your
00:02:05.439 work on CNBC article uh you gave some
00:02:10.440 insights about the dangers of U of the
00:02:15.440 new technology and
00:02:17.440 voting and I know that you’ve been
00:02:19.760 working extensively on uh campaigning
00:02:23.160 like political campaigns what made you
00:02:26.440 like so interested in this yeah I mean
00:02:30.040 in terms of being interested one thank
00:02:31.640 you for having me here I’m very excited
00:02:33.760 to uh to meet you and talk to you and
00:02:35.959 I’ve
00:02:36.879 watched oh thank you oh uh yeah I’m very
00:02:40.640 excited uh I hope I can live up to those
00:02:42.760 other guests uh so in terms of how I got
00:02:45.879 into it you know I grew up in in East LA
00:02:48.640 um in California East Los Angeles you
00:02:50.760 know I was in a a low-income Community
00:02:52.840 low-income household you know we had our
00:02:54.760 challenges you know Latino area um and I
00:02:59.120 saw politics it was something that I was
00:03:01.319 just always interested in it was it was
00:03:03.920 I would watch you know speeches I would
00:03:06.480 watch news coverage of what was
00:03:08.000 happening at the national level but as I
00:03:09.799 started really understanding it as I got
00:03:11.440 a little bit older into Junior High in
00:03:13.200 high school I realized that that was the
00:03:16.040 way to affect change you can do things
00:03:18.360 at a business level you could do things
00:03:20.640 elsewhere you know working in the
00:03:22.239 nonprofit World which is where I’m at
00:03:23.959 right now but at the same time you know
00:03:26.720 being able to affect policy whether
00:03:29.280 that’s at the local level the state
00:03:30.840 level uh the federal level that is where
00:03:34.319 real change happens and that’s where you
00:03:35.799 can really improve people’s lives or
00:03:37.840 conversely if you don’t want to improve
00:03:39.720 people’s lives where you can really
00:03:41.120 destroy communities um I think the
00:03:43.120 United States has a a history of doing a
00:03:45.120 little bit of both lots of yeah lots of
00:03:47.560 cases yeah yeah and and and so you know
00:03:51.680 that was really my what what peaked my
00:03:54.079 interest in getting involved in that
00:03:55.920 World um you know when I went to college
00:03:58.640 you know I’m a first generation college
00:04:00.239 student um or college graduate you know
00:04:03.239 when when I went to college it was
00:04:04.480 trying to figure out okay what is my
00:04:06.840 path here in the political world I was I
00:04:09.040 majored in political science that’s what
00:04:10.560 I studied but I was trying to figure out
00:04:13.200 how can I impact it am I a policy person
00:04:16.680 am I an operations or a finance campaign
00:04:19.639 person what do I do and I’ve always been
00:04:22.120 a good writer I’ve always been
00:04:23.680 interested in public speaking um I’ve
00:04:26.440 never been shy in front of a camera uh
00:04:29.120 or in front of a crowd of people and so
00:04:31.800 as I as I kind of tried to figure out
00:04:34.240 what worked for me I realized
00:04:36.080 Communications and public relations and
00:04:39.080 that type of work really spoke to me it
00:04:41.360 allowed me to be creative which is
00:04:43.039 something I I enjoy doing I like being
00:04:45.840 able to to work with other creatives or
00:04:48.680 think of new ideas or new ways to uh
00:04:51.160 promote something or push something out
00:04:52.800 there as opposed to just kind of working
00:04:55.479 with numbers where there’s not really
00:04:57.560 variability it’s just a a fix fix this
00:05:00.120 is where we’re going one way or the
00:05:01.680 other and so Communications ended up
00:05:04.280 being where I where I went and I’ve been
00:05:07.639 fortunate enough to work for elected
00:05:09.320 officials work on large and small
00:05:11.720 campaigns write speeches uh I’ve worked
00:05:15.600 a ton with the media you know over the
00:05:18.360 last 20 years or so however long my
00:05:20.759 career has been now I don’t want to put
00:05:22.240 a specific number on it because then
00:05:24.319 that makes me old uh but I I am really
00:05:29.319 uh you know that’s that’s really been my
00:05:31.680 my focus is is you know helping shape
00:05:34.880 the
00:05:35.800 narrative and it’s it’s always been
00:05:38.919 grounded by the experiences that I had
00:05:41.240 as a child and growing up exactly that
00:05:43.720 would be my question was there any
00:05:46.639 particular event you experienced or you
00:05:50.440 someone you close to you experienced uh
00:05:53.319 which made you um you know try to change
00:05:57.600 the status quo and I know that you you
00:05:59.560 are very uh big advocate of Hispanic uh
00:06:02.880 Community was there something about that
00:06:05.919 or was there also maybe um some
00:06:09.440 influence from your family or some like
00:06:12.520 someone close to you who who showed you
00:06:15.440 that um politics or um driving the
00:06:18.599 message uh is one of the you know most
00:06:22.319 influential things you can do to change
00:06:25.000 things yeah that’s a great question I
00:06:27.400 mean the driving the message portion of
00:06:29.199 it was really just where is my strength
00:06:32.280 where can I get involved and I’m not a
00:06:35.039 policy wonk you know so it felt more My
00:06:39.240 Level you know how do we break this down
00:06:40.800 so people can understand it how do we
00:06:42.720 make this understandable for the average
00:06:44.720 person so they know how this policy how
00:06:47.360 this elected official how this election
00:06:49.680 how this impacts this person in their
00:06:52.599 daily life um in terms of you know
00:06:55.720 experiences and family you know my
00:06:57.960 grandfather was Teamster uh back in the
00:07:01.800 the 50s and 60s you know he Dr he was a
00:07:04.280 a truck driver he was a Latino truck
00:07:06.160 driver in Southern California um you
00:07:08.960 know he was maybe the staunchest
00:07:12.000 Democrat I think I’ve ever met in my
00:07:14.240 life and there was a lot of me growing
00:07:16.479 up just Miho you always vote Democrat
00:07:19.039 you always vote Democrat and you know if
00:07:22.199 if I had had actual disagreements with
00:07:24.560 with the policy I wouldn’t have voted
00:07:26.080 Democrat or I would have gone the other
00:07:27.440 way but there was you know you kind of
00:07:29.080 have little voice and and um you know it
00:07:32.560 it just that it spoke to me you know
00:07:35.599 this the the policies of of the the
00:07:37.879 Democratic party ended up speaking to me
00:07:39.560 and that was kind of where I went that
00:07:41.720 was my my movement um but you know a lot
00:07:44.919 of it was just kind of like I mentioned
00:07:46.960 the the things that I experienced and
00:07:49.159 and not all of it was directly me but it
00:07:51.599 was just the lack of investment in the
00:07:53.720 community it was the poverty that I saw
00:07:57.159 people uh uh facing on a daily basis it
00:08:00.120 was you know we we needed we survived on
00:08:02.960 food SP food stamps you know so that we
00:08:05.120 could eat regularly I mean there was it
00:08:08.120 was a constant you know I would watch my
00:08:10.240 mom be just stressed out about can I pay
00:08:13.199 this bill can I pay this bill can I pay
00:08:15.039 this and you know there just wasn’t that
00:08:17.520 same opportunity it didn’t feel in our
00:08:20.199 community nobody around us was a doctor
00:08:23.000 or a lawyer or a business leader or
00:08:25.520 anything like that it was everybody was
00:08:28.919 very very blue collar very very working
00:08:31.680 class and struggling to get by and you
00:08:34.159 know at that time in in the 80s and
00:08:36.440 early 90s in in California you could
00:08:39.360 afford to live there and and have a a
00:08:43.320 very like low-paying job or very
00:08:46.000 workingclass job but you know that’s
00:08:48.080 that’s kind of a a different time now
00:08:50.240 it’s a bon era unfortunately but um that
00:08:54.480 that’s really what I think drove me was
00:08:57.320 was this idea of like we can do better
00:08:59.240 we are better I I see the people that I
00:09:01.440 go to school with who have no resources
00:09:04.920 and are just brilliant and it’s it’s not
00:09:08.360 it always felt like there was a racial
00:09:09.640 element to you know black and brown
00:09:11.360 communities of that we’re just not as
00:09:13.560 smart as our our white counterparts and
00:09:17.200 that didn’t feel right you know as I was
00:09:19.519 as I was growing up and the things that
00:09:20.839 I saw it was no no no we are just as
00:09:22.640 smart we’re just underresourced we’re
00:09:24.519 we’re purposely disinvested in and I
00:09:26.600 guess you didn’t that really drove me
00:09:28.959 yeah and you didn’t have um the right
00:09:31.079 representation right so people
00:09:33.399 couldn’t vote they couldn’t be vocal
00:09:36.519 about their
00:09:37.920 needs yeah I mean people voted but at
00:09:40.399 the same time it was nobody had nobody
00:09:43.720 was was super super involved in what was
00:09:45.920 going on because when you’re just trying
00:09:47.560 to keep your head above water you’re not
00:09:49.480 thinking let me dive deep into these
00:09:51.200 candidates it’s just I’m going to close
00:09:53.519 my eyes and shoes at the voting booth I
00:09:55.240 vote because it’s my civic duty but I
00:09:57.120 don’t vote because I’m incredibly well
00:10:00.040 informed about what’s going on because I
00:10:01.640 can’t be because I don’t have the
00:10:03.160 resources to be able to sit there and
00:10:05.279 research each of the candidates and this
00:10:06.720 is before Google this is before the
00:10:08.240 internet you know and and it just made
00:10:11.040 it much more difficult and you kind of
00:10:12.480 had to just take everything at face
00:10:13.920 value which was really unfortunate and
00:10:16.160 then there were you know especially in
00:10:17.519 the Latino Community there’s language
00:10:19.360 barriers and and people just kind of you
00:10:22.000 know I remember there were times where
00:10:23.720 people in my family would vote based off
00:10:26.040 of last name it was like oh this is a
00:10:28.040 Latino running for office vo for them
00:10:30.240 you know and ter had no idea what they
00:10:33.640 were actually advocating for mhm
00:10:36.720 absolutely and that you know that can be
00:10:39.240 a that that’s not how I would Advocate
00:10:41.480 people go about choosing candidates
00:10:43.360 based on what their name
00:10:45.560 is well first first step maybe like a
00:10:48.880 bit more uh chance that they will uh
00:10:53.240 favor your or favor like they will
00:10:55.279 listen to your to to to to Hispanic
00:10:58.360 community the commun community and
00:11:01.320 that’s the yeah and you um you talked
00:11:07.000 about the 2024 elections it’s going to
00:11:10.959 be worldwide worldwide uh I think
00:11:13.760 already in Taiwan the there was already
00:11:16.720 uh first way first elections right like
00:11:19.040 it has to be Taiwan uh Venezuela us
00:11:23.519 obviously Russia so lots of lots of
00:11:26.360 Elections around the world and in the
00:11:29.720 article um you were saying that emerging
00:11:34.279 technology like AI um
00:11:38.120 Can in a way help people to understand
00:11:42.399 to to to even Pro process what’s being
00:11:46.959 on offer what’s what’s on the table but
00:11:49.279 also it brings um dangers um so how do
00:11:53.120 you see and let’s talk about dangers
00:11:55.800 actually because there are lots of
00:11:57.760 voices lots of lots of of opinions that
00:12:01.079 um you know the elections can be
00:12:04.560 manipulated um Whoever has more money
00:12:08.360 they can um you know they can invest in
00:12:12.399 AI algorithms which will uh spreads
00:12:17.959 misinformation yeah there is definitely
00:12:19.880 a dark side to to AI um and I will say I
00:12:23.320 will preface this by saying that I am
00:12:25.240 much more positive on the potential of
00:12:28.079 AI elction specifically here in the
00:12:30.920 United States yeah than a lot of people
00:12:33.639 there’s there’s a lot of there’s a lot
00:12:35.800 of rightful I don’t want to say rightful
00:12:37.440 fearmongering but there is a lot of
00:12:39.639 raising the alarm and and that’s
00:12:42.120 absolutely correct but I think sometimes
00:12:44.320 there’s also just a a clear push back
00:12:48.839 against anything new and anything
00:12:50.560 different so I will preface that with
00:12:52.920 this but there are there is the
00:12:55.480 potential for AI to weaken democracy I
00:12:58.199 mean and that is misinformation and deep
00:13:00.560 fakes I mean that’s I feel like that’s
00:13:02.160 one of the most significant risks is
00:13:04.399 creating and spreading false information
00:13:06.639 like you were talking about um you know
00:13:09.000 deep fakes are so hyper realistic and
00:13:11.519 they can be they’re already being used
00:13:13.760 in some instance instances to to mislead
00:13:16.480 voters or or to discredit candidates uh
00:13:19.240 there was something that happened in the
00:13:21.240 mayoral election election in Chicago
00:13:23.920 Illinois um where there was uh basically
00:13:27.560 someone who manipulated one of the
00:13:29.040 candidates voices and put it out there
00:13:31.240 as though they said a thing that they
00:13:32.720 didn’t actually say um there was
00:13:35.920 something that
00:13:37.720 happened it was the DeSantis the Ron
00:13:41.320 DeSantis um campaign here in in America
00:13:46.240 um he just dropped out of the race but
00:13:48.920 it was had something to do with the the
00:13:50.639 DNC I don’t remember the specific
00:13:52.160 example but but it had to do with the
00:13:54.160 the Democratic National Committee and
00:13:56.199 and they created like these this is Joe
00:13:58.480 Biden America or this is what happens if
00:14:00.079 Joe Biden wins or something like that
00:14:02.120 where they created this dystopian
00:14:03.800 looking picture that looked real but it
00:14:06.480 was Ai and that’s that’s an issue is
00:14:08.480 people not actually um making it clear
00:14:11.519 when they’re using AI not saying hey AI
00:14:13.639 was using to do this um then there’s I
00:14:16.160 mean there’s a lot of issues though
00:14:17.199 there’s data privacy concerns um you
00:14:20.320 know Aid driven campaigns rely heavily
00:14:22.199 on data so that raises a little bit of
00:14:24.600 that concern of of privacy about ethical
00:14:27.800 use of personal information you know
00:14:30.199 voters might not be aware and there’s
00:14:32.199 not really anything in place to say you
00:14:33.720 have to tell voters how their data is
00:14:35.800 being used in terms of AI um you know we
00:14:39.120 already have an issue especially in
00:14:40.920 America and I imagine across the world
00:14:43.120 um Echo Chambers and polarization it’s
00:14:45.639 gotten worse and worse over the last 20
00:14:47.880 years I’ll say um and AI algorithms do
00:14:51.399 tend to show users content that just
00:14:53.240 kind of aligns with their existing
00:14:55.240 beliefs you know it’s it’s just a loop
00:14:58.120 exactly it’s it’s the Doom loop as we
00:14:59.759 call it um you know and then there’s
00:15:02.240 manipulation and behavioral influence I
00:15:04.120 mean AI can be it can subtly manipulate
00:15:07.440 voter behavior um opinions it raises
00:15:10.720 ethical questions too just you know the
00:15:12.600 extent to which it is acceptable to
00:15:14.320 influence voters campaigns are all about
00:15:16.320 influencing voters but when you’re
00:15:18.800 influencing utilizing this super tool
00:15:22.639 there has to be some sort of check and
00:15:25.320 balance uh you know to ensure that
00:15:27.199 you’re not going above of what is
00:15:29.720 ethically
00:15:30.959 acceptable exactly and so don’t you
00:15:34.680 worry that you know like I said whoever
00:15:38.000 has more money will have larger
00:15:41.240 proportion of let’s say
00:15:44.160 attention and and room
00:15:47.880 foration I mean the more money argument
00:15:50.880 especially here because America doesn’t
00:15:52.880 have campaign Finance where where it’s
00:15:55.279 uh Public Funding of campaigns it’s just
00:15:57.519 that’s why you see most people in
00:15:59.480 Congress most people who win elections
00:16:02.199 or who generate the most publicity are
00:16:04.120 just billionaires or people with lots
00:16:05.959 and lots of money who can afford to run
00:16:08.160 for office who can afford to take that
00:16:10.560 time and pay a staff and you know do all
00:16:13.399 of that and so utilizing Ai and throwing
00:16:16.079 AI into that mix is not going to change
00:16:18.199 it but you know my big focus is there
00:16:21.199 really needs to we we really need to
00:16:23.560 focus as a as a country here and I think
00:16:25.800 just across the world on mitig those
00:16:29.160 potentially damaging influences of AI
00:16:32.279 there’s so much transformative
00:16:34.319 possibility and benefit not just in the
00:16:37.040 political space but across uh multiple
00:16:39.800 sectors but we really need to be focused
00:16:42.440 on how do we regulate this correctly so
00:16:46.040 that we can harness the benefits and
00:16:48.920 mitigate the dangers of it um I’m that’s
00:16:53.240 one thing that I’m very concerned about
00:16:56.360 here in the US is you know the last
00:16:59.800 piece of uh technology internet uh um
00:17:04.760 regulation that we saw was in
00:17:07.000 1996 it was section 230 of the of the
00:17:10.760 communications decency act that was the
00:17:12.480 last time we did anything substantial to
00:17:15.480 regulate the internet there there is no
00:17:18.400 like AI act I I I remember there was
00:17:22.839 some some they they issued something on
00:17:26.559 the White House yes yeah yeah President
00:17:29.280 Biden issued that’s a little more of
00:17:32.039 guidelines as opposed to you know it’s
00:17:34.120 an executive order and and there’s a
00:17:35.919 little bit of binding nature to it but
00:17:37.600 it’s not a codified piece of legislation
00:17:41.039 that would have to go through Congress
00:17:43.080 would have to go through the House and
00:17:44.039 Senate and Bes signed you know you can
00:17:45.880 you can get around certain things with
00:17:47.640 executive orders but there’s there’s
00:17:50.440 only so much power that the president
00:17:52.600 has and that that’s kind of being shown
00:17:55.960 but but it was an important step because
00:17:59.559 you know President Biden’s executive
00:18:01.200 order um there was a republican
00:18:03.159 candidate for president who who dropped
00:18:05.240 out but his name was Will herd and he
00:18:07.559 had a whole um he had a whole segment on
00:18:11.559 on AI on the fact that we do need to be
00:18:14.400 regulating it and so again there’s a
00:18:17.360 conversation happening my concern is if
00:18:20.520 anybody watched the social media kind of
00:18:24.159 hearings the Congressional hearings I
00:18:26.120 want to say it was 2018 or 2019
00:18:29.240 it was embarrassing it was absolutely
00:18:31.159 embarrassing for members of Congress
00:18:32.840 because there was a clear lack of
00:18:35.799 understanding of how social media Works
00:18:38.159 what it is yeah I think it was very it
00:18:40.919 was a bit of a par it was embarrassing
00:18:43.320 absolutely it absolutely was I mean and
00:18:45.200 that’s what happens you don’t expect a
00:18:46.679 bunch of 70 and 80y olds to be fully in
00:18:49.760 the know as to what’s happening and that
00:18:51.559 has nothing to do with like oh they’re
00:18:53.280 too old but they’re elected officials
00:18:56.240 and their focus isn’t spending all of
00:18:58.000 their time on social media and it is up
00:19:00.799 to their staff who just statistically
00:19:03.039 are much much younger than them to be
00:19:05.440 able to get them up to speed and then
00:19:07.679 there’s the issue of you have to want to
00:19:09.400 learn about this but I am concerned
00:19:12.200 because AI is I mean there is a ton of
00:19:14.960 it that I don’t know and it is extremely
00:19:18.840 complex and that is very concerning to
00:19:21.679 me as we’re thinking through how do we
00:19:25.200 regulate this how do we ensure that we
00:19:27.320 actually get some moving through
00:19:29.559 Congress that can that we can use a real
00:19:32.280 piece of like Dynamic effective
00:19:34.720 legislation that can do those things
00:19:36.679 that I mentioned that can harness those
00:19:39.120 benefits and and help to transform a lot
00:19:41.320 of elements of society but also help
00:19:44.360 mitigate those very very serious risks
00:19:48.240 um and that’s where I am concerned
00:19:50.400 because I I don’t feel a lot of
00:19:53.840 uh I don’t feel a lot of pride in the
00:19:56.960 fact that like social media completely
00:19:59.640 confounded Congress and that that makes
00:20:02.360 me nervous for how do we address
00:20:05.360 AI you
00:20:07.280 know unfortunately of you know Europe is
00:20:11.159 is not doing any better we had the AI
00:20:15.120 act released I think it was in December
00:20:17.919 but it was mainly also my guidelines and
00:20:21.600 and mainly in a in sense in in a focus
00:20:25.440 of how corporations how businesses are
00:20:28.440 supposed to act ethically not
00:20:32.679 really not really anything about
00:20:35.880 politics per se so we are all in this
00:20:40.080 chaos together I would say so yeah what
00:20:44.480 what do you think would be a good um
00:20:46.840 solution to
00:20:48.919 that uh the solution to how do we uh
00:20:52.919 yeah like to having government
00:20:55.760 understand the real like you said like
00:20:59.480 um the potential uh risks and
00:21:02.760 opportunities right so um I don’t know
00:21:05.520 is there
00:21:06.440 any Board of members is there any
00:21:09.080 advisory boards
00:21:11.120 formed which is a hopefully unbiased or
00:21:15.880 like how how does it how does it look at
00:21:19.039 this
00:21:19.919 point yeah I that’s a good question I
00:21:22.320 think there are a couple of things
00:21:23.720 popping up um one you know we’re relying
00:21:28.440 pretty heavily on Private Industry and
00:21:31.640 business to kind of do the right thing
00:21:34.640 on their own because there’s not any set
00:21:36.919 of guard rails or guidelines for them
00:21:39.520 it’s just kind of you know we use the
00:21:41.799 term here in America like the wild west
00:21:43.880 which you know is kind of just
00:21:47.279 lawlessness
00:21:48.960 like that’s kind of what it feels like
00:21:51.159 it’s kind of like the wild west right
00:21:52.720 now with AI and and that’s okay you know
00:21:55.279 if there’s a lot of good actors it’s not
00:21:57.840 like we we can’t think of it in this
00:22:00.000 really cynical way of there are only Bad
00:22:02.200 actors on the stage and we have to worry
00:22:05.200 about that so you know there are there
00:22:07.679 are organizations and companies that are
00:22:10.080 self-regulating and and trying to avoid
00:22:13.559 because of the fearmongering around AI
00:22:15.279 they are trying to avoid those negative
00:22:18.279 aspects and they’re taking it upon
00:22:19.799 themselves which is really important and
00:22:21.840 I think that’s something that gets
00:22:23.000 overlooked is the kind of ethical
00:22:25.400 business leader who is looking to do the
00:22:28.400 right way instead of just what’s going
00:22:30.039 to make me the most money I don’t care
00:22:31.640 about the consequences you know people
00:22:33.360 always people always bring up when we
00:22:34.760 talk about AI they bring up the movie
00:22:36.480 the the Terminator movie series and it’s
00:22:38.960 it’s like okay but that’s not super
00:22:42.159 likely you know we’re not we’re not in
00:22:44.960 this like crazy super villain business
00:22:47.720 leader era where you know they’re just
00:22:50.400 going to do whatever they can and
00:22:51.760 destroy the destroy the world in order
00:22:54.200 to make an extra dollar like there’s
00:22:56.679 some craving people but we don’t have I
00:22:58.640 don’t think that there’s that really
00:23:00.360 that level of person in you know in AI
00:23:03.880 at least uh I I can’t speak for other
00:23:06.000 Industries um but then there’s also
00:23:08.919 groups like AI 2030 so AI 2030 is a
00:23:11.760 spin-off of or or it’s a a project of
00:23:14.240 fintech for good which is all about
00:23:16.679 focused it’s focused on um utilizing
00:23:19.840 Tech and emerging technology and and
00:23:22.080 like I mentioned harnessing those
00:23:23.480 benefits while mitigating the risks uh
00:23:26.039 you know I full disclosure I recently
00:23:28.080 joined their Advisory Board um just to
00:23:32.200 help make my voice heard a little bit
00:23:33.960 too uh they’re they’re International um
00:23:37.600 they are really focused on creating
00:23:40.120 those regulations and and uh you know
00:23:42.760 advocating with government entities to
00:23:45.159 to put in smart effective uh regulation
00:23:50.440 that will also allow for the growth of
00:23:53.400 AI and and allow it to be utilized to to
00:23:56.279 Really um uh show its transformative
00:23:58.880 nature and show its transformative power
00:24:00.480 so I I think there’s a few things we can
00:24:02.480 do I’m sure there’s other things that
00:24:03.760 I’m completely missing that I don’t know
00:24:05.440 about yet that are possible ways to to
00:24:08.600 regulate and and use Ai and and kind of
00:24:12.279 move away from this darker
00:24:14.360 side I think unfortunately you know one
00:24:16.640 of the things people are reading though
00:24:18.000 right now is just every every other
00:24:20.240 article is here’s why this person is
00:24:23.279 warning about
00:24:25.440 AI right like the
00:24:29.480 yeah clickbait and that’s that’s
00:24:31.000 unfortunate exactly it’s a lot of
00:24:32.960 clickbait sorry my voice here um that’s
00:24:36.840 uh that’s unfortunate because that kind
00:24:39.600 of puts us behind the eightball a little
00:24:41.240 bit in terms of starting the
00:24:42.880 conversation if we’re starting the
00:24:44.720 conversations from a a place of fear and
00:24:48.080 this is going to destroy Humanity well
00:24:50.799 that’s you know that’s that’s not a
00:24:52.880 productive way to start the conversation
00:24:54.720 it really should be from a neutral here
00:24:57.000 are the good here here’s the bad how do
00:24:59.279 we do more of this and do less of this
00:25:01.720 what regulations make sense and and I
00:25:03.960 think that there is a as with anything
00:25:05.760 in the world as with anything in life
00:25:07.279 there’s always a balanced uh perspective
00:25:10.480 that we can all take and so um I I’m
00:25:12.960 hoping for that is this initiative you
00:25:16.080 member of is it related to the the work
00:25:19.440 you are doing around health and like
00:25:22.240 medical and health sector you’re doing
00:25:24.080 for um you is is it United States of of
00:25:28.320 care yeah United States of care no they
00:25:30.919 are totally separate um they’re not
00:25:34.000 intertwined with one another whatsoever
00:25:35.720 it’s more of a um you know I’m I’m a
00:25:38.320 volunteer on that Advisory Board I am
00:25:40.559 not being paid it’s it’s really just a
00:25:43.080 it’s a it’s a labor of love uh we’ll say
00:25:45.880 um like some of the things that I do
00:25:47.559 like being a parent it’s a labor of love
00:25:49.039 I don’t get paid for it but
00:25:51.200 uh you are paid with love right I’m paid
00:25:56.679 with love and and screaming fits
00:25:59.799 occasionally yes exactly yeah yeah this
00:26:03.039 is a non nonprofit organization and you
00:26:06.200 focus there on Healthcare in the US yeah
00:26:09.600 United States of care is a nonprofit
00:26:12.200 advocacy organization uh really focused
00:26:14.919 on transforming the American Health Care
00:26:17.039 System um there will be an AI component
00:26:20.480 in terms of just kind of how we’re
00:26:22.799 thinking about the world you know how
00:26:25.240 we’re thinking about ai’s potential in
00:26:27.240 healthare but we’re very very early uh
00:26:29.880 you know in that aspect of it so deep
00:26:32.360 problems to to to be solved right
00:26:34.960 exactly especially in the American
00:26:36.960 Health Care System there are some
00:26:38.159 serious problems um you know but we we
00:26:40.720 do things a little bit different I’m the
00:26:42.000 chief Communications officer there I’ve
00:26:43.840 been there for almost two years um you
00:26:46.399 know our big focus is listening to
00:26:49.240 people you know that kind of makes us
00:26:51.039 different you look at a lot of advocacy
00:26:52.600 organizations and not to throw shade at
00:26:54.679 anyone there’s a lot of groups doing
00:26:56.279 great work but there’s you know you you
00:26:58.919 kind of get a lot of that Think Tank
00:27:00.520 mentality of we’re going to think about
00:27:03.039 it we’re going to get all the smartest
00:27:04.279 people in the room and we’re going to
00:27:05.640 decide what policy we’re going to
00:27:07.399 advocate for and then we’re going to
00:27:08.559 write a white paper and put it out there
00:27:10.720 we don’t do that you know that we’re not
00:27:12.960 here to just tell people hey this is
00:27:16.000 what you want because this is going to
00:27:17.240 be best for you we don’t feel like that
00:27:19.840 is the way to go about creating durable
00:27:22.640 change it’s really about being deeply
00:27:24.880 committed to understanding unique
00:27:27.559 healthare challenges that are faced by
00:27:28.919 communities across the country and we do
00:27:31.360 that through extensive listening and
00:27:33.200 engagement you know we bring everybody
00:27:35.240 to the table you know policy makers
00:27:37.520 patients Business Leaders entrepreneurs
00:27:40.080 thought leaders uh you know it’s all
00:27:42.240 about how do we how do we bring
00:27:43.960 everybody to the table to the table and
00:27:45.720 create solutions that are going to work
00:27:48.159 for everyone um you know it’s it’s all
00:27:51.200 about creating lasting people- centered
00:27:54.039 policy change and that can really make a
00:27:56.080 difference in the landscape you know
00:27:57.600 it’s it’s about affordable Dependable
00:28:00.679 personalized care that people can
00:28:03.240 understand um as an organization we we
00:28:06.240 operate kind of at the intersection of
00:28:07.880 people policy and politics um because
00:28:11.159 the hope is that what we push for is
00:28:13.720 going to have a real positive impact uh
00:28:17.120 on people’s lives uh we are nonpartisan
00:28:19.960 so we do have great relationships on
00:28:22.399 both sides of the aisle um I always say
00:28:24.799 that in the political world there’s no
00:28:26.279 permanent allies no permanent enemies uh
00:28:29.159 I might we might be working with someone
00:28:31.399 on one thing and we’re working very
00:28:32.880 closely and then we’re opposed to them
00:28:34.919 on something completely different and it
00:28:36.679 doesn’t mean that they’re good or bad or
00:28:38.679 anything like that it’s just we have a
00:28:40.840 difference of of philosophy we’re
00:28:42.679 advocating for this you’re on board with
00:28:44.480 this you’re not on board with this and
00:28:46.240 that’s okay you know it’s it’s it’s okay
00:28:48.720 for there to be disagreement without
00:28:50.399 demonizing one another um you know one
00:28:53.799 of the things we’re working on right now
00:28:55.480 is site neutral billing and so this is
00:28:57.640 kind of this idea of how do we reinvent
00:29:02.279 the way that hospital billing practices
00:29:04.720 work there’s uh the billing practices
00:29:07.240 right now for some hospitals they’re
00:29:09.039 charging more um with these these things
00:29:12.960 called facility fees so uh where people
00:29:15.519 are getting these unexpected charges
00:29:18.120 after going to a doctor’s office that
00:29:19.760 they didn’t used to get but because
00:29:21.000 there’s been so much uh consolidation in
00:29:24.440 the hospital space you’re seeing larger
00:29:27.039 Hospital hital systems kind of swallow
00:29:28.640 up small practices and people don’t get
00:29:31.600 that information beforehand there’s not
00:29:33.240 a lot of transparency and it’s it’s
00:29:36.559 difficult because people have a hard
00:29:37.679 time affording that but at the same time
00:29:39.760 hospitals are critical in so many
00:29:42.960 communities uh you know some in some
00:29:45.000 communities there is one hospital and
00:29:47.519 that’s where everybody goes to and if
00:29:49.320 that disappears there’s no hospital in
00:29:52.000 that Community anymore and that’s that’s
00:29:54.159 not a great place to be so we’re you
00:29:56.720 know some of the work we’re doing is
00:29:58.760 hospitals don’t really like it but at
00:30:00.559 the same time it doesn’t mean that we
00:30:02.440 are enemies with uh with hospitals you
00:30:07.799 know absolutely it’s just kind of a look
00:30:10.440 we disagree on this but there’s so much
00:30:12.519 we do agree on and hospitals are so
00:30:15.440 critical to the work that we do and so
00:30:17.559 you know there’s it’s that that idea
00:30:19.240 there’s no permanent allies there’s no
00:30:20.600 permanent enemies um we’re very
00:30:22.880 nonpartisan and we want to work with
00:30:25.360 everyone because our whole goal is just
00:30:27.799 making the Health Care system better for
00:30:29.840 people um you know my specific role is
00:30:32.480 all about communicating for impact uh
00:30:35.000 you know we I have to effectively
00:30:37.120 communicate uh through various mediums
00:30:39.440 our vision our Solutions uh and I have
00:30:41.919 to make sure they resonate with
00:30:43.519 reporters with the communities that we
00:30:45.840 we work with and then also with those
00:30:48.600 who might be opposed to us on certain
00:30:50.440 things and um you know so we’re kind of
00:30:53.840 we’re bridging that Gap so there’s these
00:30:56.639 trying to bridge the gap between highle
00:30:58.360 policy discussions and then just the
00:31:00.600 day-to-day Healthcare experiences that
00:31:02.880 people in the United States have H yeah
00:31:06.240 I heard some like Thor horror stories
00:31:09.799 about the US um Health Healthcare
00:31:13.240 systems like in in UK like you don’t
00:31:16.919 think about it
00:31:19.519 NHS has its downsides but it works and
00:31:23.080 even my friends from America like I
00:31:25.880 remember when one of them came and she
00:31:28.720 had some stomach pain or something and
00:31:30.399 she went to hospital she was treated and
00:31:33.559 then she was so surprised she didn’t
00:31:35.600 need to pay because in any emergency you
00:31:38.639 don’t need to pay even if even if you’re
00:31:41.399 a tourist so like I cannot imagine how
00:31:44.840 much of a terror terrible situation some
00:31:48.960 of the communities uh experience like
00:31:52.480 between not going to the hospital and
00:31:55.960 being in pain and and going to hospit
00:31:58.519 hospital and getting horrible bills
00:32:02.360 which you are not able to
00:32:04.039 pay yeah that’s and that’s scary you
00:32:07.039 know that is a choice that people have
00:32:09.600 to make in this country which is not
00:32:12.200 something that we want you know that’s
00:32:13.679 something we we try to avoid I mean with
00:32:16.919 uh with our listening work we have found
00:32:19.760 that people W think that the Health Care
00:32:22.279 system is broken and in a way it is but
00:32:25.159 it’s not broken Beyond repair um but the
00:32:28.760 the biggest issue for most people in in
00:32:31.760 the country uh is the cost it’s the cost
00:32:34.840 of healthcare and so you know we’re
00:32:38.080 talking about high drug prices the the
00:32:41.559 facility fees that I talked about a
00:32:43.000 minute ago you know there’s
00:32:44.480 administrative uh expenses and then
00:32:46.720 there’s kind of this focus on treatment
00:32:49.720 over prevention you know the primary
00:32:51.559 system in the United States is fee for
00:32:53.159 service and so that’s essentially a
00:32:55.440 doctor or provider or whomever it is
00:32:57.440 gets paid by the amount of services that
00:33:00.000 they perform so the amount of tests that
00:33:02.399 they run the amount of different things
00:33:04.519 that they do for the patient um and then
00:33:07.760 there’s the you know the issue of
00:33:09.039 unequal or inequal uh access um you know
00:33:12.240 not everyone can get the healthcare that
00:33:13.559 they need like I talked about and that
00:33:15.200 comes down to things like where they
00:33:16.360 live or how much money they make um and
00:33:18.919 it’s complex you know the American
00:33:20.799 Healthcare System is not straightforward
00:33:22.480 it can be a maze and it can be
00:33:24.720 incredibly hard to navigate for both
00:33:26.440 patients and providers so um there are
00:33:29.039 some serious policy reforms that need to
00:33:31.480 happen you know strategies that control
00:33:33.480 costs that’s like drug price negotiation
00:33:36.039 we saw something around uh um last year
00:33:40.200 that the Biden Administration kind of
00:33:41.720 LED and that ended up being a piece of
00:33:44.559 legislation that that got through but it
00:33:47.559 uh capped the cost of insulin for uh for
00:33:51.320 those living with diabetes which is a
00:33:53.519 very serious disease especially here in
00:33:55.320 the US where it is very prevalent um you
00:33:58.360 know we’re it’s capping those costs at
00:33:59.840 $35 a month which is okay a substantial
00:34:03.200 move forward and so more drug price
00:34:05.519 negotiation would be huge um you know
00:34:08.000 you are seeing a bipartisan push kind of
00:34:09.760 what I mentioned around uh Hospital
00:34:11.918 billing practices uh and that’s helpful
00:34:13.879 but that is also just a very small part
00:34:16.199 of the issue you know on the larger
00:34:18.280 scale you know I talked about the feif
00:34:19.879 for service thing and that is a big deal
00:34:23.359 because we need to be putting an
00:34:25.520 emphasis on prevention
00:34:27.599 and not just treatment and treating the
00:34:29.839 whole person as opposed to treating them
00:34:32.440 as you know individual pieces of a yeah
00:34:35.760 yes absolutely absolutely you know we we
00:34:39.000 actually last year um we did uh uh some
00:34:43.119 some polling so we did some polling
00:34:44.879 research on and some messaging research
00:34:46.760 as well on value based care so that is
00:34:49.800 the payment model where uh um providers
00:34:52.520 get paid based on health outcomes
00:34:55.280 instead of just the number of services
00:34:56.639 they perform so by a 4 to1 margin people
00:35:00.280 support this type of system they they
00:35:02.599 loved it they loved the fact that they
00:35:04.320 get to spend more time with their
00:35:06.079 doctors they loved that it was about
00:35:08.520 prevention they loved that it was
00:35:10.720 personalized to what they needed it
00:35:12.480 wasn’t just we’re going to throw a bunch
00:35:14.400 of services at you or a bunch of tests
00:35:16.320 at you and just see what happens they
00:35:18.560 loved that but they hate the term value
00:35:21.520 based because it makes them think of low
00:35:23.920 cost low quality value brands at like a
00:35:26.880 Supermarket so we actually recently us
00:35:29.960 of care we rebranded that model as
00:35:32.200 Patient First Care and that was based
00:35:34.520 off of what resonated in that messaging
00:35:37.320 research of what people would want to
00:35:39.720 hear what would make them excited about
00:35:42.280 that if they were to go to their
00:35:44.000 doctor’s office or or whatever it
00:35:45.880 happens to be um you know and I think a
00:35:48.320 full switch away completely away from
00:35:50.680 the feif for service model to this
00:35:52.760 Patient First Care model where it’s all
00:35:55.440 about the holistic person and treating
00:35:57.720 them overall as opposed to individually
00:36:00.520 you know based on their body part or the
00:36:02.520 part of their body um you know I think
00:36:05.240 that is something that would one help to
00:36:07.079 lower costs but also lead to just
00:36:09.599 overall Better Health outcomes and
00:36:11.839 technology and AI can always play a
00:36:13.880 supportive role in that um you know it’s
00:36:16.000 it’s about making healthare more
00:36:17.280 efficient and accessible but again we
00:36:20.240 have to keep equity and privacy at the
00:36:21.800 Forefront of that though uh we can’t you
00:36:24.640 know we want to keep it efficient but
00:36:27.240 we’ve just got to be mindful and smart
00:36:29.160 about how we’re implementing these Tech
00:36:30.920 Solutions um it’s not a small task it’s
00:36:34.560 it’s very complex it’s very multifaceted
00:36:38.160 uh but I think if we with a concerted
00:36:40.280 effort and and serious serious
00:36:43.000 collaboration across different sectors I
00:36:44.680 think you know in the US specifically we
00:36:46.720 can make significant strides and and I
00:36:48.760 know that’s something we’re trying to do
00:36:49.960 at us of care yeah but you know there
00:36:52.319 are some tasks which are pretty
00:36:55.040 straightforward and pretty much manual
00:36:57.319 like um you know when I don’t know how
00:36:59.720 it works in the US I guess I guess it’s
00:37:01.960 similar where um a patient comes into
00:37:04.920 the doctor’s office everything which is
00:37:07.680 being said it can be already transcri uh
00:37:11.640 transcripted right so the doctor doesn’t
00:37:13.920 need to focus on writing on typing but
00:37:16.920 uh just speaking with the with the
00:37:19.119 patient and getting more information out
00:37:21.160 of him so I think like these these are
00:37:23.640 the things which are not so risky to
00:37:28.000 implement of course it’s a huge huge um
00:37:31.400 task in in in any way thinking of how
00:37:34.960 old school how Legacy based are some of
00:37:37.960 those systems um I work in public sector
00:37:41.680 as well so yeah tough tough but but it’s
00:37:47.760 um yeah I think it’s there’s lots of
00:37:50.119 room for improvement but another um I
00:37:53.200 wanted to say uh about the the new
00:37:56.280 approach approach um you you mentioned
00:37:59.040 about in a way it reminds me of the
00:38:02.599 Eastern um Eastern medicine right like
00:38:05.480 they I I heard that in I think it’s in
00:38:08.800 China where people pay the doctor for
00:38:12.119 keeping him keeping them healthy so they
00:38:15.760 pay to prevent to to to like for the
00:38:19.200 doctor to prevent any
00:38:22.040 illness yeah I mean absolutely this
00:38:25.319 Patient First Care idea
00:38:27.760 more widely known as value based it’s
00:38:29.720 definitely not a uniquely American idea
00:38:32.040 it’s kind of something that that is seen
00:38:33.640 across the world um you know I think the
00:38:36.920 United States has just been a
00:38:38.920 little there’s this idea of
00:38:41.280 individuality you know in the US that
00:38:43.359 everybody kind of buys into it kind of
00:38:45.319 gets in ingrained in you from an early
00:38:47.560 age and um you know it’s kind of one of
00:38:51.040 those things that we people care people
00:38:53.839 like this people like the idea but it
00:38:55.760 has to
00:38:57.040 we have to kind of put our unique
00:38:58.720 American spin on it and so um you know I
00:39:02.560 I definitely wouldn’t recommend talking
00:39:04.319 about you know China China’s Healthcare
00:39:06.839 System as a hey that’s the beacon for us
00:39:09.599 to be to be Focus you mention that
00:39:11.800 someone else was was their
00:39:14.040 first yeah exactly exactly but you know
00:39:17.240 more of I I do think though that this
00:39:19.240 kind of patient first model is is where
00:39:21.720 where we’re just kind of naturally going
00:39:23.839 and and it’s going to be important to
00:39:25.359 kind of push us that way
00:39:27.520 as well because you know like you said
00:39:29.480 treating person in a holistic manner is
00:39:32.599 very important um you know there’s cost
00:39:35.040 issues and all of that but you know it
00:39:36.920 it is it’s very very important uh um you
00:39:40.800 know for people to feel to to see that
00:39:44.280 their the impetus for their care is
00:39:47.720 better health outcomes you know
00:39:49.400 everybody wants to be healthy you know
00:39:52.160 the health industry is a multi-billion
00:39:55.000 dollar a year industry I’m not talking
00:39:56.640 about providers and insurance I’m
00:39:57.920 talking about smart watches and diet
00:40:01.200 books and all of this kind of stuff
00:40:03.240 that’s all kind of health related it’s a
00:40:06.400 huge huge industry and the only reason
00:40:08.319 it’s a huge industry is because people
00:40:09.920 care about their health people want to
00:40:11.920 be healthy and they want to find ways to
00:40:15.000 have better health outcomes because we
00:40:16.800 all want to spend more time with our
00:40:18.200 family we all want to be more active we
00:40:21.119 all want to uh you know not have to take
00:40:24.359 medicine all the time and so it is
00:40:27.040 important for us to find the ways to
00:40:29.520 integrate things into our Health Care
00:40:31.240 system that allows that and that
00:40:33.319 encourages that and you know we’ve seen
00:40:35.200 a lot of a lot of insurers are offering
00:40:37.960 you know discounts for if you get a gym
00:40:40.440 membership or they’ll offer you know my
00:40:42.720 my wife’s parents are have Medicare and
00:40:45.400 the Medicare that they have it pays for
00:40:47.280 a gym membership for them so they’ve
00:40:49.280 started going to the gym they’re retired
00:40:51.079 now they’ve started going to the gym and
00:40:53.079 it’s fully paid for and they’re seeing
00:40:55.079 the health benefits of that I mean
00:40:57.160 there’s there’s a lot it’s a people
00:41:00.560 being healthy is good business for
00:41:03.800 everyone and and so you know our whole
00:41:06.800 focus is well let’s make that Equitable
00:41:09.119 let’s make sure that everybody in
00:41:11.160 America has access to these these tools
00:41:15.640 that can help them become healthier and
00:41:18.160 live longer better
00:41:20.480 lives and do you think it’s it’s going
00:41:22.640 to be a big part of the political
00:41:25.440 campaign ation is it already becoming um
00:41:29.680 you know Healthcare ends up it’s
00:41:31.160 interesting healthare becomes a huge
00:41:34.200 part of every single election uh you
00:41:37.839 know when you in 2008 it was you know
00:41:42.599 the the Obama’s plan versus Hillary’s
00:41:45.240 plan you know and then in 2012 it was um
00:41:50.359 fear-mongering about Obamacare and then
00:41:53.599 in 2016 it was how do we push this
00:41:56.640 further how do we push Obamacare further
00:41:59.160 then in 2018 in the midterms it was
00:42:02.520 people campaigning on let’s let’s get
00:42:06.280 rid of Obamacare and by that point even
00:42:08.760 people who had previously been like no
00:42:11.680 we don’t want this there was
00:42:14.160 the now they wanted it and they were
00:42:16.480 arguing in public forums with candidates
00:42:18.760 who were saying we’re going to repeal
00:42:20.000 this the repeal and the replace ended up
00:42:21.839 failing in 2020 obviously everything was
00:42:25.480 around Corona virus and Co and that
00:42:28.160 wasn’t an American issue that was a
00:42:29.559 worldwide issue um that became
00:42:32.680 unfortunately that argument rather than
00:42:34.680 being like a true policy you know let’s
00:42:37.800 figure out the best way to deal with
00:42:39.760 this and move forward and and rebuild
00:42:42.079 our society in a good way it kind of
00:42:43.800 became a half of the people embraced
00:42:47.440 crazy conspiracy theories and you know
00:42:50.240 and and it became this it still is kind
00:42:53.319 of this weird why did it go in this
00:42:55.400 direction instead of we should all be
00:42:58.280 working together to try and avoid this
00:43:00.200 in the future and let’s all work
00:43:02.480 together to get out of this as quick as
00:43:03.800 possible um you know that that
00:43:06.160 unfortunately was was led by some Bad
00:43:08.920 actors um but you know in 20124 you 2022
00:43:14.200 was abortion rights you know that was
00:43:16.400 the big thing on the on the deis and so
00:43:20.160 in 24 it’s going to be
00:43:22.599 interesting I don’t know Healthcare
00:43:26.200 might be you know there we’re thinking
00:43:28.000 about it you know the way that I’m
00:43:29.200 thinking about Healthcare in terms of uh
00:43:31.599 politics is all the research shows that
00:43:35.119 most people across political parties
00:43:37.520 across demographics across racial lines
00:43:40.240 ethnic lines uh uh income across
00:43:44.640 different geographical regions of the
00:43:46.800 country they agree on most aspects of
00:43:50.160 what they want from Healthcare yeah
00:43:52.359 people mostly it’s so Universal right
00:43:55.079 you
00:43:56.720 you have nothing everybody pretty much
00:43:58.559 wants the same thing from their
00:43:59.839 healthare system like I said earlier
00:44:01.359 they want it to be affordable they want
00:44:03.680 it to be something they can depend on
00:44:05.760 that’s there when they need it they want
00:44:07.800 it to be personalized to them they want
00:44:10.040 a personal experience with their doctor
00:44:12.480 and they want to be able to understand
00:44:13.920 it they don’t want too much jargon they
00:44:16.760 don’t want to feel confused every time
00:44:20.040 they they call their doctor or they have
00:44:21.839 an issue with with their insurance or
00:44:23.480 whatever it happens to be they don’t
00:44:24.839 want to not understand it
00:44:26.640 that’s Universal kind of across the
00:44:28.839 board so I don’t I don’t want to
00:44:31.800 prognosticate here as to what’s going to
00:44:33.720 happen in the election or or what
00:44:35.440 they’re going to talk about I think
00:44:36.760 unfortunately because of the two people
00:44:39.920 very likely to be the nominees for each
00:44:43.119 of the major parties that it’s going to
00:44:45.119 be some sort of relitigate of
00:44:47.520 2020 um so there’s that’s probably going
00:44:51.520 to be a big part of it I think the
00:44:52.680 future of democracy is also going to be
00:44:54.400 a very big uh topic in uh in that
00:44:57.440 election um so I don’t know that
00:44:59.480 Healthcare is going to be outside of
00:45:01.720 again talking a little bit more about
00:45:03.640 abortion um which is not something we
00:45:06.079 really focus on at at us of care uh but
00:45:09.760 you know outside of that I wonder if
00:45:12.400 there is going to be an opportunity to
00:45:14.440 talk about rather than here’s what
00:45:17.000 divides us as a country with all of
00:45:19.480 these topics D you know with you know in
00:45:22.599 terms of right or left here’s kind of
00:45:24.760 where we all agree here’s where most
00:45:27.079 people in the country actually want to
00:45:29.079 see some movement and kind of force not
00:45:32.440 just presidential campaigns but
00:45:33.800 campaigns across the country at the
00:45:35.800 Senate level at the the house level at
00:45:38.559 the state level to really talk about you
00:45:41.119 know what regardless of who’s in office
00:45:43.240 this is the stuff that we want this is
00:45:44.960 the stuff that we need from our Health
00:45:46.440 Care system so I think that there is an
00:45:49.200 opportunity there but people have to
00:45:50.760 take that opportunity and as you
00:45:53.040 mentioned it’s not a sensational
00:45:54.480 headline so it it’s going to be one of
00:45:57.319 those things that we have to see
00:45:59.040 ethically pushed as opposed to well this
00:46:01.760 isn’t going to generate click so we’re
00:46:03.000 not going to talk about it yeah but like
00:46:05.319 also like you said um even in the
00:46:07.520 article when where I found you I found
00:46:10.240 found about your work is helping people
00:46:14.680 understand what’s what’s in on offer
00:46:16.760 right like helping them maybe through uh
00:46:21.240 utilizing technology like AI to even
00:46:25.400 speak their language to to yeah to show
00:46:30.200 or like to pull in information which is
00:46:32.559 relevant to to them to to like things
00:46:35.559 which they care about and also the
00:46:37.680 language
00:46:38.920 wise if it’s able to translate and and
00:46:43.119 speak your language then it’s only
00:46:46.319 better yeah absolutely I mean there’s a
00:46:49.400 lot of benefits to to AI you know that I
00:46:51.920 that I you know I talked about in that
00:46:54.200 article um you know specifically in the
00:46:57.599 election I can I can talk about a little
00:46:59.319 bit or in terms the election process um
00:47:03.240 AI is kind of like this this super
00:47:05.319 analyst that can just sift through
00:47:07.119 mountains of data to figure out what
00:47:08.960 voters are are thinking about or what
00:47:10.920 they care about and so for campaigns
00:47:14.480 that makes it really interesting because
00:47:16.079 they can start tailoring the messages to
00:47:17.920 hit right at the heart of what matters
00:47:19.920 to people and get it down to almost an
00:47:22.520 individual level um you know there’s the
00:47:25.800 efficiency angle so AI is pretty good at
00:47:28.599 pinpointing you know campaigns where
00:47:30.720 they should focus their their resources
00:47:33.000 and their energy so it’s kind of like
00:47:34.720 this um like a road map I know people
00:47:37.240 don’t use real Maps anymore but you know
00:47:39.520 it’s like a Google map that that tells
00:47:42.359 you where you know your message is going
00:47:43.880 to resonate the most you’re not just
00:47:45.240 kind of shooting in the dark um you know
00:47:47.839 it it’s it allows campaigns to kind of
00:47:50.839 tweak their strategies on the Fly and
00:47:54.160 and that allows them to stay aligned
00:47:55.520 with with voter mood um you know a big
00:47:58.520 thing that we kind of were talking about
00:48:00.119 is like misinformation um fake news AI
00:48:04.240 can be a real Ally right there um you
00:48:07.119 know spotting and flagging
00:48:08.240 misinformation is really important and
00:48:10.280 that’s crucial for keeping things Fair
00:48:13.400 uh you know in terms of that um you know
00:48:16.760 it’s it’s chat Bots um who was it
00:48:21.200 Francisco Suarez who is the mayor of
00:48:23.800 Miami but he had a a short lived kind of
00:48:26.200 presidential campaign there he had a
00:48:28.760 chatbot on his website an AI chatbot
00:48:31.079 that was kind of like an AI Francisco
00:48:33.200 Suarez that was talking about his
00:48:36.000 platform and what answer questions and
00:48:38.160 which is really interesting and that’s a
00:48:39.480 real unique way to use AI but that’s a
00:48:41.400 good way to use AI that is hearing
00:48:44.280 directly from the candidate having
00:48:46.119 something that is essentially you but
00:48:47.800 isn’t you that answer questions as
00:48:50.240 voters come to your website or or you
00:48:52.200 know ask campaigns of your ask questions
00:48:54.640 of your campaign
00:48:56.319 um you
00:48:57.599 know chat Bots that can can field those
00:49:01.160 election questions in any language is
00:49:05.000 really really cool um you know I mean
00:49:09.119 the thing is it’s not it’s never going
00:49:10.599 to replace like the warmth of a face
00:49:12.400 Toof face face chat it’s not going to re
00:49:16.400 it’s not going to re uh redo that kind
00:49:18.520 of connection that you make with
00:49:20.400 community so you still need a human
00:49:22.720 touch in elections but it work hand
00:49:26.280 inand with the typical model of a
00:49:29.960 campaign you know there there are there
00:49:32.200 are ways to to do that which is really
00:49:35.160 really exciting but kind of I think to
00:49:37.440 your to your point that you mentioned a
00:49:39.520 few minutes ago
00:49:40.799 too there is the issue of helping people
00:49:44.720 see the balanced view of how do we help
00:49:49.760 people see the full picture and how do
00:49:51.200 we help them get around the kind of
00:49:53.200 disinformation and misinformation that
00:49:55.319 that we see especially around um around
00:50:00.319 politics and the election and I think
00:50:02.640 the
00:50:03.359 first excuse me the first really
00:50:06.200 important thing that people need and
00:50:08.559 that needs to be mandated in some way
00:50:10.559 and you know for for right now it’s
00:50:13.559 people can go on their own and and learn
00:50:15.480 about this but I do think that there at
00:50:17.960 some point we need to start offering
00:50:19.880 media literacy education um it’s just
00:50:23.440 all about teaching people how to sift
00:50:25.960 through information how to tell fact
00:50:27.960 from opinion um how to spot very clear
00:50:31.599 bias and very clearly fake news uh there
00:50:35.000 is right now an alarming increase in
00:50:37.480 disinformation you know on Tik Tok for
00:50:39.559 example aimed at younger people um and
00:50:42.480 then for those those people on Tik Tok
00:50:44.280 their grandparents are getting
00:50:45.559 information disinformation in an echo
00:50:48.040 chamber on Facebook you know their their
00:50:50.079 grandparents are and you know it’s so
00:50:52.359 easy you can enroll in media liter
00:50:54.760 literacy courses at most of the major
00:50:56.799 universities at least in the United
00:50:58.160 States um there’s also something called
00:51:00.280 the media literacy Institute they offer
00:51:02.799 little 90minut courses and it’s really
00:51:05.760 it provides you like a
00:51:07.319 foundational uh understanding of being
00:51:10.319 able to spot bias um there’s
00:51:13.240 factchecking websites there’s Snopes and
00:51:15.280 PolitiFact and factcheck.org and you
00:51:17.960 know even you know as much as I don’t
00:51:20.319 want to give too much credit to Twitter
00:51:22.680 yes Twitter Community notes and
00:51:25.680 Community notes are are mostly reliable
00:51:28.200 and that is really important I’ve
00:51:29.880 already seen it on various things shut
00:51:33.440 down misinformation where it’s clearly
00:51:36.160 there there’s a note there it’s cited
00:51:38.359 resources and that is so important that
00:51:42.119 that’s one of the few things that I
00:51:43.359 think Twitter has done really really
00:51:44.720 well is adding these Community notes to
00:51:47.000 things um transparency transparency
00:51:50.319 sorry I said that wrong transparency is
00:51:53.760 very important um you know rules around
00:51:57.400 uh political advertising and and AI
00:51:59.240 using campaigns that are transparent
00:52:01.119 like we talked about um you know that if
00:52:04.200 AI is being used to create or or Target
00:52:06.799 content that needs to be clear it needs
00:52:08.680 to be clear that this video is made with
00:52:10.640 AI this picture was made with AI uh this
00:52:13.640 script that you’re seeing somebody read
00:52:15.079 from is was made from AI um I do I
00:52:18.400 talked about it earlier I I do
00:52:20.760 remain I don’t want to say dubious but I
00:52:23.440 do remain a little dubious as to the vi
00:52:25.599 of regulatory AI legislation just again
00:52:28.079 given the disaster of the social media
00:52:30.400 Congressional hearings a few years ago
00:52:32.160 um and then the fact that it’s been 28
00:52:36.079 years since internet legislation in the
00:52:37.920 United States um and even this this is
00:52:41.079 even worse right because it’s it’s the
00:52:44.119 changes happening in especially with AI
00:52:47.640 are so exponential so fast and there is
00:52:51.640 no no regulation no body which would be
00:52:57.440 up to dat with with what’s
00:53:00.200 happening yeah I mean and that’s and
00:53:02.359 that’s a whole other issue is does there
00:53:05.680 does there need to be a permanent task
00:53:08.640 force you know we have you know I know
00:53:11.319 in a lot of countries you have the
00:53:12.480 ministry of this you know in the US we
00:53:15.000 have the department of this that is
00:53:17.599 going to be critical is do we need a
00:53:20.480 department of AI uh you know I I don’t
00:53:24.240 know the answer to that
00:53:25.680 I don’t know if it needs to be part of
00:53:29.319 an existing uh an existing Department in
00:53:32.119 the federal government I don’t know if
00:53:34.040 it needs to be its own thing um I don’t
00:53:37.640 know if it needs to be just kind of
00:53:39.520 American specific or if we do need some
00:53:42.000 sort of AI United Nations that ensures
00:53:46.599 that we have kind of standards across
00:53:49.200 the board um I I think I remember
00:53:52.520 hearing recently that Europe has taken a
00:53:55.200 few steps um to regulate AI I think
00:53:58.640 they’ve gotten a little further than we
00:54:00.160 have uh at the moment I don’t know
00:54:02.079 enough about yeah we we like to brag
00:54:03.680 about it it’s the AI act but as I told
00:54:06.520 you it’s more like of a guidance yet
00:54:09.520 yeah yes yes the main topic is um
00:54:12.640 transparency and ethical use of of AI
00:54:16.319 but how it’s going to play like no one
00:54:19.680 knows it’s it’s swimming in deep
00:54:22.559 water for sure I mean and that’s
00:54:25.720 again all of that makes this really
00:54:28.599 difficult is is we could call
00:54:31.520 for regulation we can call for
00:54:34.480 legislation we can call for
00:54:35.839 international cooperation but we need
00:54:38.559 experts involved we need real deep deep
00:54:42.079 AI experts who can say this doesn’t make
00:54:45.680 sense this is great this is where we
00:54:47.920 need to go your way off base hey you’re
00:54:50.720 on the right path we need that we need
00:54:53.559 people to constantly us of like this is
00:54:56.440 where we see things going because you
00:54:58.319 know you mentioned it it moves at
00:54:59.960 lightning speed the the development of
00:55:02.559 not just AI but technology in general it
00:55:04.520 moves so fast but it doesn’t come out of
00:55:08.200 nowhere it’s the general public might
00:55:10.680 feel like it comes out of nowhere but
00:55:12.240 the people working in this space can see
00:55:14.640 ahead at the very least a couple of
00:55:16.960 years they can see what is coming down
00:55:19.400 the pipe and if we have people who are
00:55:24.599 doing this behind the scenes work if we
00:55:26.720 have the Googles and the apples of the
00:55:29.160 world and the other technology gurus you
00:55:32.280 know who are able to constantly keep
00:55:36.240 international governing bodies um up to
00:55:39.680 speed as to what is happening and what’s
00:55:41.880 coming and how we need to prepare for
00:55:44.319 that we can be ahead of this instead of
00:55:46.680 constantly reacting I would love for us
00:55:50.400 as an International Community to be more
00:55:52.960 proactive in terms of regul and seeing
00:55:55.960 the benefit and mitigating the risks of
00:55:58.400 new technology instead of oh my God this
00:56:01.960 came out of nowhere how do we regulate
00:56:04.000 it how do we fix this what do we do this
00:56:06.799 could destroy all of us I don’t want to
00:56:09.000 always see how humanity is going to be
00:56:11.599 destroyed by insert new technology here
00:56:15.039 I want to see hey we know that this is
00:56:17.200 coming we knew that this is happening
00:56:18.680 this is what we have in place to ensure
00:56:20.680 that we’re getting the best of it and
00:56:22.160 mitigating the worst um again I may be
00:56:25.480 asking too much but I I don’t think that
00:56:28.400 that is uh out of the realm of
00:56:30.400 possibility no no no you’re right but
00:56:32.880 it’s it’s very it’s very difficult thing
00:56:35.839 to tackle because regulate if you over
00:56:38.960 regulate and that’s one of the biggest
00:56:41.200 concerns of like in in
00:56:43.760 Europe uh big companies like the Googles
00:56:47.000 and apples
00:56:48.680 are they are ready and they have enough
00:56:53.160 um legal resources to to navigate those
00:56:56.839 regulations and those smaller players
00:57:00.079 who you know you need U more players you
00:57:03.760 need more competition to balance things
00:57:06.359 out yeah and those small players will be
00:57:10.839 cut cut off the race right because they
00:57:13.559 won’t be able to afford to make a
00:57:15.640 mistake say so they won’t so they will
00:57:18.200 just
00:57:19.920 stop no I totally agree with that I am I
00:57:22.920 am a capitalist you know I mean I
00:57:25.839 I’m capitalism has been very good for me
00:57:27.960 you know it’s gotten me out of where I
00:57:29.839 started to where I am now but um you
00:57:32.440 know I agree with you I think
00:57:33.359 overregulation can be a problem and I
00:57:35.359 think it kind of circles back to what I
00:57:37.599 was talking about in terms of what the
00:57:39.480 organization where I work United States
00:57:41.559 of care the way that we approach
00:57:43.480 Healthcare policies we approach it from
00:57:45.480 bringing all of the players including
00:57:48.640 the patients and the the everyday people
00:57:51.599 who are impacted by this we bring
00:57:53.440 everyone to the table and it’s like
00:57:55.240 let’s find Solutions and regulation that
00:57:58.359 works for all of us I don’t think
00:58:00.280 regulation should exist in the vacuum I
00:58:02.400 think it needs to be bringing these big
00:58:04.920 companies in what do you need and what’s
00:58:07.400 too far but you know if we don’t go if
00:58:11.119 we go where you think is too far does
00:58:13.440 that
00:58:14.280 allow these Insidious elements to
00:58:17.000 potentially pop up you know how do we
00:58:19.160 ensure that we allow for the most
00:58:21.760 possible growth and benefit and when I
00:58:24.359 say Ben I don’t just mean benefit to our
00:58:26.760 everyday lives I mean benefit for the
00:58:28.400 companies that are actually banking on
00:58:30.079 this and are providing the jobs for
00:58:31.640 people and creating you know real
00:58:33.680 Economic Opportunity from this
00:58:35.839 technology how do we create the most of
00:58:39.640 that while mitigating the potential
00:58:41.680 risks and you know the people who are
00:58:44.680 the doomsayers and the this is going to
00:58:47.039 destroy everything because every piece
00:58:48.599 of technology we have people like that
00:58:51.480 so with that there’s always an element
00:58:53.799 of Truth you know again I I made fun of
00:58:55.920 the fact that people refer to the
00:58:57.799 Terminator movies when we talk about AI
00:58:59.880 you have the doomers and The Optimist
00:59:02.359 right exactly you have the doomers and
00:59:04.400 the optimists and I think the truth kind
00:59:05.960 of Lies somewhere in the middle as with
00:59:07.880 everything there there is potential Doom
00:59:11.480 around every piece of technology and
00:59:13.119 that’s why the regulations are so
00:59:14.520 important but you know there’s there’s
00:59:17.680 so much possibility to and so you know
00:59:21.039 there there are regulatory responses but
00:59:23.799 it’s about bringing everyone to the
00:59:25.520 table and listening to what everybody
00:59:27.960 needs and figuring it out from there um
00:59:31.000 you know I mean a couple things that I’m
00:59:33.760 thinking of are like data privacy laws
00:59:36.400 you know I mentioned privacy I mentioned
00:59:38.079 data um the EU has I might get these
00:59:41.839 letters wrong D the gdpr gdpr yes the
00:59:46.119 gdpr yes and and I think something like
00:59:48.200 that that works really well people
00:59:50.280 people know and agree how their data is
00:59:52.720 being used transparency is really really
00:59:54.880 really important so there needs to be
00:59:56.839 like I said rules especially in the
00:59:58.440 political world because poit the
01:00:00.000 political world can be explosive
01:00:02.760 sometimes and so having rules that make
01:00:04.839 it clear when AI is behind a political
01:00:07.039 ad or is behind microt targeting just so
01:00:09.240 people have that transparency you know
01:00:11.000 transparency is so important across the
01:00:13.240 board especially as we talk about AI um
01:00:16.119 and then setting up AI accountability
01:00:18.079 standards so guidelines on who is
01:00:20.559 responsible for AI driven decisions and
01:00:23.359 content uh that’s going to be important
01:00:25.400 to know and having those kind of
01:00:26.640 accountability standards are going to be
01:00:28.799 critical um you know bias and fairness
01:00:31.920 audits of AI systems could be really
01:00:33.799 helpful in uh unfair targeting of or
01:00:37.000 exclusion even of certain groups of
01:00:38.880 people um and then you know
01:00:41.160 misinformation I think is its own kind
01:00:42.880 of thing but you know regulations to
01:00:45.440 control AI generated misinformation and
01:00:48.079 specifically deep fakes especially
01:00:50.760 during elections would be very important
01:00:53.520 um and then I don’t know exactly what
01:00:55.960 this would look like but ethical AI
01:00:57.680 development guidelines are going to be
01:00:59.079 really important um so making sure that
01:01:01.680 AI is just being used responsibly in
01:01:05.760 every Arena where it is prevalent um
01:01:10.200 there’s there’s a lot you know there’s
01:01:11.559 the international cooperation I think is
01:01:13.440 going to be really important uh I don’t
01:01:16.559 know I don’t know if limits on AI driven
01:01:18.799 ad spending but that might be an issue
01:01:22.079 um there there’s just there’s so so much
01:01:25.280 that that needs to happen but I think if
01:01:28.480 we can even get halfway there to all of
01:01:31.440 the things that need to happen around AI
01:01:33.720 we can really harvest the benefits the
01:01:36.920 societal benefits and the economic and
01:01:39.520 and all the other benefits that we can
01:01:41.079 possibly have I mean you think of how AI
01:01:42.680 can be used it could be deployed to
01:01:45.359 improve the health care System it can be
01:01:47.599 deployed in campaigns like I’ve been
01:01:49.920 talking about in the political process
01:01:52.359 unlimited fighting climate change I mean
01:01:55.200 there’s so much that we can we can use
01:01:58.440 it for and and yeah it’s there’s going
01:02:02.200 to be H it’s GNA be a lot a big big
01:02:04.640 conversation true but the technology is
01:02:07.160 only one part right like you need to
01:02:09.599 have humans who agree on things and who
01:02:13.079 want to communicate and there is like
01:02:15.799 you said there is so much division so
01:02:17.960 many extremes extreme communities it
01:02:22.119 feels like you know it’s black and white
01:02:25.160 but it’s not most of PE most of the
01:02:27.960 people believe in in something but don’t
01:02:31.200 don’t agree with the other and there is
01:02:33.599 not
01:02:34.520 really at least in Europe what what we
01:02:38.119 see there are their representatives are
01:02:40.400 always going for the extremes they don’t
01:02:43.160 represent the true citizen let’s say so
01:02:47.359 my question would be maybe you have an
01:02:50.119 answer the Holy Grail of everything um
01:02:54.119 how how can we use technology to help
01:02:57.839 people communicate better so they don’t
01:03:00.039 divide so there is not so much
01:03:03.400 division yeah
01:03:06.079 that tricky huh that Ison is not an easy
01:03:10.200 question billion dollar no it really it
01:03:13.039 it really is it is the billion it might
01:03:14.520 even be the trillion dollar question as
01:03:15.839 we think about it but you know I mean
01:03:17.880 some of the things I talked about
01:03:19.319 earlier um you know factchecking media
01:03:22.599 literacy uh trans parency
01:03:25.319 regulation all of that stuff is
01:03:27.240 important I think
01:03:30.400 increasing increasing your media
01:03:32.480 consumption and and your mix of media
01:03:35.039 consumption is going to be really
01:03:36.079 important for that so not sticking to
01:03:39.079 One Source uh checking out kind of a
01:03:41.599 variety of viewpoints breaking down Echo
01:03:44.400 Chambers you know that kind of thing is
01:03:46.640 is really helpful um I I think that
01:03:50.720 there’s a healthy skepticism of viral
01:03:53.119 content too people look at things that
01:03:55.599 go viral and they take it as gospel
01:03:58.240 truth and viral content I mean more
01:04:01.839 often than not it’s Sensational or
01:04:04.559 exaggerated information directed script
01:04:07.440 it yeah right I mean so much of it so so
01:04:11.240 being able to and maybe this is
01:04:12.920 somewhere where AI can come into play is
01:04:15.079 is utilizing that to verify auth
01:04:17.079 authenticity before sharing or or
01:04:20.760 believing some of this stuff um you know
01:04:24.359 I’m I’m a huge advocate of of good
01:04:28.000 quality journalism and you know good
01:04:31.440 journalism is independent it digs deep
01:04:35.160 it’s balanced it’s well researched um
01:04:38.880 you know it’s not opinion and there’s
01:04:42.319 been such a fluidity to opinion actual
01:04:47.160 news and just blatant falsities that
01:04:52.319 it’s all kind of mixed together and now
01:04:55.039 people either believe the most
01:04:58.559 outlandish nonsense that they see and or
01:05:03.000 they don’t believe real
01:05:06.160 genuine you know news reporting you know
01:05:09.400 that is really just presenting the facts
01:05:11.960 and not presenting it from a point of
01:05:13.599 view um you know there’s there’s a lot
01:05:16.279 of Talking Heads you know that call
01:05:18.880 themselves journalists and that’s not
01:05:20.119 just on TV I see so many Twitter users
01:05:23.359 with journalist in their bio and they
01:05:25.200 are just blatantly pushing a point of
01:05:27.200 view they’re not conducting any kind of
01:05:29.599 informative journalism it’s just here’s
01:05:32.359 the sensationalist thing and look I’m a
01:05:34.520 journalist so you can believe me and
01:05:36.400 that’s that’s a problem you know it’s
01:05:38.520 it’s I don’t want to say it’s the same
01:05:40.760 thing but it’s not too far off from just
01:05:42.880 anybody going out and saying I’m a
01:05:44.680 doctor let me treat your illness well no
01:05:47.799 there’s training and there’s very
01:05:49.760 specific things that a doctor can do
01:05:52.200 that not any random person Society can
01:05:54.760 do and you know I think we maybe devalue
01:05:57.359 journalists and journalism like true
01:05:59.200 journalists and journalism a little bit
01:06:01.440 um and that that has definitely helped
01:06:04.880 lead to kind of a breakdown of of our
01:06:08.440 news um you know and not to the to your
01:06:11.640 point of the
01:06:13.279 hyperpolarization in Europe there’s you
01:06:17.160 know America is no different in that way
01:06:20.720 America is so
01:06:23.000 hyperpolarized despite the fact that
01:06:25.039 most Americans live somewhere here in
01:06:26.640 the middle um but major political
01:06:30.119 parties you know tend to even minor
01:06:32.119 political parties actually tend to just
01:06:34.319 kind of go as far into embracing The
01:06:38.200 Fringe elements of their of their base
01:06:42.319 um you know I’m I’m presenting myself
01:06:45.440 here in a nonpartisan way I you know in
01:06:47.640 in my work life I do tend to be
01:06:50.319 nonpartisan because I I do kind of exist
01:06:53.079 somewhere in the middle where I can find
01:06:55.920 places to work with both sides of the
01:06:57.839 aisle um in some things and then in
01:06:59.920 other things there’s there’s just not um
01:07:02.520 I in my personal life I I am a Democrat
01:07:05.359 um and I’ve done work with the party
01:07:07.440 I’ve been involved in the party in
01:07:08.680 various elements but you know I can say
01:07:11.760 I think the specifically the National
01:07:13.599 Party does a very good job of trying to
01:07:16.720 kind of embrace that middle but they’re
01:07:18.319 also fighting against a real
01:07:22.440 serious crisis in news you know in what
01:07:25.880 is considered news and that’s that’s a
01:07:29.440 problem you know um I think they do try
01:07:33.000 to try to embrace where most people are
01:07:35.680 but there’s just so much
01:07:37.599 sensationalism there’s so much
01:07:39.200 hyperpolarization being thrown out and
01:07:41.880 there also is this issue which I think
01:07:43.680 every country in the world has where the
01:07:45.920 smallest minority of people tend to be
01:07:48.079 the loudest so the and it’s usually the
01:07:51.440 most ignorant unfortunately tend to be
01:07:54.440 the loudest people and can really
01:07:58.440 strongly influence a debate because
01:08:00.599 people are just much more swayed by
01:08:02.880 sensationalism or or it seems this way
01:08:05.160 I’m not a psychologist so I can’t say
01:08:06.559 this for sure but people seem to be much
01:08:08.720 more swayed by just how Sensational
01:08:11.559 people can be as opposed to a
01:08:13.720 well-reasoned rational argument for
01:08:16.198 something yeah because of that campaigns
01:08:19.198 Embrace that and that’s where they go
01:08:21.040 and so um I don’t know if ‘s a I I know
01:08:24.839 I didn’t give an answer as to like how
01:08:26.640 do we combat this but it’s it’s more of
01:08:29.560 just look verify sources check out
01:08:31.560 credibility you know blend education and
01:08:35.719 Tech and Regulation and Community
01:08:37.600 involvement I think we blend all those
01:08:39.279 together we can create a landscape where
01:08:41.679 balanced information becomes the norm
01:08:44.479 and people can make real informed
01:08:46.920 decisions and not just take whatever you
01:08:50.198 know like you mentioned the viral aspect
01:08:52.399 of things not just take
01:08:54.319 the most viral thing that is the least
01:08:56.799 informative and run with that and let
01:08:59.080 that kind of color them and and kind of
01:09:01.000 radicalize them a little bit Yeah I
01:09:03.439 think part of part of the problem is
01:09:06.719 that the issue with journalism as a
01:09:10.920 industry um it it was broken right now
01:09:14.679 with the pay walls It’s seems to be
01:09:18.759 getting better because where when you
01:09:21.238 pay you know that you’re are going to
01:09:22.920 get quality
01:09:24.359 material quality uh information because
01:09:28.279 you know you can spend time and
01:09:31.080 resources on on digging into the story
01:09:34.359 instead of chasing those creating those
01:09:38.359 clickbaits chasing this the the
01:09:40.799 attention and and getting the
01:09:42.839 advertisers which are always biased so I
01:09:45.439 think that’s part of
01:09:47.960 it and like making people understand
01:09:51.560 that if they don’t pay for something
01:09:54.040 they are being used as a product they
01:09:56.560 are being sold to
01:09:58.440 yeah yeah no I mean absolutely it’s it’s
01:10:02.880 it’s a multifaceted problem um you know
01:10:06.000 it’s not something that I think we’re
01:10:07.199 going to to just be able to solve kind
01:10:09.280 of here on this on this podcast uh
01:10:11.800 unfortunately there are some seeds some
01:10:14.360 some ideas there they’re seeds exactly
01:10:16.880 let’s let’s water those seeds and and
01:10:19.400 exactly yeah
01:10:21.840 so as a last question
01:10:24.280 because I know you are Optimist me too
01:10:26.320 like I see there is so much potential of
01:10:28.920 changing things and like um CH
01:10:32.840 challenging the status quo what would
01:10:35.400 you what would you advise to to people
01:10:39.199 to what like what can they do to
01:10:41.840 challenge the status quo who want to
01:10:45.440 have their voice heard there’s two minds
01:10:48.480 to this right I mean you have the
01:10:51.719 traditional there’s a traditional like
01:10:53.679 let’s go out and protest something and
01:10:55.440 make people uncomfortable but I think
01:10:58.640 the time for that has passed when when
01:11:01.239 you’re making people uncomfortable
01:11:02.600 you’re actually driving them against you
01:11:04.360 you’re you’re moving people against you
01:11:06.159 we’ve seen this with we’ve seen this
01:11:08.000 among young people recently um where I
01:11:12.199 think certain protests certain movements
01:11:15.480 are actually doing more harm than good
01:11:19.320 based on either who they’re targeting or
01:11:22.120 where they’re choosing to do their their
01:11:24.600 statement or how they’re choosing to do
01:11:27.560 their statements um you know I think
01:11:30.199 that that’s not a good way to to get the
01:11:33.120 word out there when you’re turning
01:11:34.560 people against you when you’re upsetting
01:11:36.080 people um you know I’ve seen certain you
01:11:38.760 know climate change is a serious issue
01:11:40.760 that is incredibly important and it
01:11:42.480 impacts all of us and we everybody in
01:11:45.480 the in the world needs to be doing more
01:11:47.440 specifically governments but when you’re
01:11:50.239 protesting by destroying works of art
01:11:53.600 art you know in a in a
01:11:56.440 museum that’s not you know Museum
01:11:59.800 curators are not the ones who are going
01:12:01.600 to change the world and make you know do
01:12:04.920 something about climate change that
01:12:06.320 that’s it’s just a a weird way to go
01:12:08.719 about it and that’s that you’re just
01:12:10.800 pushing people against you and a lot of
01:12:12.440 times you’re pushing people who
01:12:13.480 otherwise would be on your side against
01:12:17.120 you and I’m not necessarily saying
01:12:18.800 you’re pushing them against you in terms
01:12:20.760 of the overall overarching goal what
01:12:23.600 you’re trying to do but you are pushing
01:12:25.840 them against your specific action and
01:12:28.360 your organization and that that’s not
01:12:31.080 helpful that’s not helpful for anybody
01:12:33.840 um I think there’s a couple of ways in
01:12:36.800 terms of using technology I think you
01:12:39.320 can it could be a powerful Ally to some
01:12:42.000 people you know you can leverage social
01:12:43.920 media again social media has its
01:12:46.800 problems but it is a really good
01:12:49.159 platform and it’s a unique platform that
01:12:51.239 hasn’t really been available throughout
01:12:52.960 any other time in human history to build
01:12:55.520 a following and spread your message um
01:12:58.360 you know and creating engaging content
01:13:00.199 so blogs videos podcasts all of that
01:13:03.560 kind of thing anything that gets your
01:13:05.120 your point across in a compelling way I
01:13:07.880 think is really important um you can
01:13:09.840 join online communities uh there’s a lot
01:13:12.760 of forums and groups where you can
01:13:14.840 connect with like-minded people and and
01:13:16.880 and build support and I know AI is very
01:13:19.600 much emerging so if AI is the issue that
01:13:21.639 you care about there is Absolut
01:13:23.400 absolutely online communities that you
01:13:25.560 can join there’s uh events you can
01:13:28.159 attend where you can network with other
01:13:29.920 people who agree with you I mean it’s
01:13:32.040 building networks is also very important
01:13:33.840 in this that that could be done online
01:13:35.400 or offline um and then you know kind of
01:13:38.040 to that point there’s digital tools for
01:13:39.760 organizing you know setting up events
01:13:41.840 petitions uh campaigns depending on your
01:13:44.679 time and your level of commitment um and
01:13:47.840 honestly one of the biggest things is
01:13:49.280 just staying informed so using
01:13:51.239 technology it could be apple new news it
01:13:54.080 could be having Google alerts set up it
01:13:56.560 could be following AI or Tech
01:13:59.480 journalists on social media Outlets but
01:14:02.520 keeping up with latest issues and
01:14:04.840 developments and news and that’s where
01:14:07.920 you can be informed and and know how to
01:14:12.360 challenge the status quo or if you even
01:14:14.440 need to challenge the status quo
01:14:16.120 sometimes you actually don’t sometimes
01:14:17.719 you can be a part of it and you know the
01:14:19.800 status quo or the establishment or
01:14:21.920 whatever whatever word people want to
01:14:23.880 use is kind of moving in the right
01:14:26.639 direction and you want to be a part of
01:14:29.000 that and help push them a little bit
01:14:30.520 further yeah you want to be part of that
01:14:32.679 you want to um you know sometimes you
01:14:35.679 want to be able to like hey you’re going
01:14:38.120 the right way but don’t turn that way
01:14:39.480 turn this way you know this is actually
01:14:41.520 where you want to be going a little bit
01:14:42.960 more because it’s going to be less
01:14:44.239 effective if you go in this direction um
01:14:46.880 but you can only do that if you are
01:14:48.800 fully informed and you know what’s going
01:14:50.600 on and you you have a point of view um
01:14:53.639 um that that’s really I mean I’m sure
01:14:55.800 there are other things that I’m I’m
01:14:57.520 completely missing like like you said
01:14:59.400 building a community and having your
01:15:01.600 voice heard is it has never been easier
01:15:04.480 right now there are already existing
01:15:07.280 communities like Facebook like LinkedIn
01:15:09.800 wherever you your your point of interest
01:15:12.800 is uh you will find people who will
01:15:16.239 listen to you if you have something
01:15:18.440 interesting to say and which is factual
01:15:21.239 abolutely agree and factual that’s the
01:15:24.800 important thing don’t lie when you are
01:15:28.360 talking to the just give factual
01:15:31.239 information don’t sensationalize things
01:15:33.040 you’ll be great yeah yeah un
01:15:35.679 unfortunately if they lie and
01:15:38.679 sensationalize their thing probably they
01:15:40.960 will get more more um attention but yeah
01:15:44.639 don’t do that true but eventually that’s
01:15:46.679 eventually that’s always found out and
01:15:48.560 eventually it is you know everybody who
01:15:51.800 has UTI ized Sensational platforms or
01:15:56.120 who has just you know pushed
01:15:58.719 disinformation or misinformation and
01:16:00.520 built a big following eventually they
01:16:02.600 crash you know it’s just it’s the
01:16:04.480 inevitability of it you know if if you
01:16:07.120 are continuously a reliable resource of
01:16:11.800 positive and correct information you’re
01:16:15.280 always going to have an opportunity to
01:16:17.400 get that information out there if if
01:16:19.320 you’re not you’re going to have a real
01:16:21.159 small window to be say the most
01:16:23.159 nonsensical things you can say and then
01:16:25.639 you’re going to be done so and those
01:16:27.280 good things compound right like and
01:16:29.199 reputation can be lost in an instance
01:16:32.120 and always good to to do the right thing
01:16:38.760 so for Absolut thank you thank you so
01:16:42.199 much for everything and it was a
01:16:44.159 pleasure to to have you here I you gave
01:16:46.719 so much information so much uh so much
01:16:50.040 so much crucial things what to to do and
01:16:53.600 what not to do um in order to make this
01:16:56.960 world a little bit better because this
01:16:59.560 is what we want we want diversity we
01:17:01.560 want um the
01:17:04.440 balance yes I I think that’s what we all
01:17:07.199 want and uh I I am hopeful for the
01:17:10.639 implementation of AI into into the world
01:17:14.400 I I think there’s a lot of positive um
01:17:17.560 and as long as we can mitigate those
01:17:20.560 negatives and and the potential pitfalls
01:17:22.840 I think it’s going to be a really
01:17:24.960 transformative uh era in in human
01:17:28.360 history so we shall see yeah I’m I’m as
01:17:32.440 an optimist as you as you are so I think
01:17:35.560 it will be
01:17:36.760 good I I hope so and and thank you for
01:17:39.679 having me on Camila I really appreciate
01:17:41.480 this has been so much fun and very
01:17:44.360 lovely and I appreciate you staying up
01:17:46.120 late to uh to do this no worries I just
01:17:49.120 have I have to wake up at 5:00 A.M
01:17:51.000 tomorrow but it’s fine
01:17:55.040 n