• My why
  • Are You Human
  • Understanding AI
  • Entrepreneurship Handbook
  • Skill up
  • Inspiration
Tech, business and everything In between
  • My why
  • Are You Human
  • Understanding AI
  • Entrepreneurship Handbook
  • Skill up
  • Inspiration
Tech, business and everything In between
Tech, business and everything In between
isabel scavetta on are you human
Are You Human
Isabel Scavetta: Global Shapers, Cybersecurity, Product Innovation | Are You Human Podcast
Loading
00:00 / 1:00:07
Apple Podcasts Spotify
RSS Feed
Share
Link
Embed

Download file | Play in new window | Duration: 1:00:07 | Recorded on 5th October 2023

Subscribe: Apple Podcasts | Spotify

In this episode with my guest was Isabel Scavetta, we talk about cybersecurity and deepfakes – how can we help keep people safe online when society is digitising faster than policymakers and organisations can keep up; Isabela’s work in UN and Global Shapers, designing products having everyone’s (and those obvious) needs in mind. We also debate how can product management be used as a force for good and why is it important to consider intersectionality in product design (and/or AI products)? We also discuss what value can career switchers or people from underrepresented backgrounds bring to cyber and AI teams and much more. Tune in and don’t forget to subscribe.

Connect with Isabel: linkedin.com/in/isabelscavetta
Connect with Kamila: linkedin.com/in/hankka

Transcript

I think there’s a lot of new

emerging threats in the cyber space.

We’ve spoken quite a lot about.

We’ve spoken a bit about AI so

far in this podcast, for example.

And the combination of AI, technology

and cyberattack is really interesting.

So, for example, we talk a lot about social

engineering, so encouraging someone to take an action that

they usually wouldn’t through manipulating them online.

With the advancements of tools like creating convincing

deepfakes, being able to generate images, pictures, videos

of people saying or doing things they usually

wouldn’t, we’re starting to see that use a

lot in social engineering attacks.

AI is also enabling the scraping and processing of

data at alarming rates, which makes it easier to

target a specific user or individual, because you’re able

to get a very holistic picture of all that

data floating around in cyberspace, which the male, like

I spoke about earlier, may or may not be

aware is out there, which can create a shockingly

accurate picture of what someone’s up to.

So the nigerian prince can send you

a more accurate, more personalized request.

Knowing someone from your family. Exactly.

Using that quote, they can know

your family where you recently went.

Maybe they can pretend to be a server from

a restaurant you just went to at the weekend

because you signed up using a restaurant booking app.

And then there was a data breach.

There’s all kinds of stuff that cybercriminals

are now able to get their hands

on and leverage and use against you.

And then obviously at the national level

as well, cyberattacks becoming more sophisticated.

It’s not just social engineering.

We’re looking at a variety of different attack

types and techniques, which are being decibated by

evolving technologies, being able to attack more frequently,

more often at scale, exploit vulnerabilities, which is

easier than ever, unfortunately, despite how much we’re

working to try and counter that.

Hello.

This is your host, Camila Hankivich.

And together with my guests, we discuss how tech

is changing the way we live and work.

Are you ready?

Isabel, thanks so much for finding time.

And I know it’s a crazy busy schedule you have,

judging on what type of projects you are working on.

So, really, thank you so much for being here.

No problem.

It’s great for being here.

So, my first question is, why cybersecurity?

Out of all the type of technical subjects,

cybersecurity seemed to be very mysterious and usually

chosen by guys, I would say, yeah, it’s

a very male dominated field in general.

Unfortunately, I left my black hoodie and mysterious cap

at home to hack on some code and feel

like that’s what everyone expects cybersecurity to be.

However, my journey into technology was

very unconventional in the first place.

I come from a non humanities background.

I don’t think anyone would expect me

to end up in the tech industry.

But the reason why I chose cybersecurity is I’m

really interested by how do we keep people safe

in a world that is constantly evolving, faster than

policy and legislation can keep up.

And I think working in the cybersecurity space

and I help build cybersecurity products gives you

real hands on exposure and influence to change

the way people are thinking about security online

and keeping their data private.

Okay.

And I know that you’re part of global shapers

and you are actively involved with World Economic Forum,

why you think this is needed and how youth

can influence global leaders to make some changes.

Yeah, absolutely.

I’m the co lead of a project at the

World Economic Forum, Global Shapers, called inclusive cyber.

Inclusive cyber is all about getting more

diverse talent into the room when we’re

building cybersecurity policy and products.

And there’s two phases to the project.

One phase is we go out and lead

workshops at London’s leading universities, coaching non STEM

students on how to frame the skills they

already have for a career in cybersecurity.

The second part is then youth advocacy for

the youth voice on their national and international

stage, where we take data from these workshops

and pitch to organizations like UK Cybersecurity Council

and NIST about how policy makers and organizations

can encourage more inclusive hiring and retain talent

from different backgrounds.

The core kind of motto of inclusive cyber

is we want to create a cyber workforce

as diverse as the challenges that we’ll face.

We believe that having a variety of

opinions strengthens our cybersecurity approaches and therefore

kind of security at large.

Been a really successful project so far.

We’ve coached hundreds of students across

the capital and we have some

really exciting speaking engagements coming up.

Sounds amazing.

And is there any project you are

particularly bragging about something which you lasting

impression and some good results.

And generated some good results, yeah, there’s some

things at the really high level and some

things at the more personal level.

At the high level, we spoke at

the UK Cybersecurity Council Women’s Day events,

an audience of hundreds of business leaders,

policymakers, students, et cetera, about the project,

its growth and our key strategic recommendations.

That was awesome.

But on the personal level, I always see

on my LinkedIn like workshop attendees who through

us, started to learn code through our partner

code for girls, or they’ve started going for

cybersecurity interviews, the interesting cybersecurity case study competitions.

And that, on a personal level, is really amazing

to see, like, the small scale impact of what

we’ve done as well as the big scale.

And why do you think we should keep people safe online?

How can we keep up with the changes?

And how can we make sure that

people are aware of digital environments?

Yeah, this would be quite a controversial one,

because often when I talk about data privacy

with people, they say, oh, but I have

nothing to hide, my life’s not that interesting.

Yes, I use these apps, I use these platforms, but

I don’t really see why that should really matter.

And to me, privacy is a human right.

The right to your data and ownership of

your data and the use and abuse of

your data should be your individual right.

So I think there’s a couple of pieces there.

I think, one, there’s the education at the

public level, helping people understand not only why

their data is important, but how their data

could be leveraged is a huge one.

When I talk to people often about, like,

AI, algorithmic design, building app services platforms, people

have no idea where their data is going

and how that can be used.

So that’s one piece like educating people.

And then the second piece is, at the organizational

and governmental level, what obligations do we have to

protect people’s privacy when they can’t themselves?

Like, how do we help people to keep

their data safe, to keep ownership of that

data, when they themselves may not even be

aware of the situations happening themselves then?

And do you see governments really

taking into consideration your suggestions?

Are there any legislations happening? Yes.

I mean, obviously, at the moment, the UK

is looking at the online safety bill, which

is quite controversial in many ways.

Thinking about that trade off between how much

information should governments see so they can help

protect people, versus how much should they not

see because it’s people’s tribe information?

I mean, you see all kinds of debate around

this, even things like WhatsApp’s end to end encryption

and whether that can be used or called upon

in different court cases, for example, or evidence points.

It can’t.

It’s all end to end encrypted right now, but should it?

It’s a very murky topic.

The digital transformation society, data privacy

and cybersecurity has definitely been debated

on the national and international stage.

But in terms of conclusive actions that are coming out

the other side, progress is still quite slow whilst in

society, that technology is evolving day by day.

Which country do you think is

leading the way right now?

Is UK just trying to catch up?

Yeah, that’s an interesting question.

I don’t know if I could give you a specific

country that I think is doing better than everyone else.

I think it’s a new challenge for everyone involved.

However, I think we think a lot on

the global scale about different kinds of inequalities,

economic equality, social inequality, freedom and rights inequality.

And I think digital inequality is going to

be one of the huge trends that we

start to see over the next coming decades.

The countries and governments that are able to

successfully adopt, track and proliferate technology in their

societies will quite rapidly and dramatically change global

imbalances in power as they stand.

However, doing that well is really difficult because

it’s a completely new challenge in a lot

of ways, the majority of governments.

Yeah, I completely agree.

Like two weeks ago, I had a conversation with one of

the leaders in the UN who said exactly the same.

There are lots of governments which are

over, how should I say it?

They don’t really see long term, so they don’t

care about education, of the opportunities and threats.

And it’s not really the idea.

When we, for example, take AI as an example, it’s

not the threat of AI per se, but it’s a

threat of being left behind and just widening this gap

between countries which are doing something about it.

And it will create, like you

said, more social, economical disparities. Yes.

And I feel like the introduction of kind of

like end user facing AI and the proliferation of

AI in a lot of tools and systems at

the moment, is only exacerbating that gap.

Like, I was a UN women UK delegate for the commission

on the status of women 67, and at that level, we

were even talking about the right of girls and women to

access technology, like access the Internet, access a laptop.

That digital divide, even in terms

of accessing technology, is so broad.

So then when you layer on top new and

emerging technologies, then that gap, as you say, gets

bigger and bigger and evolves faster and faster and

without intervention or consensus or better public education, it’s

only going to exacerbate a lot of the digital

inequalities that already exist.

Yes, and we live in a gap, sorry, in a bubble.

Being in London, we don’t realize,

lots of people don’t realize how

different reality is for certain countries.

And it’s tricky to make sure to include

those people in a conversation and see their

perspectives, because we are almost like creating those

regulations and policies around ourselves, excluding those types

of communities even further. Yes.

And I’m really interested in that

from a product design perspective.

So a lot of my volunteering work in my

spare time is around the digital inclusion of marginalized

groups online, so, or underrepresented groups, the very least.

So thinking about the broad spectrum of

gender, for example, the LGBT plus community.

But we can take it many

more directions further than that.

But for example, I once had a catch up with

one of the leaders, product leaders, who designed the NHS

Covid app, and he was about a lot of the

difficulties in designing that, because when you’re dealing with people’s

location data and tracking data, that can have a huge

effect on people from more difficult backgrounds.

So, for example, if we’re leveraging your location data to

see if you’re more susceptible to catching COVID what do

you do in terms of someone who is hiding from

an abusive partner or if they are going to.

Areas are associated with people from a

certain religious background or from a certain

sexuality and stuff like this.

So that kind of data in certain contexts can

be hugely sensitive, and therefore that data handling and

usage is really important in product design.

So that’s kind of one way of thinking

about it, is how do we build products

that serve the needs of everyone and also

account for the fact that people have very

different realities when they’re interacting with technologies.

And how you’re designing those products can have a

huge effect on their day to day lives.

And what are some good practices?

You keep seeing good practices

for ethical product development.

I’m very interested in good product management

as a tool for responsible innovation.

So product managers are the people who are making

the decisions on how we build something, not how

we, sorry, why we’re building something for someone.

So they should be ideally very

close to their user base.

They’re understanding the problems at hand, they’re advocating for

those problems, and then they work with their team

to design a solution to that problem.

It’s really important for product managers to have a

really good understanding of their user base and the

main concerns that they have and the main concerns

they have from that intersectional perspective, like what does

your user base look like?

How might people with different access

needs, different accessibility needs, different backgrounds,

use your product differently?

And how does your product help to facilitate that?

So we see loads of great work, for example,

in terms of Microsoft have really great accessibility guidelines,

for example, on their product development and thinking about

how someone with different access needs might need light

mode and dark mode, or reading things in certain

ways, or how a screen reader might interact with

their website or their platform, et cetera. Et cetera.

And at the end of the day, those

are all kind of, product managers should be

aware of those kinds of decisions, right?

Like, how am I building something inclusive?

How am I building something that

serves my entire user base?

So that would definitely be one of my recommendations.

Just product managers always thinking holistically about who

might use their products and how they could

make sure it works for everyone, basically.

You posted recently really cool information about

cyberspace, if I can quote you.

You said, the world we live in today

is reflected in the mirror of digital society.

Very poetic.

Therefore, the inequalities that exist offline

are duplicated and amplified in cyberspace.

You realize that there are worlds which are

intersecting and that you have to take into

consideration all sorts of factors which are affecting

how we behave online as well.

So what are some emerging cyber threats that

people and organizations should be aware, and how

can we prepare better for them?

I think there’s a lot of new emerging threats

in the cyberspace we’ve spoken quite a lot about.

We’ve spoken a bit about AI so

far in this podcast, for example.

And the combination of AI, technology

and cyberattack is really interesting.

So, for example, we talk a lot about social

engineering, so encouraging someone to take an action that

they usually wouldn’t through manipulating them online.

With the advancements of tools like creating convincing

deepfakes, being able to generate images, pictures, videos

of people saying or doing things they usually

wouldn’t, we’re starting to see that use a

lot in social engineering attacks.

AI is also enabling the scraping and processing of

data at alarming rate, which makes it easier to

target a specific user or individual, because you’re able

to get a very holistic picture of all that

data floating around in cyberspace, which the male, like

I spoke about earlier, may or may not be

aware is out there, which can create a shockingly

accurate picture of what someone’s up to.

So the nigerian prince can send you

a more accurate, more personalized request.

Knowing someone from your family. Exactly.

Using that quote, they can know

your family where you recently went.

Maybe they can pretend to be a server from

a restaurant you just went to at the weekend

because you signed up using a restaurant booking app,

and then there was a data breach.

There’s all kinds of stuff that cybercriminals

are now able to get their hands

on and leverage and use against you.

And then obviously at the national level

as well, cyberattacks are becoming more sophisticated.

It’s not just social engineering.

We’re looking at a variety of

different attack types and techniques.

Which are being decibated by evolving technologies,

being able to attack more frequently, more

often at scale, exploit vulnerabilities, which is

easier than ever, unfortunately, despite how much

we’re working to try and counter that. Yeah.

So are there any guidelines?

Are there any tools which can help

people who are maybe not so technical

to differentiate what’s real and what’s fake?

Yes, I think fake news was a really big public

focus a couple of years back, and now I kind

of feel like we’re seeing second reality of that, where

you can’t no longer trust what you see online.

And like I spoke about in that quote, obviously

the digital world reflects our real world, but sometimes

it’s not a reflection, it’s like a false image.

And it’s really hard to tell sometimes when

it’s like a hall of mirrors, right?

Sometimes it is a real reflection,

and sometimes it’s a falsified reflection.

So how do you decide what’s real and what’s fake?

A couple of things that I would.

A couple of techniques I would definitely recommend.

I work at an organization called cybersafe.

We do a lot of behavioral human

risk management with a behavioral science focus.

So kind of figuring out why people fall for

cybersecurity attacks and how they can protect themselves.

And a lot of the guidance that we

give is around how does this piece of

news information contact make you feel?

Because often a lot of cyber attacks

are based off a very emotive hook.

Maybe you see something, you feel really scared, you

feel really nervous, you feel really excited, and just

take and be like, okay, this is unusual.

It’s kind of exceptional.

Before I react or engage with this in a

motive way, let me just fact check this is.

Let me check that URL.

Let me Google the author of this article.

Let me do a reverse search on the email

address that this has come from, for example.

And so just pausing in that moment when you have

an emotive reaction is quite a powerful way to counter

cyber threats, or like phishing attacks, for example, that might

come against you in terms of other recommendations.

I think in general, everyone practicing cyber

hygiene in the same way you brush

your teeth is really important.

So deactivating old accounts, making sure your privacy permissions are

up to date, making sure your loved ones privacy permissions

are up to date, just doing a bit of like

a spring clean regularly of your digital presence, and tightening

things up online is also a really helpful way to

help people stay safe online.

I guess it’s tricky for people

who are acting emotionally, right?

Because checking and cleaning and double crossing the

facts is the last thing they will do.

And it’s almost like if we should have some kind

of guidelines, sorry, not guidelines, like a guide, who would

maybe give us some alert, like be careful, that doesn’t

look like a human or that doesn’t look correct.

We are seeing that more and more.

I mean, on the AI discussion, I’ve spoken a bit

about how cybercriminals might leverage AI to attack people.

But also AI is being used to defend people.

So, for example, I’m a Gmail user, and

my Gmail I have advanced vision protection on,

and it will often scan my emails.

And when I open something, it will

say at the top, be careful.

This has traits of being a

suspicious message or a suspicious email.

Think about it before you engage with it.

Or do you want to report it as a phishing email?

So we’re also seeing AI use and llms

use to process large amounts of information, emails,

spot trends, and like you say, like guide

or warn people when something is risky.

And I think a lot of private

organizations are doing this quite well.

I don’t know exactly how the landscape

looks between that responsibility to protect people.

I don’t know how much of that sits

at the government and policy level and how

much of that sits with private organizations developing

these apps and interfaces that people were using.

But I do think there is a lot

of opportunity as well in the chinese society

to protect people and help people, coach people,

guide people on adopting more cybersecurity behaviors.

What are the trends which you think

will reshape industries in the next decades,

and how businesses can prepare?

There’s definitely a lot of trends out there.

One I think is enhanced personalization through

leveraging AI every step of the process.

So us all having more personalized and

tailored days of work using tools that

help us accelerate our personal productivity.

I see a lot of innovation in that space, but also

personalization of the products that we use and the tech products

that we use, the way we shop, in store or online.

All of this because if we use large

scale AI, like data models, for example, machine

learning models, we can start to emulate and

predict different behaviors for specific people who fit

into certain groups due to their user characteristics,

which can overall be quite a positive thing.

I think that personalization is quite controversial

because it’s often used to sell us

more stuff we don’t need.

But I also think there’s a lot of

space for innovation, for making us more productive

at work, happier, healthier, et cetera.

So I’m curious if that is like the general AI trend,

I mean, another huge trend will be it does just change

the nature of work with the shift of remote working.

Obviously with the pandemic, that

was one great overhaul.

But now for the majority of us that are now

developing new tools and ways of working that fundamentally affect

our job, reshapes our job, rename our jobs.

I mean, even as my work as a product

manager, there are so many tools in which I

could use, like collate user research, synthesize user feedback,

get your product idea, do brainstorming, workshopping, which I

would have previously all done manually.

And now my role is starting to look different because

of some of the technologies I have, and they are

powered by AI, or it’s still pseudo AI.

Yeah, I mean, that’s also controversial.

I think there was a study, I have to

try and find it for you, but there was

a study not that long ago of seed startups.

I think it was seed startups pitching for funding.

And of all the ones that claim they

use AI, it was something like 25% of

them were actually use AI and M’s product.

Yeah, it was the same.

It’s just trends, right?

And names changing, changing.

So before it was NFD, now no one is

talking about NFD, but yeah, if they were NFD

startup, they were getting lots of funding. Yeah.

It’s funny you mentioned this bit about

designing algorithms, designing models around personalization.

But like you said at the beginning, I

think of our conversation, people also want to

have transparency around what they are given, why

they are given this result, and not other.

And I think one of the very crucial area of AI which

we have to tackle right now, is to explain how it works.

To give people more transparency of how the

algorithms are designed was the right way to

make AI algorithms more transparent and understandable, especially

in critical domains like healthcare or finance.

For me, I think there’s kind of

like three main phases in which we

need to particularly consider AI transparency.

The first is the input used to train the data model.

Often the data set that you need to train

an AI model has to be huge, right?

It has to be absolutely expansive.

And because of the scale that is required,

often these data sets unintentionally hold a lot

of bias, because like I said earlier, the

data that we have reflects our current society.

Our current society contains a lot of biases.

So the models that we are training, the data

we’re training AI models on is also often biased.

So understanding what data has been trained on, what are

the implications of that, what are the risks of that

and how could that affect the results in kind of

an adverse way, I think is really important.

This also comes to processes that might be done to

the data, like data cleaning, or data cleaning, or data

fabrication, that is, insofar as adding additional or synthetic data

into a data set to try and balance it out

or make it less biased, at what cost, et cetera.

So that’s one campus body.

What data has the model been trained on?

How is that going to impact how it thinks?

What are the risks that can be up front?

The second is, as you

say, the algorithmic design itself.

What are the assumptions that hinges off?

How can we validate those assumptions?

Do those assumptions work better or worse for certain

groups of people or certain contexts, for example?

And then finally, on the output, what

is the output that’s come out?

Has that output been manipulated further in any way?

Has it been cleaned in any way?

And what post processing has been done to

give us certain kind of results, and how

is that output going to be leveraged?

What decisions does that inform, and what

does it mean for those decisions?

And I think thinking about AI, ethics

and transparency, every single phase of the

data pipeline is really important to give

that overall transparency that we spoke about.

I agree.

And I think one of the most important areas is

to include, to have diversity of who is designing those

algorithms and how those algorithms may be used.

One of the challenges for organizations and

governments is to ensure that the models

we create are more inclusive.

Since you’ve been working around diversity for quite some

time, do you have any advice what works?

Yeah, absolutely.

I spoke on a panel with the Lord Tim Clement

Jones, who’s the chair of the all parliamentary group on

AI and kind of emerging technologies, and also the CEO

of Codefest Girls, Anna Browseford on this topic, because diversity,

especially when it comes to topics like national security, is

not a nice to have.

It is absolutely critical if you have a group of

people who are thinking about threats in the same way,

they are going to design protection against only those threats

and not think about it from other perspectives.

And I think that goes for a variety

of things, like you say, like diverse teams

have been found to have better business outcomes.

It’s obviously ethically the right thing to do as well.

That’s why we are here as a business result. Exactly.

But they proven time and time again to

perform, to perform better, generate more revenue, create

more holistic solutions, because you have so many

more perspectives in the room.

So when it comes to then actually creating

those diverse teams themselves, I think one of

the most important things to think about is

that the talent pipeline is broken.

Like when we think about the funnel of talent,

by the time we get to which the point

people are thinking about who’s going to design these

AI algorithms, they’re often looking at a small percentage

of computer science grads from X courses, from X

universities, where already the makeup of that group is

wildly an accurate reflection of modern society.

And so if we can’t rely on that very

small part of the talent pipeline, where else can

we be getting other voices in the room?

Is that people who’ve done different

degrees and a career switching?

Is it people who’ve done 20 years in the healthcare industry

on the ground and now want to move into tech?

Where can we find these people and encourage them,

foster them, upskill them, to make sure that the

teams that we have building these products are as

diverse as some of the challenges they face.

You can see this issue within our generation where

I remember when I was young, I’m still young,

but when I was in primary school, we already

had division of the boys are going to focus

more on scientific fields and girls are more like

literature or biology or chemistry, those kind of subjects.

Naturally, you will have a smaller pool to choose from.

But as you pointed well, reskilling is very crucial

to have those different type of people who may

add different perspective to whatever you’re trying to build.

And I think in terms of the government and national

perspective on this, I think there’s been some great work

done, some encouraged, you say, for example, young women to

continue with SM education, see that through.

But the issue is there is this will

take a generation, at least a generation to.

Yes, right.

There’s a lot of great policies coming in.

I’m seeing a lot of great initiatives coming

in, like schools, workshops and questions encourage.

If we take the example of young women to

stay in stEm, however, we can’t afford to wait

another ten to 15 years for that to celebrate.

So, as you said, one piece is reskilling.

So I’ve volunteered a lot with code first girls

previously who teach women and nonbinary people to code,

but also initiatives like inclusive cyber that I run,

that teach people how to talk about their existing

skills in a way that can help them get

their first job in tech is also important because

a lot of the skills gap in the technology

sector is actually around communication skills, analysis skills, presenting

skills, which work in tech doesn’t just mean coding.

And lots of people work in AI bias or

ethical AI, system design, product management, all of these

disciplines as well, which don’t have a strong technical

focus, but are very complementary to some of the

goals that we discussed achieving.

Yeah, I agree.

And in a way, tech right

now is becoming more accessible. Right?

Like, you don’t even need to code

to build your web store, or you

don’t need to understand the programming language.

You can just ask LLM models to explain you bit by bit

as if they were explaining it to a five year old.

So it’s a lot of help for those people who

don’t even know how to start, but their unique skills

in communications, as I said, it’s valid and it’s needed

for wide range of roles in technology.

Okay, so you mentioned earlier on about intersectionality in

product design, and I would like to understand, because

I haven’t heard this, because I’m not in product

design, but I haven’t heard much about this field.

Like, what does it mean?

Yes, I think it’s kind of a step

further than just inclusivity in product design, like

we were thinking about earlier, right?

Like thinking about how someone from a

certain background might experience, we use your

product differently to the next person.

And intersectionality is just the idea of

intersectionality, just the acknowledgment that we all

have various parts of our lived experience

that constitute who we are.

So, for example, if we just take things at

a very high level, we can think about gender,

for example, and how someone who has a.

I’m sorry, I’m just trying to think.

So, gender, and how someone from a certain

gender might use a healthcare app, for example.

But then if we take that a little bit further

and we think about gender, and we think about.

I spoke to someone who was

building a reproductive health app, right.

In general, like period, tracking symptom,

tracking fertility, all of this.

And at the very high level, you would assume,

okay, we need to think about people who identify

as women and have the sexual characteristics of being

assigned female at birth would use this app, right.

In general.

But if you take that a step further and

you’re like, for example, someone with the gender identity

of female who maybe has a different biological background,

so maybe a trans woman, how would she interact

with a tracking fertility app?

Or for example, if you have a trans man who wants

to use your app, what would that look like for him?

And how are his needs a bit different

from the general group of people that you

were looking to address with your app?

And then if you take another layer on top,

like if that person also has accessibility needs or

they come from a certain background, or et cetera,

et cetera, how would that change?

How they interact with the Apple once again.

So intersectionality and product design is just

the acknowledgement that the first step of

inclusivity is thinking about the broad characteristics

someone might have that affect how they

interact with your apple products.

But intersectionality is like, how do their

other identities lay on top of that?

And does that change that again?

And I think, again, it’s quite a lot,

quite a lot to think about, really.

But if you think about how these things

all stack up, I think that’s how you

create a really amazing product or, like a

really well loved product, because it serves everyone.

And I think there are a couple of products

that do that well, and there’s definitely more expectation

for larger products to do that well.

But I think product managers at every stage in

the process can be thinking about this or asking

these questions, especially when it comes to, like you

say, sensitive things like healthcare apps, financial apps.

How do you make your concepts accessible?

And how might that interact

with different intersectional identities? Wow.

It’s almost like focusing solely on finding

those edge cases and designing around them.

But don’t you think that after a certain point,

designing for everyone means designing for no one?

Shouldn’t there be specific apps or

maybe specific separate apps for specific

type of people with special needs?

There’s definitely a trade off to be made, right?

But I would always argue that designing

for everyone actually unintentionally can help.

Sorry, designing for a smaller group of people can

actually unintentionally help a wider group of people.

So let me give you an example.

I think it was Nike.

Again, I need to check the brand, but

they invented a kind of shoe that you

could put on without using your hands.

So you could just put your feet in and put it on.

And initially, the idea of this was for

people who have a permanent reason to not

be able to put their shoes on.

So, for example, like limb loss in their upper body,

can’t use my hands, therefore can only fit shoes on.

Well, they actually found when they built, and which is

a very, seems like a very small use case, but

when they built and launched this product, what they also

found was it was really well used by people who

had sometimes had that issue or rarely had that issue.

So, for example, someone who has a temporary injury,

like a broken arm, would really benefit from shoes

you could put on without using your arms.

Or a mum who is holding her baby in

her arm and trying to get out the door

with, like a very young child would really benefit

from putting shoes on without using their arm.

And I think this happens a lot in

product design, I think it feels like you’re

designing for a really small section of society.

So, for example, I talk about the example I

gave on a reproductive app, but actually, maybe that

use case has helped lots of other people who

are experiencing things at different times, even just temporarily

or sometimes, which means the user base reflecting can

actually be quite broad.

And I think this differentiates your app and makes it

stand out from the rest because people see that you

really put effort into designing for specific needs.

Yeah.

And it happens all the time.

I’m sure there’s a lot of apps that you use.

They say, or I use, they’d say that I love.

And I’m just like, this app is just amazing.

And all it means is some PM samware sat down and

was like, okay, if I was this kind of Persona trying

to do this thing, what would I want to do?

And because of all that thought that’s

gone into it, it’s created features and

functionality that I love using.

So there’s definitely a lot to be

gained from empathy and product design. Okay.

And are there any such.

I will keep calling it edge cases in your current

projects you are working on, and maybe they came from

you or from somebody else and it just surprised you.

I haven’t thought about that.

Yes, it’s something that I work on.

So I’m the product manager

for our phishing simulations tool.

It goes much more beyond phishing

simulation, basically like a holistic approach

to managing social engineering threats.

And when we think about social engineering, how people

are affected by phishing and other kinds of cyberattack

varies a lot based on their personal experience.

And a really easily example of this is

when I talk to security awareness professionals, for

example, is how different cultures receive phishing threats

and how they respond to them.

So, for example, we track a lot of insights

around why do people fall for phishing scams?

Like, what kinds of emotional

triggers are working against them?

How can that be harmful?

What does that look like?

And often we do find

those are really culturally aligned.

So, for example, for a certain client, we

really found that their people really struggled with

phishing attacks, which seemed to come from people

of authority and had a tone of fear.

So that’s really interesting because it means for

that specific geography, then security awareness professionals in

that space can focus on creating a program

of training and learning, which is specifically addressing

how do we handle authority?

How do we handle fear?

How do we make sure that our comms

from senior management are always really clear and

people know what to do when they’re contacted

by someone senior, and they don’t panic.

And that was just in one geography.

But that’s a really interesting kind of like,

cultural chain difference in how people are responding

to a cyber threat, which might not be

like a global or universal sentiment.

Yeah, you’re right.

Some cultures will not admit that they are

not sure, but they will not admit, for

example, that they will still follow.

Yes, they will still follow

the authority voice, let’s say.

And let’s talk about your other achievements.

I know that you’ve been recognised as one

of the top hundred female voices in technology.

And I know that you advocate for diversity, but you

also do not exclude anyone and you want to make

sure that all the voices are being heard.

So how do you see role of

modern women evolving in field of technology?

Not to be excluded, not to create this artificial

barrier and give her opportunities that she can explore?

Yes, I think whenever I talk about

the inclusion of women in the technology

sector, I always think we started it.

The technology sector would not began without women.

Right.

So many of the early pioneers of

computer science and technology, like Ada Lovelace,

all came from female identifying backgrounds.

So I feel like women started the

technology industry, but as it’s grown, we’ve

become more pushed out and more marginalized.

I think it’s something like 25% of

the tech industry is female identifying.

It’s a really tiny percentage.

I think it’s even worse in cybersecurity specifically.

So that’s one piece.

Women started it.

How do we make sure women get

back into it and we continue?

I speak a lot about the power of mentorship.

I’ve been a mentor and a mentee numerous times myself.

Have coached probably hundreds of women by this

point on starting their careers in technology.

What they can do to break into

tech, how they can pitch their skills,

reviewed literally hundreds of cvs, et cetera.

In my free time.

However, I think even more important than

sponsorship, than mentorship, apologies, is sponsorship.

So how can people who already are more senior or

are in those decision making positions advocate for women to

get involved, to get the next opportunity to jump for

a promotion that she’s nearly ready for, but not quite

to encourage her to apply for a job that she

might not have done otherwise?

And I think sponsorship is even more powerful

for social mobility because mentorship is amazing, because

you see people where they’re at and helps

them, and it’s a very reciprocal relationship.

But sponsorship has a much bigger scale of

impact in terms of accelerating women’s careers and

making sure that their names are put into

rooms where otherwise they might not other being.

And how can we make sure that those leaders

understand that it’s in their benefit to sponsor, to

help pave the way to other women?

Because I don’t know if you have the same

feeling, but sometimes it feels like the women who

get the top are not really helpful.

Yes, I think there’s a couple of

common fallacies that make this problem worse.

I think one is assuming that women’s

rights and progression is a gendered issue.

That is, that it’s women’s responsibility to

help other women to succeed and excel.

Actually, a lot of my best sponsors have been men,

because if already at those high levels, it’s already very

male dominated, then waiting for a woman to get there

so she can start pulling other people up the ladder

is a very ineffective way of thinking about it, and

it places a lot of individual pressure.

Secondly, unfortunately, just because someone is

a senior woman does not necessarily

mean they aren’t particularly feminist, or

does not mean they’re particularly like.

They take gender as a priority, for example.

In fact, many women actually shy away from

gender equality, special or gender representation discussions when

there are high levels because they’re worried of

what that might mean for their reputation or

worried perception of what they might look like.

And also, a lot of women who got

very senior positions have often got there by

adopting conventionally masculine behaviors, insofar as they have

just adopted the styles and characteristics of how

other senior men have been acting.

And that’s helped their career progression.

But that’s not very helpful to kind

of help other people beyond that.

Kind of like playing the game a bit, which is

just symptomatic of the industry rather than the individual.

But that’s something that’s quite difficult.

And then the third thing, which I think is

kind of untrue, which is not helping women’s progression

in general, is that perception of kind of.

We talk a lot about the glass ceiling.

Like women can only advance so

far before they get stuck somewhere.

But there’s also kind of like this glass

platform or this glass cliff, where sometimes someone.

It’ll be a really difficult business situation.

They will promote a senior woman into

that role, knowing it was risky.

And then business outcomes aren’t what

they expected them to be.

Then they let that person crumble. Right?

Or let that person fall.

And that’s not a sustainable

way to promote female talent.

It’s giving people something I say quite often

is gender equality is not when we have

equal representation of men and exceptional women.

It’s when we have as many mediocre

men and mediocre women doing the job.

We don’t want these rocket star, rocket ship

women who do an amazing job like blast

to the top one in a million.

She’s incredible.

It should be that everyone of the same

level of ability is being promoted or recognized

for every level of the organization.

Yeah, you’re right, because we only see those and other

people who are maybe hoping to be there one day,

but they are seeing there is a huge gap in

between and many don’t even try because they are overwhelmed.

So how do you think we can bridge gap between

people and all the people who are maybe not so

technical, but they want to have their voice heard and

they want to become more active in shaping technology.

I mean, there’s a lot of ways in

which people can get involved in general.

For example, like product testing.

Often you want to test our products

with quite like a wide cohort.

You can get involved in online communities, feedback groups,

this kind of stuff, which can be really helpful

in terms of making your voice heard in tech.

I also don’t think you need to be technical

to follow current trends in technology or share your

opinions on them, share your voices on them.

As I spoke earlier about the intersectionality piece, often

I learn a lot from reading the tweets or

articles from people who come from different backgrounds to

myself and their experiences interacting with things or their

experiences using technologies, which helps broaden my understanding as

a product decision maker on how we could be

serving people better.

And also, if you’re interested in technology in general, I

often recommend feel free to start projects, set something up,

even if that’s just a blog where you share your

views or do you have an app idea like signs

build out, like a proof of concept of an app.

I think you can learn a huge amount about the

tech industry and the way it works and the kinds

of decisions you would need to be making in the

tech industry through setting up something yourself.

I completely agree.

It also relates to entrepreneurship or

like entrepreneurship as a general concept.

Because when you start something, you can read many books,

you can watch tutorials, you can do courses, but you

will never learn as much as when you do the

work, when you start something and you fail and you

iterate and it’s just something which you cannot get elsewhere.

And yeah, I really believe also that when you create

this community, when you start a blog or start sharing

your maybe ideas, looking for advice on LinkedIn or any

other platform, people generally want to help.

And I think it was someone on Reddit once

said that if you want to get the answer,

if you want to get help, write something which

is ridiculously wrong so they will correct you.

And, yeah, if you are trying to think

of some solution, I think this is the

best idea, the best course of action.

Yes, I think there’s a huge amount as well

about learning in public or building in public.

Throughout my tech career, I’ve always been really open

about what I’ve been doing, what I’ve been learning,

what kind of questions I’m coming across.

And I’ve met so many amazing people through doing that.

I’ve had so many people be like, oh, I’ve been

following your journey over the last couple of years.

I saw your work on this.

I would love to collaborate, and I think,

again, especially to go back to gender point,

but maybe again, more intersectionally as well.

People from underrepresented groups, I think, are often

even more nervous to speak publicly about what

they’re doing and to ask those questions.

They ask for help, et cetera.

But I have had such a positive feedback loop from

doing that, and I think when you first do it,

you feel ridiculous, and then you do it the second

time and you’re like, okay, that’s all right.

And by the third and the

fourth time, you become more confident.

And there’s just so much in the tech sector as well.

As you say, it’s always changing, always being shaped.

So learning to learn in public and putting

yourself out there is a really powerful skill.

But lots of people, they feel if they

show their vulnerable self, they will be ridiculed.

How should they look at it to overcome this fear?

Is there something which works for you?

Because I know you are already tasked, that you’re already

out there and you got it validated, so it’s easier.

Yeah, but for people who are just starting, who are

not sure, and I know there are lots of.

Lots of great people, when you talk to them one

on one, they have immense knowledge, but they never share

it because they think, if I’m not sure about something

or if I make mistake, someone will point that out.

Yeah, I think a couple of things that have really

helped me is people talk a lot about imposter syndrome,

but I want to talk instead about Spotlight syndrome, which

is just the belief that everyone is so invested in

what you’re doing in the nicest way possible.

People don’t really care.

You’re doing right.

You write a LinkedIn post and you’re like, oh, my

gosh, every single one of my connections is going to

be thinking about what I said and what I did.

Most of them won’t even see it.

Most of them will scroll past it,

but maybe five people might read it

and be like, oh, that’s really interesting.

I didn’t know that Isabelle

was working on those things.

I didn’t know she was thinking

about those kind of questions.

And then maybe one of those people will be like, oh,

let me message her and we can collaborate on that.

So I think that’s one thing, like just accepting people

who don’t care as much as you think they do.

Another thing I saw lately, which I thought

was really nice was cringe or embarrassment is

the cost we pay for success or exposure.

To be a beginner in anything, to learn how

to do anything, you have to be okay with

being a bit embarrassed or being a bit shy.

If you want to learn how to play the

piano, you need to be bad at the piano

for a bit before you’re going to be good.

And I think it’s the same in terms

of putting yourself out there and getting opportunities.

When I started doing it, it felt so embarrassing.

But since then, I’ve done so many international

talks, I’ve had so many amazing opportunities, multiple

advisory positions like this kind of stuff, none

of which would have happened if I hadn’t

initially started putting myself out there.

So I think that’s really helpful.

And three, you don’t have to do everything at once.

Speak on something that you

are really confident about first.

If you’re like, oh, I’m too nervous to put

my opinions out on the future of AI.

That feels a huge topic.

Bring it right down scale.

Be like, okay, I do this day to day

in my role and I’ve noticed AI is really

impacting this one specific thing I do.

Like what do people think?

I think it’s this blah blah.

You don’t have to come out with an entire huge

content planning schedule for the year, but speak to something

you’re confident about first because that will make you more

confident in speaking about things that are more unknown to

you or are more undefined, where you can have more

debate and stuff like this, like you say.

Yeah, I agree.

And how else people, apart from maybe your

closest community within work, how else can people

know about what you do, right?

And you want to open yourself to your 2nd 3rd

degree connections because you never know who may be reading,

who may be watching what you are doing.

And maybe it’s not this person who may need

your services or help or want to collaborate, but

it’s someone whom they talked to a week earlier.

And amazing things happen once

you put yourself out there.

Okay, so just to wrap up, looking ahead, what do

you think are the most exciting things in technology?

What makes you most excited about the future

of technology, AI and cybersecurity, your fields.

And what do you hope to achieve

in the next, let’s say five years?

It’s not an interview, don’t worry.

Yeah, no, I love it. No, it’s great.

It’s really got me thinking.

I’m very excited by the prospects of how AI and

technology can help people live happier, more fulfilled lives, so

saving times on things that don’t need to, so we

can care more about the people we care about, work

less, support each other, build meaningful communities.

I’m excited by how there’s so much opportunity

for innovation to improve a lot of the

systems that we rely on day to day.

I think that’s one of the big ones for me, personally.

I’m excited to continue developing in my

career path, making harder and harder judgment

calls, influencing products at increasing greater scales.

And I’m also excited to share more of

my expertise and knowledge in terms of topics

we spoke about today, like AI, gender diversity,

product development, so we can create an overall

more informed society that helps people to take

control of their digital rights and digital autonomy.

It’s going to be a great time.

I really think that career switched into technology

from an unexpected background and I just really

think it’s the place to be.

And that belief, I think will only

continue over the next five years.

We are biased, right?

I am so biased.

We could live somewhere in the middle

of nowhere and just doing gardening.

That sounds lovely.

I mean, what I also say with one of

my big bets on technology is no matter what

industry you work in, technology is relevant.

Even if in five years I was to be like, I’m going

to swap into healthcare, or I want to swap into, I probably

wouldn’t swap into fashion, but whatever it is, I want to swap

into retail or I want to swap into policy.

Agriculture.

Yeah, even agriculture.

Like, no matter what industry you’re thinking of,

there is always a technology slant to it

because every single industry, pretty much without fail,

is going to be impacted by technological advance

and change in the next five years.

So I think that’s really exciting too. Yeah.

And when I see, sometimes when I scroll over

LinkedIn profiles of sea level people, you can see

that sometimes they jump from completely different industry to

what they are doing right now.

Because like you said, lots of skills are transferable.

And just the fact that you understand that

you’ve worked on maybe specific scale or specific

type of, maybe geography, it matters.

So we should not get attached too much

to our job titles because they always change

and there is always a new trend. Right?

Like data science.

I heard that data science is no longer the hot one.

Now it’s turned into data decision science.

So maybe in cybersecurity there’ll be

something else soon as well.

Yeah, it’s something like a third of people in schools are

going to be doing jobs that don’t exist right now.

There’s so many jobs that are becoming more and more

important with evolving technologies that you just have to learn

to cultivate your core skill set and save minded, as

you say, to new opportunities that arise or see where

new trends are going in the career market. Yeah.

And when I was younger, before university,

I wouldn’t think about becoming a YouTube

influencer or influencer per se.

But this is an actual job right now.

Yeah, exactly.

Tough times, but exciting times because so many

things are changing and AI is going to

be like a copilot for us, right?

Like we are going to be more efficient.

Hopefully we can do more things at scale and

more precise because we will have more relevant data

at our hands and maybe we will have more

time to actually live outside of our offices.

Exactly.

That’s definitely my hope.

Thank you so much, Isabel,

for this amazing conversation.

And hopefully we will have many more people inspired by

what you do and more diverse voices in cybersecurity because

this is a big topic which we have to crack. Yes.

Thank you so much for having me, Kamila. Thank you.

Shares
Write Comment
Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous Post

Camilo Sandoval: The White House, AI, The Japanese Startups | Are You Human Podcast

Next Post

Eirik Norman Hansen: Hyperadoption, Futurism, Drones and Embracing Change | Are You Human Podcast

Tech, business and everything In between
Tech, business and everything In between
  • My why
  • Are You Human
  • Understanding AI
  • Entrepreneurship Handbook
  • Skill up
  • Inspiration

Kamila Hankiewicz

Entrepreneur / Host

Creativity is born in chaos. No matter if it's software, podcast or a kitchen. I share what I learn while building untrite.com, oishya.com, and hosting brilliant people on my podcast Are You Human.