⚠️ Some features may be temporarily unavailable due to an ongoing 3rd party provider issue. We apologize for the inconvenience and expect this to be resolved soon.
TRANSCRIPTEnglish

What the Facebook Whistleblower JUST Said

14m 3s2,293 words388 segmentsEnglish

FULL TRANSCRIPT

0:00

okay folks here is a summary of part one

0:02

of the testimony of francis hogan 37

0:05

years old product manager former product

0:06

manager at facebook she's got a degree

0:08

from harvard and this is a summary of

0:10

what she mentioned to the senate uh

0:13

today or the senate hearing committee

0:17

so francis hogan mentions that a

0:19

facebook repeatedly misled the public

0:22

about what its own research reveals

0:24

about the safety of children and the

0:26

efficacy of its artificial intelligence

0:28

systems

0:29

as a role in spreading divisive and

0:32

extreme messages she says that facebook

0:34

has the potential to bring out the worst

0:36

of us but could also bring out the best

0:39

of us if we had proper controls for

0:41

facebook she believes that facebook

0:43

harms children stokes division and harms

0:46

democracy she says that facebook puts

0:49

profit before people and that they will

0:52

not solve the problems they have without

0:54

the help or compelling of congress she

0:58

says that facebook resolves conflicts

1:00

within the company in favor of its own

1:02

profits rather than

1:04

uh what is best for society later she

1:06

mentions that facebook pays for their

1:09

profit with our

1:11

safety she says the results are more

1:14

profit for facebook but more lies and

1:17

more combativeness in our society and

1:19

sometimes sometimes this actually leads

1:20

to violence

1:21

she mentions that the company

1:23

intentionally hides vital information

1:26

that could potentially lead regulators

1:28

to

1:29

pay attention to facebook more

1:31

she says that facebook purposefully

1:33

hides behind walls that keep researchers

1:35

out this is different from what google

1:36

does for example where researchers can

1:38

access data and do more research

1:40

comparison she made

1:42

these uh problems she says are very

1:44

solvable and she gives some suggestions

1:46

for how to solve these issues which

1:48

we'll talk about in just a moment

1:50

she also mentions that children most

1:52

children who use facebook probably don't

1:55

have the self-awareness to recognize the

1:58

true issues of what facebook is actually

2:00

doing to them and how facebook is

2:03

actually hurting them

2:05

she mentions that facebook uh likely

2:09

she mentions that facebook says that

2:11

children lie about their age but the

2:13

reality is statistics within facebook

2:15

show that most children tell the truth

2:17

and you can figure out most children's

2:19

actual ages by doing some backward

2:21

analysis

2:22

and

2:23

she mentions that the real problems that

2:25

facebook beyond

2:27

people potentially lying about their age

2:29

because this was a question that came up

2:31

were really a staffing related that

2:33

there aren't enough humans involved in

2:36

decision making or problem solving at

2:38

facebook

2:39

and so

2:40

ultimately when there is understaffing

2:43

groups within facebook are discouraged

2:44

from actually trying to solve the issues

2:46

because that could potentially lead to

2:48

more work

2:49

when they're already overwhelmed so for

2:51

example francis mentioned that she was

2:53

on a counter espionage uh group or cell

2:57

or whatever

2:58

and uh at any given time she said they

3:01

could only deal with one third of the

3:02

caseload they had if they would work on

3:05

building a better

3:07

detector to find more problems then they

3:10

would just be even more overwhelmed and

3:12

even more behind and so she cites a lack

3:14

of

3:15

humans and a lack of staff as some of

3:17

the problems of the issues that we have

3:20

at facebook

3:21

she says that

3:23

her goal is to fix facebook however

3:25

facebook should declare moral bankruptcy

3:28

she mentions that some things that could

3:30

help are things like the kids act the

3:32

kids act by the way are things or is an

3:35

act that would limit things like

3:37

autoplay push alerts and like buttons or

3:40

even follower accounts for children

3:43

there was a mention that facebook's q2

3:45

revenue per user is 35 and 58 cents

3:48

annualized that would be over

3:50

uh 135 dollars

3:53

which is pretty incredible

3:55

actually probably over 140 dollars

3:58

senator klobuchar says that uh

3:59

algorithms are likely to promote uh

4:02

outrageous content to which base or to

4:04

which uh francis says absolutely

4:07

facebook uses engagement based content

4:10

algorithms that basically amplify

4:12

extremes and in reverse engineering

4:15

facebook found that it could actually

4:17

lead people in studies

4:18

to

4:19

to

4:20

do or act a certain way so for example

4:23

when the question was asked can

4:24

facebook's algorithm lead people to

4:26

anorexia she said the answer was yes

4:29

that research had shown that facebook

4:31

has that power which is kind of

4:33

incredible to think about that facebook

4:34

could essentially manipulate you by

4:37

catering certain messages to you

4:40

facebook invests more in its users that

4:42

make them more money she says uh

4:44

specifically in the english language she

4:46

talked a little bit more about the

4:47

algorithm the engagement based ranking

4:49

algorithm

4:50

she counters what facebook might say she

4:53

believes that facebook would say that

4:54

facebook uh says you won't like facebook

4:58

as much if we didn't cater content for

5:00

you

5:01

she has more of a preference

5:03

saying that hey maybe we don't need

5:05

algorithms to cater content for us maybe

5:07

maybe we could go time based uh on on

5:11

the way we view content in a feed but

5:13

we've also seen that by the way like

5:15

youtube

5:16

remember back in the day and i'm making

5:17

this reference here back in the day

5:19

youtube was very subscription-based if

5:21

you subscribe to a channel you would see

5:23

a list of all the videos from your

5:25

subscriptions

5:26

but youtube found that that didn't

5:28

actually increase viewership instead

5:31

providing you content that wasn't

5:33

necessarily from your subscriptions list

5:35

but what was most deemed to get the most

5:37

view time from you

5:39

ended up growing the platform

5:40

substantially more and so that's why

5:41

they switched to that sort of platform i

5:43

imagine facebook has found

5:46

very much the same thing to be true that

5:47

catered content

5:49

increases engagement and that's why they

5:51

use that the problem is

5:53

the

5:54

content that facebook is likely to

5:56

promote is content that's likely to

5:58

quote elicit an extreme reaction from

6:01

the user

6:02

because in that case it's more likely to

6:04

generate likes and shares

6:06

leading other people not only to view

6:09

and consume that content and then

6:10

potentially share that content but also

6:12

promoting other people to create content

6:15

and so on youtube it's worth noting that

6:18

when youtube suggests a video to you

6:20

it's because each video has a

6:24

predetermined or pre-estimated

6:27

watch time per impression and so every

6:30

time they show you a thumbnail they

6:32

assume that on average you're going to

6:34

spend five minutes here or 10 minutes

6:36

here eight minutes here but that watch

6:38

time per impression is really important

6:40

because it also calculates how likely

6:42

you are to share that video because see

6:45

if you watch a video for eight minutes

6:46

and don't share it that's eight minutes

6:48

if you watch a video and share it with a

6:50

person who then shares it to another

6:52

person who shares it to another person

6:54

that could be 24 or 32 minutes of watch

6:57

time per that one impression they gave

6:59

you right so they're looking for that

7:00

sort of outrageous or extreme content

7:02

youtube does pretty much the same thing

7:05

facebook says

7:06

it's uh artificial intelligence can find

7:09

bad content uh francis says but she

7:12

argues that you can't rely on artificial

7:15

intelligence

7:16

to solve these issues or or to find bad

7:19

content because in her research

7:22

they found uh and and on sort of groups

7:24

that she's worked on they found that

7:26

ai was only about 10 to 15 percent

7:29

effective at actually finding content

7:31

that violated uh the terms of services

7:34

or was bad content

7:36

on facebook and so she believes that's

7:39

very important for not only research to

7:41

be shared with congress but congress to

7:44

provide oversight to congress maybe even

7:46

facebook and cr create suggestions for

7:48

how that oversight should look uh and

7:50

congress can collaborate on that sense

7:53

she was also a big fan of what twitter

7:55

is doing with link sharing when twitter

7:58

when somebody posts a link and somebody

8:00

wants to retweet that twitter has a

8:02

little notification that pops up and

8:03

says you haven't read this aren't do you

8:06

want to read this article before you

8:08

re-share it facebook

8:10

she says doesn't do this because that

8:12

would reduce shares and facebook likes

8:14

shares so they don't want to stand in

8:16

the way of of of um people sharing

8:19

content she doesn't believe that you

8:21

having to click on the article before

8:23

sharing it is actually a form of

8:24

censorship instead she says hey like

8:26

this is a good way to help prevent miss

8:28

or disinformation

8:30

now uh let's see here she also mentions

8:33

and i thought this was very interesting

8:34

uh another senator came up and brought

8:37

up some examples of how facebook's ad

8:39

review had failed a few times i actually

8:42

have that up right here

8:44

and here's an example

8:47

so

8:48

this is by the

8:50

techtransparencyproject.org the ttp

8:52

referenced by a senator during the

8:54

hearing

8:55

and they submitted this ad

8:58

targeted to 13 to 17 year olds they

9:00

never ran these ads they just wanted to

9:02

see if these ads would get approved

9:04

and so here's one it says it's time for

9:06

some skittles are you in and it's a

9:08

picture of a bunch of drugs throw a

9:10

skittles party like no other and that

9:12

was approved

9:14

here's one

9:15

that shows a picture of a skinny girl

9:18

and says when you're craving a snack

9:20

visit pro anorexia sites to feed your

9:23

motivation and reach your goal except

9:25

they shorten anorexia and just call it

9:27

ana and then they call it an antitip and

9:29

how this got approved or here's some

9:31

other dating or gambling or alcohol or

9:33

smoking ads that got promoted to 13 to

9:36

17 year olds

9:37

and they show

9:39

how quickly these were approved a

9:42

potential reach of 9.1 million 13 to 17

9:44

year olds and they submitted these ads

9:46

and got basically approval within uh at

9:50

exactly the same time

9:52

within about an hour without within an

9:55

hour and one minute all of these ads

9:56

were approved and look at that all of

9:58

the ads were approved at exactly the

10:00

same time see that here this

10:02

41 pm approve approve approve approval

10:05

proof all of them at exactly the same

10:06

time now

10:08

francis

10:10

it doesn't work on these particular ad

10:11

teams but she believes that it's likely

10:13

that algorithms are approving these and

10:16

that it's very unlikely that humans are

10:18

looking at at all of these ads and if

10:21

facebook is relying on algorithms to

10:24

review ads then it's likely that only 10

10:27

to 15

10:28

of bad ads are actually caught

10:31

which is a problem

10:32

and especially when she believes that

10:34

facebook creates mental health issues

10:36

for not just

10:38

women but also children in general which

10:40

we've seen

10:41

through the wall street journal expose

10:43

where

10:46

there's ample evidence of mental health

10:48

deterioration in individuals who are

10:51

children especially girls but also boys

10:55

and

10:56

one of the issues that she cites is

10:58

suggesting that

10:59

parents

11:00

don't really know how to help their

11:02

children because they didn't really grow

11:03

up with social media they might say

11:05

something like hey just stop using

11:06

facebook but that would be bad advice

11:08

because

11:09

people have a temptation to continue to

11:11

resort to social media because of the

11:14

way our brains are wired the dopamine

11:16

release of every time we check our phone

11:18

and so she says this is an issue because

11:20

when you wake up in the morning you see

11:21

people who are cruel to you when you go

11:23

to sleep at night you see people who are

11:25

cruel to you through facebook

11:27

and this is very different from how

11:29

bullying for example used to work

11:31

bullying in schools didn't used to

11:33

follow you home now it follows you home

11:35

wherever you are

11:37

and you don't get a break from that sort

11:38

of bullying and and that continues to

11:42

unfortunately perpetuate

11:44

negative mental health

11:45

for for children

11:48

she uh she's a big fan of promoting

11:50

oversight and regulation to

11:52

section 230 the communication decency

11:54

act she mentions that

11:57

platforms like facebook have 100 control

12:00

over their algorithms and algorithms

12:02

should not prioritize growth at the sake

12:05

of or at the expense of public safety

12:09

uh she also mentions that 80 to 90 of

12:12

sensitive content in terms of swear

12:14

words somehow still makes it through

12:17

facebook algorithms and facebook

12:19

algorithms are especially bad at

12:21

identifying

12:23

inappropriate content in other languages

12:27

but they're also very bad at identifying

12:29

inappropriate content in english

12:32

then we have uh let's see we talked

12:35

about the ads already

12:37

and she mentions that facebook has a

12:39

focus on scale how can we do things

12:42

cheaply and fast it's very possible that

12:45

a lot of content on facebook is never

12:47

reviewed by uh that submitted like for

12:50

ads for in this case was for ads she

12:52

believes it's very likely and very

12:54

possible that most ads are not reviewed

12:56

by humans

12:57

we touched on that already she believes

12:59

we need to quote human scale social

13:02

media not computer scale social media

13:04

this kind of goes back to not having

13:06

enough staff

13:08

and i you know my sort of takeaway was

13:10

really that in her opinion

13:12

artificial intelligence sucks at

13:14

preventing the negative from exploding

13:17

on facebook that

13:19

algorithms have too much biases or have

13:21

too many biases and that left unchecked

13:24

you end up with like she mentioned in

13:26

her opening remarks more divisiveness

13:28

more potential violence more bullying

13:30

more mental health problems and so this

13:32

gives you a summary

13:34

of the first and longest part of the

13:37

senate testimony from francis hogan i

13:39

expect the second part to be relatively

13:42

similar but uh this gives you

13:44

part one so thank you so much for

13:46

watching if you found this helpful

13:48

consider subscribing and folks we'll see

13:50

you in the next one

13:53

[Music]

UNLOCK MORE

Sign up free to access premium features

INTERACTIVE VIEWER

Watch the video with synced subtitles, adjustable overlay, and full playback control.

SIGN UP FREE TO UNLOCK

AI SUMMARY

Get an instant AI-generated summary of the video content, key points, and takeaways.

SIGN UP FREE TO UNLOCK

TRANSLATE

Translate the transcript to 100+ languages with one click. Download in any format.

SIGN UP FREE TO UNLOCK

MIND MAP

Visualize the transcript as an interactive mind map. Understand structure at a glance.

SIGN UP FREE TO UNLOCK

CHAT WITH TRANSCRIPT

Ask questions about the video content. Get answers powered by AI directly from the transcript.

SIGN UP FREE TO UNLOCK

GET MORE FROM YOUR TRANSCRIPTS

Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.