|
FILE - In
this Jan. 6,
2021 file
photo
insurrectionists
loyal to
President
Donald Trump
try to open
a door of
the U.S.
Capitol as
they riot in
Washington.
New internal
documents
provided by
former
Facebook
employee-turned-whistleblower
Frances
Haugen
provide a
rare glimpse
into how the
company,
after years
under the
microscope
for the
policing of
its
platform,
appears to
have simply
stumbled
into the
Jan. 6 riot
(AP
Photo/Jose
Luis Magana,
File) |
|
|
|
|
|
|
|
|
|
|
|
|
|
Amid the
Capitol
riot,
Facebook
faced
its own
insurrection
By
ALAN
SUDERMAN
and
JOSHUA
GOODMAN
apnews.com
WASHINGTON
- As
supporters
of
Donald
Trump
stormed
the U.S.
Capitol
on Jan.
6th,
battling
police
and
forcing
lawmakers
into
hiding,
an
insurrection
of a
different
kind was
taking
place
inside
the
world’s
largest
social
media
company.
Thousands
of miles
away, in
California,
Facebook
engineers
were
racing
to tweak
internal
controls
to slow
the
spread
of
misinformation
and
inciteful
content.
Emergency
actions
— some
of which
were
rolled
back
after
the 2020
election
—
included
banning
Trump,
freezing
comments
in
groups
with a
record
for hate
speech,
filtering
out the
“Stop
the
Steal”
rallying
cry and
empowering
content
moderators
to act
more
assertively
by
labeling
the U.S.
a
“Temporary
High
Risk
Location”
for
political
violence.
At
the same
time,
frustration
inside
Facebook
erupted
over
what
some saw
as the
company’s
halting
and
often
reversed
response
to
rising
extremism
in the
U.S.
“Haven’t
we had
enough
time to
figure
out how
to
manage
discourse
without
enabling
violence?”
one
employee
wrote on
an
internal
message
board at
the
height
of the
Jan. 6
turmoil.
“We’ve
been
fueling
this
fire for
a long
time and
we
shouldn’t
be
surprised
it’s now
out of
control.”
It’s
a
question
that
still
hangs
over the
company
today,
as
Congress
and
regulators
investigate
Facebook’s
part in
the Jan.
6 riots.
New
internal
documents
provided
by
former
Facebook
employee-turned-whistleblower
Frances
Haugen
provide
a rare
glimpse
into how
the
company
appears
to have
simply
stumbled
into the
Jan. 6
riot. It
quickly
became
clear
that
even
after
years
under
the
microscope
for
insufficiently
policing
its
platform,
the
social
network
had
missed
how riot
participants
spent
weeks
vowing —
on
Facebook
itself —
to stop
Congress
from
certifying
Joe
Biden’s
election
victory.
The
documents
also
appear
to
bolster
Haugen’s
claim
that
Facebook
put its
growth
and
profits
ahead of
public
safety,
opening
the
clearest
window
yet into
how
Facebook’s
conflicting
impulses
— to
safeguard
its
business
and
protect
democracy
—
clashed
in the
days and
weeks
leading
up to
the
attempted
Jan. 6
coup.
This
story is
based in
part on
disclosures
Haugen
made to
the
Securities
and
Exchange
Commission
and
provided
to
Congress
in
redacted
form by
Haugen’s
legal
counsel.
The
redacted
versions
received
by
Congress
were
obtained
by a
consortium
of news
organizations,
including
The
Associated
Press.
What
Facebook
called
“Break
the
Glass”
emergency
measures
put in
place on
Jan. 6
were
essentially
a
toolkit
of
options
designed
to stem
the
spread
of
dangerous
or
violent
content
that the
social
network
had
first
used in
the
run-up
to the
bitter
2020
election.
As many
as 22 of
those
measures
were
rolled
back at
some
point
after
the
election,
according
to an
internal
spreadsheet
analyzing
the
company’s
response.
“As
soon as
the
election
was
over,
they
turned
them
back off
or they
changed
the
settings
back to
what
they
were
before,
to
prioritize
growth
over
safety,”
Haugen
said in
an
interview
with “60
Minutes.”
An
internal
Facebook
report
following
Jan. 6,
previously
reported
by
BuzzFeed,
faulted
the
company
for
having a
“piecemeal”
approach
to the
rapid
growth
of “Stop
the
Steal”
pages,
related
misinformation
sources,
and
violent
and
inciteful
comments.
Facebook
says the
situation
is more
nuanced
and that
it
carefully
calibrates
its
controls
to react
quickly
to
spikes
in
hateful
and
violent
content,
as it
did on
Jan 6.
The
company
said
it’s not
responsible
for the
actions
of the
rioters
and that
having
stricter
controls
in place
prior to
that day
wouldn’t
have
helped.
Facebook’s
decisions
to phase
certain
safety
measures
in or
out took
into
account
signals
from the
Facebook
platform
as well
as
information
from law
enforcement,
said
spokeswoman
Dani
Lever.
“When
those
signals
changed,
so did
the
measures.”
Lever
said
some of
the
measures
stayed
in place
well
into
February
and
others
remain
active
today.
Some
employees
were
unhappy
with
Facebook’s
managing
of
problematic
content
even
before
the Jan.
6 riots.
One
employee
who
departed
the
company
in 2020
left a
long
note
charging
that
promising
new
tools,
backed
by
strong
research,
were
being
constrained
by
Facebook
for
“fears
of
public
and
policy
stakeholder
responses”
(translation:
concerns
about
negative
reactions
from
Trump
allies
and
investors).
“Similarly
(though
even
more
concerning),
I’ve
seen
already
built &
functioning
safeguards
being
rolled
back for
the same
reasons,”
wrote
the
employee,
whose
name is
blacked
out.
Research
conducted
by
Facebook
well
before
the 2020
campaign
left
little
doubt
that its
algorithm
could
pose a
serious
danger
of
spreading
misinformation
and
potentially
radicalizing
users.
One
2019
study,
entitled
“Carol’s
Journey
to
QAnon—A
Test
User
Study of
Misinfo
&
Polarization
Risks
Encountered
through
Recommendation
Systems,”
described
results
of an
experiment
conducted
with a
test
account
established
to
reflect
the
views of
a
prototypical
“strong
conservative”
— but
not
extremist
—
41-year
North
Carolina
woman.
This
test
account,
using
the fake
name
Carol
Smith,
indicated
a
preference
for
mainstream
news
sources
like Fox
News,
followed
humor
groups
that
mocked
liberals,
embraced
Christianity
and was
a fan of
Melania
Trump.
Within a
single
day,
page
recommendations
for this
account
generated
by
Facebook
itself
had
evolved
to a
“quite
troubling,
polarizing
state,”
the
study
found.
By day
2, the
algorithm
was
recommending
more
extremist
content,
including
a
QAnon-linked
group,
which
the fake
user
didn’t
join
because
she
wasn’t
innately
drawn to
conspiracy
theories.
A
week
later
the test
subject’s
feed
featured
“a
barrage
of
extreme,
conspiratorial
and
graphic
content,”
including
posts
reviving
the
false
Obama
birther
lie and
linking
the
Clintons
to the
murder
of a
former
Arkansas
state
senator.
Much of
the
content
was
pushed
by
dubious
groups
run from
abroad
or by
administrators
with a
track
record
for
violating
Facebook’s
rules on
bot
activity.
Those
results
led the
researcher,
whose
name was
redacted
by the
whistleblower,
to
recommend
safety
measures
running
from
removing
content
with
known
conspiracy
references
and
disabling
“top
contributor”
badges
for
misinformation
commenters
to
lowering
the
threshold
number
of
followers
required
before
Facebook
verifies
a page
administrator’s
identity.
Among
the
other
Facebook
employees
who read
the
research
the
response
was
almost
universally
supportive.
“Hey!
This is
such a
thorough
and
well-outlined
(and
disturbing)
study,”
one user
wrote,
their
name
blacked
out by
the
whistleblower.
“Do you
know of
any
concrete
changes
that
came out
of
this?”
Facebook
said the
study
was an
one of
many
examples
of its
commitment
to
continually
studying
and
improving
its
platform.
Another
study
turned
over to
congressional
investigators,
titled
“Understanding
the
Dangers
of
Harmful
Topic
Communities,”
discussed
how
like-minded
individuals
embracing
a
borderline
topic or
identity
can form
“echo
chambers”
for
misinformation
that
normalizes
harmful
attitudes,
spurs
radicalization
and can
even
provide
a
justification
for
violence.
Examples
of such
harmful
communities
include
QAnon
and,
hate
groups
promoting
theories
of a
race
war.
“The
risk of
offline
violence
or harm
becomes
more
likely
when
like-minded
individuals
come
together
and
support
one
another
to act,”
the
study
concludes.
Charging
documents
filed by
federal
prosecutors
against
those
alleged
to have
stormed
the
Capitol
have
examples
of such
like-minded
people
coming
together.
Prosecutors
say a
reputed
leader
in the
Oath
Keepers
militia
group
used
Facebook
to
discuss
forming
an
“alliance”
and
coordinating
plans
with
another
extremist
group,
the
Proud
Boys,
ahead of
the riot
at the
Capitol.
“We
have
decided
to work
together
and shut
this s—t
down,”
Kelly
Meggs,
described
by
authorities
as the
leader
of the
Florida
chapter
of the
Oath
Keepers,
wrote on
Facebook,
according
to court
records.
|
|
|
|
|
|
|
|
|