Fabricated claims targeting George Floyd remain on Facebook despite pledges by the social media giant to take them down, a new investigation by human rights group Avaaz has found.
USA TODAY reported in January that Facebook had not removed racist falsehoods, stereotypes and tropes about Floyd and other victims of police brutality that had been debunked by fact-checkers.
Two-thirds of the content is still online and nearly a third is not labeled false, said Avaaz campaign director Fadi Quran.
Avaaz uncovered 65 posts pushing 15 false narratives on Floyd’s murder, including that his death was staged or was filmed before COVID-19, that he died of an overdose and those involved in his arrest were crisis actors, that had an estimated 3.4 million views. It flagged the posts for Facebook in September.
A review this week after the guilty verdicts in the murder trial of former Minneapolis police officer Derek Chauvin showed that 43 of the posts and 14 out of 15 false narratives remain on the platform, Avaaz told USA TODAY.
For example, the claim that Floyd is still alive and arresting officers were “crisis actors,” 1 out of 5 posts were removed as of Wednesday. Of the remaining four posts, one was labeled “false information.”
Chauvin verdict:DOJ launches inquiry into Minneapolis police operations, a day after Chauvin guilty verdicts
Calls for police reform:Kehlani expresses outrage, calls for reform after Columbus police shooting: ‘What is justice’
“Instead of playing a positive role in protecting marginalized communities from disinformation and hate, Facebook is still allowing its platform to be weaponized to spread this content,” Fadi Quran, campaign director at Avaaz, said in a statement.
Facebook spokesman Andy Stone declined to comment.
In January, he told USA TODAY that the company routinely removes and labels content that violates its policies. “And we did so for the content, pages and groups identified in the Avaaz report,” Stone said at the time.
Facebook said Monday that it would take emergency steps to limit hate speech and calls for violence that “could lead to civil unrest or violence” when the verdict was announced in the Chauvin trial.
On Tuesday, a Minneapolis jury found Chauvin guilty of second-degree murder, third-degree murder and second-degree manslaughter.
Cellphone video of Floyd’s death last May under Chauvin’s knee went viral and set off months of protests in the U.S. and abroad condemning police brutality and calling for racial justice.
In anticipation of a verdict, Facebook pledged to remove posts from Facebook and Instagram that urged people to take up arms and any content that praised, celebrated or mocked Floyd’s death. It also designated Minneapolis a “high risk location.”
Facebook has used powerful moderation tools before to curb flow of misinformation and calls to violence in the aftermath of the 2020 presidential election. After the Chauvin verdict, activists questioned why Facebook does not deploy them all the time.
“If Facebook can be safer for Black people, why isn’t that the default setting?” said Rashad Robinson, president of Color Of Change, a nonprofit civil rights advocacy organization.
Black users have told USA TODAY they are routinely subjected to racially motivated hate speech and yet are censored by Facebook when they talk about racism. And that harassment only got worse after Floyd’s death and during the 2020 election campaign.
One of the problems, say civil rights activists, is the dearth of underrepresented minorities at Facebook, particularly in positions of influence. Despite repeated pledges to close the racial gap, just 3.7% of Facebook’s U.S. employees and 3% of senior executives are Black, according to a USA TODAY analysis of 2018 figures. Facebook is also facing a federal investigation by the Equal Employment Opportunity Commission probing allegations of bias in hiring, promotion and pay.
Another issue is the company’s policy of protecting all racial and ethnic groups equally, even if they do not face oppression, marginalization or centuries of systemic racism, civil rights activists say.
Facebook said it would put a higher priority on detecting and deleting racist slurs and hate speech against Black people, Muslims, Jews, the LGBTQ community and people of more than one race than on statements such as “White people are stupid” and “Men are pigs.” As part of that initiative, the company said it would retrain automated moderation systems to focus on hate speech targeting historically marginalized and oppressed groups, which “can be the most harmful.”
Last year, Facebook hired a civil rights executive to help the company curb racial hatred and violent content on its platforms. Civil rights attorney Roy L. Austin Jr. has established a new civil rights organization inside Facebook, one of the key recommendations of an internal audit of Facebook’s practices released in July that heightened scrutiny of the spread of racism and hate on Facebook and Instagram.
On Tuesday, Facebook CEO Mark Zuckerberg wrote about the Chauvin verdict on his Facebook page.
“Right now I’m thinking of George Floyd, his family and those who knew him. I hope this verdict brings some measure of comfort to them, and to everyone who can’t help but see themselves in his story,” Zuckerberg wrote. “We stand in solidarity with you, knowing that this is part of a bigger struggle against racism and injustice.”