Facebook's automation was 90 percent certain Trump post about Floyd protesters broke rules

The 'Facebook Papers' revealed that former President Trump's post about looters in Minneapolis resulted in a large spike in hate speech on the social media platform.

10/27/2021 11:22:00 PM

The 'Facebook Papers' revealed that former President Trump's post about looters in Minneapolis resulted in a large spike in hate speech on the social media platform.

The 'Facebook Papers' revealed that former President Trump's post about looters in Minneapolis resulted in a large spike in hate speech on the social media platform.

On May 28 that year, in a message posted on both hisand Facebook accounts, the then-president wrote at 9:53 a.m.:"These THUGS are dishonoring the memory of George Floyd and I won't let that happen. Any difficulty and we will assume control but, when the looting starts the shooting starts!"

O'Neil, Hodges, Minoso, Kaat, Oliva, Fowler get baseball HOF | AP News Fauci says early reports encouraging about omicron variant | AP News Lawyer: Chris Cuomo accuser was disgusted by 'hypocrisy' | AP News

Facebook did not remove Trump's post, which resulted in much criticism directed at the company.For more reporting from the Associated Press, see below.Facebook documents show its automation controls predicted with 90 percent certainly that a message former President Donald Trump posted about shooting looters violated its rules. In this photo, Trump speaks during a rally on July 3, 2021, in Sarasota, Florida.

Eva Marie Uzcategui/Getty ImagesThe reports of hateful and violent posts on Facebook started pouring in on the night of May 28, soon after Trump posted his warning on social media that suggested looters in Minneapolis would be shot.Protests were already underway in Minnesota's largest city, but it wasn't until after Trump posted about Floyd's death that the reports of violence and hate speech increased"rapidly" on Facebook across the country, an internal company analysis of the ex-president's social media post reveals. headtopics.com

The former president has since been suspended from both Twitter and Facebook.Leaked Facebook documents provide a first-hand look at how Trump's social media posts ignited more anger in an already deeply divided country that was eventually lit"on fire" with reports of hate speech and violence across the platform.

Yet, the tech giant didn't take any action on Trump's message.Offline, the next day, protests—some of which turned violent—engulfed nearly every U.S. city, big and small."When people look back at the role Facebook played, they won't say Facebook caused it, but Facebook was certainly the megaphone," said Lainer Holt, a communications professor at Ohio State University."I don't think there's any way they can get out of saying that they exacerbated the situation."

Social media rival Twitter, meanwhile, responded quickly at the time by covering Trump's tweet with a warning and prohibiting users from sharing it any further.Facebook's internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to

in redacted form by former Facebook employee-turned-whistleblower Frances Haugen's legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.The Wall Street Journal headtopics.com

Chris Cuomo Accused of Sexual Misconduct Amid CNN Exit UN and US condemn Myanmar military after protest reportedly turns deadly in Yangon Fauci Fires Back At GOP Sen. Ron Johnson's 'Preposterous' Claim He Overhyped AIDS

previously reported that Trump was one of many high-profile users, including politicians and celebrities, exempted from some or all of the company's normal enforcement policies.Hate speech and violence reports had been mostly limited to the Minneapolis region after Floyd's death, the documents reveal. That changed after Trump wrote about looters.

The internal analysis shows a five-fold increase in violence reports on Facebook, while complaints of hate speech tripled in the days following Trump's post. Reports of false news on the platform doubled. Reshares of Trump's message generated a"substantial amount of hateful and violent comments," many of which Facebook worked to remove. Some of those comments included calls to"start shooting these thugs" and"f—- the white."

By June 2,"we can see clearly that the entire country was basically 'on fire,'" a Facebook employee wrote of the increase in hate speech and violence reports in the June 5 memo.Facebook says it's impossible to separate how many of the hate speech reports were driven by Trump's post itself or the controversy over Floyd's death.

"This spike in user reports resulted from a critical moment in history for the racial justice movement—not from a single Donald Trump post about it," a Facebook spokesperson said in a statement."Facebook often reflects what's happening in society and the only way to prevent spikes in user reports during these moments is to not allow them to be discussed on our platform at all, which is something we would never do." headtopics.com

But the internal findings also raise questions about public statements Facebook CEOMark Zuckerbergmade last year as he defended his decision to leave Trump's post untouched.On May 29, for example, Zuckerberg said the company looked closely to see if Trump's words broke any of its policies and concluded that they did not. Zuckerberg also said he left the post up because it warned people of Trump's plan to deploy troops.

"I know many people are upset that we've left the President's posts up, but our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies," Zuckerberg wrote on his Facebook account the night of May 29, as protests erupted around the country.

David Perdue plans to challenge Brian Kemp in GOP primary for Georgia governor, reports say Of Course Rihanna and A$AP Rocky’s Winter Coat Game Is Strong Who’s Really Quitting? New Labor Data Has the Answer.

Yet, Facebook's own automated enforcement controls determined the post likely did break the rules."Our violence and incitement classifier was almost 90% certain that this (Trump) post violated Facebook's ... policy," the June 5 analysis reads.

That contradicts conversations Zuckerberg had with civil rights leaders last year to quell concerns that Trump's post was a specific threat to Black people protesting Floyd's death, said Rashad Robinson, the president of Color of Change, a civil rights advocacy group. The group also spearheaded a boycott of Facebook in the weeks following Trump's post.

"To be clear, I had a direct argument with Zuckerberg days after that post where he gaslit me and he specifically pushed back on any notion that this violated their rules," Robinson said in an interview with the AP last week.A Facebook spokesperson said that its internal controls do not always correctly predict when a post has violated rules and that human review, which was done in the case of Trump's post, is more accurate.

To curb the ex-president's ability to stoke hateful reactions on its platform, Facebook employees suggested last year that the company limit reshares on similar posts that may violate Facebook's rules in the future.But Trump continued to use his Facebook account, which more than 32 million follow, to fire up his supporters throughout much of the remainder of his presidency. In the days leading up to a deadly siege in Washington on Jan. 6, Trump regularly promoted false claims that widespread voter fraud caused him to lose the White House, spurring hundreds of his fans to storm the U.S. Capitol and demand the results of a fair election be overturned.

It wasn't until after the Capitol riot, and as Trump was on his way out of the White House, that Facebook pulled him off the platform in January, announcing his account would be suspended until at least 2023.There's a reason Facebook waited so long to take any action, said Jennifer Mercieca, a professor at Texas A&M University who closely studied the former president's rhetoric.

"Facebook really benefited from Trump and Trump's ability to draw attention and engagement through outrage," Mercieca said."They wanted Trump to keep going on."Created to search for posts violating rules on hate speech and violence, Facebook's own automation found with almost 90 percent certainty that one particular post from Donald Trump about the death of George Floyd broke the company's regulations. Above, in this May 29, 2020, file photo, protesters gather in front of a burning fast-food restaurant in Minneapolis.

Read more: Newsweek »

TikTok