New internal documents provided by former Facebook employee-turned-whistleblower Frances Haugen provide a rare glimpse into how the company failed to control misinformation on its social network leading up to the Jan. 6 U.S. Capitol riot.
WASHINGTON (AP) — As supporters of Donald Trump stormed the U.S. Capitol on Jan. 6th, battling police and forcing lawmakers into hiding, an insurrection of a different kind was taking place inside the world’s largest social media company.
It’s a question that still hangs over the company today, as Congress and regulators investigate Facebook’s part in the Jan. 6 riots.New internal documents provided by former Facebook employee-turned-whistleblower Frances Haugen provide a rare glimpse into how the company appears to have simply stumbled into the Jan. 6 riot. It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participants spent weeks vowing — on Facebook itself — to stop Congress from certifying Joe Biden’s election victory.
The documents also appear to bolster Haugen’s claim that Facebook put its growth and profits ahead of public safety, opening the clearest window yet into how Facebook’s conflicting impulses — to safeguard its business and protect democracy — clashed in the days and weeks leading up to the attempted Jan. 6 coup.
This story is based in part on disclosures Haugen made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press. headtopics.com
What Facebook called “Break the Glass” emergency measures put in place on Jan. 6 were essentially a toolkit of options designed to stem the spread of dangerous or violent content that the social network had first used in the run-up to the bitter 2020 election. As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadsheet analyzing the company’s response.
ADVERTISEMENT“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen said in an interview with “60 Minutes.”An internal Facebook report following Jan. 6, previously reported by BuzzFeed, faulted the company for having a “piecemeal” approach to the rapid growth of “Stop the Steal” pages, related misinformation sources, and violent and inciteful comments.
Facebook says the situation is more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content, as it did on Jan 6. The company said it’s not responsible for the actions of the rioters and that having stricter controls in place prior to that day wouldn’t have helped.
Facebook’s decisions to phase certain safety measures in or out took into account signals from the Facebook platform as well as information from law enforcement, said spokeswoman Dani Lever. “When those signals changed, so did the measures.”Lever said some of the measures stayed in place well into February and others remain active today. headtopics.com
Some employees were unhappy with Facebook’s managing of problematic content even before the Jan. 6 riots. One employee who departed the company in 2020 left a long note charging that promising new tools, backed by strong research, were being constrained by Facebook for “fears of public and policy stakeholder responses” (translation: concerns about negative reactions from Trump allies and investors).
“Similarly (though even more concerning), I’ve seen already built & functioning safeguards being rolled back for the same reasons,” wrote the employee, whose name is blacked out.Research conducted by Facebook well before the 2020 campaign left little doubt that its algorithm could pose a serious danger of spreading misinformation and potentially radicalizing users.
One 2019 study, entitled “Carol’s Journey to QAnon—A Test User Study of Misinfo & Polarization Risks Encountered through Recommendation Systems,” described results of an experiment conducted with a test account established to reflect the views of a prototypical “strong conservative” — but not extremist — 41-year North Carolina woman. This test account, using the fake name Carol Smith, indicated a preference for mainstream news sources like Fox News, followed humor groups that mocked liberals, embraced Christianity and was a fan of Melania Trump.
Within a single day, page recommendations for this account generated by Facebook itself had evolved to a “quite troubling, polarizing state,” the study found. By day 2, the algorithm was recommending more extremist content, including a QAnon-linked group, which the fake user didn’t join because she wasn’t innately drawn to conspiracy theories. headtopics.com
A week later the test subject’s feed featured “a barrage of extreme, conspiratorial and graphic content,” including posts reviving the false Obama birther lie and linking the Clintons to the murder of a former Arkansas state senator. Much of the content was pushed by dubious groups run from abroad or by administrators with a track record for violating Facebook’s rules on bot activity.
Those results led the researcher, whose name was redacted by the whistleblower, to recommend safety measures running from removing content with known conspiracy references and disabling “top contributor” badges for misinformation commenters to lowering the threshold number of followers required before Facebook verifies a page administrator’s identity.
Among the other Facebook employees who read the research the response was almost universally supportive.“Hey! This is such a thorough and well-outlined (and disturbing) study,” one user wrote, their name blacked out by the whistleblower. “Do you know of any concrete changes that came out of this?”
Facebook said the study was an one of many examples of its commitment to continually studying and improving its platform.Another study turned over to congressional investigators, titled “Understanding the Dangers of Harmful Topic Communities,” discussed how like-minded individuals embracing a borderline topic or identity can form “echo chambers” for misinformation that normalizes harmful attitudes, spurs radicalization and can even provide a justification for violence.
Examples of such harmful communities include QAnon and, hate groups promoting theories of a race war.“The risk of offline violence or harm becomes more likely when like-minded individuals come together and support one another to act,” the study concludes.
Charging documents filed by federal prosecutors against those alleged to have stormed the Capitol have examples of such like-minded people coming together.Prosecutors say a reputed leader in the Oath Keepers militia group used Facebook to discuss forming an “alliance” and coordinating plans with another extremist group, the Proud Boys, ahead of the riot at the Capitol.
“We have decided to work together and shut this s—t down,” Kelly Meggs, described by authorities as the leader of the Florida chapter of the Oath Keepers, wrote on Facebook, according to court records. Read more: The Associated Press »
First known U.S. Omicron case found in fully vaccinated overseas traveler
The United States on Wednesday identified its first known COVID case caused by the Omicron variant, discovered in a fully vaccinated patient who traveled to South Africa, as scientists continue to study the risks the new version could pose.
State actor. Everything is fake. I just wanna say a big thank you to Gregorylisa31 I was nobody until a friend of mine introduced me into investing with you, Since then it's been nothing but financial 💰 Success for me. I will invite my friends and family into your company Gregorylisa31 you are the best💯
Censorship is not the way. 'failed' - read : 'Chose not to' 234 pages containing 500 sworn affidavits alleging 11,000 incidents of various types of voter fraud. Did they fail to control misinformation or did they succeed in spreading it? dont care Lulz, sure, she’s a ’whistleblower’ like I’m the King of England 🤡
It’s amazing how the media can not stop talking about Jan 6th. And now they will tell us that taking our freedoms away is better for us all. Can we talk about what is really hurting us al CNN, FOX, MSNBC don't have a whistle-blower to provide a rare glimpse into their own misinformation for the past 5 years. We don't need one because we all see it on live TV. Same with AP, WSJ, NYT..etc. Their misinformation is in writing all day.
House Holds Steve Bannon in Criminal Contempt for Refusing to Testify About 1/6The House voted Thursday to hold Steve Bannon in criminal contempt of Congress , recommending that he be prosecuted by the Justice Department for refusing to comply with a subpoena from the committee investigating the January 6 attack on the Capitol For all those who are new to this working from home Bitcoin trading options Here's a little tip: Get a trusted Bitcoin expert and stick to him Bowser_51 Invest and play at similar times each day. Because : In times of chaos, your investment is your anchor to success AND CitadelIsNotRetail CitadelIsNotRetail CitadelIsNotRetail CountTheShares CountTheShares CountTheShares KenGriffinLied KenGriffinLied KenGriffinLied KenGriffinCantSilenceUs KenGriffinCantSilenceUs GME 🚀🚀🚀🚀🚀🚀🚀
“whistleblower” “misinformation” Lol The culture of leadership at facebook adopted an ethic where the power of capital wealth transcended the humanistic principles embodied in democracy. Now you're reporting that it was Facebook's fault that the jan 6th insuraction happened. Their lack of controls caused that riot. Both the whistleblower believed this. Solution is for Facebook or another app to watch over us with strict controls over our daily communication
Oh new name sure makes this go away… DeleteFacebook Booooooooooo Dig that Chewbacca dude! Shouldn't the word deliberately or intentionally appear before 'failed to control misinformation' for accuracy sake. Just sayin'. Given the scandal ridden way FB came to be is it any surprise that they would allow this to happen. At the end of the day it's about $$$.
The documents also appear to bolster Haugen’s claim that Facebook put its growth and profits ahead of public safety in the days and weeks leading up to the attempted Jan. 6 coup.
Not stopping 'Stop the Steal:' Facebook Papers paint damning picture of company's role in insurrection Facebook 's own analysis reveals how the company failed to halt growth of movement dedicated to 2020 election lie Fine them! Facebook = HATE. DELETE FACEBOOK! You misspelled “exacerbated for profit”
Facebook Papers Reveal How It Had Been 'Fueling This Fire' Ahead Of The InsurrectionNew internal documents provide a rare glimpse into how the tech giant appears to have stumbled into the Jan. 6 riot. Stumbled? Really? What a pitiful line.
Man accused by judge of 'leading the charge' in Capitol Riot denied release from jailChristian Matthew Manley is allegedly seen in videos trying to breach the Capitol, using pepper spray twice on officers and lobbing a metal rod at police.
Alabama native charged in Jan. 6 insurrection denied releaseAn Alabama native whom a judge alleged was “leading the charge” during the Jan. 6 insurrection at the U.S. Capitol was denied release Thursday in an Alaska courtroom while his case proceeds. Christian Matthew Manley then waived his right to preliminary hearing and asked that he be immediately transferred to the U.S. District Court for the District of Columbia, the jurisdiction prosecuting those charged in the riot. Manley was soft spoken and polite, calling U.S. Magistrate Judge Matthew McCrary Scoble “sir” during his detention hearing in U.S. District Court in Anchorage.
House votes to hold Trump ally Steve Bannon in contempt over defiance of Jan. 6 panel's subpoenaThe House voted to hold Trump ally Steve Bannon in contempt for defying a subpoena issued by the select committee investigating the Jan. 6 Capitol riot