By: Denise Simon | Founders Code
FB: As the U.S. braces for election-related unrest next month, Facebook executives are implementing emergency measures reserved for “at-risk” countries in a company-wide effort to bring down the online temperature.
The Wall Street Journal reported Sunday that the social media giant plans to limit the spread of viral content and lower the benchmark for suppressing potentially inflammatory posts using internal tools previously deployed in Sri Lanka and Myanmar.
The tools, now a key component of Facebook’s strategy to prepare for the contentious U.S. election, would only be activated in “dire circumstances” and instances of violence, people familiar with the matter told the Journal.
The measures would loosen the threshold previously established for content deemed dangerous on the platform, and would slow down the dissemination of specific posts as they begin to gain traction, the Journal explains. An internal adjustment would also be applied to news feeds to control the content available to users.
“Deployed together, the tools could alter what tens of millions of Americans see when they log onto the platform, diminishing their exposure to sensationalism, incitements to violence and misinformation, said the people familiar with the measures,” the Journal writes. “But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook employees uneasy, some of the people said.”
Facebook spokesman Andy Stone told the Journal that the company has “spent years building for safer, more secure elections,” and that their strategy is based on “lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios.”
The move comes days after Facebook censored a story from The New York Post detailing allegedly corrupt business deals by Joe Biden’s son Hunter Biden — which prompted harsh backlash from President Trump and Republicans who have long criticized the platform’s role in regulating content.
At the time, Facebook CEO Mark Zuckerberg said that the company would impose fewer restrictive rules on content following the conclusion of November’s election, but that they had implemented policy changes to address any uncertainty and the perpetuation of disinformation for the time being, according to BuzzFeed News.
“Once we’re past these events and we’ve resolved them peacefully, I wouldn’t expect that we continue to adopt a lot more policies that are restricting of a lot more content,” Zuckerberg said.
Company higher-ups have said these tools are the nuclear option and will only be used in the event of election-related violence or other dire circumstances, people familiar with the planning told the outlet. Some employees at the company said they were uneasy about these measures and particularly concerned that they could suppress legitimate political discussions and viral content, according to the Journal.
Facebook established its toolkit for humanitarian intervention after facing widespread criticism for mishandling violent hate speech against Rohingya Muslims in Myanmar. As far back as 2014, human rights activists implored Facebook to crack down on inflammatory rumors and calls for violence against the minority Rohingya population. After years of violence, mass exodus, and thousands of deaths, Facebook admitted in 2018 that it had been “too slow to act” and wasn’t “doing enough to help prevent our platform from being used to foment division and incite offline violence.” The company pledged to better prepare for future crises and promptly banned several high-profile figures that were named by the United Nations as complicit in the genocide.
Facebook announced last month that it would not accept new political ad submissions a week before election day and plans to ban all political ads indefinitely once the polls close. It also said it will label any premature declarations of victory by either candidate (though, really, we all know which one they’re worried about) and include “specific information…that the counting is still in progress and no winner has been determined.” Facebook’s VP of global affairs and communications, Nick Clegg, recently said that, to date, the company’s rejected 2.2 million ads and withdrawn 120,000 posts in total across Facebook and Instagram that were trying to “obstruct voting” in the 2020 presidential election. More here.