Social media companies still use COVID-19 staff shortages to justify removing content that doesn't violate platform rules.
Both Twitter and Facebook scaled back human reviews of content in the early days of the pandemic. The companies employ contractors to sift through material that may break their sites' rules, and said many contractors could not do that work from home. Two years on, both companies blame the effects of COVID-19 for erroneous content removal and the suspension of users' ability to appeal the censorship of their content.
Both social media giants have increased content restrictions throughout the pandemic, even while claiming they can't adequately review removals. Twitter and Facebook rolled out new initiatives to ban or hide speech that suggested COVID-19 came from a Wuhan lab, although that theory has gained traction in the scientific community. Twitter regularly suspends politicians and other users who question vaccine mandates.
Both Twitter and Facebook switched to remote work in March 2020. Twitter said at the time it would rely on "machine learning and automation" to "take a wide range of actions on potentially abusive and manipulative content," and acknowledged that this change would lead to more mistakes and incorrect penalties for users.
Although Twitter previously said the increased use of automated systems was a temporary move during the pandemic, a Twitter spokeswoman told the Washington Free Beacon the company would continue to "utilize machine learning" to enforce Twitter rules. The spokeswoman did not say whether Twitter would increase human review.
Facebook says pandemic staffing problems caused them to suspend many users' ability to appeal content removals. The company says it lacks the bandwidth to conduct such reviews, even if content was erroneously removed by an automated system. Instead, Facebook promises users that their complaints will serve as training data to help the automated systems perform better "in the future."
In response to a Free Beacon inquiry, a Facebook spokesman contradicted this policy, saying the company allows users to appeal takedowns of their content in "the vast majority of cases." The spokesman also said Facebook was "actively monitoring feedback to improve the accuracy of our enforcement efforts." Facebook did not respond to requests to provide data on the percentage of users who can appeal content removal.
In a November 2020 report, Facebook said most of the content removal decisions it later reversed were made by automated systems, suggesting those systems are substantially more likely to punish behavior that doesn't violate Facebook's rules. Facebook does not publish data on the percentage of content that is successfully appealed.
Institute for Free Speech president David Keating told the Free Beacon that social media platforms should restore an appeals process for removed posts. He also criticized platforms for having a different "set of rules for powerful people," calling content moderation systems "a black box" for users.
Facebook allows most workers to work from home full time, and Twitter plans to allow remote indefinitely. It is unclear when the contracted content moderators Twitter and Facebook employed will return to regular, in-office work.