Facebook announced Wednesday that it will now limit the spread of all posts from individual Facebook users who repeatedly share content that's been debunked by fact-checkers.
That's right, all posts — even the cat pictures. Facebook already limits the reach of individual posts that contain misinformation and levies various punishments on Pages and Groups that are havens for misinformation. But it hasn't so far cracked down on individual Facebook users.
- That matters. Research has repeatedly shown that whether it comes to COVID vaccine misinformation or election falsehoods, even a small handful of individuals can become superspreaders of misinformation.
- Sometimes those superspreaders, like former President Trump, are sharing misinformation on Pages. But in other cases, they're sharing posts from individual accounts with substantial reach. Until now, Facebook has taken action against the content of their posts, but not the people behind them.
Targeting these repeat offenders is key to Facebook limiting misinformation on the platform. Burying a single piece of misinformation does little to prevent the same thing from happening again in the future. Burying all posts from a problematic user might.
- Facebook also said it will begin alerting people if they are about to Like a Page that has repeatedly shared misinformation.
These policies are contingent on Facebook's fact-checkers actually debunking users' posts, which is, after all, a manual and sometimes spotty process.
- Facebook's fact-checking program launched in 2016 with a handful of partners and has since grown substantially. But critics have continued to point out that fact-checkers are unable to keep up with the sheer volume of misinformation on Facebook.
- Once a given post has been fact-checked, Facebook uses automation to find other posts that, say, contain the same debunked meme or story. And yet, those systems sometimes fail to find replicas that have been tweaked ever so slightly to evade detection.
There are still lots of unanswered questions from Facebook about how this crackdown on individual accounts will work in practice. It's unclear, for instance, how many times a user has to share misinformation in order to have their account demoted. A Facebook spokesperson said the company's not sharing these details due to "very real concerns about gaming the system."
- While Facebook alerts people each time they share misinformation, for now, users will have no way of knowing whether Facebook is demoting all posts from their account. Facebook says it's looking at how to properly notify users when they've reached their misinformation limit.
- Also fuzzy? What it takes for a user to get back in Facebook's good graces. "If they stop sharing false content after a certain period of time, their privileges will be restored," the spokesperson said. "If they continue sharing false content, it will continue to trigger penalties."
In other news, Facebook also says the U.S. is one of the top 5 countries where influence operations originate, up there with Russia, Iran, Myanmar and Ukraine.