Facebook announced a suite of new tools for group admins Wednesday, designed to slow down heated conversations and root out spam and troublesome members.
The tools offload some of the work of cleaning up Facebook onto group administrators, who often have the most immediate insight into what's happening in their communities, but until how, have had little recourse to immediately address serious issues.
The new Admin Home platform includes tools like conflict alerts, which use AI to notify group admins when "contentious or unhealthy conversations" are taking place in a group, a Facebook blog post said. Facebook also unveiled a tool that lets moderators slow down those conversations by limiting how often certain members can comment or how often comments can be made in general.
The new tools also give admins the ability to filter out promotional materials with specific links and spam, as well as comments from people who are new to Facebook or new to the group.
"It's important for admins to be able to set, shape and reinforce a community's culture," Facebook's vice president of communities, Maria Smith, wrote. "Understanding who their members are and setting clear rules and norms for the community to follow is at the center of that."
Facebook has become increasingly aware of the problems that exist within groups, particularly in the aftermath of the Capitol riot, which was organized in part inside large Stop the Steal groups on Facebook. Facebook acted to shut the groups down, but they continued to pop back up, often with new names and set as private.
Weeks after the riot, Facebook said it would no longer promote political groups in its News Feed. Over the last year, it's also announced a slew of new policies with regard to groups and their admins, punishing groups and admins that repeatedly violate Facebook policies and archiving groups whose admins are no longer active.
Those measures could help crack down on groups led by problematic users. But the tools Facebook announced Wednesday seem to be geared toward admins who genuinely want the conversations they're leading to be less toxic.
All of it is an abrupt about-face from Facebook's aggressive lean into groups just a few years ago, when, in an effort to distance itself from the Cambridge Analytica scandal, Mark Zuckerberg declared that the future of Facebook was private communities and that the company had redesigned its platform to "make communities as central as friends."
It's an even more abrupt retreat from Facebook's first decade of chasing growth at all costs. The new tools embrace the concept of adding friction to online interactions, something that Twitter is also experimenting with through its prompts that warn users before they use "harmful" language.
Now, Facebook, the company that pushed its employees to move quickly, is taking stock of what's broken and urging its users to move slowly.