Game development platform Unity has released a new study on online behavior in video games, and its findings paint a particularly grim picture of the gaming community's propensity for toxicity. The study found that seven out of 10 players say they've experienced some form of toxic behavior, described as "sexual harassment, hate speech, threats of violence, doxing" and other abusive chat or insulting voice activity.
Nearly half of players say they at least "sometimes" experience toxic behavior while playing, and around 21% report experiencing it "every time" or "often." Around two-thirds of players admit they're somewhat likely to stop playing a game if someone else is abusive toward them or exhibits toxic behavior. A vast majority, or about 92%, think there should be better solutions to enforce compliance with in-game codes of conduct and rules around toxic behavior. The study was conducted with 2,076 U.S.-based participants ages 18 or older in partnership with The Harris Poll.
To that end, Unity said toxicity is an addressable problem, and one solution is improved automated moderation. The company's study coincides with the announcement of its acquisition of OTO, an artificial intelligence firm specializing in analysis of voice chats. Unity said OTO's software, which can analyze the sentiment with which someone says something, will allow it to improve tools for game developers. That way, games can better monitor online behavior, especially spoken conversations over in-game voice chats, to try to root out abusive actors.
"We believe that online communities, especially with the rise of online multiplayer, have a rise in toxic behavior," said Felix Thé, a vice president of product management at Unity, in an interview with Protocol. "Toxic interactions at any level are something we need to address."
Thé said online games have seen a surge in new players during the COVID-19 pandemic, and that, in turn, has exacerbated toxicity issues in gaming. Newer players are more likely to be turned off by toxic behavior, while more-seasoned players may be more inclined to act abusively toward less-experienced newcomers.
Toxicity also has an acute effect on women who play video games. The study found that men were more willing to engage in voice chat and other communication tools while playing games, and therefore experience toxic behavior more often. However, women often avoid online communication tools because of gendered abuse such as sexual harassment. Women who do participate, the study found, are more likely to stop playing a game because of such toxicity.
Thé said Unity's plan is to incorporate OTO's technology into its existing Vivox tool, which gives game makers easy-to-use voice and text chat features. There's no concrete timeline for when the new AI-based tools will be incorporated, but Thé said they're working to do so soon.
Thé also stressed the effectiveness of OTO's sentiment-based moderation, pointing out how it can help reduce the error rate in issuing bans and other forms of punishment by differentiating between when someone is using expletives or other language in a friendly manner among people they know versus being legitimately abusive toward strangers.
"The tech is being used to assign a propensity of whether an interaction is toxic or not. From a privacy standpoint, it's far superior and more importantly, more effective," Thé said, adding that it's able to scale to more languages faster because sentiment when speaking is often the same in languages of similar origin, like European romance languages.
Because only the AI sentiment analysis of reported incidents is later reviewed by human moderators, Thé said it's a better method than constantly recording human conversations and storing them for review. "We believe this is fast, privacy-centric, much more effective and scalable," he added.