In the vast web of misinformation on social media, YouTube videos often sit at the center. Videos that may have started on YouTube, or even been suppressed by YouTube's moderators, have a way of creeping across the internet where they can find a life of their own on other social platforms.
Now, YouTube is working on ways to change that.
Neal Mohan, the platform's chief product officer, wrote in a blog post Thursday that YouTube is working to take on misinformation by stemming its reach, which could include limiting cross-platform sharing and expanding the keywords it uses to suppress misinformation content.
Part of the goal is to catch new narratives as they're bubbling up, before they go viral. Though some conspiracies have long been topics of conversation among users — take moon landing conspiracy theories and flat earthers, for example — new types of misinformation can often arise and spread quickly before trusted sources have a chance to debunk it.
"Increasingly, a completely new narrative can quickly crop up and gain views," Mohan said. "Or, narratives can slide from one topic to another—for example, some general wellness content can lead to vaccine hesitancy."
One potential change is using keywords in more languages to catch and flag misinformation in other regions, as well as working with regional analysts to catch local misinformation theories that YouTube may have missed. The company is also considering partnerships with non-governmental organizations to better understand regional and local misinformation.
But the biggest change is the possibility — still under consideration — that YouTube might disable the share button on misinformation videos. The company is also considering breaking links to videos that have already been suppressed on YouTube, but may have been shared on other platforms. "We grapple with whether preventing shares may go too far in restricting a viewer’s freedoms," Mohan said. "Our systems reduce borderline content in recommendations, but sharing a link is an active choice a person can make, distinct from a more passive action like watching a recommended video."
Another potential solution Mohan proposed: embedding interstitials, which act "like a speed bump" to let users know that the video may contain misinformation.
For major breaking news events, such as natural disasters, the company is exploring new disclaimer labels for videos or search results "warning viewers there’s a lack of high quality information," Mohan said.
But there are tradeoffs to consider. "We also have to weigh whether surfacing a label could unintentionally put a spotlight on a topic that might not otherwise gain traction," he said.