Under a new policy, which YouTube announced in a blog post Thursday, the company will prohibit conspiracy theory content that is "used to justify real-world violence," including videos that suggest individuals or groups are complicit in QAnon.
Unlike Facebook's wholesale ban on QAnon accounts, YouTube will allow QAnon videos to stay up if they don't explicitly target individuals or groups.
YouTube cast its decision as part of a years-long evolution of its hate and harassment policies. According to the blog post, since January 2019, QAnon-related YouTube channels have seen an 80% drop in traffic from non-subscribers clicking on YouTube's recommendations. In 2018, YouTube changed its recommendations in order to prevent them from nudging users toward increasingly extremist content. Now, YouTube says this latest change could limit QAnon's spread even more.
"Managing misinformation and harmful conspiracy theories is challenging because the content is always shifting and evolving," the post reads. "To address this kind of content effectively, it's critical that our teams continually review and update our policies and systems to reflect the frequent changes."