A self-described group of Russian “cyber soldiers” have embraced new tactics in the information front of the country’s war in Ukraine, according to research viewed by Protocol, and the approach is helping pro-Moscow influence operations get around the policies of social platforms such as Instagram.
The prominent Moscow-allied group, known as Cyber Front Z, is increasingly organizing on Telegram, where it tells supporters how to deploy harassment of pro-Ukraine voices and disseminate pro-Russia talking points about the conflict on major social media sites. That’s according to a report from researchers at Reset, which focuses on studies that examine “the interplay between democracy and technology.”
The research suggests that by using Telegram as a staging ground, the Russian group can instruct “forces” made up of real people while avoiding much of the monitoring and “coordinated inauthentic behavior” policies of bigger services like Instagram, YouTube and Facebook, where the posts and comments ultimately ended up.
The findings raise questions about whether the platforms can keep up with Vladimir Putin’s administration and its evolving attempts to support its war in Ukraine. And given Russia’s habit of going global with information warfare tactics it first tries out in Eastern Europe, the report also suggests the same techniques could become a feature of potential Russian efforts to sway the upcoming midterm election in the U.S.
Moscow-allied group Cyber Front Z is increasingly organizing on Telegram, where it tells supporters how to deploy harassment and disseminate pro-Russia talking points on major social media sites.
Before, Russia had largely fought in cyberspace with government-allied trolls posing as everyday citizens of target countries as well as spam bots posting the same message over and over again — both of which prompted major platforms to come down on what’s often called coordinated inauthentic behavior. Reset’s research revealed the pro-Russian effort has pivoted instead to mobilize and guide real people to post on major services with their own version of messages revolving around the same coordinated theme — a form of influence that has proven potent online from Gamergate to the Big Lie.
The conclusions from Reset build on reports of everyday Russians turning to Telegram — an encrypted app originally founded in Russia that has developed a following among far-right groups and purveyors of disinformation — because more mainstream platforms chose or had to shut down in the country in response to the war. But the app also functions as a digital battlefront where those loyal to Putin attempt to spin the conflict, and the Kremlin has also increasingly turned to real people to post on TikTok, reportedly paying existing influencers to have them talk up the war.
The researchers from Reset focused on the Telegram channel for Kremlin-allied group Cyber Front Z — a variation on a troll factory that the U.K. government said “has suspected links to … the founder of the most infamous and wide-ranging bot-farm the Internet Research Agency.” The study looked at Cyber Front Z’s activities during roughly the opening two months of Russia’s invasion and found more than 1,100 posts, most containing instructions to followers. The researchers then traced the resulting posts and comments back to major platforms.
Cyber Front Z first posted on March 12 with a call to Russia supporters to help, according to the study. The channel has now reached more than 100,000 followers, although its most active participants likely only number around 4,000.
Many of the group’s posts encourage followers to disseminate patriotic images to boost the Russian war, according to Reset. It also routinely instructs users to target “traitors” — such as Russians that are apparently insufficiently enthusiastic as well as international pro-Ukraine voices — with harassing comments, reports, downvotes and more. Such posts include explicit “links to the target’s social media platforms, and usually a clear instruction of which actions should be performed,” according to Reset.
At times, the study found, Cyber Front Z communications go for subtle menace, inviting supporters to “visit” targets or “explain” things to them. The group also employs pseudo-military terminology, referring to mobilizing and conducting “blitzkrieg,” or else suggesting followers “spam” or “drown” particular people or posts.
The study found clear instances when Cyber Front Z instructions immediately preceded a spike in activity on the major social media platforms, including comments in Russian on content where the discussion had previously been mainly in other languages. The research also suggested the Russian effort was focused primarily on Instagram, followed by YouTube — precisely those services that prior studies have blasted for their relative lack of enforcement aimed at curbing bot-driven, pro-Moscow disinformation and harassment.
Many of the group’s posts encourage followers to disseminate patriotic images to boost the Russian war.
A recent report from a NATO-accredited international group aimed at educating military and government personnel, for instance, found that Instagram was the cheapest major service for bots to manipulate, with comparatively poor removal of abuse (despite Facebook scoring better among social media services). That research group, the NATO Strategic Communications Centre of Excellence, also found that YouTube is improving somewhat in content removal when bots post, but is getting outpaced by the increasing speed of manipulation.
In general, in response to the Ukraine war, social media platforms have doubled down on their policies around coordinated inauthentic behavior, or occasionally coordinated mass bullying. Meta said during the early days of the war that it had taken down “a relatively small network” targeting Ukraine and coordinating persons off its platform, including on Telegram. Cyber Front Z’s strategy involves real humans, distinct posts and activity that’s only coordinated on Telegram, which is out of view of the social media companies themselves.
Spokesperson Ivy Choi told Protocol YouTube is “aware of the group Cyber Front Z,” adding, “since the war in Ukraine began, our Threat Analysis Group (TAG) has worked to quickly identify and terminate a number of YouTube channels as part of our investigations into coordinated influence operations linked to Russia.” YouTube has removed more than 9,000 channels and more than 70,000 videos related to the war for violating its policies, Choi said.
Reset suggested this practice creates a need for a “content-based approach” to finding users who are posting at the behest of Cyber Front Z, suggesting that certain keywords could identify them. Focusing on content over actions, however, might also sweep in everyday users who are posting without instructions from a Telegram channel.
The question of the platforms’ ability and willingness to tackle Russian information operations on social media comes as the U.S. midterm election season is getting into full swing. The topic of Russia’s online activities came under intense scrutiny in the U.S. following revelations of Moscow’s attempts to boost Donald Trump’s presidential campaign in 2016. To try to sow division and affect turnout among particular groups, the effort used bots as well as accounts and groups that purported to be Americans. The tactics relied on coordinated inauthentic activity that Russia had previously used in other countries in Europe, including Ukraine.
Yet the Russian endeavors continued, with similar use of fake personas in 2020, although the work of genuine U.S. groups and politicians to spread misinformation, particularly about the integrity of the vote, ended up being the focus of many of the major platforms’ revised policies. If current cyber techniques being used in the Ukraine war prove successful and Russia does indeed aim operations at the 2022 elections here in the U.S., however, its troll army could continue to boost propaganda and harassment campaigns on the major platforms by coordinating them off-platform.
Reset, which includes major scholars on its advisory board and is overseen day to day by the former head of the technology policy advisory group to Hillary Clinton’s 2016 presidential campaign, provided its findings to Protocol but is not otherwise publicizing them.