Russian agents believed to be connected to the government have been active in spreading divisive content and promoting extreme themes ahead of Tuesday's U.S. mid-term elections, but they are working hard to cover their tracks, according to government investigators, academics and security firms.
Researchers studying the spread of disinformation on Facebook, Twitter, Reddit and other platforms say the new, subtler tactics have allowed most of the so-called information operations campaigns to survive purges by the big social media companies and avoid government scrutiny.
"The Russians are definitely not sitting this one out," said Graham Brookie, director of the Atlantic Council's Digital Forensic Research Lab. "They have adapted over time to increased (U.S.) focus on influence operations."
U.S. intelligence and law enforcement agencies say Russia used disinformation and other tactics to support President Donald Trump's 2016 campaign.
The Russian government has rejected allegations of election interference. On Tuesday, Russian President Vladimir Putin's spokesman declined to comment on allegations of further meddling in the run-up to the mid-term elections.
"We cannot react to some abstract cybersecurity analysts because we do not know who they are and whether they understand anything about cybersecurity," Dmitry Peskov told reporters.
He said Moscow expected no significant improvement to its strained ties with Washington after the vote.
One clear sign of a continued Russian commitment to disrupting American political life came out in charges unsealed last month against a Russian woman who serves as an accountant at a St. Petersburg company known as the Internet Research Agency.
After spending $12 million on a project to influence the U.S. election through social media in 2016, the company budgeted $12.2 million for last year and then proposed spending $10 million in just the first half of 2018, court filings showed.
The indictment said the Internet Research Agency used fake social media accounts to post on both sides of politically charged issues including race, gun control and immigration. The instructions were detailed, down to how to mock particular politicians during a specific news cycle.
If the goals of spreading divisive content have remained the same, the methods have evolved in multiple ways, researchers say. For one, there has been less reliance on pure fiction. People have been sensitized to look for completely false stories, and Facebook has been using outside fact-checkers to at least slow their spread on its pages.
"We've done a lot research on fake news and people are getting better at figuring out what it is, so it's become less effective as a tactic," said Priscilla Moriuchi, a former National Security Agency official who is now a threat analyst at the cybersecurity firm Recorded Future threat manager.
Instead, Russian accounts have been amplifying stories and internet "memes" that initially came from the U.S. far left or far right. Such postings seem more authentic, are harder to identify as foreign, and are easier to produce than made-up stories.
Renee DiResta, director of research at security company New Knowledge, said her company had compiled a list of suspected Russian accounts on Facebook and Twitter that were similar to those suspended after the 2016 campaign.
Some of them seized on the Brett Kavanaugh nomination to the Supreme Court to rally conservatives, while others used memes from the leftist Occupy Democrats. Some operators of the accounts in the collection established themselves as far-right pundits and had accounts on Gab, the social network favored by the far right.
Brookie said that while the Russian accounts might jump on a hot topic, the payoff would often come by throwing in related issues.
But that need not be necessary when the main topic is divisive enough. Take the idea of "Blexit," a call for black Americans to exit the Democratic Party. The Daily Beast said it captured 250,000 tweets with the Blexit hashtag during a 15-hour burst last week and found that 40,000 of them came from handles that had previously participated in Russian information campaigns.
Though jumping on existing bandwagons is easier than what Russia did in 2016, other new tactics have been more complex.
In the October indictment and an earlier operation uncovered by Facebook, records showed that the instigators used Facebook's Messenger service to try to get others to buy advertisements for them and to recruit American radicals to promote real-world protests.
Those moves allowed the Russians to evade strengthened detection systems and blend in with the crowd.
"They are baiting Americans to drive more polarizing and vitriolic content," Brookie said. "Any given solution needs to focus on basing our politics on facts, first and foremost, and to focus on what holds our country closer together."
TWEET YOUR COMMENT