/Russian internet trolls hired U.S. journalists to push their news website, Facebook says

Russian internet trolls hired U.S. journalists to push their news website, Facebook says

Some of Russia’s most infamous internet trolls have launched a news website that hired real-life journalism freelancers — including Americans — to contribute, Facebook said Tuesday.

The site, called Peace Data, launched this year with coverage focused largely on the environment and corporate and political corruption. Facebook learned through a tip from the FBI that people formerly associated with the Russian Internet Research Agency, which created a number of influential Twitter and Facebook personas to inflame political tensions in the 2016 election, ran Peace Data and has taken down its known affiliated accounts. It had yet to gain a serious following, Nathaniel Gleicher, the company’s Head of Cybersecurity Policy, said.

“It confirms what I think we’ve all thought: Russian actors are trying to target the 2020 elections and public debate in the U.S., and they’re trying to be creative about it,” Gleicher said.

“But the second thing that it confirms is, it’s not really working,” he said. “You can run a loud, noisy influence campaign like the one we saw in 2016, and you get caught very quickly. Or you can try to run a much more subtle campaign, which is what this looks like. And A, you still get caught, and B, when you run a subtle influence campaign, you’re sort of working at cross-purposes with yourself. You don’t get a lot of attention for it.”

Peace Data aimed to court left-leaning voters, Ben Nimmo, whose firm, Graphika, released a report on the site, said.

“This looks like an attempt to target left-wing audiences on a range of issues, but the operation got taken down in its early stages and didn’t score measurable impact,” Nimmo said. “The election wasn’t the only focus, but to the extent that it was, it looks like the operation wanted to divide Democratic voters, the same way the IRA tried in 2016,” he said, referring to the Internet Research Agency.

The page appears to have got very little attention on social media, but it wasn’t entirely for the lack of trying. One of Peace Data’s “editors” was a fictional persona called Alex Lacusta who tried to share the site’s stories to dozens of left-leaning Facebook groups, though those posts got fewer than 200 shares in total.

Facebook’s role in tying the site to the Internet Research Agency, whose owner has close ties to Russian President Vladimir Putin and which ran unchecked on Facebook and Twitter ahead of the 2016 election, illustrates one way the U.S. government is trying to battle foreign influence efforts online.

The FBI doesn’t publicly attribute state-sponsored online influence operations, which may not break U.S. law, but it does occasionally tip social media companies to them. The FBI also provided Facebook with the tip on the Internet Research Agency’s Instagram accounts and Facebook pages in 2018, which the social media giant took down. In an August op-ed, Paul Nakasone, the leader of both the National Security Agency and the U.S. Cyber Command, the government’s two most elite agencies in cyberspace, wrote that it was those agencies that had “shared threat indicators with the FBI to bolster that organization’s efforts to counter foreign trolls on social media platforms.”

Peace Data screenshot.peacedata.net

Peace Data recruited journalists on the freelance networking website Guru, which didn’t mention any connection to the Russian government, but called for “writers for the following topics: anti-war, corruption, abuse of power, human rights violations, and such like.”

While some of Peace Data’s freelance journalists were real reporters, some of its “editors” were personas whose profile pictures were deepfakes, or algorithmically generated, Lee Foster, manager of information operations at the cybersecurity firm Mandiant, said.

“From a detection and attribution standpoint, when it’s clearly a real person writing this content and presumably also amplifying that content, it’s harder to identify as an influence operation,” Foster said.

“It allows you to produce content that is more credible,” he said.

Original Source