Facebook forced troll farm content to over 40% of all Americans every month


Following the 2016 election, Facebook knew it had a problem. Pages and fake accounts created by the Kremlin-backed Internet Research Agency spread across the social network and garnered massive engagement from real users. Facebook knew it had to take back control.

But years later, Facebook’s own internal research teams revealed that troll farms still reach massive audiences, even if they don’t have many direct followers. The company’s own algorithms pushed content from trolls to users who had not shown interest in the pages, exponentially expanding the reach of trolls. A report detailing the research was leaked to MIT Technology Review by a former employee.

When the report was released in 2019, troll farms were affecting 100 million Americans and 360 million people worldwide every week. In any given month, Facebook was showing posts from troll farms to 140 million Americans. Most users have never followed any of the pages. On the contrary, Facebook’s content recommendation algorithms had forced content on more than 100 million Americans every week. “A large majority of their ability to reach our users comes from the structure of our platform and our ranking algorithms rather than user choice,” the report says.

Troll farms seemed to distinguish users in the United States. As more and more people around the world saw content in raw numbers – 360 million every week according to Facebook’s own accounts – troll farms reached over 40% of all Americans.

The report, written by Jeff Allen, a former data scientist at Facebook, found that prioritizing business engagement led to the problem. Facebook, he said, knows very little about content producers. Who posted something was not counted in the News Feed algorithm.

“That’s a lot of extremely sophisticated collaborative filtering algorithms, but all engagement-based,” Allen wrote. “When the content producers who win this system tap into the communities on our platform rather than creating and supporting them, it becomes clear that the ranking system does not reflect our company values. So much so that it actually works against us. “

Foreign influencers

Most of the pages are from countries on the Balkan Peninsula and target a foreign audience with a primary focus on Americans, the report says.

The popularity and reach of troll farms led Allen to believe that agents of the Russian Internet Research Agency were likely able to exploit the same techniques or use the same pages to reach American users. “If Troll Farms reaches 30 million US users with content aimed at African Americans, we shouldn’t be at all surprised if we find out that the IRA currently has a large audience there as well,” Allen wrote.

Additionally, troll farms were able to slip their content into instant articles and ad breaks, two Facebook programs that allow partners to reduce sales of ads that run alongside page content. “In Instant Articles, there was a period where maybe up to 60% of Instant Article reads were on scraped content, which is Troll Farms’ article writing method of choice.” Allen said. Facebook had inadvertently paid the troll farms.

“This message was from an employee who was leaving in 2019,” Facebook spokesman Joe Osborne told Ars. “Even when it was published, we had already investigated these topics – and since that time – we have teamed up, developed new policies and collaborated with industry peers to address these networks. We have taken aggressive enforcement action against these types of foreign and domestic inauthentic groups and have shared the results publicly on a quarterly basis.

Ars has sent additional questions to Facebook, and we’ll update this story if we get a response.

Communities operated

Users who viewed the troll farm content tended to split into two groups, Allen wrote. “One camp doesn’t realize that the pages are run by inauthentic actors who exploit their communities. They tend to like these pages. They love how entertaining the posts are and how they reaffirm their already held beliefs, ”he wrote. “The other side realizes that the pages are run by inauthentic actors. They hate the shit still in love with these pages. They hate these pages with a passion that even I find impossible to match.

The latter group was actively talking about the problem to Facebook. “Our users are literally trying to tell us that they feel taken advantage of by these pages,” Allen said.

As an example, Allen cited a user who discovered a troll farm page targeting American Indians. The troll group stole artwork and sold it reprinted on t-shirts that often never shipped to customers, the user said. “This whole group is a fraud ring,” the user wrote.

The troll farms highlighted in the report primarily targeted four different groups: American Indians, Black Americans, Christian Americans, and American women. At the time of writing in October 2019, the report states that for many of these groups, the majority of the front pages were run by troll farms, including the first 15 pages targeting American Christians, with 10 of the top 15 targeting American Christians. Black Americans, and four of the top 15 targeting American Indians. When MIT Technology Review published its article, five of the troll groups were still active. Three targeted black Americans, one targeted American Christians, and one targeted American Indians.

Much of the content posted by these groups, while frequently stolen, has apparently not broken Facebook’s content guidelines. Still, that didn’t mean it was harmless, Allen said. “Ultimately, whether or not there is an intellectual property violation, posting strictly non-original content violates our policies and exposes our communities to exploitation,” Allen explained.

Simple fixes

Allen thought the problem could be fixed quite easily by incorporating “Graph Authority,” a way to rank users and pages similar to Google’s PageRank, into the News Feed’s algorithm. “Adding even a few simple features like Graph Authority and removing the button from purely engagement-based features would probably pay off a ton both in the integrity space and … probably also in engagement,” he said. -he writes.

Allen quit Facebook shortly after writing the document, reports the MIT Technology Review, in part because the company “effectively ignored” his research, a source said.

Leave A Reply

Your email address will not be published.