In the run-up to the 2020 election, the most highly contested in US history, Facebook’s most popular pages for Christian and Black American content were being run by Eastern European troll farms. These pages were part of a larger network that collectively reached nearly half of all Americans, according to an internal company report, and achieved that reach not through user choice but primarily as a result of Facebook’s own platform design and engagement-hungry algorithm.
The report, written in October 2019 and obtained by MIT Technology Review from a former Facebook employee not involved in researching it, found that after the 2016 election, Facebook failed to prioritize fundamental changes to how its platform promotes and distributes information. The company instead pursued a whack-a-mole strategy that involved monitoring and quashing the activity of bad actors when they engaged in political discourse, and adding some guardrails that prevented “the worst of the worst.”
But this approach did little to stem the underlying problem, the report noted. Troll farms—professionalized groups that work in a coordinated fashion to post provocative content, often propaganda, to social networks—were still building massive audiences by running networks of Facebook pages. Their content was reaching 140 million US users per month—75% of whom had never followed any of the pages. They were seeing the content because Facebook’s content-recommendation system had pushed it into their news feeds.