Unprecedented research in the context of the 2020 presidential election reveals algorithms are extremely influential in people’s on-platform experiences and there is significant ideological segregation in political news exposure but, among consenting study participants, changes to critical aspects of the algorithms that determine what they saw did not sway political attitudes.
Area of Study
This article was originally published on Medium.
Today, academics from U.S. colleges and universities working in collaboration with researchers at Meta published findings from the first set of four papers as part of the most comprehensive research project to date examining the role of social media in American democracy. The papers, which focus primarily on how critical aspects of the algorithms that determine what people see in their feeds affect what people see and believe, were peer-reviewed and published in Science and Nature.
The academic team proposed and selected specific research questions and study designs with the explicit agreement that the only reasons Meta could reject such designs would be for legal, privacy, or logistical (i.e., infeasibility) reasons. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions. With this unprecedented access to data and research collaboration, the team found:
Algorithms are extremely influential in terms of what people see and in shaping their on-platform experiences.
There is significant ideological segregation in political news exposure.
Three experiments conducted with consenting participants run during the 2020 election period suggest that although algorithm adjustments significantly change what people see and their level of engagement on the platforms, the three-month experimental modifications did not notably affect political attitudes.
The project was announced in 2020 after internal researchers at Meta initiated a partnership with Professor Talia Jomini Stroud, founder and Director of the Center for Media Engagement at the University of Texas at Austin, and Professor Joshua A. Tucker, co-founder and co-director of the Center for Social Media and Politics at New York University and Director of the NYU Jordan Center for the Advanced Study of Russia, around the impact of Facebook and Instagram on the 2020 U.S. elections.
“Social scientists have been limited in the study of social media’s impact on U.S. democracy,” said Stroud and Tucker. “We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes. What we don’t know is why. It could be because the length of time for which the algorithms were changed wasn’t long enough, or these platforms have been around for decades already, or that while Facebook and Instagram are influential sources of information, they are not people’s only sources.”
The core research team included 15 additional academic researchers with expertise in the four areas this project focused on: political polarization, political participation, (mis)information and knowledge, and beliefs about democratic norms and the legitimacy of democratic institutions. The team worked with Meta researchers to design experimental studies with consenting users who answered survey questions and shared data about their on-platform behavior. The team also analyzed platform-wide phenomena based on the behavior of all adult U.S. users of the platform. Platform-wide data was only made available to the academic researchers in aggregated form to protect user privacy. Additional findings include:
Ideological segregation on Facebook (article available here)
Many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both.
Ideological segregation associated with political news URLs posted by Pages and in Groups was higher than content posted by users.
There was an asymmetry between conservative and liberal audiences, where there were far more political news URLs almost exclusively seen by conservatives than political news URLs exclusively seen by liberals.
The large majority (97%) of political news URLs (posted at least 100 times) on Facebook rated as false by Meta’s third-party fact checker program were seen by more conservatives than liberals, although the proportion of political news URLs rated as false was very low.
Impacts of removing reshared content on Facebook (article available here)
Removing reshared content on Facebook substantially decreased the amount of political news and content from untrustworthy sources people saw in their feeds, decreased overall clicks and reactions, and reduced clicks on posts from partisan news sources.
Removing reshares reduced the proportion of political content in people’s feeds by nearly 20% and the proportion of political news by more than half.
Although making up only 2.6% of Facebook feeds on average, removing reshares reduced the amount of content from untrustworthy sources by 30.6%.
Removing reshared content on Facebook decreased news knowledge among the study participants, and did not significantly affect political polarization or other individual-level political attitudes.
Impacts of altering feed algorithms from personalized to chronological (article available here)
Replacing study participants’ algorithmically ranked feeds on Facebook and Instagram with a simple chronological ranking, meaning that they saw the newest content first, substantially decreased the time participants spent on the platforms and how much they engaged with posts there.
The average study participant in the Algorithmic Feed group spent 73% more time each day on average compared with U.S. monthly active users, but the Chronological Feed group spent only 37% more.
The chronologically ordered feed significantly increased content from moderate friends and sources with ideologically mixed audiences on Facebook; it also increased the amount of political and untrustworthy content relative to the default algorithmic feed. The chronological feed decreased uncivil content.
When presented in chronological order, political content — appearing in 13.5% of participants’ feeds on Facebook and 5.3% on Instagram on average — increased by 15.2% on Facebook and 4.8% on Instagram.
When participants viewed the chronological feed, content from untrustworthy sources, making up 2.6% of Facebook feeds and 1.3% of Instagram feeds on average, increased by 68.8% and 22.1%, respectively.
Posts with uncivil content on Facebook (estimated as 3.2% of participants’ feeds on average) decreased by 43% when participants saw a chronological feed. Posts with uncivil content on Instagram (estimated as 1.6% of participants’ Instagram feeds on average), however, did not decrease.
Despite these substantial changes in participants’ on-platform experience, the chronological feed did not significantly alter levels of issue polarization, affective polarization, political knowledge, or other key attitudes during the three-month study period.
Impacts of deprioritizing content from like-minded sources on Facebook (article available here)
Posts from politically ‘‘like-minded” sources constitute a majority of what people see on the platform, although political information and news represent only a small fraction of these exposures.
The median Facebook user received a majority of their content from politically like-minded sources — 50.4% versus 14.7% from cross-cutting sources (i.e., liberals seeing content from conservatives or the opposite). The remainder are from friends, Pages and Groups that are classified as neither like-minded nor cross-cutting.
Reducing the prevalence of politically like-minded content in participants’ feeds during the 2020 U.S. presidential election had no measurable effects on attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims.
“Asymmetric Ideological Segregation in Exposure to Political News on Facebook,” led by Professors Sandra González-Bailón and David Lazer from the University of Pennsylvania and Northeastern University, respectively, analyzed on-platform exposure to political news URLs during the U.S. 2020 election and compared the inventory of all the political news links U.S. users could have seen in their feeds with the information they saw and the information with which they engaged.
“This begins to answer questions about the complex interaction between social and algorithmic choices in the curation of political news and how that played out on Facebook in the 2020 election,” said Sandra González-Bailón.
“Reshares on Social Media Amplify Political News but Do Not Detectably Affect Beliefs or Opinions,” led by Professors Andrew Guess from Princeton, and Neil Malhotra and Jennifer Pan from Stanford, studied the effects of exposure to reshared content on Facebook during the 2020 U.S. election.
“Most of the news about politics that people see in their Facebook feeds comes from reshares,” said Andrew Guess. “When you take the reshared posts out of people’s feeds, that means they are seeing less virality-prone and potentially misleading content. But that also means they are seeing less content from trustworthy sources, which is even more prevalent among reshares.”
In “How Do Social Media Feed Algorithms Affect Attitudes and Behavior in an Election Campaign?,” also led by Guess, Malhotra and Pan, the team investigated the effects of Facebook and Instagram feed algorithms during the 2020 U.S. election by comparing the standard feed to a chronologically ordered feed.
“The findings suggest that chronological feed is no silver bullet for issues such as political polarization,” said Pan.
Finally, the “Like-minded Sources on Facebook Are Prevalent but Not Polarizing” study, led by Professors Brendan Nyhan from Dartmouth, Jaime Settle from William & Mary, Emily Thorson from Syracuse and Magdalena Wojcieszak from University of California, Davis, presented data from 2020 for the entire population of active adult Facebook users in the U.S., showing that content from politically like-minded sources constitutes the majority of what people see on the platform, though political information and news represent only a small fraction of these exposures. The study subsequently reduced the volume of content from like-minded sources in consenting participants’ feeds to gauge the effect on political attitudes.
“This tells us that reducing exposure to content from like-minded sources, at least in the context of the 2020 presidential election, did not substantively affect political attitudes,” said Settle.
Academics from Dartmouth, Northeastern University, Princeton, Stanford, Syracuse University, University of California, Davis, University of Pennsylvania, University of Virginia and William & Mary are the lead authors of these initial studies. The lead researchers from the Meta team were Pablo Barberá for all four papers and Meiqing Zhang for the paper on ideological segregation. Meta project leads are Annie Franco, Chad Kiewiet de Jonge, and Winter Mason.
In the coming year, additional papers from the project will be publicly released after completing the peer-review process. They will provide insight into the content circulating on the platforms, people’s behavior and the interaction between the two.
The four papers are available at the links below:
How do social media feed algorithms affect attitudes and behavior in an election campaign?
Asymmetric Ideological Segregation in Exposure to Political News on Facebook
Reshares on social media amplify political news but do not detectably affect beliefs or opinions
Like-minded sources on Facebook are prevalent but not polarizing