- Home  /  
- Research  /  
- Academic Research  /  
- Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users
Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users
We find that YouTube’s recommendation algorithm doesn’t lead the vast majority of users down extremist rabbit holes — but does push users into mild ideological echo chambers and toward moderately conservative content
Citation
Brown, Megan A., James Bisbee, Angela Lai, Richard Bonneau, Jonathan Nagler, and Joshua A. Tucker. "Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users." SSRN Electronic Journal, (2022). https://ssrn.com/abstract=4088828
Date Posted
May 11, 2022
Authors
Area of Study
Tags
Abstract
To what extent does the YouTube recommendation algorithm push users into echo chambers, ideologically biased content, or rabbit holes? Despite growing popular concern, recent work suggests that the recommendation algorithm is not pushing users into these echo chambers. However, existing research relies heavily on the use of anonymous data collection that does not account for the personalized nature of the recommendation algorithm. We asked a sample of real users to install a browser extension that downloaded the list of videos they were recommended. We instructed these users to start on an assigned video and then click through 20 sets of recommendations, capturing what they were being shown in real time as they used the platform logged into their real accounts. Using a novel method to estimate the ideology of a YouTube video, we demonstrate that the YouTube recommendation algorithm does, in fact, push real users into mild ideological echo chambers where, by the end of the data collection task, liberals and conservatives received different distributions of recommendations from each other, though this difference is small. While we find evidence that this difference increases the longer the user followed the recommendation algorithm, we do not find evidence that many go down `rabbit holes' that lead them to ideologically extreme content. Finally, we find that YouTube pushes all users, regardless of ideology, towards moderately conservative and an increasingly narrow range of ideological content the longer they follow YouTube's recommendations.