Academic Research

CSMaP faculty, postdoctoral fellows, and students publish rigorous, peer-reviewed research in top academic journals and post working papers sharing ongoing work.

Search or Filter

  • Journal Article

    The Trump Advantage in Policy Recall Among Voters

    American Politics Research, 2024

    View Article View abstract

    Research in political science suggests campaigns have a minimal effect on voters’ attitudes and vote choice. We evaluate the effectiveness of the 2016 Trump and Clinton campaigns at informing voters by giving respondents an opportunity to name policy positions of candidates that they felt would make them better off. The relatively high rates of respondents’ ability to name a Trump policy that would make them better off suggests that the success of his campaign can be partly attributed to its ability to communicate memorable information. Our evidence also suggests that cable television informed voters: respondents exposed to higher levels of liberal news were more likely to be able to name Clinton policies, and voters exposed to higher levels of conservative news were more likely to name Trump policies; these effects hold even conditioning on respondents’ ideology and exposure to mainstream media. Our results demonstrate the advantages of using novel survey questions and provide additional insights into the 2016 campaign that challenge one part of the conventional narrative about the presumed non-importance of operational ideology.

    Date Posted

    Oct 30, 2024

  • Working Paper

    Survey Professionalism: New Evidence from Web Browsing Data

    Working Paper, August 2024

    View Article View abstract

    Online panels have become an important resource for research in political science, but the financial compensation involved incentivizes respondents to become “survey professionals”, which raises concerns about data quality. We provide evidence on survey professionalism using behavioral web browsing data from three U.S. samples, recruited via Lucid, YouGov, and Facebook (total n = 3,886). Survey professionalism is common but varies across samples: By our most conservative measure, we identify 1.7% of respondents on Facebook, 7.9% of respondents on YouGov, and 34.3% of respondents on Lucid as survey professionals. However, evidence that professionals lower data quality is limited: they do not systematically differ demographically or politically from non-professionals and do not respond more randomly—although they are somewhat more likely to speed, to straightline, and to take questionnaires repeatedly. While concerns are warranted, we conclude that survey professionals do not, by and large, distort inferences of research based on online panels.

    Date Posted

    Aug 30, 2024

  • Working Paper

    Reaching Across the Political Aisle: Overcoming Challenges in Using Social Media for Recruiting Politically Diverse Respondents

    Working Paper, August 2024

    View Article View abstract

    A challenge for public opinion surveys is achieving representativeness of respondents across demographic groups. We test the extent to which ideological alignment with a survey’s sponsor shapes differential partisan response and users’ choice of whether to participate in a research study on Facebook. While the use of Facebook advertisements for recruitment has increased in recent years and offers potential benefits, it can yield difficulties in recruiting politically representative samples. We recruit respondents for a short survey through two otherwise identical advertisements associated with either New York University (from a liberal state) or the University of Mississippi (from a conservative state). Contrary to our expectations, we don’t find an asymmetry in completion rates between self-reported Democrats and Republicans based on the survey sponsor. Nor do we find statistically significant differences in attitudes of respondents across the two survey sponsors when we control for observables.

    Date Posted

    Aug 13, 2024

  • Journal Article

    Digital Town Square? Nextdoor's Offline Contexts and Online Discourse

    Journal of Quantitative Description: Digital Media, 2024

    View Article View abstract

    There is scant quantitative research describing Nextdoor, the world's largest and most important hyperlocal social media network. Due to its localized structure, Nextdoor data are notoriously difficult to collect and work with. We build multiple datasets that allow us to generate descriptive analyses of the platform's offline contexts and online content. We first create a comprehensive dataset of all Nextdoor neighborhoods joined with U.S. Census data, which we analyze at the community-level (block-group). Our findings suggests that Nextdoor is primarily used in communities where the populations are whiter, more educated, more likely to own a home, and with higher levels of average income, potentially impacting the platform's ability to create new opportunities for social capital formation and citizen engagement. At the same time, Nextdoor neighborhoods are more likely to have active government agency accounts---and law enforcement agencies in particular---where offline communities are more urban, have larger nonwhite populations, greater income inequality, and higher average home values. We then build a convenience sample of 30 Nextdoor neighborhoods, for which we collect daily posts and comments appearing in the feed (115,716 posts and 163,903 comments), as well as associated metadata. Among the accounts for which we collected posts and comments, posts seeking or offering services were the most frequent, while those reporting potentially suspicious people or activities received the highest average number of comments. Taken together, our study describes the ecosystem of and discussion on Nextdoor, as well as introduces data for quantitatively studying the platform.

    Date Posted

    May 29, 2024

  • Journal Article

    The Effects of Facebook and Instagram on the 2020 Election: A Deactivation Experiment

    • Hunt Alcott, 
    • Matthew Gentzkow, 
    • Winter Mason, 
    • Arjun Wilkins, 
    • Pablo Barberá
    • Taylor Brown, 
    • Juan Carlos Cisneros, 
    • Adriana Crespo-Tenorio, 
    • Drew Dimmery, 
    • Deen Freelon, 
    • Sandra González-Bailón
    • Andrew M. Guess
    • Young Mie Kim, 
    • David Lazer, 
    • Neil Malhotra, 
    • Devra Moehler, 
    • Sameer Nair-Desai, 
    • Houda Nait El Barj, 
    • Brendan Nyhan, 
    • Ana Carolina Paixao de Queiroz, 
    • Jennifer Pan, 
    • Jaime Settle, 
    • Emily Thorson, 
    • Rebekah Tromble, 
    • Carlos Velasco Rivera, 
    • Benjamin Wittenbrink, 
    • Magdalena Wojcieszak
    • Saam Zahedian, 
    • Annie Franco, 
    • Chad Kiewiet De Jong, 
    • Natalie Jomini Stroud, 
    • Joshua A. Tucker

    Proceedings of the National Academy of Sciences, 2024

    View Article View abstract

    We study the effect of Facebook and Instagram access on political beliefs, attitudes, and behavior by randomizing a subset of 19,857 Facebook users and 15,585 Instagram users to deactivate their accounts for 6 wk before the 2020 U.S. election. We report four key findings. First, both Facebook and Instagram deactivation reduced an index of political participation (driven mainly by reduced participation online). Second, Facebook deactivation had no significant effect on an index of knowledge, but secondary analyses suggest that it reduced knowledge of general news while possibly also decreasing belief in misinformation circulating online. Third, Facebook deactivation may have reduced self-reported net votes for Trump, though this effect does not meet our preregistered significance threshold. Finally, the effects of both Facebook and Instagram deactivation on affective and issue polarization, perceived legitimacy of the election, candidate favorability, and voter turnout were all precisely estimated and close to zero.

  • Journal Article

    A Synthesis of Evidence for Policy from Behavioural Science During COVID-19

    • Kai Ruggeri, 
    • Friederike Stock, 
    • S. Alexander Haslam, 
    • Valerio Capraro, 
    • Paulo Boggio, 
    • Naomi Ellemers, 
    • Aleksandra Cichocka, 
    • Karen M. Douglas, 
    • David G. Rand, 
    • Sander van der Linden, 
    • Mina Cikara, 
    • Eli J. Finkel, 
    • James N. Druckman, 
    • Michael J. A. Wohl, 
    • Richard E. Petty, 
    • Joshua A. Tucker
    • Azim Shariff, 
    • Michele Gelfand, 
    • Dominic Packer, 
    • Jolanda Jetten, 
    • Paul A. M. Van Lange, 
    • Gordon Pennycook, 
    • Ellen Peters, 
    • Katherine Baicker, 
    • Alia Crum, 
    • Kim A. Weeden, 
    • Lucy Napper, 
    • Nassim Tabri, 
    • Jamil Zaki, 
    • Linda Skitka, 
    • Shinobu Kitayama, 
    • Dean Mobbs, 
    • Cass R. Sunstein, 
    • Sarah Ashcroft-Jones, 
    • Anna Louise Todsen, 
    • Ali Hajian, 
    • Sanne Verra, 
    • Vanessa Buehler, 
    • Maja Friedemann, 
    • Marlene Hecht, 
    • Rayyan S. Mobarak, 
    • Ralitsa Karakasheva, 
    • Markus R. Tünte, 
    • Siu Kit Yeung, 
    • R. Shayna Rosenbaum, 
    • Žan Lep, 
    • Yuki Yamada, 
    • Sa-kiera Tiarra Jolynn Hudson, 
    • Lucía Macchia, 
    • Irina Soboleva, 
    • Eugen Dimant, 
    • Sandra J. Geiger, 
    • Hannes Jarke, 
    • Tobias Wingen, 
    • Jana Berkessel, 
    • Silvana Mareva, 
    • Lucy McGill, 
    • Francesca Papa, 
    • Bojana Većkalov, 
    • Zeina Afif, 
    • Eike K. Buabang, 
    • Marna Landman, 
    • Felice Tavera, 
    • Jack L. Andrews, 
    • Aslı Bursalıoğlu, 
    • Zorana Zupan, 
    • Lisa Wagner, 
    • Joaquin Navajas, 
    • Marek Vranka, 
    • David Kasdan, 
    • Patricia Chen, 
    • Kathleen R. Hudson, 
    • Lindsay M. Novak, 
    • Paul Teas, 
    • Nikolay R. Rachev, 
    • Matteo M. Galizzi, 
    • Katherine L. Milkman, 
    • Marija Petrović, 
    • Jay J. Van Bavel
    • Robb Willer

    Nature, 2023

    View Article View abstract

    Scientific evidence regularly guides policy decisions, with behavioural science increasingly part of this process. In April 2020, an influential paper proposed 19 policy recommendations (‘claims’) detailing how evidence from behavioural science could contribute to efforts to reduce impacts and end the COVID-19 pandemic. Here we assess 747 pandemic-related research articles that empirically investigated those claims. We report the scale of evidence and whether evidence supports them to indicate applicability for policymaking. Two independent teams, involving 72 reviewers, found evidence for 18 of 19 claims, with both teams finding evidence supporting 16 (89%) of those 18 claims. The strongest evidence supported claims that anticipated culture, polarization and misinformation would be associated with policy effectiveness. Claims suggesting trusted leaders and positive social norms increased adherence to behavioural interventions also had strong empirical support, as did appealing to social consensus or bipartisan agreement. Targeted language in messaging yielded mixed effects and there were no effects for highlighting individual benefits or protecting others. No available evidence existed to assess any distinct differences in effects between using the terms ‘physical distancing’ and ‘social distancing’. Analysis of 463 papers containing data showed generally large samples; 418 involved human participants with a mean of 16,848 (median of 1,699). That statistical power underscored improved suitability of behavioural science research for informing policy decisions. Furthermore, by implementing a standardized approach to evidence selection and synthesis, we amplify broader implications for advancing scientific evidence in policy formulation and prioritization.

    Date Posted

    Dec 13, 2023

    Tags

  • Book

    Computational Social Science for Policy and Quality of Democracy: Public Opinion, Hate Speech, Misinformation, and Foreign Influence Campaigns

    Handbook of Computational Social Science for Policy, 2023

    View Book View abstract

    The intersection of social media and politics is yet another realm in which Computational Social Science has a paramount role to play. In this review, I examine the questions that computational social scientists are attempting to answer – as well as the tools and methods they are developing to do so – in three areas where the rise of social media has led to concerns about the quality of democracy in the digital information era: online hate; misinformation; and foreign influence campaigns. I begin, however, by considering a precursor of these topics – and also a potential hope for social media to be able to positively impact the quality of democracy – by exploring attempts to measure public opinion online using Computational Social Science methods. In all four areas, computational social scientists have made great strides in providing information to policy makers and the public regarding the evolution of these very complex phenomena but in all cases could do more to inform public policy with better access to the necessary data; this point is discussed in more detail in the conclusion of the review.

  • Journal Article

    Tweeting Beyond Tahrir: Ideological Diversity and Political Intolerance in Egyptian Twitter Networks

    World Politics, 2021

    View Article View abstract

    Do online social networks affect political tolerance in the highly polarized climate of postcoup Egypt? Taking advantage of the real-time networked structure of Twitter data, the authors find that not only is greater network diversity associated with lower levels of intolerance, but also that longer exposure to a diverse network is linked to less expression of intolerance over time. The authors find that this relationship persists in both elite and non-elite diverse networks. Exploring the mechanisms by which network diversity might affect tolerance, the authors offer suggestive evidence that social norms in online networks may shape individuals’ propensity to publicly express intolerant attitudes. The findings contribute to the political tolerance literature and enrich the ongoing debate over the relationship between online echo chambers and political attitudes and behavior by providing new insights from a repressive authoritarian context.

  • Journal Article

    Political Knowledge and Misinformation in the Era of Social Media: Evidence From the 2015 UK Election

    British Journal of Political Science, 2022

    View Article View abstract

    Does social media educate voters, or mislead them? This study measures changes in political knowledge among a panel of voters surveyed during the 2015 UK general election campaign while monitoring the political information to which they were exposed on the Twitter social media platform. The study's panel design permits identification of the effect of information exposure on changes in political knowledge. Twitter use led to higher levels of knowledge about politics and public affairs, as information from news media improved knowledge of politically relevant facts, and messages sent by political parties increased knowledge of party platforms. But in a troubling demonstration of campaigns' ability to manipulate knowledge, messages from the parties also shifted voters' assessments of the economy and immigration in directions favorable to the parties' platforms, leaving some voters with beliefs further from the truth at the end of the campaign than they were at its beginning.

  • Working Paper

    Opinion Change and Learning in the 2016 U.S. Presidential Election: Evidence from a Panel Survey Combined with Direct Observation of Social Media Activity

    Working Paper, September 2020

    View Article View abstract

    The role of the media in influencing people’s attitudes and opinions is difficult to demonstrate because media consumption by survey respondents is usually unobserved in datasets containing information on attitudes and vote choice. This paper leverages behavioral data combined with responses from a multi-wave panel to test whether Democrats who see more stories from liberal news sources on Twitter develop more liberal positions over time and, conversely, whether Republicans are more likely to revise their views in a conservative direction if they are exposed to more news on Twitter from conservative media sources. We find evidence that exposure to ideologically framed information and arguments changes voters’ own positions, but has a limited impact on perceptions of where the candidates stand on the issues.

    Date Posted

    Sep 24, 2020

  • Book
  • Journal Article

    Using Social and Behavioral Science to Support COVID-19 Pandemic Response

    • Jay J. Van Bavel
    • Katherine Baicker, 
    • Paulo Boggio, 
    • Valerio Capraro, 
    • Aleksandra Cichocka, 
    • Mina Cikara, 
    • Molly J. Crockett, 
    • Alia Crum, 
    • Karen M. Douglas, 
    • James N. Druckman, 
    • John Drury, 
    • Oeindrila Dube, 
    • Naomi Ellemers, 
    • Eli J. Finkel, 
    • James H. Fowler, 
    • Michele Gelfand, 
    • Shihui Han, 
    • S. Alexander Haslam, 
    • Jolanda Jetten, 
    • Shinobu Kitayama, 
    • Dean Mobbs, 
    • Lucy Napper, 
    • Dominic Packer, 
    • Gordon Pennycook, 
    • Ellen Peters, 
    • Richard E. Petty, 
    • David G. Rand, 
    • Stephen D. Reicher, 
    • Simone Schnall, 
    • Azim Shariff, 
    • Linda Skitka, 
    • Sandra Susan Smith, 
    • Cass R. Sunstein, 
    • Nassim Tabri, 
    • Joshua A. Tucker
    • Sander van der Linden, 
    • Paul A. M. Van Lange, 
    • Kim A. Weeden, 
    • Michael J. A. Wohl, 
    • Jamil Zaki, 
    • Sean R. Zion, 
    • Robb Willer

    Nature Human Behavior, 2020

    View Article View abstract

    The COVID-19 pandemic represents a massive global health crisis. Because the crisis requires large-scale behaviour change and places significant psychological burdens on individuals, insights from the social and behavioural sciences can be used to help align human behaviour with the recommendations of epidemiologists and public health experts. Here we discuss evidence from a selection of research topics relevant to pandemics, including work on navigating threats, social and cultural influences on behaviour, science communication, moral decision-making, leadership, and stress and coping. In each section, we note the nature and quality of prior research, including uncertainty and unsettled issues. We identify several insights for effective response to the COVID-19 pandemic and highlight important gaps researchers should move quickly to fill in the coming weeks and months.

    Date Posted

    Apr 30, 2020

    Tags

  • Journal Article

    Political Psycholinguistics: A Comprehensive Analysis of the Language Habits of Liberal and Conservative Social Media Users.

    Journal of Personality and Social Psychology, 2020

    View Article View abstract

    For nearly a century social scientists have sought to understand left–right ideological differences in values, motives, and thinking styles. Much progress has been made, but — as in other areas of research — this work has been criticized for relying on small and statistically unrepresentative samples and the use of reactive, self-report measures that lack ecological validity. In an effort to overcome these limitations, we employed automated text analytic methods to investigate the spontaneous, naturally occurring use of language in nearly 25,000 Twitter users. We derived 27 hypotheses from the literature on political psychology and tested them using 32 individual dictionaries. In 23 cases, we observed significant differences in the linguistic styles of liberals and conservatives. For instance, liberals used more language that conveyed benevolence, whereas conservatives used more language pertaining to threat, power, tradition, resistance to change, certainty, security, anger, anxiety, and negative emotion in general. In 17 cases, there were also significant effects of ideological extremity. For instance, moderates used more benevolent language, whereas extremists used more language pertaining to inhibition, tentativeness, affiliation, resistance to change, certainty, security, anger, anxiety, negative affect, swear words, and death-related language. These research methods, which are easily adaptable, open up new and unprecedented opportunities for conducting unobtrusive research in psycholinguistics and political psychology with large and diverse samples.

    Date Posted

    Jan 09, 2020

  • Journal Article

    Who Leads? Who Follows? Measuring Issue Attention and Agenda Setting by Legislators and the Mass Public Using Social Media Data

    American Political Science Review, 2019

    View Article View abstract

    Are legislators responsive to the priorities of the public? Research demonstrates a strong correspondence between the issues about which the public cares and the issues addressed by politicians, but conclusive evidence about who leads whom in setting the political agenda has yet to be uncovered. We answer this question with fine-grained temporal analyses of Twitter messages by legislators and the public during the 113th U.S. Congress. After employing an unsupervised method that classifies tweets sent by legislators and citizens into topics, we use vector autoregression models to explore whose priorities more strongly predict the relationship between citizens and politicians. We find that legislators are more likely to follow, than to lead, discussion of public issues, results that hold even after controlling for the agenda-setting effects of the media. We also find, however, that legislators are more likely to be responsive to their supporters than to the general public.

    Date Posted

    Jul 12, 2019

  • Journal Article

    Moral Discourse in the Twitterverse: Effects of Ideology and Political Sophistication on Language Use Among U.S. Citizens and Members of Congress

    Journal of Language and Politics, 2018

    View Article View abstract

    We analyzed Twitter language to explore hypotheses derived from moral foundations theory, which suggests that liberals and conservatives prioritize different values. In Study 1, we captured 11 million tweets from nearly 25,000 U.S. residents and observed that liberals expressed fairness concerns more often than conservatives, whereas conservatives were more likely to express concerns about group loyalty, authority, and purity. Increasing political sophistication exacerbated ideological differences in authority and group loyalty. At low levels of sophistication, liberals used more harm language, but at high levels of sophistication conservatives referenced harm more often. In Study 2, we analyzed 59,000 tweets from 388 members of the U.S. Congress. Liberal legislators used more fairness- and harm-related words, whereas conservative legislators used more authority-related words. Unexpectedly, liberal legislators used more language pertaining to group loyalty and purity. Follow-up analyses suggest that liberals and conservatives in Congress use similar words to emphasize different policy priorities.

  • Book

    Measuring Public Opinion with Social Media Data

    The Oxford Handbook of Polling and Survey Methods, 2018

    View Book View abstract

    This chapter examines the use of social networking sites such as Twitter in measuring public opinion. It first considers the opportunities and challenges that are involved in conducting public opinion surveys using social media data. Three challenges are discussed: identifying political opinion, representativeness of social media users, and aggregating from individual responses to public opinion. The chapter outlines some of the strategies for overcoming these challenges and proceeds by highlighting some of the novel uses for social media that have fewer direct analogs in traditional survey work. Finally, it suggests new directions for a research agenda in using social media for public opinion work.

    Date Posted

    Oct 01, 2017