When the Supreme Court considers the Florida and Texas laws, it should craft rulings that leave ample room for legislative and regulatory efforts to mandate transparency and access to data.
Area of Study
This term, the U.S. Supreme Court will hear two cases that may have far-reaching implications for the power of governments to mandate social media platform transparency and access to data.
These cases, Moody v. NetChoice and NetChoice v. Paxton, involve lawsuits brought by a trade association of Internet companies challenging Florida and Texas laws that attempt to regulate large social media platforms. Although the primary focus of the litigation is and has been the provisions of these state laws that seek to control platforms’ content moderation policies and practices, the laws also contain other provisions that seek to mandate greater transparency from the platforms. The Court has decided to directly address some of these transparency provisions, but not others.
Today, NYU’s Center for Social Media and Politics — along with Darren Linvill and Patrick Warren of the Media Forensics Hub, and Filippo Menczer of the Observatory on Social Media  — filed an amicus curiae brief in support of neither party in these cases.
The goal of the brief is to emphasize both the importance of independent platform research and the imperative for mandated access to the data and information that independent researchers, the public, and policymakers need in order to understand the ways in which social media platforms influence public discourse and democracy.
Summary of Argument
Over the past two decades, social media platforms and other digital technologies have transformed society. These technologies have made it easier than ever to find information, engage with politics, and connect with people across the globe. But the public and policymakers have also raised concerns about the role that platforms play in spreading misinformation, enabling harassment, and contributing to polarization. The need to understand how platforms operate and what influence they have on politics, policy, and democracy has never been more urgent.
Independent social science research has played a critical role in helping the public and policymakers understand the wide-ranging effects of these technologies. Studies have provided insights into the recommendations of algorithmic systems, the patterns of foreign influence campaigns, the relationship between social media and political behavior and beliefs, the prevalence of hate speech and harassment, and the efficacy of interventions. Yet much of this research can only be conducted by analyzing massive amounts of social media data, since assessing how these platforms shape society requires access to data on a scale commensurate with the unprecedented complexity and scope of today’s online information ecosystems.
Social media platforms, however, unilaterally control and limit access to their data, erecting significant barriers to rigorous research. Voluntary public disclosures like “transparency reports” contain insufficient information. Direct researcher access is inconsistently granted, and when it is, that access can be restrictive, incomplete, and subject to withdrawal at any time, for any reason, and with little recourse. Only some platforms have historically provided researcher access to data at all, so existing studies have skewed towards research inquiries that could make use of the data available, rather than towards the most pressing questions of public importance. Social media platforms have monopolies over information critical to the cultural, political, and social life of our democracy, and they have little incentive to help researchers paint an accurate picture of their impacts, especially when doing so may reveal them in an unflattering light.
As a result, independent researchers are limited in their efforts to study the causes, character, and scope of the various phenomena attributed to the rise of social media. There is widespread alarm over perceived problems such as a rise in hate speech across platforms, algorithmic systems that push users into ideological echo chambers or extremist rabbit holes, and the spread of inaccurate information from low-credibility news sources. Some government actors in the United States have attempted to ban the video-hosting platform TikTok based on alleged national security concerns, while others seek to regulate a host of platforms out of concerns for adolescent mental health. And the rise of Generative Artificial Intelligence (AI) raises new fears about the spread of dis- and misinformation on social media.
Without researcher access to accurate, comprehensive, and timely platform data, the public and policymakers are forced to rely on guesswork when grappling with these important cultural, social, and political issues. On the one hand, members of the public are unable to make informed decisions about social media use both as consumers in the marketplace and more fundamentally as citizens in a democratic society. On the other, policymakers are unable to develop effective social media regulation: Without an evidence-based understanding of the nature of the risks posed by platforms, they are hampered in their ability to design policies to mitigate them or evaluate those policies once implemented.
This untenable status quo points to the need for, and overriding public interest in, meaningful platform transparency mandates. Although the Court has decided not to address directly the general disclosure provisions of the Florida and Texas laws at issue in these cases, the Court’s resolution of the remaining provisions—in particular the laws’ individualized explanation requirements—implicates fundamental questions about the power of governments to mandate platform transparency and access to data. Amici file this brief to emphasize both the importance of independent research and the imperative for mandated access to the data and information that independent researchers, the public, and policymakers need to understand the ways in which social media platforms influence public discourse and democracy. Amici respectfully submit that the Court should craft rulings in these cases that leave ample room for responsible legislative and regulatory efforts aimed at mandating meaningful platform transparency and access to data. It is essential that such efforts survive constitutional scrutiny.
Affiliations are provided for identification purposes only, and the views expressed in the brief do not necessarily represent those of their respective universities.