Does What We Know About Fake News Hold Up in a Pandemic?

January 12, 2021  ·   News

Joshua A. Tucker discusses the future of social media research as false information about COVID-19 spreads alongside the disease itself.

A graphic of megaphones and the covid-19 virus leaving them and surrounding the Earth.

Credit: Judy Zhang

As I wrote this, a slickly edited conspiracy theory video called “Plandemic” circulated on Facebook and YouTube, racking up millions of views before it was banned by both platforms. Made to look like a documentary, the 26-minute video peddled falsehoods about COVID-19, the safety of vaccines, and Dr. Anthony Fauci, the director of the National Institute of Allergy and Infectious Diseases and the most visible public health expert during the crisis.

The video is emblematic of what the World Health Organization has called an “infodemic,” or the rapid spread of dangerous false information about the COVID-19 pandemic. It represents a new information ecosystem for those who study misinformation and disinformation to explore and understand, including researchers at the Center for Social Media and Politics at NYU (CSMaP), which harvests social media data to study political attitudes and behavior online.

I spoke with Joshua Tucker, professor of politics and co-director of CSMaP, to chart a course forward for those studying at the intersection of social media and politics, who must now adapt to a rapidly changing research environment. Tucker’s expertise covers mass political behavior and the relationship between social media and politics. He argues we should go back to the drawing board to figure out if what we’ve learned about fake news holds true in a global pandemic context.

What follows is a lightly edited transcript of our conversation.


How does the COVID-19 infodemic change the field of social media research?


The questions we, as researchers, should ask about the infodemic fall into three categories.

One: Does what we know about political disinformation hold true for scientific disinformation? We learned a lot about who was producing and sharing disinformation during the 2016 U.S. presidential elections, including a major finding we contributed to literature on the topic: People over the age of 65 on average shared seven times as many links to fake news websites as millennials. So, older adults were more likely to share fake news. Another important finding from that same study is that sharing fake news is a rare phenomenon. More than 90 percent of our sample didn’t share a single link to one of these fake news websites, but there were some people who shared many links. We don’t know if those or any other findings will stand up in future research on false information around COVID.

Two: What’s going to happen to people’s faith in news generally, including non-COVID-related news? I worry the long-term takeaway of the fake news crisis will be that people become less trusting of the media in general — because they’re constantly reminded that fake news exists. You have politicians who say, “Look, you have the New York Times saying that there aren’t enough testing kits. That’s fake news.” Before 2016, people would believe the New York Times if it reported there weren’t enough testing kits for COVID. We now live in a world where the concept of fake news has been weaponized for political purposes by leaders around the world. Donald Trump is the most visible example of this, but Rodrigo Duterte in the Philippines and Jair Bolsonaro in Brazil also use the term “fake news” to criticize those who disagree with them. That’s why I think it’s crucial to revisit the questions we’ve asked of political misinformation and find out if the answers hold up for COVID-related misinformation: Who’s producing it? Who’s sharing it? Who’s exposed to it? And what are the effects of being exposed to it?

Three: How is this going to change the way people trust and rely on experts? Because so much is at stake, will people trust experts a lot more than they have previously? Or will they be disillusioned by the fact that things experts are saying about COVID could later turn out to be incorrect? Take the issue of wearing masks. For a month, experts told us not to wear them. Now they’re telling us we should wear masks. What is that kind of inconsistency going to do to the gravitas experts typically bring to their public appearances and statements?


Let’s talk more about those questions we should revisit in the context of COVID. I’ll start with, “What are the effects of exposure to fake news?” How does the infodemic reframe the potential consequences of consuming that kind of content?


If I pass along something that says the pope endorsed Donald Trump, at the end of the day, maybe I’ll just look foolish. But if I believe information that says drinking bleach is going to cure me of COVID, and I drink bleach and it kills me, that’s really, really consequential. Or if I believe social distancing is just a plot to make Donald Trump look bad, and I go out and participate in one of these rallies and I get COVID and infect others, that could also have serious consequences.


Another question you said we should ask in the context of COVID is, “Who’s sharing fake news?” How should we revisit our discovery that older adults share false political information at higher rates? And does this group’s vulnerability to severe illness raise the stakes for any new lines of inquiry?


It’s an interesting question, and it points to the urgency of trying to understand what we found about older Americans being more willing to share fake news in a political context. Does it translate to older Americans being more willing to believe false information during the course of a pandemic? We have to find out.

People over the age of 65 on average shared seven times as many links to fake news websites as millennials. So, older adults were more likely to share fake news.

I should also point out that we didn’t find a strong age effect in a fact-checking study we’re working on with the Stanford Cyber Policy Center, which explores the ability of ordinary citizens to identify the veracity of news stories in real time. We asked Americans to examine news stories published in the last 24 hours and decide if they were true or false and misleading. We also gave them the opportunity to say they couldn’t decide. Pretty depressingly, we found a strong “partisan-concordance” effect. That means liberals are less likely to realize news is false if it has a pro-liberal slant, and conservatives are less likely to realize news is false if it has a pro-conservative slant.


How might that partisan-concordance effect be exploited by various actors in the current information environment?


Here’s what’s really interesting about COVID-related fake news: It’s scientific misinformation and disinformation that’s being politicized. False information about death rates, about how contagious the disease is, about possible cures — all these things can be politicized. They can be used by Republicans to try to harm Democratic governors, as we’re seeing with these “liberate” rallies being held all over the country, and they can be used by people on the Left to criticize Trump.

We tend to think of scientific misinformation as just incorrect information about science, but it’s often a function of partisanship. We show this in a new paper about science misinformation patterns: People who are conservative or Republican are more likely to believe misinformation about global warming than people on the Left, while people across the spectrum believe other types of misinformation — about vaccines, for example.


So many of us are staying indoors and relying on the internet as a lifeline to the outside world. Our news diets consist of COVID updates and almost nothing else. Could this more saturated information environment fuel the kind of partisanship and polarization you’re talking about?


You and I haven’t lived through it, but maybe World War II was like this. There was a brief period after 9/11, too, but we’ve probably never known a world in which so much of our news focused on one topic for such a sustained period of time. As you said, we are all spending more time on computers right now, which includes spending more time on social media. Twitter and Facebook usage has gone up, so what does that mean for the disinformation environment generally?

Could the COVID situation be making the country even more polarized than it was previously? We don’t know, but political science research suggests in the face of scarce resources people tend to turn on outgroups. With so many people losing their jobs and the broader economic turmoil, that distrust of outgroups is going to go up even more. Is that going to interfere with people’s ability to identify misinformation? Is it going to be even more likely that they will share this misinformation? These are possibilities we’ll have to consider in future research.


You’ve just described what we don’t know about the infodemic and its effects. Are you seeing that reflected in news coverage, or are too many journalists making inferences that can’t be supported by evidence?


That’s a great question. I’m not an expert on this and don’t want to portray myself as one, but I’d remind reporters that social media is huge and optimized for search. That means you can find evidence of anything. If you see misinformation online, don’t assume everyone is sharing that misinformation. Or if you see it being shared and there are arguments about its veracity, don’t assume everyone is witnessing those arguments. That’s my biggest piece of advice — to realize what you can and cannot infer to larger populations based on what you see in front of you.

On the positive side, I think journalists can play a role in inoculating people against misinformation that’s out there. A friend texted me alarming false information back in March. The text warned of a lockdown. As in, police would prevent people from traveling on highways. Had I read a story beforehand saying, “Hey, there are rumors circulating about this and they’re nothing more than rumors,” that might have inoculated me against exposure to that particular piece of misinformation.


I want to ask you a big-picture question. Can you talk about the future of CSMaP’s research on fake news and what folks can expect from us in this new pandemic world?


We’re planning to extend the crowdsourcing fact-checking study with two main goals: The first is to study people’s vulnerability to false information around COVID. The second is to test how the pandemic environment affects people’s ability to determine the accuracy of news — both political and COVID-related — over the course of the 2020 election campaign. We’re actively working on securing the funding to do this.

And of course, we’re trying to make progress on all those questions we discussed earlier. One way to do so is to move forward the study of public opinion using social media data, because as you said, this is a new world, where things tend to change by the hour, and traditional survey research is not well-equipped to measure public opinion in real time. Social media, on the other hand, could provide some real insight into how Trump’s tweets or announcements might change people’s understanding of the issues surrounding the current crisis.

Venuri Siriwardane is the Researcher/Editor at the Center for Social Media and Politics. She holds a master’s degree in Politics and Communication from the London School of Economics, where her research interests included political legitimacy in postcolonial states. In a previous life, she worked as a business journalist in New York.

This piece was originally published on May 18, 2020.

Read the Original Medium Post Here.