Survey Professionalism: New Evidence from Web Browsing Data

Abstract

Online panels have become an important resource for research in political science, but the compensation offered to panelists incentivizes them to become “survey professionals,” raising concerns about data quality. We provide evidence on survey professionalism exploring three US samples of subjects who donated their browsing data, recruited via Lucid, YouGov, and Facebook (total  𝑛=3,886). Survey professionalism is common, but varies across samples: by our most conservative estimate, we find 1.7% of respondents on Facebook, 7.6% on YouGov, and 34 7% on Lucid to be professionals (under the assumption that professionals are as likely as non-professionals to donate data after conditioning on observable demographics available from all online survey takers). However, evidence that professionals lower data quality is limited: they do not systematically differ demographically or politically from non-professionals and do not exhibit more response instability. They are, however, somewhat more likely to speed, straightline, and attempt to take questionnaires repeatedly. To address potential selection issues in donating of browsing data, we present sensitivity analyses with lower bounds for survey professionalism. While concerns about professionalism are warranted, we conclude that survey professionals do not, by and large, distort inferences of research based on online panels.

Background

As online research has expanded, scholars have gained access to vast pools of survey respondents through digital panels. These panels make it possible to study political behavior quickly and at scale — but they also raise new questions about how participants engage with surveys in everyday life.

Some respondents complete many surveys across multiple platforms, becoming what researchers call “survey professionals.” While this behavior could shape who ends up represented in online studies, little was previously known about how widespread it is or whether it affects data quality.

This study takes a novel approach by moving beyond self-reported survey participation. Using web-browsing data, the authors directly observe how frequently respondents visit survey websites and how that behavior correlates with data reliability.

Study

We analyzed digital trace data from three U.S. samples of survey respondents to measure how frequently individuals participate in online surveys and whether this “survey professionalism” affects data quality. Our samples included participants recruited through Facebook Ads (n = 707), Lucid/Cint (n = 2,222), and YouGov (n = 957) who consented to share their web-browsing histories. Across these groups, we recorded over 96 million web visits to identify survey-taking behavior and repeated participation across platforms.

We combined these behavioral data with survey responses to test four main questions: How common is survey professionalism? Are professionals demographically or politically distinct? Does frequent participation lower data quality? And do professionals attempt to take the same survey more than once? This design allowed us to directly observe survey activity, rather than relying on self-reports, and to assess how this behavior influences data reliability.

Results

Survey professionalism is common, but its effects on data quality are limited. Across samples, professionals accounted for 1.7% of Facebook respondents, 7.6% on YouGov, and 34.7% on Lucid, with estimates rising above 70% under broader definitions.

Professionals differ only slightly from other respondents — they tend to be older and more educated — but show no consistent political or attitudinal differences. While they are more likely to speed through surveys or straightline responses, professionals do not provide less stable answers over time. When comparing responses across two survey waves, professionals and non-professionals show similar levels of over-time stability, indicating that professionals are at least as attentive to surveys as other respondents.

Attempts to take the same questionnaire multiple times — what we define as repeated participation, measured by detecting repeated attempts to access identical survey URLs in respondents’ browsing data (capturing attempts rather than successful retakes) — are common. Many professionals attempt to take the same survey more than once, highlighting a need for improved screening tools.

Overall, the study shows that survey professionalism is real and measurable, but it does not substantially distort the conclusions drawn from online panels.