YouTube More Likely to Direct Election-Fraud Videos to Users Already Skeptical about 2020 Election’s Legitimacy

October 19, 2022

New Study Shows How Site’s Algorithms Perpetuate Existing MisperceptionsYouTube was more likely to recommend videos about election fraud to users who were already skeptical about the legitimacy of the 2020 U.S. presidential election, shows a new study examining the impact of the site’s algorithms. Importantly, these patterns reflect the independent influence of the algorithm on what real users are shown while using the platform. Nearly two years after the 2020 presidential election, large numbers of Americans, particularly Republicans, don’t believe in the legitimacy of the outcome. In the CSMaP study, the researchers sampled more than 300 Americans with YouTube accounts in November and December of 2020. The subjects then proceeded through a sequence of YouTube recommended videos, allowing the researchers to observe what the YouTube algorithm suggested to its users.

New Study Shows How Site’s Algorithms Perpetuate Existing Misperceptions

YouTube was more likely to recommend videos about election fraud to users who were already skeptical about the legitimacy of the 2020 U.S. presidential election, shows a new study examining the impact of the site’s algorithms. 

The results of the research, which is published in the Journal of Online Trust and Safety, showed that those most skeptical of the election’s legitimacy were shown three times as many election-fraud-related videos as were the least skeptical participants—roughly 8 additional recommendations out of approximately 400 videos suggested to each study participant. 

While the overall prevalence of these types of videos was low, the findings expose the consequences of a recommendation system that provides users with the content they want. For those most concerned about possible election fraud, showing them related content provided a mechanism by which misinformation, disinformation, and conspiracies can find their way to those most likely to believe them, observe the authors of the study. Importantly, these patterns reflect the independent influence of the algorithm on what real users are shown while using the platform.

“Our findings uncover the detrimental consequences of recommendation algorithms and cast doubt on the view that online information environments are solely determined by user choice,” says James Bisbee, who led the study as a postdoctoral researcher at New York University’s Center for Social Media and Politics (CSMaP).

Nearly two years after the 2020 presidential election, large numbers of Americans, particularly Republicans, don’t believe in the legitimacy of the outcome. 

“Roughly 70% of Republicans don’t see Biden as the legitimate winner,” despite “multiple recounts and audits that confirmed Joe Biden’s win,” the Poynter Institute’s PolitiFact wrote earlier this year.

While it’s well-known that social media platforms, such as YouTube, direct content to users based on their search preferences, the consequences of this dynamic may not be fully realized. 

In the CSMaP study, the researchers sampled more than 300 Americans with YouTube accounts in November and December of 2020. The subjects were asked how concerned they were with a number of aspects of election fraud, including fraudulent ballots being counted, valid ballots being discarded, foreign governments interfering, and non-U.S. citizens voting, among other questions.

These participants were then asked to install a browser extension that would record the list of recommendations they were shown. The subjects were then instructed to click on a randomly assigned YouTube video (the “seed” video), and then to click on one of the recommendations they were shown according to a randomly assigned “traversal rule”. For example, users assigned to the “second traversal rule” would be required to always click on the second video in the list of recommendations shown, regardless of its content. By restricting user behavior in these ways, the researchers were able to isolate the recommendation algorithm’s influence on what real users were being suggested in real time. 

The subjects then proceeded through a sequence of YouTube recommended videos, allowing the researchers to observe what the YouTube algorithm suggested to its users. Bisbee and his colleagues then compared the number of videos about election fraud in the 2020 U.S. presidential election that were recommended to participants who were more skeptical about the legitimacy of the election to those recommended to participants who were less skeptical. These results showed that election skeptics were recommended an average of eight additional videos about possible fraud in the 2020 US election, relative to non-skeptical participants (12 vs. 4).

“Many believe that automated recommendation algorithms have little influence on online ‘echo chambers’ in which users only see content that reaffirms their preexisting views,” observes Bisbee, now an assistant professor at Vanderbilt University. “Our study, however, suggests that YouTube's recommendation algorithm was able to determine which users were more likely to be concerned about fraud in the 2020 U.S. presidential election and then suggested up to three times as many videos about election fraud to these users compared to those less concerned about election fraud. This highlights the need for further investigation into how opaque recommendation algorithms operate on an issue-by-issue basis."

The paper’s other authors were Joshua A. Tucker and Jonathan Nagler, professors in NYU’s Department of Politics, and Richard Bonneau, a professor in NYU’s Department of Biology and Courant Institute of Mathematical Sciences, as well as Megan A. Brown, the senior research engineer at CSMaP, and Angela Lai, an NYU doctoral student. Tucker and Nagler are co-directors of CSMaP.

The source of this news is from New York University

Popular in Research

1

Apr 6, 2024

Conspiracy theory runs wild linking New York City’s 4.8-magnitude earthquake to date of solar eclipse

2

Apr 9, 2024

The rise of Dawn

3

Apr 9, 2024

High School Biology Textbooks Do Not Provide Students with a Comprehensive View of the Science of Sex and Gender

4

4 days ago

How early-stage cancer cells hide from the immune system

5

4 days ago

Three Lincoln Laboratory inventions named IEEE Milestones

Cool Course: Investigating Injustice

9 hours ago

Trump offers lukewarm, glitchy response to Biden criticism

11 hours ago

Silence broken on gender pay gaps but we must hold organisations to account

1 week ago

Nasdaq Futures Up 2% as Nvidia Powers Global Rally: Markets Wrap

Apr 8, 2024

Think Potluck, Not 'Melting Pot’

9 hours ago

Four-peat: MIT students take first place in the 84th Putnam Math Competition

9 hours ago