YouTube More Likely to Direct Election-Fraud Videos to Users Already Skeptical about 2020 Election’s Legitimacy

October 19, 2022

New Study Shows How Site’s Algorithms Perpetuate Existing MisperceptionsYouTube was more likely to recommend videos about election fraud to users who were already skeptical about the legitimacy of the 2020 U.S. presidential election, shows a new study examining the impact of the site’s algorithms. Importantly, these patterns reflect the independent influence of the algorithm on what real users are shown while using the platform. Nearly two years after the 2020 presidential election, large numbers of Americans, particularly Republicans, don’t believe in the legitimacy of the outcome. In the CSMaP study, the researchers sampled more than 300 Americans with YouTube accounts in November and December of 2020. The subjects then proceeded through a sequence of YouTube recommended videos, allowing the researchers to observe what the YouTube algorithm suggested to its users.

New Study Shows How Site’s Algorithms Perpetuate Existing Misperceptions

YouTube was more likely to recommend videos about election fraud to users who were already skeptical about the legitimacy of the 2020 U.S. presidential election, shows a new study examining the impact of the site’s algorithms. 

The results of the research, which is published in the Journal of Online Trust and Safety, showed that those most skeptical of the election’s legitimacy were shown three times as many election-fraud-related videos as were the least skeptical participants—roughly 8 additional recommendations out of approximately 400 videos suggested to each study participant. 

While the overall prevalence of these types of videos was low, the findings expose the consequences of a recommendation system that provides users with the content they want. For those most concerned about possible election fraud, showing them related content provided a mechanism by which misinformation, disinformation, and conspiracies can find their way to those most likely to believe them, observe the authors of the study. Importantly, these patterns reflect the independent influence of the algorithm on what real users are shown while using the platform.

“Our findings uncover the detrimental consequences of recommendation algorithms and cast doubt on the view that online information environments are solely determined by user choice,” says James Bisbee, who led the study as a postdoctoral researcher at New York University’s Center for Social Media and Politics (CSMaP).

Nearly two years after the 2020 presidential election, large numbers of Americans, particularly Republicans, don’t believe in the legitimacy of the outcome. 

“Roughly 70% of Republicans don’t see Biden as the legitimate winner,” despite “multiple recounts and audits that confirmed Joe Biden’s win,” the Poynter Institute’s PolitiFact wrote earlier this year.

While it’s well-known that social media platforms, such as YouTube, direct content to users based on their search preferences, the consequences of this dynamic may not be fully realized. 

In the CSMaP study, the researchers sampled more than 300 Americans with YouTube accounts in November and December of 2020. The subjects were asked how concerned they were with a number of aspects of election fraud, including fraudulent ballots being counted, valid ballots being discarded, foreign governments interfering, and non-U.S. citizens voting, among other questions.

These participants were then asked to install a browser extension that would record the list of recommendations they were shown. The subjects were then instructed to click on a randomly assigned YouTube video (the “seed” video), and then to click on one of the recommendations they were shown according to a randomly assigned “traversal rule”. For example, users assigned to the “second traversal rule” would be required to always click on the second video in the list of recommendations shown, regardless of its content. By restricting user behavior in these ways, the researchers were able to isolate the recommendation algorithm’s influence on what real users were being suggested in real time. 

The subjects then proceeded through a sequence of YouTube recommended videos, allowing the researchers to observe what the YouTube algorithm suggested to its users. Bisbee and his colleagues then compared the number of videos about election fraud in the 2020 U.S. presidential election that were recommended to participants who were more skeptical about the legitimacy of the election to those recommended to participants who were less skeptical. These results showed that election skeptics were recommended an average of eight additional videos about possible fraud in the 2020 US election, relative to non-skeptical participants (12 vs. 4).

“Many believe that automated recommendation algorithms have little influence on online ‘echo chambers’ in which users only see content that reaffirms their preexisting views,” observes Bisbee, now an assistant professor at Vanderbilt University. “Our study, however, suggests that YouTube's recommendation algorithm was able to determine which users were more likely to be concerned about fraud in the 2020 U.S. presidential election and then suggested up to three times as many videos about election fraud to these users compared to those less concerned about election fraud. This highlights the need for further investigation into how opaque recommendation algorithms operate on an issue-by-issue basis."

The paper’s other authors were Joshua A. Tucker and Jonathan Nagler, professors in NYU’s Department of Politics, and Richard Bonneau, a professor in NYU’s Department of Biology and Courant Institute of Mathematical Sciences, as well as Megan A. Brown, the senior research engineer at CSMaP, and Angela Lai, an NYU doctoral student. Tucker and Nagler are co-directors of CSMaP.

The source of this news is from New York University

Popular in Research

1

Nov 25, 2022

Deep technology solves global challenges - Aalto University is a good growth platform for deep tech companies

2

4 days ago

Celebrating 20 years of discovery, Picower Institute looks ahead to continuing impact

3

Nov 26, 2022

Deep technology solves global challenges - Aalto University is a great growth platform for deep tech companies

4

Nov 17, 2022

The surprising Swiss-Army-knife-like functions of a powerful enzyme

5

Nov 19, 2022

A more sustainable production of industrial chemicals

China operating over 100 police stations across the world with the help of some host nations, report claims

2 hours ago

Expert Comment: Unemployment could undermine social trust in Europe

2 days ago

Trump Prosecutors See Evidence for Obstruction Charges

2 hours ago

Payrolls and wages blow past expectations, flying in the face of Fed rate hikes

2 days ago

MIT student club Engineers Without Borders works with local village in Tanzania

2 hours ago

Professor Tom Eagar, renowned metallurgist and admired storyteller, dies at 72

1 day ago