Worth Sharing

WS

Stories That Matter

Misinformation Experiment Has Potential to 'Inoculate' Millions of Social Media Users

Misinformation Experiment Has Potential to 'Inoculate' Millions of Social Media Users
Briefly exposing social media users to the tricks behind misinformation boosts awareness of online falsehoods—and Google is set to deploy.

Briefly exposing social media users to the tricks behind misinformation boosts awareness of harmful online falsehoods, says new research—and Google is set to deploy an anti-disinformation campaign based on the findings.

Short animations placed in YouTube's Ad slot gave viewers a taste of the strategies behind misinformation, according to the huge online experiment led by the University of Cambridge.

Working with Jigsaw, a unit within Google dedicated to tackling threats to open societies, a team of psychologists from the universities of Cambridge and Bristol created 90-second clips designed to familiarize users with manipulation techniques such as scapegoating and deliberate incoherence.

This "pre-bunking" strategy preemptively exposes people to tropes at the root of malicious propaganda, so they can better identify falsehoods online—regardless of subject matter.

Researchers behind the Inoculation Science project compare it to a vaccine: by giving people a "micro-dose" of misinformation in advance, it helps prevent them falling for it in future — an idea based on what social psychologist's call "inoculation theory."

The findings, published in Science Advances, come from seven experiments involving a total of almost 30,000 participants—including the first "real world field study" of inoculation theory on a social media platform.

The team reports that even a single viewing of one of the film clips increased awareness of misinformation.

The videos introduce concepts from the "misinformation playbook," illustrated with relatable examples from film and TV such as Family Guy or, in the case of false dichotomies, Star Wars ("Only a Sith deals in absolutes").

"YouTube has well over 2 billion active users worldwide. Our videos could easily be embedded within the ad space on YouTube to ‘pre-bunk' misinformation," said study co-author Prof. Sander van der Linden, Head of the Social Decision-Making Lab at Cambridge, who led the work.

"Our research provides the necessary proof of concept that the principle of psychological inoculation can readily be scaled across hundreds of millions of users worldwide."

Watch their video about ‘scapegoating', then see another one below…

Lead author Dr. Jon Roozenbeek from the Cambridge Lab describes the team's videos as "source agnostic," avoiding biases people have about where information is from, and how it aligns—or not—with what they already believe.

"Our interventions make no claims about what is true or a fact, which is often disputed. They are effective for anyone who does not appreciate being manipulated," he said.

"The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education, and different personality types."

YouTube's parent company, Google, is already harnessing the findings. At the end of August, Jigsaw will roll out a pre-bunking campaign across several platforms in Poland, Slovakia, and the Czech Republic to get ahead of emerging disinformation relating to Ukrainian refugees. The campaign is designed to build resilience to harmful anti-refugee narratives, in partnership with local NGOs, fact checkers, academics, and disinformation experts.

"Harmful misinformation takes many forms, but the manipulative tactics and narratives are often repeated and can therefore be predicted," said Beth Goldberg, co-author and Head of Research and Development for Google's Jigsaw unit.

"Teaching people about techniques that manipulate them, like ad-hominem attacks, can help build resilience to believing and spreading misinformation, before harmful narratives take hold," Goldberg said.

The team argue that ‘pre-bunking' may be more effective at fighting misinformation than fact-checking each untruth, which is impossible to do at scale—and has the potential to feed conspiracy theories.

"Propaganda, lies and misdirections are nearly always created from the same playbook," said co-author Prof. Stephan Lewandowsky from the University of Bristol. "We developed the videos by analyzing the rhetoric of demagogues, who deal in scapegoating and false dichotomies."

"Fact-checkers can only rebut a fraction of the falsehoods circulating online. We need to teach people to recognize the misinformation playbook, so they understand when they are being misled."

Six initial controlled experiments featured 6,464 participants, with the sixth experiment conducted a year after the first five to ensure earlier findings could be replicated.

Data collection for each participant was comprehensive, from basic information—gender, age, education, political leanings—to levels of numeracy (the ability to understand and work with numbers), conspiratorial thinking, hours spent on news and social media, gullibility, and a personality inventory, among other variables.

Factoring all this in, the team found that inoculation videos improved people's ability to spot misinformation, and boosted their confidence in being able to do so again. The clips also improve the quality of "sharing decisions": whether or not to spread "damaging" content.

Two of the animations were then tested as part of a vast experiment on YouTube, with clips positioned in the pre-video advert slot that provides an option to skip after five seconds.

Google Jigsaw exposed around 5.4 million American YouTubers, and almost a million watched for at least 30 seconds. The platform then gave a random 30% of users that watched a voluntary test question within 24 hours of their initial viewing.

The clips aimed to inoculate against tactics such as hyper-emotive language and use of false dichotomies—and the follow-up questions were based on fictional posts made to test for detection of these tropes. YouTube also gave a "control" group of users who had not viewed a video the same test question. In total, 22,632 users answered a question.

Watch the emotive language clip below, then continue reading…

Despite the intense distractions on YouTube, ability to recognize manipulation techniques at the heart of misinformation increased by 5% on average.

"Users participated in the tests around 18 hours on average after watching the videos, so the inoculation appears to have stuck," said van der Linden.

Researchers say that such a recognition increase could be game changing if dramatically scaled up across social platforms— which would be inexpensive to do. The average cost for each view of significant length was the tiny sum of $0.05.

Added Roozenbeek: "If anyone wants to pay for a YouTube campaign that measurably reduces susceptibility to misinformation across millions of users, they can do so, and at a minuscule cost per view."

(Source: University of Cambridge)

WHY NOT Fight Misinformation by Sharing the Article on Social Media…

About author

Be the first to comment

Leave a Comment