
https://www.science.org/content/article/...nformation
EXCERPTS: . . . As head of the Social Decision-Making Lab at the University of Cambridge, Van der Linden is studying the power of lies and how to keep people from believing them. He has become academia’s biggest proponent of a strategy pioneered after the Korean War to “inoculate” humans against persuasion, the way they are vaccinated against dangerous infections.
The recipe only has two steps: First, warn people they may be manipulated. Second, expose them to a weakened form of the misinformation, just enough to intrigue but not persuade anyone. “The goal is to raise eyebrows (antibodies) without convincing (infecting),” Van der Linden and his colleague Jon Roozenbeek wrote recently in JAMA.
Inoculation, also called “prebunking,” is just one of several techniques researchers are testing to stop people from falling for misinformation and spreading it further....
[...] Van der Linden’s 2023 book, Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity, has won many awards, and Google’s research arm, Jigsaw, has rolled out the approach to tens of millions of people via YouTube ads. “My reading of the literature is that it’s probably the most effective strategy,” says Jay van Bavel, a psychologist at New York University.
But others say inoculation is an analogy gone awry that wrongly focuses on recipients of misinformation instead of on its sources and the social media companies...
[...] The idea of inoculation dates back to the Cold War era. In 1954, when the Korean War ended, 21 U.S. prisoners of war decided to move to communist China instead of coming home, a choice that shocked the nation. Many assumed the soldiers were victims of “brainwashing,” a term invented a few years earlier. To resist this kind of manipulation, experts declared, young people in the United States needed to be taught more about “American” ideals at home, at school, and in the Army.
But psychologist William McGuire, who spent most of his career at Yale University, had a different idea. He argued that the soldiers were vulnerable to the endless propaganda they encountered because it was their first exposure to those ideas. Such prisoners, McGuire argued, were like someone brought up in an “aseptic” environment who, “although appearing in very good health, proves quite vulnerable when suddenly exposed to a massive dose of an infectious virus.” The remedy seemed obvious: Pre-expose people to “weakened, defense-stimulating forms of the counterarguments.”
McGuire tested the hypothesis by seeing whether inoculation could preserve students’ belief in four cultural truisms, including that they should brush their teeth after every meal and that antibiotics had been a huge benefit to humankind. Confronting students with counterarguments—for example, that antibiotics led to the development of deadly resistant strains—could drastically decrease their belief in these truisms, McGuire found. But if the students first read an essay that laid out the counterarguments and refuted them, their belief didn’t erode nearly as much.
The idea fascinated Van der Linden, who first read McGuire’s papers as a Yale student working on the public’s perception of climate change. But McGuire believed inoculation only worked if people’s beliefs had never been challenged before. Van der Linden thought that assumption was mistaken. “In the real world, we are dealing with people in various stages of infection, and inoculation can both work as a preventative and therapeutic agent,” he wrote in his book. “If I ever had a ‘lightbulb’ moment, this was it.”
Van der Linden put the theory to the test in an online study of more than 2000 participants that probed their views of climate change—and tried to change them. Respondents on average estimated 70% of scientists agree human-caused global climate change is real. When the subjects were told that the actual number is 97%, their estimates rose accordingly. But exposing participants to a misinformation trope that “over 31,000 American scientists have signed a petition” countering the consensus completely erased that increase.
However, if participants were warned of exactly this kind of falsehood before being exposed to both the consensus message and the misinformation, the net effect was an increase to about 84%. A preemptive warning about politically motivated attempts to spread misinformation worked, Van der Linden concluded. The effect was seen even in those who were more skeptical of climate change to begin with.
The study, published in Global Challenges in January 2017—just after Donald Trump had been sworn in as U.S. president—created a flood of media attention... (MORE - missing details)
EXCERPTS: . . . As head of the Social Decision-Making Lab at the University of Cambridge, Van der Linden is studying the power of lies and how to keep people from believing them. He has become academia’s biggest proponent of a strategy pioneered after the Korean War to “inoculate” humans against persuasion, the way they are vaccinated against dangerous infections.
The recipe only has two steps: First, warn people they may be manipulated. Second, expose them to a weakened form of the misinformation, just enough to intrigue but not persuade anyone. “The goal is to raise eyebrows (antibodies) without convincing (infecting),” Van der Linden and his colleague Jon Roozenbeek wrote recently in JAMA.
Inoculation, also called “prebunking,” is just one of several techniques researchers are testing to stop people from falling for misinformation and spreading it further....
[...] Van der Linden’s 2023 book, Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity, has won many awards, and Google’s research arm, Jigsaw, has rolled out the approach to tens of millions of people via YouTube ads. “My reading of the literature is that it’s probably the most effective strategy,” says Jay van Bavel, a psychologist at New York University.
But others say inoculation is an analogy gone awry that wrongly focuses on recipients of misinformation instead of on its sources and the social media companies...
[...] The idea of inoculation dates back to the Cold War era. In 1954, when the Korean War ended, 21 U.S. prisoners of war decided to move to communist China instead of coming home, a choice that shocked the nation. Many assumed the soldiers were victims of “brainwashing,” a term invented a few years earlier. To resist this kind of manipulation, experts declared, young people in the United States needed to be taught more about “American” ideals at home, at school, and in the Army.
But psychologist William McGuire, who spent most of his career at Yale University, had a different idea. He argued that the soldiers were vulnerable to the endless propaganda they encountered because it was their first exposure to those ideas. Such prisoners, McGuire argued, were like someone brought up in an “aseptic” environment who, “although appearing in very good health, proves quite vulnerable when suddenly exposed to a massive dose of an infectious virus.” The remedy seemed obvious: Pre-expose people to “weakened, defense-stimulating forms of the counterarguments.”
McGuire tested the hypothesis by seeing whether inoculation could preserve students’ belief in four cultural truisms, including that they should brush their teeth after every meal and that antibiotics had been a huge benefit to humankind. Confronting students with counterarguments—for example, that antibiotics led to the development of deadly resistant strains—could drastically decrease their belief in these truisms, McGuire found. But if the students first read an essay that laid out the counterarguments and refuted them, their belief didn’t erode nearly as much.
The idea fascinated Van der Linden, who first read McGuire’s papers as a Yale student working on the public’s perception of climate change. But McGuire believed inoculation only worked if people’s beliefs had never been challenged before. Van der Linden thought that assumption was mistaken. “In the real world, we are dealing with people in various stages of infection, and inoculation can both work as a preventative and therapeutic agent,” he wrote in his book. “If I ever had a ‘lightbulb’ moment, this was it.”
Van der Linden put the theory to the test in an online study of more than 2000 participants that probed their views of climate change—and tried to change them. Respondents on average estimated 70% of scientists agree human-caused global climate change is real. When the subjects were told that the actual number is 97%, their estimates rose accordingly. But exposing participants to a misinformation trope that “over 31,000 American scientists have signed a petition” countering the consensus completely erased that increase.
However, if participants were warned of exactly this kind of falsehood before being exposed to both the consensus message and the misinformation, the net effect was an increase to about 84%. A preemptive warning about politically motivated attempts to spread misinformation worked, Van der Linden concluded. The effect was seen even in those who were more skeptical of climate change to begin with.
The study, published in Global Challenges in January 2017—just after Donald Trump had been sworn in as U.S. president—created a flood of media attention... (MORE - missing details)