Daryl Bem proved ESP is real. Which means science is broken

C C Offline

EXCERPT: [...] Having served for a time as an associate editor of JPSP, Bem knew his methods would be up to snuff. With about 100 subjects in each experiment, his sample sizes were large. He’d used only the most conventional statistical analyses. He’d double- and triple-checked to make sure there were no glitches in the randomization of his stimuli.

Even with all that extra care, Bem would not have dared to send in such a controversial finding had he not been able to replicate the results in his lab, and replicate them again, and then replicate them five more times. His finished paper lists nine separate ministudies of ESP. Eight of those returned the same effect.

Bem’s 10-year investigation, his nine experiments, his thousand subjects—all of it would have to be taken seriously. He’d shown, with more rigor than anyone ever had before, that it might be possible to see into the future. Bem knew his research would not convince the die-hard skeptics. But he also knew it couldn’t be ignored.

When the study went public, about six months later, some of Bem’s colleagues guessed it was a hoax. Other scholars, those who believed in ESP—theirs is a small but fervent field of study—saw his paper as validation of their work and a chance for mainstream credibility.

But for most observers, at least the mainstream ones, the paper posed a very difficult dilemma. It was both methodologically sound and logically insane. Daryl Bem had seemed to prove that time can flow in two directions—that ESP is real. If you bought into those results, you’d be admitting that much of what you understood about the universe was wrong. If you rejected them, you’d be admitting something almost as momentous: that the standard methods of psychology cannot be trusted, and that much of what gets published in the field—and thus, much of what we think we understand about the mind—could be total bunk.

If one had to choose a single moment that set off the “replication crisis” in psychology—an event that nudged the discipline into its present and anarchic state, where even textbook findings have been cast in doubt—this might be it: the publication, in early 2011, of Daryl Bem’s experiments on second sight.

The replication crisis as it’s understood today may yet prove to be a passing worry or else a mild problem calling for a soft corrective. It might also grow and spread in years to come, flaring from the social sciences into other disciplines, burning trails of cinder through medicine, neuroscience, and chemistry. It’s hard to see into the future. But here’s one thing we can say about the past: The final research project of Bem’s career landed like an ember in the underbrush and set his field ablaze.


Had Bem made those choices in advance? The wording of his paper suggested that he had. But that’s the way papers in his field are written—or at least it’s how they were written back in 2010: People would act as if they’d preplanned everything, even when they’d bushwhacked their way through a thicket of results, ignoring all the dead ends they came across along the way. Statistician Andrew Gelman refers to this as “the garden of forking paths”: If you don’t specify your route before you start, any place you end up can be made to seem like a meaningful destination. (Gelman ascribed this problem to Bem’s research in a 2013 piece for Slate.)

[...] There’s now a movement in psychology to “pre-register” your research, so you commit yourself in advance to a plan for running your experiment and analyzing the data. Even now, the wisdom of this practice is contested, and it’s certainly the case that journal editors never would have expected Bem to pre-register anything circa the early 2000s, when he started on his ESP research.

“Clearly by the normal rules that we [used] in evaluating research, we would accept this paper,” said Lee Ross, a noted social psychologist at Stanford who served as one of Bem’s peer reviewers. “The level of proof here was ordinary. I mean that positively as well as negatively. I mean it was exactly the kind of conventional psychology analysis that [one often sees], with the same failings and concerns that most research has.”

[...] There were other replication failures, too. But then, there were also some successes. Bem has since put out a meta-analysis that includes 23 exact replications of his original experiments, going back to 2003. When he pooled all those studies with his own, creating a pool of more than 2,000 subjects, he found a positive effect. In his view, the data showed ESP was real.

Others have disputed this assessment. Wagenmakers notes that if Bem restricted his analysis to those studies that came out after his—that is to say, if he’d looked at the efforts of mainstream researchers and skipped the ones by fellow travelers who’d heard about his work at meetings of the Parapsychological Association—the positive effect would disappear.

In any case, those replications soon became a footnote. Within a month or two, the fallout from Bem’s initial paper had broadened into something bigger than a referendum on precognition. It had become a referendum on evidence itself.

In 2005, while Bem was still working on his ESP experiments, medical doctor and statistician John Ioannidis published a short but often-cited essay arguing that “most published research findings are false.” Among the major sources of this problem, according to Ioannidis, was that researchers gave themselves too much flexibility in designing and analyzing experiments—that is, they might be trying lots of different methods and reporting only the “best” results.

[...] These dodgy methods were clearly rife in academic science. A 2011 survey of more than 2,000 university psychologists had found that more than half of those researchers admitted using them. But how badly could they really screw things up? By running 15,000 simulations, Simmons, Nelson, and Simonsohn showed that a researcher could almost double her false-positive rate (often treated as if it were 5 percent) with just a single, seemingly innocuous manipulation. And if a researcher combined several questionable (but common) research practices—fiddling with the sample size and choosing among dependent variables after the fact, for instance—the false-positive rate might soar to more than 60 percent.

“It wasn’t until we ran those simulations that we understood how much these things mattered,” Nelson said. “You could have an entire field that is trying to be noble, actively generating false findings.”

[...] Bem had shown that even a smart and rigorous scientist could cart himself to crazyland, just by following the rules of the road. But Simmons, Nelson, and Simonsohn revealed that Bem’s ESP paper was not a matter of poor judgment—or not merely that—but one of flawed mechanics. They’d popped the hood on the bullshit-mobile of science and pointed to its busted engine. They’d shown that anyone could be a Daryl Bem, and any study could end up as a smoking pile of debris...

- - -
Magical Realist Offline
If it's a question of whether esp exists, or whether if you can prove esp thru experiments then you can prove anything, I think I'd just go with esp existing. Do the science and let the chips fall where they may. That's how science evolves over time. The amazing implication about Bem's experiments is that esp can be somewhat exerted at will. Provided ofcourse a strong enough motivation.

Possibly Related Threads…
Thread Author Replies Views Last Post
  The Dunning-Kruger Effect is probably not real + The pervsion of science C C 1 87 Dec 28, 2020 08:23 PM
Last Post: Syne
  Behavioural despair tests get rethink + "Nutrition science is broken" C C 0 178 Jul 20, 2019 12:05 AM
Last Post: C C
  The real war on science + Economists versus the economy C C 2 347 Jan 5, 2017 10:35 PM
Last Post: Syne
  The Broken Technology of Ghost Hunting C C 17 2,719 Nov 24, 2016 06:26 PM
Last Post: Zinjanthropos
  'Chemtrails' not real, say atmospheric science experts C C 1 499 Sep 4, 2016 12:47 AM
Last Post: Yazata
  Science Isn’t Broken C C 2 933 Aug 28, 2015 07:44 PM
Last Post: Yazata

Users browsing this thread: 1 Guest(s)