One of my literary heroes! The System does not suffer lightly the journey of the original thinker...
"In 1929, Joseph Campbell made the worst career move possible.
He'd just finished studying medieval literature in Paris and Munich. He had a master's degree from Columbia. The path was clear: get your PhD, land a university job, publish papers in your narrow specialty, build your career brick by conventional brick.
Instead, Campbell walked into his faculty advisor's office and announced he wanted to study Sanskrit, modern art, psychology, AND medieval literature. They said no. Academic programs didn't work that way. Pick one lane.
Campbell walked away from the entire system.
Then the Great Depression hit. The timing couldn't have been worse. The stock market crashed a month after he returned to America. Academic jobs evaporated. His friends thought he'd destroyed his future. His family was horrified.
But Campbell did something radical: he decided to use the crisis as an opportunity.
He rented a cabin in Woodstock, New York for twenty dollars a year. No running water. No career prospects. Just books.
For the next five years, Campbell read. Not casually—monastically. He'd wake at dawn and read for nine hours straight. Hindu texts. Buddhist scriptures. Greek mythology. Native American stories. African folklore. Carl Jung's psychology. James Joyce's experimental novels. Medieval romances.
Everything.
He wasn't preparing for exams. He wasn't writing papers for tenure committees. He was looking for something academics confined to their specialties would never see: patterns hidden across cultures and centuries.
His routine was brutal in its simplicity. Read. Take notes. Read more. Synthesize. Repeat. No social pressure. No academic approval. Just an obsessive search for connections between human stories separated by thousands of miles and millennia.
In 1934, after five years of voluntary intellectual exile, Campbell got a job teaching literature at Sarah Lawrence College. The school was perfect—it encouraged interdisciplinary thinking rather than narrow expertise. He could finally teach everything he'd been studying.
But the real work was just beginning.
For the next fifteen years, while teaching full-time, Campbell organized everything he'd discovered into a single revolutionary idea: every hero story ever told—from ancient Mesopotamia to modern Hollywood—follows the same pattern.
The hero receives a call to adventure. Refuses at first. Eventually crosses into an unknown world.
Faces tests and trials. Undergoes transformation. Returns home changed, bringing wisdom to others.
Greek myths. Hindu epics. Native American legends. Christian parables. Buddhist teachings. Arthurian romances. The specific details varied wildly, but the skeleton beneath was identical.
Campbell called it the "monomyth." The hero's journey.
In 1949, he published The Hero with a Thousand Faces. Academic reviewers were mixed—some thought he was oversimplifying complex traditions. The book sold modestly.
Then nothing happened. For decades.
Campbell kept teaching. Kept researching. Kept refining his ideas. The book stayed in print but remained obscure outside academic circles.
Until 1977.
A young filmmaker named George Lucas released Star Wars. Luke Skywalker's journey—farm boy to Jedi knight—followed Campbell's pattern exactly. The call to adventure. The refusal ("I can't leave my uncle"). The mentor. The trials. The transformation. The return.
Lucas publicly credited Campbell. Suddenly, everyone wanted to know: who was this mythology professor whose work had shaped the biggest movie of the decade?
Writers discovered the book. Filmmakers studied it. A Hollywood script consultant named Christopher Vogler translated Campbell's academic framework into practical screenwriting advice. The monomyth became the secret architecture of blockbuster storytelling.
In 1988, journalist Bill Moyers filmed a six-part PBS series with Campbell at George Lucas's Skywalker Ranch, explaining mythology for general audiences. The series aired just after Campbell died in October 1987.
The Power of Myth became one of the most-watched PBS series in history.
The Hero with a Thousand Faces—published nearly 40 years earlier—hit the bestseller list.
Campbell died at 83, having lived to see his cabin-in-the-woods reading project influence how millions understand stories.
Today, it's almost impossible to watch a major film without seeing Campbell's influence. The Matrix. Harry Potter. The Lion King. The Lord of the Rings. Black Panther. Every hero who refuses the call, crosses a threshold, faces trials, and returns transformed is walking Campbell's path.
Critics argue he oversimplified diverse traditions, ignored myths that didn't fit his pattern, and focused too heavily on male heroes while treating women as helpers or prizes. These criticisms are valid and important.
But his influence is undeniable.
Because in 1929, when everyone said "specialize," Joseph Campbell said "no, I need to see the whole picture." When the economy crashed and everyone scrambled for security, he chose poverty and books. When academic institutions said "stay in your lane," he spent five years reading across every lane simultaneously.
He didn't discover the monomyth by following the prescribed path.
He discovered it by rejecting the path entirely and spending years looking for patterns that academic boundaries kept separate.
The man who dropped out to read mythology in a cabin influenced some of the most successful films ever made—because he understood that the biggest insights often require stepping outside the system designed to produce them.
Sometimes the worst career move is the only one that leads somewhere truly original.
"Wassily Kandinsky had synesthesia, specifically chromesthesia, meaning he literally saw colors when he heard sounds and heard music when he saw colors, a phenomenon that profoundly shaped his transition from law to art and fueled his pioneering abstract paintings, which he often titled like musical compositions (e.g., Improvisation, Composition) to reflect this "joined perception" where color and sound merged into a spiritual symphony. He described powerful sensory experiences, like seeing vivid colors during a Wagner concert, which convinced him to pursue art to express these internal, multisensory realities, making his work a visual representation of auditory experiences."
EXCERPTS: The idea of “experiential relativity,” as Rob Boddice calls it—a recent paper also referred to his approach as “historical neurodiversity”—might seem squishy and postmodern. It’s a kind of thinking that questions whether anything is real—the sort of speculation that might emerge from a dorm room late at night. The reaction is understandable. But Boddice is interested in some very real things: the brain and the body, and the way they interact with culture to produce experience.
His approach reminded me of the philosopher William James, who also didn’t believe that human emotions are “sacramental or eternally fixed,” as he wrote in his 1890 book, The Principles of Psychology. Whereas the prevailing thought was that an internal feeling generates outward response—I’m sad, therefore I cry—James thought the causality was all wrong.
In his schema, what happens first is an external stimulus. This triggers a bodily response, and only then does an internal process of interpretation assign meaning to that response. I might see a sunset and find tears springing to my eyes, and then my mind will interpret this as missing my father, with whom I last witnessed a sunset. Because of this variability of response and interpretation, James wrote, “there is no limit to the number of possible different emotions which may exist, and why the emotions of different individuals may vary indefinitely.”
[...] Boddice makes his claim with the gusto and certainty of a Silicon Valley entrepreneur: “What we propose is a disruption of what it is and means to be human.” This is the kind of provocative statement to which Boddice is prone, and his work can induce a sense of vertigo. To unmoor people from any sense of common humanity means undermining most of the political philosophies and laws that govern our world. If we abandon our sense of shared humanness with people who lived in the past, what does that mean for other people who live in different cultural contexts today—in a village in China, or just on the other side of the same city?
And yet the notion that the same inputs may create divergent experiences has some gut-level validity to it—think of how the feeling of being an American changes today whether you are wearing blue or red lenses. And then there are the dizzying advances in AI, which make Boddice’s question—what does it even mean to be human?—one that we all face as never before.
[...] The brain, Barrett told me, is trapped in the skull, “a dark, silent box,” so it has to make predictions by drawing on those concepts and categories, which are “very, very different by culture—even the concept of what an emotion is varies by culture.” This helps the brain predict and hone its perceptions, and these are very much related to concepts tied to a time and place.
The process all leads to what the behavioral neurologist Marsel Mesulam has called our “highly edited subjective version of the world.” In other words, there is no spot in our heads where a Platonic (or emoji) version of sadness or happiness resides. Feelings are not determined; they are created. And this is true for even something as seemingly universal as pain... (MORE - details)
EXCERPTS: How can we live in a meaningless world? Is there any hope of happiness, when our existence is fundamentally absurd [...] These are the questions to which Albert Camus returns over and over again in his fiction, essays and plays.
[...] For Camus, as he puts it first in the notebooks and later in L’Homme révolté: ‘Freedom is the right not to lie.’ By freedom he also means a liberation from antinomies, from being forced ‘to choose to be victim or executioner – and nothing else’. Throughout his work, throughout these notebooks, Camus rejects the ‘naivete of the … intellectual who believes a person has to be inflexible to flex their intellect’.
Some of the above sounds like Orwell, and there are similarities between the two authors. Both refused adamantine positions, both opposed fascism and communism, both experienced some heat from their peers for these positions. As Bloom writes, Camus came to see the French Communist Party and its intellectual supporters as ‘apologists for premeditated, organised, rationalised murder’. This ended his friendship with Sartre, their quarrel coming after the publication of L’Homme révolté in 1951... (MORE - details)
EXCERPTS: As important as fire has been to our species, tracing its early history has proved an immense challenge. Rain can wash away ash and charcoal, erasing the evidence of a fire. Even when scientists do uncover the rare trace of an ancient blaze, it can be hard to determine whether it was created by people or ignited by lightning.
The oldest evidence for human ancestors using fire, dating back to between 1 million and 1.5 million years ago, comes from a cave in South Africa. Human ancestors left behind tens of thousands of fragments of bones from the animals they butchered to eat. Of those fragments, 270 show signs of having been burned in a fire.
But clues like these don’t offer clear proof that those ancient people knew how to make a fire. They may have just stumbled across a wildfire from time to time, and figured out ways to take advantage of it. They might have learned to light a stick from the fire, and then carry the ember back to their cave to cook a meal.
[...] Some 400,000 years ago, in what is now eastern England, a group of Neanderthals used flint and pyrite to make fires by a watering hole — not just once, but time after time, over several generations.
That is the conclusion of a study published on Wednesday in the journal Nature. Previously, the oldest known evidence of humans making fires dated back just 50,000 years. The new finding indicates that this critical step in human history occurred much earlier.
[...] For the time being, Barnham remains the only place known for any evidence of fire-making hundreds of thousands of years ago. But that isn’t proof that the practice was rare at the time, Dr. Ashton said. After all, it had taken years of field work at Barnham to uncover the telling evidence. Similar efforts could reveal other Barnhams elsewhere in the world... (MORE - missing details)
INTRO: The creator of AI actress Tilly Norwood is addressing the backlash over potentially replacing human actors with AI ones.
Eline Van Der Velden, the founder and CEO of Particle6 and creator of the AI-generated actress, spoke with ABC News Live on Tuesday to explain how she thinks Tilly will fit in the world of film -- and sought to allay concerns that Tilly would take the jobs of human actors.
"That's not what she's here for and that's absolutely not my plan," Van Der Velden told ABC News' Kyra Phillips when asked about her past comments saying that she aims to make Tilly the next Scarlett Johansson or Natalie Portman... (MORE - details)
Posted by: C C - 5 hours ago - Forum: Junk Science
- No Replies
The political narrative: "Rosalind Franklin's story involves intellectual property theft, sexism, and deceit, and the struggle of a woman scientist to be accepted in the male-dominated scientific community of the 1950s."
- - - - - - -
EXCERPTS: Interviews with Crick from the 1960s and a close reading of the Watson and Crick research papers show that the actual process of making the breakthrough did not involve using any of Franklin’s data. Instead, the pair spent a month fiddling about with cardboard shapes corresponding to the component molecules of DNA, using the basic rules of chemistry. Once they had finally, almost by accident, made the discovery, then they could see that it corresponded to Franklin’s data. Franklin was not hostile to the pair—she continued to share her data and ideas with both men and subsequently became very close friends with Crick and his wife, Odile. (MORE - details)
"Patient AB’s story is one of the most unusual and fascinating cases in psychiatry. In 1984, she began hearing voices that were not her own thoughts, but unlike the distressing hallucinations often described in medical literature, these voices were calm, supportive, and strangely precise. They reassured her, told her she was ill, and urged her to go to the hospital. More astonishingly, they gave her a specific diagnosis: a brain tumor.
She listened, sought medical help, and doctors confirmed the voices were right. A tumor was discovered, surgery was performed, and she recovered. Throughout this process, the voices seemed invested in her wellbeing. They expressed pleasure when she was well again, and once their mission was complete, they bid her farewell and disappeared entirely. This was not just rare, it was unprecedented.
The case was later documented by psychiatrist Ikechukwu Obialo Azuonye and published in the British Medical Journal in 1997. It was described as “the first and only instance in which hallucinatory voices sought to reassure the patient, offered her a diagnosis, directed her to hospital, expressed joy at her recovery, bid farewell, and then disappeared.” That description alone captures how extraordinary the event was.
What makes Patient AB’s experience so compelling is how it challenges our assumptions about hallucinations. Typically, voices are seen as symptoms of illness, often tormenting or misleading. Yet here they acted almost like guardians, guiding her toward lifesaving treatment. It raises profound questions about the brain, consciousness, and whether the mind can sometimes protect itself in ways science struggles to explain.
Even today, Patient AB’s case is cited as a reminder of the mysteries that lie within human perception. It is a story that blends medicine, psychology, and something almost miraculous. Whether viewed as a medical anomaly or a moment of inexplicable grace, her experience continues to spark wonder, proof that not all voices are destructive, and sometimes, they can save a life."