
http://nautil.us/blog/yes-your-brain-doe...nformation
EXCERPT: [...] Computation actually has a specific definition—not necessarily related to information processing—formulated by Alan Turing, forefather of the modern computer. In 1928, the mathematician David Hilbert wondered whether it were possible to create a machine capable of answering any mathematical question. Turing, in answering Hilbert in 1936, “invented a mathematical model that we now know as the Turing machine,” says Krakauer. It showed that there are some mathematical statements that are fundamentally uncomputable—impossible, in other words, to prove as true or false. Later, in the 1940s, Turing realized that the Turing machine was not just a model for solving math problems, says Krakauer—“it was actually the model of problem solving itself, and the model of problem solving itself is what we mean by computation.”
But [Robert ] Epstein uses “computation” as if it’s synonymous with any sort of information-processing, in effect setting up a straw man: “The idea,” writes Epstein, “that humans must be information processors just because computers are information processors is just plain silly.” This argument, says Krakauer, “is so utterly confused that it’s almost not worth attending to.”
The brain does process information because information is in fact “the negative of thermodynamic entropy,” as Claude Shannon, the founder of information theory, realized, says Krakauer. Entropy is the degree of disorder or randomness in a system, so information, says Krakauer, amounts to “the reduction of uncertainty,” or disorder, in a system. Our brains, he says, are constantly reducing uncertainty about the world, for example, by transforming sensory inputs, like light hitting our eyes and atmospheric vibrations bumping our ears, into the perceptual outputs of a visual scene accompanied by sound. In a sense, every waking moment is an effort to stave off and diminish the unpredictability of the world....
EXCERPT: [...] Computation actually has a specific definition—not necessarily related to information processing—formulated by Alan Turing, forefather of the modern computer. In 1928, the mathematician David Hilbert wondered whether it were possible to create a machine capable of answering any mathematical question. Turing, in answering Hilbert in 1936, “invented a mathematical model that we now know as the Turing machine,” says Krakauer. It showed that there are some mathematical statements that are fundamentally uncomputable—impossible, in other words, to prove as true or false. Later, in the 1940s, Turing realized that the Turing machine was not just a model for solving math problems, says Krakauer—“it was actually the model of problem solving itself, and the model of problem solving itself is what we mean by computation.”
But [Robert ] Epstein uses “computation” as if it’s synonymous with any sort of information-processing, in effect setting up a straw man: “The idea,” writes Epstein, “that humans must be information processors just because computers are information processors is just plain silly.” This argument, says Krakauer, “is so utterly confused that it’s almost not worth attending to.”
The brain does process information because information is in fact “the negative of thermodynamic entropy,” as Claude Shannon, the founder of information theory, realized, says Krakauer. Entropy is the degree of disorder or randomness in a system, so information, says Krakauer, amounts to “the reduction of uncertainty,” or disorder, in a system. Our brains, he says, are constantly reducing uncertainty about the world, for example, by transforming sensory inputs, like light hitting our eyes and atmospheric vibrations bumping our ears, into the perceptual outputs of a visual scene accompanied by sound. In a sense, every waking moment is an effort to stave off and diminish the unpredictability of the world....