Sep 24, 2025 01:26 AM
https://thereader.mitpress.mit.edu/is-li...mputation/
EXCERPTS: Nonetheless, Turing and von Neumann grasped something fundamental: Computation doesn’t require a central processor, logic gates, binary arithmetic, or sequential programs. There are infinite ways to compute, and, crucially, they are all equivalent. This insight is one of the greatest accomplishments of theoretical computer science.
This “platform independence” or “multiple realizability” means that any computer can emulate any other one. If the computers are of different designs, though, the emulation may be glacially slow. For that reason, von Neumann’s self-reproducing cellular automaton has never been physically built — though that would be fun to see!
[...] In 1994, a strange, pixelated machine came to life on a computer screen. It read a string of instructions, copied them, and built a clone of itself — just as the Hungarian-American Polymath John von Neumann had predicted half a century earlier. It was a striking demonstration of a profound idea: that life, at its core, might be computational.
Although this is seldom fully appreciated, von Neumann was one of the first to establish a deep link between life and computation. Reproduction, like computation, he showed, could be carried out by machines following coded instructions. In his model, based on Alan Turing’s Universal Machine, self-replicating systems read and execute instructions much like DNA does: “if the next instruction is the codon CGA, then add an arginine to the protein under construction.” It’s not a metaphor to call DNA a “program” — that is literally the case.
Of course, there are meaningful differences between biological computing and the kind of digital computing done by a personal computer or your smartphone. [...]
Biological computing is “massively parallel,” decentralized, and noisy. [...] The movements of hinged components, the capture and release of smaller molecules, and the manipulation of chemical bonds are all individually random, reversible, and inexact, driven this way and that by constant thermal buffeting. Only a statistical asymmetry favors one direction over another, with clever origami moves tending to “lock in” certain steps such that a next step becomes likely to happen.
This differs greatly from the operation of “logic gates” in a computer, basic components that process binary inputs into outputs using fixed rules. They are irreversible and engineered to be 99.99 percent reliable and reproducible.
Biological computing is computing, nonetheless. And its use of randomness is a feature, not a bug... (MORE - missing details)
EXCERPTS: Nonetheless, Turing and von Neumann grasped something fundamental: Computation doesn’t require a central processor, logic gates, binary arithmetic, or sequential programs. There are infinite ways to compute, and, crucially, they are all equivalent. This insight is one of the greatest accomplishments of theoretical computer science.
This “platform independence” or “multiple realizability” means that any computer can emulate any other one. If the computers are of different designs, though, the emulation may be glacially slow. For that reason, von Neumann’s self-reproducing cellular automaton has never been physically built — though that would be fun to see!
[...] In 1994, a strange, pixelated machine came to life on a computer screen. It read a string of instructions, copied them, and built a clone of itself — just as the Hungarian-American Polymath John von Neumann had predicted half a century earlier. It was a striking demonstration of a profound idea: that life, at its core, might be computational.
Although this is seldom fully appreciated, von Neumann was one of the first to establish a deep link between life and computation. Reproduction, like computation, he showed, could be carried out by machines following coded instructions. In his model, based on Alan Turing’s Universal Machine, self-replicating systems read and execute instructions much like DNA does: “if the next instruction is the codon CGA, then add an arginine to the protein under construction.” It’s not a metaphor to call DNA a “program” — that is literally the case.
Of course, there are meaningful differences between biological computing and the kind of digital computing done by a personal computer or your smartphone. [...]
Biological computing is “massively parallel,” decentralized, and noisy. [...] The movements of hinged components, the capture and release of smaller molecules, and the manipulation of chemical bonds are all individually random, reversible, and inexact, driven this way and that by constant thermal buffeting. Only a statistical asymmetry favors one direction over another, with clever origami moves tending to “lock in” certain steps such that a next step becomes likely to happen.
This differs greatly from the operation of “logic gates” in a computer, basic components that process binary inputs into outputs using fixed rules. They are irreversible and engineered to be 99.99 percent reliable and reproducible.
Biological computing is computing, nonetheless. And its use of randomness is a feature, not a bug... (MORE - missing details)