Does new physics lurk inside living matter?

C C Offline

EXCERPTS (Paul Davies): . . . Asked whether physics can explain life, most physicists would answer yes. The more pertinent question, however, is whether known physics is up to the job, or whether something fundamentally new is required. In the 1930s many of the architects of quantum mechanics-most notably Niels Bohr, Eugene Wigner, and Werner Heisenberg -- had a hunch that there is indeed something new and different in the physics of living matter. Schrödinger was undecided, but open to the possibility. “One must be prepared to find a new kind of physical law prevailing in it,” he conjectured. But he didn’t say what that might be.

[...] The gulf between physics and biology is more than a matter of complexity; a fundamental difference in conceptual framework exists. Physicists study life using concepts such as energy, entropy, molecular forces, and reaction rates. Biologists offer a very different narrative, with terms such as signals, codes, transcription, and translation -- the language of information. ... Life is invested in information storage and processing at all levels, not just in DNA. Genes-DNA sequences that serve as encrypted instruction sets -- can switch other genes on or off using chemical messengers, and they often form complex networks. Those chemical circuits resemble electronic or computing components, sometimes constituting modules or gates that enact logical operations.

At the cellular level, a variety of physical mechanisms permit signaling and can lead to cooperative behavior. Slime molds [...] provide a striking example. They are aggregations of single cells that can self-organize into striking shapes and sometimes behave coherently as if they were a single organism. Likewise, social insects such as ants and bees exchange complex information and engage in collective decision making... The informational basis of life has led some scientists to pronounce the informal dictum, Life = Matter + Information. For that linking equation to acquire real explanatory and predictive power, however, a formal theoretical framework is necessary that couples information to matter.

[...] information must be quantified and formally incorporated into the laws of thermodynamics. The basis for modern information theory was laid down by Claude Shannon in the late 1940s. Shannon defined information as reduction in uncertainty -- for example, by inspecting the outcome of a coin toss. The familiar binary digit, or bit, is the information gained by determining heads or tails from flipping a coin. The synthesis of Shannon’s information theory and thermodynamics led to the identification of information as negative entropy...

[...] the lesson of Maxwell’s demon is that information is actually a physical quantity that can profoundly affect the way that matter behaves. Information, as defined by Shannon, is more than an informal parameter; it is a fundamental physical variable that has a defined place in the laws of thermodynamics.

Shannon stressed that his information theory dealt purely with the efficiency and capacity of information flow; it said nothing about the meaning of the information communicated. But in biology, meaning or context is critical. [...] Today, information transfer in biology is known to be a two-way process, involving feedback loops and top-down information flow. ... Thinking about the physics of living matter in informational terms rather than purely molecular terms is analogous to the difference between software and hardware in computing. Just as a full understanding of a particular computer application -- PowerPoint, for example -- requires a grasp of the principles of software engineering as much as the physics of computer circuitry, so life can only be understood when the principles of biological information dynamics are fully elucidated.

Since the time of Isaac Newton, a fundamental dualism has pervaded physics. Although physical states evolve with time, the underlying laws of physics are normally regarded as immutable. That assumption underlies Hamiltonian dynamics, trajectory integrability, and ergodicity. But immutable laws are a poor fit for biological systems, in which dynamical patterns of information couple to time-dependent chemical networks and where expressed information -- example, the switching on of genes -- depend on global or systemic physical forces as well as local chemical signaling.

Biological evolution, with its open-ended variety, novelty, and lack of predictability, also stands in stark contrast to the way that nonliving systems change over time. Yet biology is not chaos: Many examples of rules at work can be found. [...] realistic description of change in biosystems would be the variation in the dynamical rules as a function of the state of a system. State-dependent dynamics opens up a rich landscape of novel behavior, but it is far from a formal mathematical theory. To appreciate what it might entail, consider the analogy to a game of chess.

In standard chess, the system is closed and the rules are fixed. From the conventional initial state, chess players are free to explore a state that, while vast, is nevertheless constrained by immutable rules to be but a tiny subset of all possible configurations of pieces on the board. Although an enormous number of patterns are possible, an even greater number of patterns are not permitted -- example, having all bishops occupy squares of the same color.

Now imagine a modified game of chess in which the rules can change according to the overall state of play -- a sysstem-level, or top-down, criterion. To take a somewhat silly example, if white is winning, then black might be permitted to move pawns backward as well as forward. In that extended version of chess, the system is open, and states of play will arise that are simply impossible using the fixed rules of standard chess. That imaginary game is reminiscent of biology, in which organisms are also open systems, able to accomplish things that are seemingly impossible for nonliving systems.

[...] If biology deploys new physics, such as state-dependent dynamical rules, then at what point between simple molecules and living cells does it emerge? CA models may be instructive, but they are cartoons, not physics; they tell us nothing about where to look for new emergent phenomena. As it happens, standard physics already contains a familiar example of state-dependent dynamics: quantum mechanics.

Left in isolation, a pure quantum state described by a coherent wavefunction evolves predictably according to a well-understood mathematical prescription known as unitary evolution. But when a measurement is made, the state changes abruptly -- a phenomenon often called the collapse of the wavefunction. In an ideal measurement, the jump projects the system into one possible eigenstate corresponding to the observable being measured. For that step, the unitary evolution rule is replaced by the Born rule, which predicts the relative probabilities of the measurement outcomes and introduces into quantum mechanics the element of indeterminism or uncertainty. That marks the transition from the quantum to the classical domain. Could quantum mechanics therefore point us to what makes life tick?

[...] The huge advances in molecular biology of the past few decades may be largely attributed to the application of mechanical concepts to biosystems-that is, to physics infiltrating biology. Curiously, the reverse is now happening. Many physicists, particularly those working on foundational questions in quantum mechanics, advocate placing information at the heart of physics, while others conjecture that new physics lurks in the remarkable and baffling world of biological organisms. Biology is shaping up to be the next great frontier of physics... (MORE - precise details, diagrams, references)

Users browsing this thread: 1 Guest(s)