https://iai.tv/articles/tim-palmer-quant..._auid=2020
EXCERPTS (Tim Palmer): . . . One of the extraordinary applications of quantum mechanics, our theory of physics on small scales, is quantum computing in which certain computations, such as factoring composite numbers into primes, can be performed exponentially faster on a quantum computer than a classical computer. From where does the extra processing power come?
One could say the extra power is simply encoded in the mathematics of quantum mechanics. But that merely begs the question: What is the underlying physics encoded in the mathematics of quantum mechanics that gives quantum computers their power?
One answer to this question, the answer favoured by one of the pioneers of quantum computing, David Deutsch, is “quantum parallelism”. David is a believer in the so-called Many Worlds Interpretation of quantum mechanics, proposed by Hugh Everett in the 1950s.
The key idea in the Many Worlds interpretation is that when we make a measurement of a quantum system, the universe somehow branches into multiple copies, corresponding to each of the possible outcomes of the measurement. According to Deutsch a quantum computer allows useful tasks to be performed in collaboration between these copies of the universe.
The quantum computer’s processing power comes from an outsourcing of work in which calculations take place in other universes. Entangled quantum particles function as paths of communication between different universes, sharing information and gathering the results.
Sounds bonkers? Well, let me put my cards on the table. I think Deutsch is absolutely right that the resource that explains the power of quantum computation lies in these parallel worlds.
However, I personally don’t buy into Everett’s Many Worlds interpretation at a technical level. It has a number of problems, notably with how to attach probabilities to each of the worlds that branch after a measurement. In my view, the solution to this problem is not to give up on quantum parallelism, but to give up on a key assumption in the Everettian interpretation, that the Schrödinger equation in quantum mechanics is literally true and not just a good approximation to a deeper theory of quantum physics.
I have my own model of quantum physics, called Invariant Set Theory, described in my book The Primacy of Doubt. Somewhat like the Everettian interpretation, the quantum wavefunction represents an ensemble of parallel worlds each world lying close to the others. But when we perform a measurement, worlds do not branch or split. Instead, they simply diverge from each other, like the divergence of state-space trajectories in chaotic systems (as described in The Primacy of Doubt). Indeed, in the book I propose that the universe is itself a deterministic system evolving on a “cosmological fractal attractor”.
I described this model in an earlier IAI article, as I believe it provides a novel way to understand some of the most conceptually difficult issues in quantum physics (like uncertainty and spooky action at a distance) [v]. Here the Schrödinger equation is only an approximation to deeper underlying laws of physics.
[...] I want to suggest that our brains are quantum computers in the sense of having a cognitive awareness of nearby counterfactual worlds on the cosmological fractal attractor. But hang on, you may complain ... Well, I never said that your quantum brain could factor large numbers. That won’t be possible not least because the warm noisy environment of the brain would prevent entangled quantum superpositions lasting long enough to do serious quantum computational calculations ... But that doesn't mean that there aren't some vestiges of quantum parallelism in our cognitive capabilities.
But why should our brains ever make use of quantum physics in the first place? Here I believe the primal answer is energy efficiency... (MORE - missing details)
EXCERPTS (Tim Palmer): . . . One of the extraordinary applications of quantum mechanics, our theory of physics on small scales, is quantum computing in which certain computations, such as factoring composite numbers into primes, can be performed exponentially faster on a quantum computer than a classical computer. From where does the extra processing power come?
One could say the extra power is simply encoded in the mathematics of quantum mechanics. But that merely begs the question: What is the underlying physics encoded in the mathematics of quantum mechanics that gives quantum computers their power?
One answer to this question, the answer favoured by one of the pioneers of quantum computing, David Deutsch, is “quantum parallelism”. David is a believer in the so-called Many Worlds Interpretation of quantum mechanics, proposed by Hugh Everett in the 1950s.
The key idea in the Many Worlds interpretation is that when we make a measurement of a quantum system, the universe somehow branches into multiple copies, corresponding to each of the possible outcomes of the measurement. According to Deutsch a quantum computer allows useful tasks to be performed in collaboration between these copies of the universe.
The quantum computer’s processing power comes from an outsourcing of work in which calculations take place in other universes. Entangled quantum particles function as paths of communication between different universes, sharing information and gathering the results.
Sounds bonkers? Well, let me put my cards on the table. I think Deutsch is absolutely right that the resource that explains the power of quantum computation lies in these parallel worlds.
However, I personally don’t buy into Everett’s Many Worlds interpretation at a technical level. It has a number of problems, notably with how to attach probabilities to each of the worlds that branch after a measurement. In my view, the solution to this problem is not to give up on quantum parallelism, but to give up on a key assumption in the Everettian interpretation, that the Schrödinger equation in quantum mechanics is literally true and not just a good approximation to a deeper theory of quantum physics.
I have my own model of quantum physics, called Invariant Set Theory, described in my book The Primacy of Doubt. Somewhat like the Everettian interpretation, the quantum wavefunction represents an ensemble of parallel worlds each world lying close to the others. But when we perform a measurement, worlds do not branch or split. Instead, they simply diverge from each other, like the divergence of state-space trajectories in chaotic systems (as described in The Primacy of Doubt). Indeed, in the book I propose that the universe is itself a deterministic system evolving on a “cosmological fractal attractor”.
I described this model in an earlier IAI article, as I believe it provides a novel way to understand some of the most conceptually difficult issues in quantum physics (like uncertainty and spooky action at a distance) [v]. Here the Schrödinger equation is only an approximation to deeper underlying laws of physics.
[...] I want to suggest that our brains are quantum computers in the sense of having a cognitive awareness of nearby counterfactual worlds on the cosmological fractal attractor. But hang on, you may complain ... Well, I never said that your quantum brain could factor large numbers. That won’t be possible not least because the warm noisy environment of the brain would prevent entangled quantum superpositions lasting long enough to do serious quantum computational calculations ... But that doesn't mean that there aren't some vestiges of quantum parallelism in our cognitive capabilities.
But why should our brains ever make use of quantum physics in the first place? Here I believe the primal answer is energy efficiency... (MORE - missing details)