Today I’m pleased to announce that we have a guest post from a very distinguished colleague of mine, Len Adleman. Len is best known as the “A” in RSA and the inventor of DNA-computing. He is a Turing Award laureate. However, he considers himself “a rank amateur” (his words!) as a physicist. He’s one of my colleagues on whom I can *always* rely for a fun and interesting conversation, even if it is just for a fleeting moment during a chance encounter in an elevator. The other day he told me he’d been thinking a lot about quantum mechanics, and it seemed like it would be fun to share his thoughts with others here on the blog. So join in using the comment form if you’ve some thoughts of your own in response.

Here’s Len.

-cvj

_________________________________________________________________________________

For a long time, physicists have struggled with perplexing “meta-questions” (my phrase): Does God play dice with the universe? Does a theory of everything exist? Do parallel universes exist? As the physics community is acutely aware, these are extremely difficult questions and one may despair of ever finding meaningful answers. The mathematical community has had its own meta-questions that are no less daunting: What is “truth”? Do infinitesimals exist? Is there a single set of axioms from which all of mathematics can be derived? In what many consider to be on the short list of great intellectual achievements, Frege, Russell, Tarski, Turing, Godel, and other logicians were able to clear away the fog and sort these questions out. The framework they created, mathematical logic, has put a foundation under mathematics, provided great insights and profound results. After many years of consideration, I have come to believe that mathematical logic, suitably extended and modified (perhaps to include complexity theoretic ideas), has the potential to provide the same benefits to physics. In the following remarks, I will explore this possibility.

But, be warned: I am not a physicist and these ideas are embryonic. At best they indicate a possible direction; a fully functional theoretical framework, if possible at all, would be the work of lifetimes.

For most of my academic life, my primary topic of research (and affection) has been number theory. Number theory is the study of “the standard model of arithmetic”: the set {0,1,2,…} together with the operations of addition and multiplication. Where by addition and multiplication, I mean those functions computed by the algorithms we learned when we were little.

Despite how simple the standard model may seem, finding out what is true about it is sometimes remarkably difficult. For example, only by building on over three centuries of prior research was Andrew Wiles able to establish that Fermat’s Last Theorem is true in the standard model.

In the 1930’s, Godel proved the famous incompleteness theorem. Godel’s theorem shows that if you “know” a set K of statements (e.g., Giuseppe Peano’s axioms) which are true in the standard model, there must exist a statement S which cannot be proven true and cannot be proven false using K as axioms. Further, there is an infinite collection of “parallel models” (my phrase), in all of which the statements in K are true, but in infinitely many of which S is true, and in infinitely many of which S is false. Such an S is said to be independent of K, and you cannot use K to figure out whether the standard model is one of those parallel models where S is true or one of those parallel models where S is false.

Now, consider the following story of a physicist and a mathematician (no, they do not go into a bar).

Our physicist has designed an experiment during which the spin of an electron will be measured. He attempts to predict whether the outcome of the measurement will be “up” or the outcome of the measurement will be “down”. He knows many laws of our universe (e.g., quantum mechanics, relativity, etc.). The physicist may be able to use his “known laws” to deduce many things about the outcome of the experiment (e.g., perhaps he can use quantum mechanics to deduce that the probability of the experimental measurement producing an “up” is 0.5 and the probability of the experimental measurement producing a “down” is 0.5). But, unfortunately, the known laws do not allow him to deduce whether the outcome of his measurement will be “up” or the outcome of his measurement will be “down”.

Though it may seem strange, it is very important to keep in mind that “known laws” and “laws” mean different things. Special relativity has (presumably) always been a law of our universe; however, in 1905, it became a “known law” of our universe.

Our mathematician has written an algorithm. He attempts to determine whether the algorithm halts on all inputs. He knows many true statements about the standard model (e.g., Peano’s axioms). The mathematician may be able to use his known truths to deduce many things about the algorithm. But, unfortunately, the known truths do not allow him to deduce whether his algorithm halts on all inputs.

How does our mathematician interpret his situation?

The mathematician considers the fact that the known truths do not allow him to deduce whether the algorithm halts on all inputs. He concludes that the statement “the algorithm halts on all inputs” is independent of the known truths. He realizes that there is an infinite collection of parallel models, in all of which all of the known truths are true, but in infinitely many of which the algorithm halts on all inputs, and in infinitely many of which there exists an input on which the algorithm does not halt. The mathematician’s difficulty is that despite all he knows, he does not know whether the standard model is one of the infinitely many in which the algorithm halts on all inputs or the infinitely many in which there is an input on which it does not halt.

The mathematician accepts that while he is having difficulty, the standard model is not. In the standard model either the algorithm halts on all inputs or there exists an input on which the algorithm does not halt. Which ever it is, it does not change over time and does not change because our mathematician has wondered about the question or even run the algorithm.

The mathematician views his difficulty as stemming from the inadequacy of the truths he knows. He is aware that Godel’s results establish that neither he nor any future mathematician can ever know all of the truths about the standard model. Even if future generations of mathematicians are equipped with a bigger set of known truths, algorithms will still exist about which these future mathematicians will be unable to deduce whether or not they halt on all inputs. Essentially, the mathematician has come to understand that the object of his study is simply too complex to yield all of its truths to one with only the tools he has available.

What have mathematicians gained from this interpretation? A great deal, for example, it has provided them with answers to many of their meta-questions – including all those asked at the beginning of this note.

Now how does our physicist interpret his situation? The Copenhagen interpretation would have the physicist view the electron as existing in a superposition of “up”/“down” until the moment of measurement when its wavefunction would collapse and the result of the measurement would be determined. This interpretation places the physicist’s difficulty within our universe rather than in the inadequacy of the known laws.

Many other interpretations have been proposed, notably the Everett many-worlds interpretation, but I will reserve a discussion of them for another time.

But, what about a new interpretation based upon the mathematical one given above? I suspect that such an interpretation is possible, that it is distinct from those previously proposed, and that it would have profound implications. With such an interpretation, the physicist, like the mathematician before him, must accept that the tools he possesses for understanding the object of his study are insufficient to unlock all of its truths. With such an interpretation, the physicist’s lack of knowledge is not imposed upon the universe itself. The electron probability cloud is no longer seen as a picture from our universe, but rather as a diagram of our lack of knowledge.

–*Len Adleman*

**Acknowledgment:** I want to thank David Deutsch – a real physicist – for acting as a sounding board for some of these ideas. The remarks above are partially derived from a series of emails between us a few years ago. I also want to thank my students for helping me compose this note. Finally, thanks to Clifford Johnson for providing a forum.

_________________________________________________________________________________

I cannot relate to the mathematics theories as I have never found the inclination to devote sufficient time to maths… However, it is my understanding that quantum theory goes way beyond what we are able to perceive from our three dimensional perspective.

What I can comment on is derived from my life experiences which have included beliefs (as opposed to knowledge) which at times in my life seemed very real to me from very different viewpoints (depending on what I believed to be true at the time). I think that mass conciousness is coming to the same understanding that whatever it is we believe in becomes our reality.

But we are also bounded by the illusion of time… So we look to our mathmatical friends to find a way of explaining their theories in laymans terms, whilst we continue to follow what we believe to be our own current truth.

I could go on… but I think I’ll send everyone to sleep (including myself).

Suzi.

[…] science. Leave a Comment Leonard Adleman, of RSA and DNA computing fame, and also my mentor, has posted his essay on his views of interpreting Quantum Mechanics on Clifford Johnson’s blog. Coming from a […]

This sounds like motivation for the Conway-Kochen Free Will Theorem.

Just curious Len on what you feel about Venn logic and how it might apply to your circumstance.

It is strange to me in the one sense that logic can be so cold and emotionless, while thinking about developing robotic features? That it can somehow evolve into a perfect and developed robot, with the intention of developing the perfect human being?:)

Also More often then not, I have notice the programing feature that Sno or LiGO needs in terms of looking through the data. How do you go about “writing that program” according to what you are looking for, in a way seems that we are writing the results occurring in nature before they appear??

Best,

Good to hear from you, Len!

There is one key difference between algorithms and photons. The algorithm either terminates, or does not. There are no two ways about this. In contrast, it is not always possible to say that a photon is either in spin up or spin down. Quantum superposition is a different kind of beast. It is a flavor of uncertainty that we have not seen before.

What I would really like to see is how you deal with the phenomenology of the double slit experiment ( http://en.wikipedia.org/wiki/Double-slit_experiment ).

Manoj

[…] This post was mentioned on Twitter by Manoj Gopalkrishnan and Manoj Gopalkrishnan. Manoj Gopalkrishnan said: Reading my advisor’s post on #quantum at http://bit.ly/2Wfzj3 […]

“The mathematician accepts that while he is having difficulty, the standard model is not.” Lol.

[…] Adleman, of RSA and DNA computing fame, and also my mentor, has posted his essay on his views of interpreting Quantum Mechanics on Clifford Johnson’s blog. Coming from a […]

It seems that Category theory is supposed to

provide a foundation – better than set theory ?

Just wondering if there is something like the

incompleteness theorem for Category theory, the

way Godel affected the Russell-Whitehead approach.

“Further, there is an infinite collection of “parallel models” (my phrase), in all of which the statements in K are true, but in infinitely many of which S is true, and in infinitely many of which S is false.”

Isn’t this a statement of religious faith?

Basically, you are positing the many-world (MW) interpretation of mathematical logic (L), analogous to the MW interpretation of quantum mechanics (QM). I bet not all mathematicians believe in this MW interpretation of L, just like not all believe in God. Believing in either of these MW interpretations does not allow you to predict any distinctively-MW measurements, by definition. So, why care? At least in religion, having faith may score you some points in the afterlife. What is the benefit of faith in a MW interpretation of L?

“I suspect that such an interpretation is possible, that it is distinct from those previously proposed, and that it would have profound implications.”

Good luck. The interpretation of QM you posit already exists, has existed at least since Everett’s PhD thesis in 1957, and hasn’t produced any distinctively-MW predictions, because, guess what, it’s not supposed to, by definition

You may be interested in these papers following similar lines of thought:

arXiv:0811.4542

arXiv:0901.3327

See also …

‘Mathematical undecidability and quantum randomness’.

Authors: Tomasz Paterek, Johannes Kofler, Robert Prevedel, Peter Klimek, Markus Aspelmeyer, Anton Zeilinger, Caslav Brukner

http://arxiv.org/abs/0811.4542

Abstract: We propose a new link between mathematical undecidability and quantum physics. We demonstrate that the states of elementary quantum systems are capable of encoding mathematical axioms and show that quantum measurements are capable of revealing whether a given proposition is decidable or not within the axiomatic system. Whenever a mathematical proposition is undecidable within the axioms encoded in the state, the measurement associated with the proposition gives random outcomes. Our results support the view that quantum randomness is irreducible and a manifestation of mathematical undecidability.

Engineers appreciate that real-world measurement processes have a very large number of possible outcomes … and this leads to a very different way of teaching engineering students about the origins of quantum randomness.

In our quantum spin imaging experiments the number of possible experimental data records is (say) 2^(10^16) … this being the number of possible ways that photons can flow through the interferometer. This is such a large number of data records that (per the Kolomogorov-Chaitin definition of randomness), the set of non-random experimental data records necessarily has measure zero.

In short, any theoretical framework that allows for a large number of experimental outcomes—which is to say, any realistic theory—predicts randomness … and quantum theories are not special in this regard.

As for the uncertainty principle, that is explained also using wholly classical language. Measurement processes (and equivalently, noise processes) concentration quantum trajectories onto low dimension manifolds. Viewed as a dynamical pullback onto a Kählerian state-space, this concentration preserves the Lie invariance of symplectic structure (which is why thermodynamics works for both classical and quantum systems), but not the Lie invariance of the metric structure.

The if we associate quantum operators with Berezin symbol functions, and associate each Berezin symbol with a Lie generator, then the Lie commutators on the reduced reduced-dimension state-space enforce all of the standard quantum limits and uncertainty principles.

This approach treats all processes—classical and quantum—as dynamical flows.

The bottom line: we teach quantum systems engineering students that real-world quantum systems have the same symplectic and metric dynamics as classical systems … and therefore, precisely the same

spukhafte Fernwirkung.Whether this is true philosophically, is beyond my competence. But this framework works pretty well for pedagogic and engineering purposes. 🙂

“The Copenhagen interpretation would have the physicist view the electron as existing in a superposition of “up”/“down” until the moment of measurement when its wavefunction would collapse and the result of the measurement would be determined.”

Ah, no. The Copenhagen interpretation wouldn’t talk about there being a particle, certainly not about there being a particle having any properties, except at the point of measurement. Physicists who claim to adhere to the Copenhagen interpretation might say so (almost everyone until 20 years ago), but I would say that linear superposition is a mathematical operation on quantum states, not a property of particles or particle properties.

Talk of collapse is almost ubiquitous, but the Copenhagen interpretation could be taken to include instrumental interpretations that do not need to invoke collapse as well as von Neumann-type interpretations that feel the need to do so.

I should clarify that applying computational complexity theory to QM is a fruitful endeavor that has already produced many new results. What I was questioning is your call for a “new interpretation” that is “distinct from those previously proposed, and that … would have profound implications.”

I’ve seen the idea of an analogy between quantum indeterminacy and

incompleteness before. But it doesn’t strike me as a very good analogy. For one thing, quantum mechanics has a quantitative aspect that isn’t captured by this analogy at all: a measurement returns a given outcome with a definite probability that you can calculate, given knowledge of the state. For another, this analogy doesn’t even touch the measurement problem — e.g., the act of asking about an undecidable proposition doesn’t affect whether that proposition is true or false. So I’m left wondering whether new understanding is gained from this analogy, or whether it essentially amounts to “quantum mechanics is weird, incompleteness is also weird, therefore, maybe there’s a relationship between them.”

I think the best point of view for Goedel’s incompleteness theorem is from computability theory: The set of sentences of number theory that are true (in the standard model) is not decidable, or even semi-decidable. (In fact a lot more can be said about the complexity of the set of true sentences: it is omega “jumps” above decidability.)

The less I say about quantum mechanics, the better.

This is an interesting idea, and the folks at

Quantum Pontiffare talking about it. Something to remember: math has to produce definite results itself (logical necessity!), and can’t generate actual random outcomes. (Not to be confused with “probability math” which tells us the various proportions of outcomes.) For that reason and others I reject the MUH (mathematical universe hypothesis), it from bit, modal realism etc.I have always been partial to an information-theoretic interpretation of quantum mechanics. To pose an analogy its as though the information contained in a measurement of any phenomenon must be communicated over an information channel. That channel “bandwidth” is directly associated with the size of the phenomenon. We get the uncertainty for very small phenomenon due the the lack of sufficient bandwidth to transmit enough information in a particular period of time to accurately measure position, momentum etc. Hence the uncertainty.

Why I prefer this viewpoint is that it does not introduce the observer as having a special role which always troubled me about other interpretation of QM.

I may very well be wrong. But I’ve been wrong before and when it comes to interpretations of QM, I don’t think anyone has effectively cornered this market.

e.

Elliot says:

“I have always been partial to an information-theoretic interpretation of quantum mechanics. To pose an analogy its as though the information contained in a measurement of any phenomenon must be communicated over an information channel.”Elliot, in quantum spin microscopy it is natural to elevate this principle from an analogy to a mathematical symmetry—“you observe the sample; the sample observes you”—the resulting quantitative design implications for von Neumann’s dream of comprehensive biomicroscopy are highly encouraging:

“http://www.pnas.org/content/106/8/2477.extract”

Elliot says: “Why I prefer this viewpoint is that it does not introduce the observer as having a special role which always troubled me about other interpretation of QM.”

This principle too can be elevated from an analogy to a mathematical symmetry; with reference to Nielsen and Chuang, the necessary identity is Theorem 8.2, named by them “unitary freedom in the operator-sum representation”.

In pullback form this informatically symmetric point-of-view yields a description of measurement (and noise) as Ito-Lindblad stochastic processes on Kähler state-spaces.

“http://faculty.washington.edu/sidles/QSEPACK/Kavli/QSE_summary.pdf”

This is a very natural extension of the symplectic framework of classical dynamics that was pioneered by (e.g.) Kolomogorov and Arnol’d; these symplectic ideas nowadays provide the mathematical foundations of radically new conformational biology tools (e.g., Rosetta, Anton).

There are many good ways to appreciate quantum mechanics; in quantum systems engineering we find that the “pullback” point-of-view is pedagogically compact, yields efficient recipes for computation and—most important of all!—-links naturally to broad classes of wonderful “yellow book” mathematics.

Elliot, we still have the problem of what happens to the various elements of superpositions, why the photon “hits” once place and not another place, etc. Also, the popular treatment decoherence is a dodge (see http://plato.stanford.edu/entries/qm-decoherence/ and my own link), and without “observers” we have evolving Schrodinger waves with “no place to go.”

Neil,

a photon hits where its information channel tells us it hits. That is the phenomonon. No observer necessary or required. the collapse of the wave through an observation is just reading the data stream at a point in time.

I admit it is unorthodox. But what about QM is not?

Call me crazy but I believe that QM predates observers. How can you reconcile that with the Copenhagen interpretation.

e.

[…] MQ & Gödel Posted in Philosophie, Science by hadyba on septembre 24, 2009 Len Adleman semble penser que les physiciens quantiques devraient s’inspirer des mathématiciens et considérer que […]

Do we selectively ignore other models from artificial intelligence such as Zadeh’s Fuzzy Logic? This is a logic used to model perception and used in newly designed “smart” cameras. Where standard logic must give a true or false value to every proposition, fuzzy logic assigns a certainty value between zero and one to each of the propositions, so that we say a statement is .7 true and .3 false. Is this theory selectively ignored to support our theories?Using Quantum interrogation it seemed relevant when held in context of Quanglement?

Best,

Thanks John,

Looks like I’ve got some reading/digesting to do.

e.

Elliot, the photon could click in either A or B etc, and there is no “logical preference” for one or the other AFAWK. There is nothing that we can imagine in the system, given an evolving Schrodinger wave, that would pick one versus the other – and break the logical symmetry. That is the problem, and comfortable sounding platitudes or diverting phrasing (sorry) won’t solve any real problems about it. If you don’t need an “observer”, then nature has to be really weird to hash it out. Maybe a relational universe can handle it, but local realism is gone and a lost cause. Everyone needs to “get over it” and move on (in legitimate ways), whatever “on” happens to be.

Neil,

Can you address the issue of which came first Quantum Mechanics or observers?

e.

Elliot, no one, not even I 😉 knows whether we “need” observers or whether the universe and observers need each other to exist etc. (Wheeler played with a sort of interactively created universe.) The universe might well do just fine by itself, if it’s really real and not a Matrix type simulation (look that up, and “modal realism” etc.) But the key issue is, imagining the wave function as real and subject to normal travel time of causality, does not work. We can argue about the implications, how to solve it, what it says about other questions etc. – but that feature of the world should be admitted. And schemes like “decoherence”, and likely the information channel concept too, can’t resolve it. Maybe we just can’t understand – the universe has no (?) obligation to be comprehensible to human minds, does it?

Neil,

You are cleverly sidestepping the question. Let me rephrase it. Do you believe that the laws of quantum mechanics operated in the early universe prior to the existence of any intelligent information processing phenomenon that could be classified as an observer?

e.

Quantum Mechanics and Mathematical Logic is my passion too. What is more, I have found mathematical undecidability within the quantum formalism itself.

This derives from a logical excluded middle under the Field Axioms and relates to scalars whose logical status are distinct. Some scalars exist as theorems of the Field Axioms, others merely satisfy them. Model Theory proves the undecidability. It then propagates fully throughout a theoremology indicative of causelogy in Nature that explains the “causal anomalies” of Quantum Physics. Some details are in my blog.

If you are interested in the origins of indeterminacy in Quantum Mechanics, read my newly finished paper titled:

“The Mathematical Undecidability within Quantum Physics: The Origin of Indeterminacy and Mechanism of Decision at Measurement”

To get a copy click on the following link:

http://steviefaulkner.files.wordpress.com/2010/04/undecidability-in-qm_1034.pdf

[…] at Asymptotia, Len Adleman (the A in RSA, founder of DNA computation (but not the A in DNA!), and a discoverer of the APR […]