Tag Archives: foundations

Foundations Mailing Lists

Bob Coecke has recently set up an email mailing list for announcements in the foundations of quantum theory (conference announcements, job postings and the like). You can subscribe by sending a blank email to quantum-foundations-subscribe@maillist.ox.ac.uk. The mailing list is moderated so you will not get inundated by messages from cranks.

On a similar note, I thought I would mention the philosophy of physics mailing list, which has been going for about seven years and also often features announcements that are relevant to the foundations of quantum theory. Obviously, the focus is more on the philosophy side, but I have often heard about interesting conferences and workshops via this list.

Job/Course/Conference Announcements

Here are a few announcements that have arrived in my inbox in the past few days.

Perimeter Scholars International

Canada’s Perimeter Institute for Theoretical Physics (PI), in partnership with the University of Waterloo, welcomes applications to the Master’s level course, Perimeter Scholars International (PSI). Exceptional students with an undergraduate honours degree in Physics, Math, Engineering or Computer Science are encouraged to apply. Students must have a minimum of 3 upper level undergraduate or graduate courses in physics. PSI recruits a diverse group of students and especially encourages applications from qualified women candidates. The due date for applications to PSI is February 1st, 2011. Complete details are available at www.perimeterscholars.org.

Foundations Postdocs

Also a reminder that it is currently postdoc hiring season at Perimeter Institute. Although, the deadline for applications has passed, they will always consider applications from qualified candidates if not all positions have been filled. Anyone looking for a postdoc in quantum foundations should definitely apply. In fact, if you are looking for a foundations job and you have not applied to PI then you must be quite mad, since there are not a lot of foundations positions in physics to be had elsewhere. Details are here.

Quantum Interactions

I will admit that this next conference announcement is a little leftfield, but some of the areas it covers are very interesting and worthwhile in my opinion, particularly the biological and artificial intelligence applications.

————————–

CALL FOR PAPERS

————————–

The Fifth International Symposium on Quantum Interaction (QI’2010, http://www.rgu.ac.uk/qi2011), 27-29 June 2010, Aberdeen, United Kingdom.

Quantum Interaction (QI) is an emerging field which is applying quantum theory (QT) to domains such as artificial intelligence, human language, cognition, information retrieval, biology, political science, economics, organisations and social interaction.

After highly successful previous meetings (QI’2007 at Stanford, QI’2008 at Oxford, QI’2009 at Saarbruecken, QI’2010 at Washington DC), the Fifth International Quantum Interaction Symposium will take place in Aberdeen, UK from 27 to 29 June 2011.

This symposium will bring together researchers interested in how QT addresses problems in non-quantum domains. QI’2011 will also include a half day tutorial session on 26 June 2011, with a number of leading researchers delivering tutorial on the foundations of QT, the application of QT to human cognition and decision making, and QT inspired semantic information processing.

***Call for Papers***

We are seeking submission of high-quality and original research papers that have not been previously published and are not under review for another conference or journal. Papers should address one or more of the following broad content areas, but not limited to:

– Artificial Intelligence (Logic, planning, agents and multi-agent systems)

– Biological or Complex Systems

– Cognition and Brain (memory, cognitive processes, neural networks, consciousness)

– Decision Theory (political, psychological, cultural, organisational, social sciences)

– Finance and Economics (decision-making, mergers, corporate cultures)

– Information Processing and Retrieval

– Language and Linguistics

The post-conference proceedings of QI’2011 will be published by Springer in its Lecture Notes in Computer Science (LNCS) series. Authors will be required to submit a final version 14 days after the conference to reflect the comments made at the conference. We will also consider organizing a special issue for a suitable journal to publish selected best papers.

***Important Dates***

28th March 2011: Abstract submission deadline

1st April 2011: Paper submission deadline

1st May 2011: Notification of acceptance

1st June 2011: Camera-Ready Copy

26th June 2011: Tutorial Session

27th – 29th June 2011: Conference

***Submission***

Authors are invited to submit research papers up to 12 pages. All submissions should be prepared in English using the LNCS template, which can be downloaded from http://www.springer.com/computer/lncs?SGWID=0-164-6-793341-0.

Please submit online at:

http://www.easychair.org/conferences/?conf=qi2011

***Organization***

Steering Committee:

Peter Bruza (Queensland University of Technology, Australia)

William Lawless (Paine College, USA)

Keith van Rijsbergen (University of Glasgow, UK)

Donald Sofge (Naval Research Laboratory, USA)

Dominic Widdows (Google, USA)

General Chair:

Dawei Song (Robert Gordon University, UK)

Programme Committee Chair:

Massimo Melucci (University of Padua, Italy)

Publicity Chair:

Sachi Arafat (University of Glasgow, UK)

Proceedings Chair:

Ingo Frommholz (University of Glasgow, UK)

Local Organization co-Chairs:

Jun Wang and Peng Zhang (Robert Gordon University, UK)

Quantum Foundations Meetings

Prompted in part by the Quantum Pontiff’s post about the APS March meeting, I thought it would be a good idea to post one of my extremely irregular lists of interesting conferences about the foundations of quantum theory that are coming up. A lot of my usual sources for this sort of information have become defunct in the couple of years I was away from work, so if anyone knows of any other interesting meetings then please post them in the comments.

  • March 21st-25th 2011: APS March Meeting (Dallas, Texas) – Includes a special session on Quantum Information For Quantum Foundations. Abstract submission deadline Nov. 19th.
  • April 29th-May 1st 2011: New Directions in the Foundations of Physics (Washington DC) – Always one of the highlights of the foundations calendar, but invite only.
  • May 2nd-6th 2011: 5th Feynman Festival (Brazil) – Includes foundations of quantum theory as one of its topics, but likely there will be more quantum information/computation talks. Registration deadline Feb. 1st, Abstract submission deadline Feb. 15th.
  • July 25th-30th 2011: Frontiers of Quantum and Mesoscopic Thermodynamics (Prague, Czech Republic) – Not strictly a quantum foundations conference, but there are a few foundations speakers and foundations of thermodynamics is interesting to many quantum foundations people.

Time Travel and Information Processing

Lately, the quant-ph section of the arXiv has been aflurry with papers investigating what would happen to quantum information processing if time travel were possible (see the more recent papers here). I am not sure exactly why this topic has become fashionable, but it may well be an example of the Bennett effect in quantum information research. That is, a research topic can meander along slowly at its own pace for a few years until Charlie Bennett publishes an (often important) paper1 on the subject and then everyone is suddenly talking and writing about it for a couple of years. In any case, there have been a number of counter-intuitive claims that time travel enables quantum information processing to be souped up. Specifically, it supposedly enables super-hard computational problems that are in complexity classes larger than NP to be solved efficiently2345 and it supposedly allows nonorthogonal quantum states to be perfectly distinguished67. These claims are based on two different models for quantum time-travel, one due to David Deutsch8 and one due to a multitude of independent authors based on post-selected teleportation (this paper9 does a good job of the history in the introduction).

In this post, I am going to give a basic introduction to the physics of time-travel. In later posts, I will explain the Deutsch and teleportation-based models and evaluate the information processing claims that have been made about them. What is most interesting to me about this whole topic, is that the correct model for time travelling quantum systems, and hence their information processing power, seems to depend sensitively on both the formalism and the interpretation of quantum theory that is adopted10. For this reason, it is a useful test-bed for ideas in quantum foundations.

Basic Concepts of Time-Travel

Everyone is familiar with the sensation of time-travel into the future. We all do it at a rate of one second per second every day of our lives. If you would like to speed up your rate of future time travel, relative to Earth, then all you have to do is take a space trip at a speed close to the speed of light. When you get back, a lot more time will have elapsed on Earth than you will have experienced on your journey. This is the time-dilation effect of special relativity. Therefore, the problem of time-travel into the future is completely solved in theory, although in practice you would need a vast source of energy in order to accelerate yourself fast enough to make the effect significant. It also causes no conceptual problems for physics, since we have a perfectly good framework for quantum theories that are compatible with special relativity, known as quantum field theory.

On the other hand, time travel into the past is a much more tricky and conceptually interesting proposition. For one thing, it seems to entail time-travel paradoxes, such as the grandfather paradox where you go back in time and kill your grandfather before your parents were born, so that you are never born, so that you cannot go back in time and kill your grandfather, so that you are born, so that you can go back in time and kill your grandfather etc. (see this article for a philosophical and physics-based discussion of time travel paradoxes). For this reason, many physicists are highly sceptical of the idea that time travel into the past is possible. However, General Relativity (GR) provides a reason to temper our skepticism.

Closed Timelike Curves in GR

It has been well-known for a long time that GR admits solutions that include closed timelike curves (CTCs), i.e. world-lines that return to their starting point and loop around. If you happened to be travelling along a CTC then you would eventually end up in the past of where you started from. Actually, it is a bit more complicated than that because the usual notions of past and future do not really make sense on a CTC. However, imagine what it would look like to an observer in a part of the universe that respects causality in the usual sense. First of all, she would see you appear out of nowhere, claiming to have knowledge of events that she regards as being in the future. Some time later she would see you disappear out of existence. From her perspective it certainly looks like time-travel into the past. What things would feel like from your point of view is more of a mystery, as the notion of a CTC makes a mockery of our usual notion of “now”, i.e. it is a fundamentally block-universe construct.

The possibility of CTCs in GR was first noticed by Willem van Stockum in 193711 and later by Kurt Gödel in 194912. Perhaps the most important solution that incorporates CTCs is the Kerr vacuum, which is the solution that describes an uncharged rotating black hole. Since most black holes in the universe are likely to be rotating, there is a sense in which one can say that CTCs are generic. The caveat is that the CTCs in the Kerr vacuum only occur in the interior of the black hole so that the physics outside the event horizon respects causality in the usual sense. Many physicists believe that the CTCs in the Kerr vacuum are mathematical artifacts, which will perhaps not occur in a full theory of quantum gravity. Nevertheless, the conceptual possibility of CTCs in General Relativity is a good reason to look at their physics more closely.

There have been a few attempts to look for solutions of GR that incorporate CTCs that a human being would actually be able to travel along without getting torn to pieces. This is a bit beyond my current knowledge, but, as far as I am aware, all such solutions involve large quantities of negative energy, so they are unlikely to exist in nature and it is unlikely that we can construct them artificially. For this reason, CTCs are currently more of a curiosity for foundationally inclined physicists like myself than they are a practical method of time-travel.

Other Retrocausal Effects in Physics

Apart from GR, other forms of backwards-in-time, or retrocausal, effect have been proposed in physics from time to time. For example, there is the Wheeler-Feynman absorber theory of electrodynamics, which postulates a backwards-in-time propagating field in addition to the usual forwards-in-time propagating field, and Feynman also postulated that positrons might be electrons travelling backwards in time. There is also Cramer’s transactional interpretation of quantum theory13, which does a similar thing with quantum wavefunctions, and the distinct, but conceptually similar, two-state vector formalism of Aharonov and collaborators14. Finally, retrocausal influences have been suggested as a mechanism to reproduce the violations of Bell-inequalities in quantum theory without the need for Lorentz-invariance violating nonlocal influences15.

However, none of these proposals are as compelling an argument for taking the physics of time-travel into the past seriously as the existence of CTCs in General Relativity. This is because, none of these theories gives provides a method for exploiting the retrocausal effect to actually travel back in time. Also, in each case, there is an alternative approach to the same phenomena that does not involve retrocausal influences. Nevertheless, it is possible that the models to be discussed have applications to these alternative approaches to physics.

Consistency Constraints and The Interpretation of Quantum Theory

Any viable theory of time travel into the past has to rule out things like the grandfather paradox. Consistency conditions have to be imposed on any physical model to so that time-travel cannot be used to alter the past. This raises interesting questions about free will, e.g. what exactly stops someone from freely deciding to pull the trigger on their grandfather? Whilst these questions are philosophically interesting, physicists are more inclined to just lay out the mathematics of consistency and see what it leads to. The different models of quantum time travel are essentially just different methods of imposing this sort of consistency constraint on quantum systems.

That is pretty much it for the basic introduction, but I want to leave you with a quick thought experiment to illustrate the sort of quantum foundational issues that come up when considering time-travel into the past. Suppose you prepare a spin-\(\frac{1}{2}\) particle in a spin up state in the z direction and then measure it in the x direction, so that it has a 50-50 chance of giving the spin up or spin down outcome. After observing the outcome you jump onto a CTC, travel back into the past and watch yourself perform the experiment again. The question is, would you see the experiment have the same outcome the second time around?

A consistency condition for time travel has to say something like “the full ontic state (state of things that exist in reality) of the universe must be the same the second time round as it was the first time round”, albeit that your subjective position within it has changed. If you believe, as many-worlds supporters do, that the quantum wavefunction is the complete description of reality then it, and only it, must be the same the second time around. Therefore, it must be the case that the probabilities are still 50-50 and you could see either outcome. This is not inconsistent because the many-worlds supporters believe that both outcomes happened the first time round in any case. If you are a Bohmian then the ontic state includes the positions of all particles in addition to the wavefunction and these, taken together, can be used to determine the outcome of the experiment uniquely. Therefore, a Bohmian must believe that the measurement outcome has to be the same the second time around. Finally, if you are some sort of anti-realist neo-Copenhagen type then it is not clear exactly what you believe, but, then again, it is not clear exaclty what you believe even when there is no time-travel.

There are some subtleties in these arguments. For example, it is not clear what happens to the correlations between you and the observed system when you go around the causal loop. If they still exist then this may restrict the ability of the earlier version of you to prepare a pure state. On the other hand, perhaps they get wiped out or perhaps your memory of the outcome gets wiped. The different models for the quantum physics of CTCs differ on how they handle this sort of issue, and this is what I will be looking at in future posts. If you have travelled along a CTC and happen to have brought a copy of these future posts with you then I would be very grateful if you could email them to me because that would be much easier for me than actually writing them.

‘Till next time!

References

  1. Bennett, C. H. et. al. (2009). “Can closed timelike curves or nonlinear quantum mechanics improve quantum state discrimination or help solve hard problems”. Phys. Rev. Lett. 103:170502. eprint arXiv:0908.3023. []
  2. Brun, T. A. and Wilde, Mark M. (2010). “Perfect state distinguishability and computational speedups with postselected closed timelike curves”. eprint arXiv:1008.0433. []
  3. Aaronson, S. and Watrous, J. (2009). Closed timelike curves make quantum and classical computing equivalent. Proc. R. Soc. A 465:631-647. eprint arXiv:0808.2669. []
  4. Bacon, D. (2004). Quantum Computational Complexity in the Presence of Closed Timelike Curves. Phys. Rev. A 70:032309. eprint arXiv:quant-ph/0309189. []
  5. Brun, T. A. (2003). Computers with closed timelike curves can solve hard problems. Found. Phys. Lett. 16:245-253. eprint arXiv:gr-qc/0209061. []
  6. Brun, T. A. and Wilde, Mark M. (2010). “Perfect state distinguishability and computational speedups with postselected closed timelike curves”. eprint arXiv:1008.0433. []
  7. Brun, Todd A., Harrington, J. and Wilde, M. M. (2009). “Localized closed timelike curves can perfectly distinguish quantum states”. Phys. Rev. Lett. 102:210402. eprint arXiv:0811.1209. []
  8. Deutsch, D. (1991). “Quantum mechanics near closed timelike lines”. Phys. Rev. D 44:3197—3217. []
  9. Lloyd, S. et. al. (2010). “The quantum mechanics of time travel through post-selected teleportation”. eprint arXiv:1007.2615 []
  10. I should mention that Joseph Fitzsimons (@jfitzsimons) disagreed with this statement in our Twitter conversations on this subject, and no doubt many physicists would too, but I hope to convince you that it is correct by the end of this series of posts. []
  11. Stockum, W. J. van (1937). “The gravitational field of a distribution of particles rotating around an axis of symmetry”. Proc. Roy. Soc. Edinburgh A 57: 135. []
  12. Kurt Gödel (1949). “An Example of a New Type of Cosmological Solution of Einstein’s Field Equations of Gravitation”. Rev. Mod. Phys. 21: 447. []
  13. Cramer, J. G. (1986). “The transactional interpretation of quantum mechanics”. Rev. Mod. Phys. 58:647-687. []
  14. Aharonov, Y. and Vaidman, L. (2001). “The Two-State Vector Formalism of Quantum Mechanics: An Updated Review”. in “Time in Quantum Mechanics”, Muga, J. G., Sala Mayato, R. and Egusquiza, I. L. eprint arXiv:quant-ph/0105101. []
  15. For example, see Price, H. (1997). “Time’s Arrow and Archimedes’ Point”. OUP. []

Quantum Foundations Resources

Since I get asked a lot, I have added a collection of links to resources on quantum foundations to the About page.  Any suggestions for additions will be gratefully received, especially if you know of any good quality popular talks that can be viewed online.

P.S.  In case you were thinking of asking, neither “The Tao of Physics” or “What The Bleep Do We Know?” are ever going to be added.

Baez on Quantum Foundations

I just wrote another post on the fqxi site, but to cut a long story short it gives a link to the latest “This Week’s Finds..” on quantum foundations.

Foundations at APS, take 2

It doesn’t seem that a year has gone by since I wrote about the first sessions on quantum foundations organized by the topical group on quantum information, concepts and computation at the APS March meeting. Nevertheless it has, and I am here in Denver after possibly the longest day of continuous sitting through talks in my life. I arrived at 8am to chair the session on Quantum Limited Measurements, which was interesting, but readers of this blog won’t want to hear about such practical matters, so instead I’ll spill the beans on the two foundations sessions that followed.

In the first foundations session, things got off to a good start with Rob Spekkens as the invited speaker explaining to us once again why quantum states are states of knowledge. OK, I’m biased because he’s a collaborator, but he did throw us a new tidbit on how to make an analog of the Elitzur Vaidman bomb experiment in his toy theory by constructing a version for field theory.

Next, there was a talk by some complete crackpot called Matt Leifer. He talked about this.

Frank Schroeck gave an overview of his formulation of quantum mechanics on phase space, which did pique my interest, but 10 minutes was really too short to do it justice. Someday I’ll read his book.

Chris Fuchs gave a talk which was surprisingly not the same as his usual quantum Bayesian propaganda speech. It contained some new results about Symmetric Informationally Complete POVMs, including the fact that the states the POVM elements are proportional to are minimum uncertainty states with respect to mutually unbiased bases. This should be hitting an arXiv near you very soon.

Caslav Brukner talked about his recent work on the emergence of classicality via coarse graining. I’ve mentioned it before on this blog, and it’s definitely a topic I’m becoming much more interested in.

Later on, Jeff Tollaksen talked about generalizing a theorem proved by Rob Spekkens and myself about pre- and post-selected quantum systems to the case of weak measurements. I’m not sure I agree with the particular spin he gives on it, especially his idea of “quantum contextuality”, but you can decide for yourself by reading this.

Jan-Ake Larrson gave a very comprehensible talk about a “loophole” (he prefers the term “experimental problem”) in Bell inequality tests to do with coincidence times of photon detection. You can deal with it by having a detection efficiency just a few percent higher than that needed to overcome the detection loophole. Read all about it here.

Most of the rest of the talks in this session were more quantum information oriented, but I suppose you can argue they were at the foundational end of quantum information. Animesh Datta talked about the role of entanglement in the Knill-Laflamme model of quantum computation with one pure qubit, Anil Shaji talked about using easily computable entanglement measures to put bounds on those that aren’t so easy to compute and finally Ian Durham made some interesting observations about the connections between entropy, information and Bell inequalities.

The second foundations session was more of a mixed bag, but let me just mention a couple of the talks that appealed to me. Marcello Sarandy Alioscia Hamma talked about generalizing the quantum adiabatic theorem to open systems, where you don’t necessarily have a Hamiltonian with well-defined eigenstates to talk about and Kicheon Kang talked about a proposal for a quantum eraser experiment with electrons.

On Tuesday, Bill Wootters won a prize for best research at an undergraduate teaching college. He gave a great talk about his discrete Wigner functions, which included some new stuff about minumum uncertainty states and analogs of coherent states.

That’s pretty much it for the foundations talks at APS this year. It’s all quantum information from here on in. That is unless you count Zeilinger, who is talking on Thursday. He’s supposed to be talking about quantum cryptography, but perhaps he will say something about the more foundationy experiments going on in his lab as well.

What can decoherence do for us?

OK, so it’s time for the promised post about decoherence, but where to begin? Decoherence theory is now a vast subject with an enormous literature covering a wide variety of physical systems and scenarios. I will not deal with everything here, but just make some comments on how the theory looks from my point of view about the foundations of quantum theory. Alexei Grinbaum pointed me to a review article by Maximilian Schlosshauer on the role of decoherence in solving the measurement problem and in interpretations of quantum theory. That’s a good entry into the literature for people who want to know more.

OK, let me start by defining two problems that I take to be at the heart of understanding quantum theory:

1) The Emergence of Classicality: Our most fundamental theories of the world are quantum mechanical, but the world appears classical to us at the everyday level. Explain why we do not find ourselves making mistakes in using classical theories to make predictions about the everyday world of experience. By this I mean not only classical dynamics, but also classical probability theory, information theory, computer science, etc.

2) The ontology problem: The mathematical formalism of quantum theory provides an algorithm for computing the probabilities of outcomes of measurements made in experiments. Explain what things exist in reality and what laws they obey in such a way as to account for the correctness of the predictions of the theory.

I take these to be the fundamental challenges of understanding quantum mechanics. You will note that I did not mention the measurement problem, Schroedinger’s cat, or the other conventional ways of expressing the foundational challenges of quantum theory. This is because, as I have argued before, these problems are not interpretation neutral. Instead, one begins with something like the orthodox interpretation and shows that unitary evolution and the measurement postulates are in apparent conflict within that interpretation depending on whether we choose to view the measuring apparatus as a physical system obeying quantum theory or to leave it unanalysed. The problems with this are twofold:

i) It is not the case that we cannot solve the measurement problem. Several solutions exist, such as the account given by Bohmian mechanics, that of Everett/many-worlds, etc. The fact that there is more than one solution, and that none of them have been found to be universally compelling, indicates that it is not solving the measurement problem per se that is the issue. You could say that it is solving the measurement problem in a compelling way that is the issue, but I would say it is better to formulate the problem in such a way that it is obvious how it applies to each of the different interpretations.

ii) The standard way of describing the problems essentially assumes that the quantum state-vector corresponds more or less directly to whatever exists in reality, and that it is in fact all that exists in reality. This is an assumption of the orthodox interpretation, so we are talking about a problem with the standard interpretation and not with quantum theory itself. Assuming the reality of the state-vector simply begs the question. What if it does not correspond to an element of reality, but is just an epistemic object with a status akin to a probability distribution in classical theories? This is an idea that I favor, but now is not the time to go into detailed arguments for it. The mere fact that it is a possibility, and is taken seriously by a significant section of the foundations community, means that we should try to formulate the problems in a language that is independent of the ontological status of the state-vector.

Given this background viewpoint, we can now ask to what extent decoherence can help us with 1) and 2), i.e. the emergence and ontology problems. Let me begin with a very short description of what decoherence is in this context. The first point is that it takes seriously the idea that quantum systems, particularly the sort that we usually describe as “classical”, are open, i.e. interact strongly with a large environment. Correlations between system and environment are typically established very quickly in some particular basis, determined by the form of the system-environment interaction Hamiltonain, so that the density matrix of the system quickly becomes diagonal in that basis. Furthermore, the basis in which the correlations exist is stable over a very long period of time, which can typically be much longer than the lifetime of the universe. Finally, for many realistic Hamiltonians and a wide variety of systems, the decoherence basis corresponds very well to the kind of states we actually observe.

From my point of view, the short answer to the role of decoherence in foundations is that it provides a good framework for addressing emergence, but has almost nothing to say about ontology.  The reason for saying that should be clear:  we have a good correspondence with our observations, but at no point in my description of decoherence did I find it necessary to mention a reality underlying quantum mechanics.  Having said that, a couple of caveats are in order. Firstly, decoherence can do much more if it is placed within a framework with a well defined ontology. For example, in Everett/many-worlds, the ontology is the state-vector, which always evolves unitarily and never collapses. The trouble with this is that the ontology doesn’t correspond to our subjective experience, so we need to supplement it with some account of why we see collapses, definite measurement outcomes, etc. Decoherence theory does a pretty good job of this by providing us with rules to describe this subjective experience, i.e. we will experience the world relative to the basis that decoherence theory picks out. However, the point here is that the work is not being done by decoherence alone, as claimed by some physicists, but also by a nontrivial ontological assumption about the state-vector. As I remarked earlier, the latter is itself a point of contention, so it is clear that decoherence alone is not providing a complete solution.

The second caveat, is that some people, including Max Schlosshauer in his review, would argue for plausible denial of the need to answer the ontology question at all. So long as we can account for our subjective experience in a compelling manner then why should we demand any more of our theories? The idea is then that decoherence can solve the emergence problem, and then we are done because the ontology problem need not be solved at all. One could argue for this position, but to do so is thoroughly wrongheaded in my opinion, and this is so independently of my conviction that physics is about trying to describe a reality that goes beyond subjective experience. The simple point is that someone who takes this view seriously really has no need for decoherence theory at all. Firstly, given that we are not assigning ontological status to anything, let alone the state-vector, then you are free to collapse it, uncollapse it, evolve it, swing it around your head or do anything else you like with it. After all, if it is not supposed to represent anything existing in reality then there need not be any physical consequences for reality of any mathematical manipulation, such as a projection, that you might care to do. The second point is that if we are prepared to give a privelliged status to observers in our physical theories, by saying that physics needs to describe their experience and nothing more, then we can simply say that the collapse is a subjective property of the observer’s experience and leave it at that. We already have privelliged systems in our theory on this view, so what extra harm could that do?

Of course, I don’t subscribe to this viewpoint myself, but on both views described so far, decoherence theory either needs to be supplemented with an ontology, or is not needed at all for addressing foundational issues.

Finally, I want to make a couple of comments about how odd the decoherence solution looks from my particular point of view as a believer in the epistemic nature of wavefunctions. The first is that, from this point of view, the decoherence solution appears to have things backwards. When constructing a classical probabilistic theory, we first identify the ontological entities, e.g. particles that have definite trajectories, and describe their dynamics, e.g. Hamilton’s equations. Only then do we introduce probabilities and derive the corresponding probabilistic theory, e.g. Liouville mechanics. Decoherence theory does things in the other direction, starting from Schroedinger mechanics and then seeking to define the states of reality in terms of the probabilistic object, i.e. the state-vector. Whilst this is not obviously incorrect, since we don’t necessarily have to do things the same way in classical and quantum theories, it does seem a little perverse from my point of view. I’d rather start with an ontology and derive the fact that the state-vector is a good mathematical object for making probabilistic predictions, instead of the other way round.

The second comment concerns an analogy between the emergence of classicality in QM and the emergence of the second law of thermodynamics from statistical mechanics. For the latter, we have a multitude of conceptually different approaches, which all arrive at somewhat similar results from a practical point of view. For a state-vector epistemist like myself, the interventionist approach to statistical mechanics seems very similar to the decoherence approach to the emergence problem in QM. Both say that the respective problems cannot be solved by looking at a closed Hamiltonian system, but only by considering interaction with a somewhat uncontrollable environment. In the case of stat-mech, this is used to explain the statistical fluctuations observed in what would be an otherwise deterministic system. The introduction of correlations between system and environment is the mechanism behind both processes. Somewhat bizzarely, most physicists currently prefer closed-system approaches to the derivation of the second law, based on coarse-graining, but prefer the decoherence approach when it comes to the emergence of classicality from quantum theory. Closed system approaches have the advantage of being applicable to the universe as a whole, where there is no external environment to rely on. However, apart from special cases like this, one can broadly say that the two types of approach are complimentary for stat mech, and neither has a monopoly on explaining the second law. It is then natural to ask whether closed system approaches to emergence in QM are available making use of coarse graining, and whether they ought to be given equal weight to the decoherence explanation. Indeed, such arguments have been given – here is a recent example, which has many precursors too numerous to go through in detail. I myself am thinking about a similar kind of approach at the moment. Right now, such arguments have a disadvantage over decoherence in that the “measurement basis” has to be put in by hand, rather than emerging from the physics as in decoherence. However, further work is needed to determine whether this is an insurmountable obstacle.

In conclusion, decoherence theory has done a lot for our understanding of the emergence of classicality from quantum theory. However, it does not solve all the foundational queations about quantum theory, at least not on it’s own. Further, its importance may have been overemphasized by the physics community because other less-developed approaches to emergence could turn out to be of equal importance.

Steane Roller

Earlier, I promised some discussion of Andrew Steane‘s new paper: Context, spactime loops, and the interpretation of quantum mechanics. Whilst it is impossible to summarize everything in the paper, I can give a short description of what I think are the most important points.

  • Firstly, he does believe that the whole universe obeys the laws of quantum mechanics, which are not required to be generalized.
  • Secondly, he does not think that Everett/Many-Worlds is a good way to go because it doesn’t give a well-defined rule for when we see one particular outcome of a measurement in one particular basis.
  • He believes that collapse is a real phenomenon and so the problem is to come up with a rule for assigning a basis in which the wavefunction collapses, as well as, roughly speaking, a spacetime location at which it occurs.
  • For now, he describes collapse as an unanalysed fundamenally stochastic process that achieves this, but he recognizes that it might be useful to come up with a more detailed mechanism by which this occurs.

Steane’s problem therefore reduces to picking a basis and a spacetime location. For the former, he uses the standard ideas from decoherence theory, i.e. the basis in which collapse occurs is the basis in which the reduced state of the system is diagonal. However, the location of collapse is what is really interesting about the proposal, and makes it more interesting and more bizzare than most of the proposals I have seen so far.

Firstly, note that the process of collapse destroys the phase information between the system and the environment. Therefore, if the environmental degrees of freedom could ever be gathered together and re-interacted with the system, then QM would predict interference effects that would not be present if a genuine collapse had occurred. Since Steane believes in the universal validity of QM, he has to come up with a way of having a genuine collapse without getting into a contradiction with this possibility.

His first innovation is to assert that the collapse need not be associated to an exactly precise location in spacetime. Instead, it can be a function of what is going on in a larger region of spacetime. Presumably, for events that we would normally regard as “classical” this region is supposed to be rather small, but for coherent evolutions it could be quite large.

The rule is easiest to state for special cases, so for now we will assume that we are talking about particles with a discrete quantum degree of freedom, e.g. spin, but that the position and momentum can be treated classically. Now, suppose we have 3 qubits and that they are in the state |000> + e^i phi |111>. The state of the first two qubits is a density operator, diagonal in the basis {|00>, |11>}, with a probability 1/2 for each of the two states. The phase e^i phi will only ever be detectable if the third qubit re-interacts with the first two. Whether or not this can happen is determined by the relative locations of the qubits, since the interaction Hamiltonias in nature are local. Since we are treating position and momentum classically at the moment, there is a matter of fact about whether this will occur and Steane’s rule is simple: if the qubits re-interact in the future then there is no collapse, but if they don’t then the then the first two qubits have collapsed into the state |00> or the state |11> with probability 1/2 for each one.

Things are going to get more complicated if we quantize the position and momentum, or indeed if we move to quantum field theory, since then we don’t have definite particle trajectories to work with. It is not entirely clear to me whether Steane’s proposal can be made to work in the general case, and he does admit that further technical work is needed. However, he still asserts that whether or not a system has collapsed at a given point is spacetime is in principle a function of its entire future, i.e. whether or not it will eventually re-interact with the environment it has decohered with respect to.

At this point, I want to highlight a bizzare physical prediction that can be made if you believe Steane’s point of view. Really, it is metaphysics, since the experiment is not at all practical. For starters, the fact that I experience myself being in a definite state rather than a superposition means that there are environmental degrees of freedom that I have interacted with in the past that have decohered me into a particular basis. We can in principle imagine an omnipotent “Maxwell’s demon” type character, who can collect up every degree of freedom I have ever interacted with, bring it all together and reverse the evolution, eliminating me in the process. Whilst this is impractical, there is nothing in principle to stop it happening if we believe that QM applies to the entire universe. However, according to Steane, the very fact that I have a definite experience means that we can predict with certainty that no such interaction happens in the future. If it did, there would be no basis for my definite experience at the moment.

Contrast this with a many-worlds account a la David Wallace. There, the entire global wavefunction still exists, and the fact that I experience the world in a particular basis is due to the fact that only certain special bases, the ones in which decoherence occurs, are capable of supporting systems complex enough to achieve conciousness. There is nothing in this view to rule out the Maxwell’s demon conclusively, although we may note that he is very unlikely to be generated by a natural process due to the second law of thermodynamics.

Therefore, there is something comforting about Steane’s proposal. If true, my very existence can be used to infer that I will never be wiped out by a Maxwell’s demon. All we need to do to test the theory is to try and wipe out a conscious being by constructing such a demon, which is obviously impractical and also unethical. Needless to say, there is something troubling about drawing such a strong metaphysical conclusion from quantum theory, which is why I still prefer the many-worlds account over Steane’s proposal at the moment. (That’s not to say that I agree with the former either though.)

Against Interpretation

It appears that I haven’t had a good rant on this blog for some time, but I have been stimulated into doing so by some of the discussion following the Quantum Pontiff‘s recent post about Bohmian Mechanics. I don’t want to talk about Bohm theory in particular, but to answer the following general question:

  • Just what is the goal of studying the foundations of quantum mechanics?

Before answering this question, note that its answer depends on whether you are approaching it as a physicist, mathematician, philosopher, or religious crank trying to seek justification for your outlandish worldview. I’m approaching the question as a physicist and to a lesser extent as a mathematician, but philosophers may have legitimate alternative answers. Since the current increase of interest in foundations is primarily amongst physicists and mathematicians, this seems like a natural viewpoint to take.

Let me begin by stating some common answers to the question:

1. To provide an interpretation of quantum theory, consistent with all its possible predictions, but free of the conceptual problems associated with orthodox and Copenhagen interpretations.

2. To discover a successor to quantum theory, consistent with the empirical facts known to date, but making new predictions in untested regimes as well as resolving the conceptual difficulties.

Now, let me give my proposed answer:

  • To provide a clear path for the future development of physics, and possibly to take a few steps along that path.

To me, this statement applies to the study of the foundations of any physical theory, not just quantum mechanics, and the success of the strategy has been born out in practice. For example, consider thermodynamics. The earliest complete statements of the principles of thermodynamics were in terms of heat engines. If you wanted to apply the theory to some physical system, you first had to work out how to think of it as a kind of heat engine before you started. This was often possible, but a rather unnatural thing to do in many cases. The introduction of the concept of entropy eliminated the need to talk about heat engines and allowed the theory to be applied to virtually any macroscopic system. Further, it facilitated the discovery of statistical mechanics. The formulation in terms of entropy is formally mathematically equivalent to the earlier formulations, and thus it might be thought superfluous to requirements, but in hindsight it is abundantly clear that it was the best way of looking at things for the progress of physics.

Let’s accept my answer to the foundational question for now and examine what becomes of the earlier answers. I think it is clear that answer 2 is consistent with my proposal, and is a legitimate task for a physicist to undertake. For those who wish to take that road, I wish you the best of luck. On the other hand, answer 1 is problematic.

Earlier, I wrote a post about criteria that a good interpretation should satisfy. Now I would like to take a step back from that and urge the banishment of the word interpretation entirely. The problem with 1 is that it ring-fences the experimental predictions of quantum theory, so that the foundational debate has no impact on them at all. This is the antithesis of the approach I advocate, since on my view foundational studies are supposed to feed back into improved practice of the theory. I think that the separation of foundations and practice did serve a useful role in the historical development of quantum theory, since rapid progress required focussing attention on practical matters, and the time was not ripe for detailed foundational investigations. For one thing, experiments that probe the weirder aspects of quantum theory were not possible until the last couple of decades. It can also serve a useful role for a subsection of the philosophy community, who may wish to focus on interpretation without having to keep track of modern developments in the physics. However, the view is simply a hangover from an earlier age, and should be abandoned as quickly as possible. It is a debate that can never be resolved, since how can physicists be convinced to adopt one interpretation over another if it makes no difference at all to how they understand the phenomenology of the theory?

On the other hand, if one looks closely it is evident that many “interpretations” that are supposedly of this type are not mere interpretations at all. For example, although Bohmian Mechanics is equivalent to standard quantum theory in its predictions, it immediately suggests a generalization to a “nonequilibrium” hidden variable theory, which would make new predictions not possible within the standard theory. Similar remarks can be made about other interpretations. For example, many-worlds, despite not being a favorite of mine, does suggest that it is perfectly fine to apply standard quantum theory to the entire universe. In Copenhagen this is not possible in any straightforward way, since there is always supposed to be a “classical” world out there at some level, which the state of the quantum system is referred to. In short, the distinction between “the physics” and “the interpretation” often disappears on close inspection, so we are better off abandoning the word “interpretation” and instead viewing the project as providing alternatives frameworks for the future progress of physics.

Finally, the more observant amongst you will have noticed that I did not include “solving the measurement problem” as a possible major goal of quantum foundations, despite its frequent appearance in this context. Deconstructing the measurement problem requires it’s own special rant, so I’m saving it for a future occasion.