Category Archives: Quantum Quandaries

Teaching Quantum Theory

The recent article by Chandralekha Singh, Mario Belloni and Wolfgang Christian on Students’ understanding of Quantum Mechanics in Physics Today provoked an interesting series of letters in response. Both Robert Griffith and Travis Norsen argue that students’ understanding would be improved by replacing the usual Copenhagen/Orthodox dogma by discussion of some more recent developments in the foundations of quantum theory.

Given that I don’t actually have much experience teaching quantum theory (I have only covered a lecturer’s absence for two lectures) it is perhaps a bit presumptuous for me to contribute my thoughts on this topic. Nevertheless, I do agree wholeheartedly with the basic sentiment of both these letters. I think one can easily see that at least some of the misconceptions that Sing, Belloni and Christian have written about could be easily remedied by a bit more foundational discussion at the ground level. For example, I think the common misconception that stationary states are the only allowed states of a quantum system could be dispelled by a deeper discussion of the sense in which quantum theory is analogous to classical probability theory.

However, I think both Griffith and Norsen make a mistake in the approaches they advocate in their letters. Griffith suggests replacing the orthodoxy with his own favored approach, namely decoherent/consistent histories, and Norsen thinks we should teach students Bohmian mechanics. In fact, in his letter Griffith gives the misleading impression that his approach is universally and unproblematicallly accepted by all right-thinking physicists. Whilst the formalism certainly has quite a few adherents in quantum cosmology, it is far from true that it has received universal support from all serious thinkers on the foundations of quantum theory. Similarly, whilst I agree that Bohmian mechanics presents the clearest counterexample to many common misconceptions about quantum theory, it is far from clear that it represents the best road to future progress.

In my view, the problem is not that we are teaching the wrong orthodoxy to students, but rather that we are teaching them any orthodoxy at all, since foundations is a subject that is still mired in controversy to this day. It is hard for me to imagine any physicist who is not directly involved in foundations taking either Griffith’s or Norsen’s arguments seriously, since their letters directly contradict each other about what is the best approach to teach, and a non-specialist really has no way of deciding which one of them they should trust. The view that foundations is a murky area, with no clear reason for choosing one approach over any other is only reinforced by such arguments and it is unlikely to persuade a skeptic to change their whole teaching strategy.

On the other hand, I do believe that there are a lot of developments in foundations that have made our current understanding much clearer, and these could be usefully communicated to students. For example, we have a much clearer understanding of the “no-go” theorems, such as Bell’s theorem, and their possible loopholes, and a much clearer understanding of the space of possible realist interpretations of quantum theory. We have an improved understanding of the classical limit, via decoherence theory amongst other approaches, and quantum information theory has shown that entanglement and the understanding of quantum theory as a generalized probability theory actually have useful consequences. I believe we should teach these things as a central part of quantum mechanics courses, and not just as peripheral topics covered in the last one or two lectures, which students are instructed not to worry about because it won’t be on the final exam! We should also give students an understanding of the space of possible resolutions to foundational problems, to equip them with a BS detector for statements they are likely to hear about quantum theory. Why do I believe this? Well, simply because I think it will leave students less confused about how to understand quantum theory and because I think these areas are all increasingly fruitful avenues of research that we might want to encourage them to pursue.

The difficult question, I think, is not the why but the how. It would entail battling against the prevailing wisdom that foundations are to be de-emphasised and relegated to the end of the course. Also, good teaching materials at an appropriate level that could supplement the existing curriculum are not readily available, and that is a problem we definitely have to address if we want this to happen.

Foundations Summer School: Apply Now!

Just a short note to let you know that the application form for the Perimeter Institute Quantum Foundations Summer School is now available online from here. The application deadline is 20th May.

Update: I should have mentioned that for successful applicants who are grad students all expenses will be paid by Perimeter. That should make it easier to persuade your advisor to let you go. You don’t have to be an expert on foundations and we are hoping that students studying a wide variety of areas of Physics will attend.

Update 2: Whether non-students, e.g. postdocs, will be allowed to attend is still an open question. I’m waiting to hear more about this from the organizers. Clearly, the priority for a summer school has to be grad students, so I would speculate that it will depend on the number and quality of applications that we get. I’m just guessing at the moment though and I’ll post another update once I hear the official word.

Update 3: I have just heard that there will be up to 10 places will be made available at the summer school for postdocs and junior faculty.

Foundations at APS, take 2

It doesn’t seem that a year has gone by since I wrote about the first sessions on quantum foundations organized by the topical group on quantum information, concepts and computation at the APS March meeting. Nevertheless it has, and I am here in Denver after possibly the longest day of continuous sitting through talks in my life. I arrived at 8am to chair the session on Quantum Limited Measurements, which was interesting, but readers of this blog won’t want to hear about such practical matters, so instead I’ll spill the beans on the two foundations sessions that followed.

In the first foundations session, things got off to a good start with Rob Spekkens as the invited speaker explaining to us once again why quantum states are states of knowledge. OK, I’m biased because he’s a collaborator, but he did throw us a new tidbit on how to make an analog of the Elitzur Vaidman bomb experiment in his toy theory by constructing a version for field theory.

Next, there was a talk by some complete crackpot called Matt Leifer. He talked about this.

Frank Schroeck gave an overview of his formulation of quantum mechanics on phase space, which did pique my interest, but 10 minutes was really too short to do it justice. Someday I’ll read his book.

Chris Fuchs gave a talk which was surprisingly not the same as his usual quantum Bayesian propaganda speech. It contained some new results about Symmetric Informationally Complete POVMs, including the fact that the states the POVM elements are proportional to are minimum uncertainty states with respect to mutually unbiased bases. This should be hitting an arXiv near you very soon.

Caslav Brukner talked about his recent work on the emergence of classicality via coarse graining. I’ve mentioned it before on this blog, and it’s definitely a topic I’m becoming much more interested in.

Later on, Jeff Tollaksen talked about generalizing a theorem proved by Rob Spekkens and myself about pre- and post-selected quantum systems to the case of weak measurements. I’m not sure I agree with the particular spin he gives on it, especially his idea of “quantum contextuality”, but you can decide for yourself by reading this.

Jan-Ake Larrson gave a very comprehensible talk about a “loophole” (he prefers the term “experimental problem”) in Bell inequality tests to do with coincidence times of photon detection. You can deal with it by having a detection efficiency just a few percent higher than that needed to overcome the detection loophole. Read all about it here.

Most of the rest of the talks in this session were more quantum information oriented, but I suppose you can argue they were at the foundational end of quantum information. Animesh Datta talked about the role of entanglement in the Knill-Laflamme model of quantum computation with one pure qubit, Anil Shaji talked about using easily computable entanglement measures to put bounds on those that aren’t so easy to compute and finally Ian Durham made some interesting observations about the connections between entropy, information and Bell inequalities.

The second foundations session was more of a mixed bag, but let me just mention a couple of the talks that appealed to me. Marcello Sarandy Alioscia Hamma talked about generalizing the quantum adiabatic theorem to open systems, where you don’t necessarily have a Hamiltonian with well-defined eigenstates to talk about and Kicheon Kang talked about a proposal for a quantum eraser experiment with electrons.

On Tuesday, Bill Wootters won a prize for best research at an undergraduate teaching college. He gave a great talk about his discrete Wigner functions, which included some new stuff about minumum uncertainty states and analogs of coherent states.

That’s pretty much it for the foundations talks at APS this year. It’s all quantum information from here on in. That is unless you count Zeilinger, who is talking on Thursday. He’s supposed to be talking about quantum cryptography, but perhaps he will say something about the more foundationy experiments going on in his lab as well.

Tao on Many-Worlds and Tomb Raider

Terence Tao has an interesting post on why many-worlds quantum theory is like Tomb Raider.  I think it’s de Broglie-Bohm theory that is more like Tomb Raider though, as you can see from the comments.

Dates for your diary

Update: I am informed that the Oxford Everett meeting will be in the summer rather than in September and is invitation only.  Also, there will be a Symposium on the Foundations of Modern Physics in Vienna 7th-10th June.  Registration for that is open until the end of March.

I haven’t been contemplating too many quantum quandaries recently because I was away at a workshop on Operator Structures in Quantum Information in Banff (a very interesting meeting and a highly recommended location) and am currently visiting Caltech. My brain is mostly full of mathematics and non-foundations oriented physics. In the meantime, here are some interesting foundations events coming up this summer.

Firstly, Perimeter Institute is organising its first Summer School on Quantum Foundations August 27th-31st. There have been several summer schools in other locations in the past, which have mostly been philosophy/interpretations oriented. The PI School will have a distinctly “physics” flavor, e.g. it will include lectures on experiments amongst other things. I’ve seen the list of speakers and it looks like it’s going to be really interesting. For grad students and postdocs interested in foundations, summer schools are highly recommended because of the sparsity of experts in the subject at most institutions. It’s how I became reasonably competent in the subject at any rate. Please don’t write to me requesting further details because I can’t help you. All the information is going to be posted on your favorite quantum websites/mailing lists very soon. Alternatively, you’ll be able to get to the school website via this link once it is up and running.

Secondly, the Institute for Quantum Computing and Perimeter are jointly running a series of quantum oriented workshops this summer under the banner Taming the Quantum World. There’s lots of interesting events for quantum information folks, so check out the website, but the workshop on Operational Quantum Physics and the Quantum-Classical Contrast, June 4th-7th, organized by Paul Busch and Lucien Hardy will be of special interest to readers of this blog.

Since I’m plugging foundations meetings at my own institutions, I should also mention Many Worlds at 50, organized by Jonathan Barrett, Adrian Kent and David Wallace, taking place September 21st-24th.

Given the number of meetings in Waterloo this year, it is somewhat surprising that the foundations community has also found time to organise some events at other locations. Here’s the rundown of the rest:

– March 5th-9th: APS March Meeting, Denver – Two focus sessions on quantum foundations have been organised.

– March 29th-31st: 15th UK and European Meeting on the Foundations of Physics, Leeds.

– April 13th-15th: New Directions in the Foundations of Physics, Maryland. It’s invitation only (and full) I’m afraid.

– June 11th-16th: Quantum Theory: Reconsideration of Foundations 4, Vaxjo.

– July 2nd-13th: Operational probabilistic theories as foils to quantum theory, Cambridge. It’s invitation only (and full).

– Sometime in September: Everett at 50, Oxford.

If I’ve missed any meetings or you have any new info on any of these then please leave a comment.

Quantum Brains

OK, I should be preparing a talk, but it is late and my mind is wandering, so it’s not going to happen tonight.  Instead, I’ll pose this puzzler:  If quantum computers are more efficient than classical ones then why didn’t our brains evolve to take advantage of quantum information processing?

I have a vague recollection of seeing this question on a physics blog somewhere before, and it does have a family resemblance to Scott’s infamous post, albeit a more politically correct version.

There are a number of assumptions behind this question:

  • Evolution usually does a very efficient job of coming up with information processing devices.  As evidence for this note that the best algorithms we have for some tasks simply imitiate nature, e.g. neural networks, simulated annealing, etc.
  • Some functions of the brain, such as the ability to solve math problems, are best understood by regarding the brain as a kind of computer.  Note that we don’t need to say that the brain is merely a computer, only that it can be regarded as such for understanding some of its functions, i.e. we don’t need to get into a big philosophical debate about conciousness and artificial intelligence.
  • Further, in these respects the brain is a classical computer and not a quantum one.  It certainly seems that the information processing function of neurons can be understood in classical terms, i.e. neural networks again.  There is a small minority of experts who believe that quantum mechanics plays an essential role in the information processing functions of the brain for whom my question is nonsense.

Here are all the possible explanations I can think of.

  • The set of problems in BQP, but not in P does not include anything that would have conferred a significant survival advantage for our ancestors.  Admittedly, efficient factoring could be useful for surviving high-school math class, as well as for cracking codes, but this wouldn’t have mattered so much to cave-people.  This would be disappointing, although not devastating, news for people trying to come up with new quantum algorithms.
  • There is some big problem with building a stable quantum computer of any appreciable size, and so present day experimentalists will eventually run into the same problems that nature did.
  • Dumb luck.  Evolution tends to find local minima in the landscape of all possible species.  Having a quantum brain is indeed a lower minimum than our current classical brain, but we never got a big enough hit to get over the mountain separating that solution from ourselves.

The first two explanations seem like the most interesting ones.  If the third explanation wasn’t a possibility then there would have to be a tradeoff between the amount of progress possible in developing quantum algorithms and the amount possible in actually building a quantum computer.  Given that much quantum computing funding is predicated on the idea that massive progress is possible in both areas, I’d say we should thank Darwin for dumb luck!

What can decoherence do for us?

OK, so it’s time for the promised post about decoherence, but where to begin? Decoherence theory is now a vast subject with an enormous literature covering a wide variety of physical systems and scenarios. I will not deal with everything here, but just make some comments on how the theory looks from my point of view about the foundations of quantum theory. Alexei Grinbaum pointed me to a review article by Maximilian Schlosshauer on the role of decoherence in solving the measurement problem and in interpretations of quantum theory. That’s a good entry into the literature for people who want to know more.

OK, let me start by defining two problems that I take to be at the heart of understanding quantum theory:

1) The Emergence of Classicality: Our most fundamental theories of the world are quantum mechanical, but the world appears classical to us at the everyday level. Explain why we do not find ourselves making mistakes in using classical theories to make predictions about the everyday world of experience. By this I mean not only classical dynamics, but also classical probability theory, information theory, computer science, etc.

2) The ontology problem: The mathematical formalism of quantum theory provides an algorithm for computing the probabilities of outcomes of measurements made in experiments. Explain what things exist in reality and what laws they obey in such a way as to account for the correctness of the predictions of the theory.

I take these to be the fundamental challenges of understanding quantum mechanics. You will note that I did not mention the measurement problem, Schroedinger’s cat, or the other conventional ways of expressing the foundational challenges of quantum theory. This is because, as I have argued before, these problems are not interpretation neutral. Instead, one begins with something like the orthodox interpretation and shows that unitary evolution and the measurement postulates are in apparent conflict within that interpretation depending on whether we choose to view the measuring apparatus as a physical system obeying quantum theory or to leave it unanalysed. The problems with this are twofold:

i) It is not the case that we cannot solve the measurement problem. Several solutions exist, such as the account given by Bohmian mechanics, that of Everett/many-worlds, etc. The fact that there is more than one solution, and that none of them have been found to be universally compelling, indicates that it is not solving the measurement problem per se that is the issue. You could say that it is solving the measurement problem in a compelling way that is the issue, but I would say it is better to formulate the problem in such a way that it is obvious how it applies to each of the different interpretations.

ii) The standard way of describing the problems essentially assumes that the quantum state-vector corresponds more or less directly to whatever exists in reality, and that it is in fact all that exists in reality. This is an assumption of the orthodox interpretation, so we are talking about a problem with the standard interpretation and not with quantum theory itself. Assuming the reality of the state-vector simply begs the question. What if it does not correspond to an element of reality, but is just an epistemic object with a status akin to a probability distribution in classical theories? This is an idea that I favor, but now is not the time to go into detailed arguments for it. The mere fact that it is a possibility, and is taken seriously by a significant section of the foundations community, means that we should try to formulate the problems in a language that is independent of the ontological status of the state-vector.

Given this background viewpoint, we can now ask to what extent decoherence can help us with 1) and 2), i.e. the emergence and ontology problems. Let me begin with a very short description of what decoherence is in this context. The first point is that it takes seriously the idea that quantum systems, particularly the sort that we usually describe as “classical”, are open, i.e. interact strongly with a large environment. Correlations between system and environment are typically established very quickly in some particular basis, determined by the form of the system-environment interaction Hamiltonain, so that the density matrix of the system quickly becomes diagonal in that basis. Furthermore, the basis in which the correlations exist is stable over a very long period of time, which can typically be much longer than the lifetime of the universe. Finally, for many realistic Hamiltonians and a wide variety of systems, the decoherence basis corresponds very well to the kind of states we actually observe.

From my point of view, the short answer to the role of decoherence in foundations is that it provides a good framework for addressing emergence, but has almost nothing to say about ontology.  The reason for saying that should be clear:  we have a good correspondence with our observations, but at no point in my description of decoherence did I find it necessary to mention a reality underlying quantum mechanics.  Having said that, a couple of caveats are in order. Firstly, decoherence can do much more if it is placed within a framework with a well defined ontology. For example, in Everett/many-worlds, the ontology is the state-vector, which always evolves unitarily and never collapses. The trouble with this is that the ontology doesn’t correspond to our subjective experience, so we need to supplement it with some account of why we see collapses, definite measurement outcomes, etc. Decoherence theory does a pretty good job of this by providing us with rules to describe this subjective experience, i.e. we will experience the world relative to the basis that decoherence theory picks out. However, the point here is that the work is not being done by decoherence alone, as claimed by some physicists, but also by a nontrivial ontological assumption about the state-vector. As I remarked earlier, the latter is itself a point of contention, so it is clear that decoherence alone is not providing a complete solution.

The second caveat, is that some people, including Max Schlosshauer in his review, would argue for plausible denial of the need to answer the ontology question at all. So long as we can account for our subjective experience in a compelling manner then why should we demand any more of our theories? The idea is then that decoherence can solve the emergence problem, and then we are done because the ontology problem need not be solved at all. One could argue for this position, but to do so is thoroughly wrongheaded in my opinion, and this is so independently of my conviction that physics is about trying to describe a reality that goes beyond subjective experience. The simple point is that someone who takes this view seriously really has no need for decoherence theory at all. Firstly, given that we are not assigning ontological status to anything, let alone the state-vector, then you are free to collapse it, uncollapse it, evolve it, swing it around your head or do anything else you like with it. After all, if it is not supposed to represent anything existing in reality then there need not be any physical consequences for reality of any mathematical manipulation, such as a projection, that you might care to do. The second point is that if we are prepared to give a privelliged status to observers in our physical theories, by saying that physics needs to describe their experience and nothing more, then we can simply say that the collapse is a subjective property of the observer’s experience and leave it at that. We already have privelliged systems in our theory on this view, so what extra harm could that do?

Of course, I don’t subscribe to this viewpoint myself, but on both views described so far, decoherence theory either needs to be supplemented with an ontology, or is not needed at all for addressing foundational issues.

Finally, I want to make a couple of comments about how odd the decoherence solution looks from my particular point of view as a believer in the epistemic nature of wavefunctions. The first is that, from this point of view, the decoherence solution appears to have things backwards. When constructing a classical probabilistic theory, we first identify the ontological entities, e.g. particles that have definite trajectories, and describe their dynamics, e.g. Hamilton’s equations. Only then do we introduce probabilities and derive the corresponding probabilistic theory, e.g. Liouville mechanics. Decoherence theory does things in the other direction, starting from Schroedinger mechanics and then seeking to define the states of reality in terms of the probabilistic object, i.e. the state-vector. Whilst this is not obviously incorrect, since we don’t necessarily have to do things the same way in classical and quantum theories, it does seem a little perverse from my point of view. I’d rather start with an ontology and derive the fact that the state-vector is a good mathematical object for making probabilistic predictions, instead of the other way round.

The second comment concerns an analogy between the emergence of classicality in QM and the emergence of the second law of thermodynamics from statistical mechanics. For the latter, we have a multitude of conceptually different approaches, which all arrive at somewhat similar results from a practical point of view. For a state-vector epistemist like myself, the interventionist approach to statistical mechanics seems very similar to the decoherence approach to the emergence problem in QM. Both say that the respective problems cannot be solved by looking at a closed Hamiltonian system, but only by considering interaction with a somewhat uncontrollable environment. In the case of stat-mech, this is used to explain the statistical fluctuations observed in what would be an otherwise deterministic system. The introduction of correlations between system and environment is the mechanism behind both processes. Somewhat bizzarely, most physicists currently prefer closed-system approaches to the derivation of the second law, based on coarse-graining, but prefer the decoherence approach when it comes to the emergence of classicality from quantum theory. Closed system approaches have the advantage of being applicable to the universe as a whole, where there is no external environment to rely on. However, apart from special cases like this, one can broadly say that the two types of approach are complimentary for stat mech, and neither has a monopoly on explaining the second law. It is then natural to ask whether closed system approaches to emergence in QM are available making use of coarse graining, and whether they ought to be given equal weight to the decoherence explanation. Indeed, such arguments have been given – here is a recent example, which has many precursors too numerous to go through in detail. I myself am thinking about a similar kind of approach at the moment. Right now, such arguments have a disadvantage over decoherence in that the “measurement basis” has to be put in by hand, rather than emerging from the physics as in decoherence. However, further work is needed to determine whether this is an insurmountable obstacle.

In conclusion, decoherence theory has done a lot for our understanding of the emergence of classicality from quantum theory. However, it does not solve all the foundational queations about quantum theory, at least not on it’s own. Further, its importance may have been overemphasized by the physics community because other less-developed approaches to emergence could turn out to be of equal importance.

Universitas Magistrorum et Scholarium

I have arrived back in Waterloo to start my new hybrid University/Perimeter Institute position.  It’s been quite a long break from posting, because – strangely enough – having two affiliations means I had to do twice the amount of paperwork to get myself set-up this time.  As much as I loved being at PI, it is nice to be back in a university and to have some small role in educating the next generation of quantum mechanics.

Over the break, Andrew Thomas has left a few comments about the role of decoherence in interpretations of quantum theory in my Professional Jealousy post.  There are some who think that understanding decoherence alone is enough to “solve” the conceptual difficulties with quantum theory.  This is quite a popular opinion in some quarters of the physics community, where one often finds people mumbling something about decoherence when asked about the measurement problem.  However, there are also many deep thinkers on foundations who have denied that decoherence completely solves the problems, and I tend to agree with them, so we’ll have a post on “What can decoherence do for us?” later on this week.

To clarify, I’m not going to argue that decoherence isn’t an important and real physical effect, nor am I going to say that it has no role at all in foundational studies, so please hold your fire until after the next post if you were thinking of commenting to that effect.

Happy Holidays!

As I don’t expect to be able to blog again before the Xmas break, I’d like to wish all readers of QQ a happy whateveryou’recelebrating.

The holidays are one of those times of year when relatives get the opportunity to ask you, “So, what exactly is it that you do research on?”. This dreaded question will come with certainty, regardless of how many times you have previously explained it to them. It’s not their fault because the average person does not have physics on their mind for any significant amount of time, so it’s easy to forget what it’s all about.

The question is especially bad if you spend any time thinking about the foundations of quantum theory, because it’s difficult to describe quantum theory accurately in a few words. Here’s my best shot at an answer at the moment.

Miscellaneous Relative: So, what is this quantum theory thing all about then?

Me: Well, it’s not exactly about the fact that particles sometimes behave like waves and waves like particles.

MR: Go on.

Me: There is this thing called the Heisenberg uncertainty relation, but strictly speaking it doesn’t say that a measurement of position necessarily disturbs the momentum and vice-versa.

MR: OK.

Me: And it’s definitely not that there are multiple universes.

MR: That’s a shame. I enjoy science fiction, so that was the bit I liked the most.

Me: There are these things called wavefunctions, which can be in superpositions, but it’s not entirely clear what the true significance of that is.

MR: I’m not getting much insight into what you actually do from this by the way.

Me: It seems that John Bell proved that locality and realism are incompatible, but people are still debating the significance of that, so it’s definitely not the whole story either.

MR: Now I really have no clue what you are talking about.

Me: It’s not just about “finding the right language” with which to talk about physics. In particular, I don’t think that revising logic is really the right thing to do.

MR: That sounds sensible enough.

Me: Some people think the whole thing is just about doing something called “solving the measurement problem”, but I don’t think that’s an entirely helpful way of looking at things.

MR: So just what IS the whole thing about then?

Me: That’s the whole question. Welcome to my research programme.

Steane Roller

Earlier, I promised some discussion of Andrew Steane‘s new paper: Context, spactime loops, and the interpretation of quantum mechanics. Whilst it is impossible to summarize everything in the paper, I can give a short description of what I think are the most important points.

  • Firstly, he does believe that the whole universe obeys the laws of quantum mechanics, which are not required to be generalized.
  • Secondly, he does not think that Everett/Many-Worlds is a good way to go because it doesn’t give a well-defined rule for when we see one particular outcome of a measurement in one particular basis.
  • He believes that collapse is a real phenomenon and so the problem is to come up with a rule for assigning a basis in which the wavefunction collapses, as well as, roughly speaking, a spacetime location at which it occurs.
  • For now, he describes collapse as an unanalysed fundamenally stochastic process that achieves this, but he recognizes that it might be useful to come up with a more detailed mechanism by which this occurs.

Steane’s problem therefore reduces to picking a basis and a spacetime location. For the former, he uses the standard ideas from decoherence theory, i.e. the basis in which collapse occurs is the basis in which the reduced state of the system is diagonal. However, the location of collapse is what is really interesting about the proposal, and makes it more interesting and more bizzare than most of the proposals I have seen so far.

Firstly, note that the process of collapse destroys the phase information between the system and the environment. Therefore, if the environmental degrees of freedom could ever be gathered together and re-interacted with the system, then QM would predict interference effects that would not be present if a genuine collapse had occurred. Since Steane believes in the universal validity of QM, he has to come up with a way of having a genuine collapse without getting into a contradiction with this possibility.

His first innovation is to assert that the collapse need not be associated to an exactly precise location in spacetime. Instead, it can be a function of what is going on in a larger region of spacetime. Presumably, for events that we would normally regard as “classical” this region is supposed to be rather small, but for coherent evolutions it could be quite large.

The rule is easiest to state for special cases, so for now we will assume that we are talking about particles with a discrete quantum degree of freedom, e.g. spin, but that the position and momentum can be treated classically. Now, suppose we have 3 qubits and that they are in the state |000> + e^i phi |111>. The state of the first two qubits is a density operator, diagonal in the basis {|00>, |11>}, with a probability 1/2 for each of the two states. The phase e^i phi will only ever be detectable if the third qubit re-interacts with the first two. Whether or not this can happen is determined by the relative locations of the qubits, since the interaction Hamiltonias in nature are local. Since we are treating position and momentum classically at the moment, there is a matter of fact about whether this will occur and Steane’s rule is simple: if the qubits re-interact in the future then there is no collapse, but if they don’t then the then the first two qubits have collapsed into the state |00> or the state |11> with probability 1/2 for each one.

Things are going to get more complicated if we quantize the position and momentum, or indeed if we move to quantum field theory, since then we don’t have definite particle trajectories to work with. It is not entirely clear to me whether Steane’s proposal can be made to work in the general case, and he does admit that further technical work is needed. However, he still asserts that whether or not a system has collapsed at a given point is spacetime is in principle a function of its entire future, i.e. whether or not it will eventually re-interact with the environment it has decohered with respect to.

At this point, I want to highlight a bizzare physical prediction that can be made if you believe Steane’s point of view. Really, it is metaphysics, since the experiment is not at all practical. For starters, the fact that I experience myself being in a definite state rather than a superposition means that there are environmental degrees of freedom that I have interacted with in the past that have decohered me into a particular basis. We can in principle imagine an omnipotent “Maxwell’s demon” type character, who can collect up every degree of freedom I have ever interacted with, bring it all together and reverse the evolution, eliminating me in the process. Whilst this is impractical, there is nothing in principle to stop it happening if we believe that QM applies to the entire universe. However, according to Steane, the very fact that I have a definite experience means that we can predict with certainty that no such interaction happens in the future. If it did, there would be no basis for my definite experience at the moment.

Contrast this with a many-worlds account a la David Wallace. There, the entire global wavefunction still exists, and the fact that I experience the world in a particular basis is due to the fact that only certain special bases, the ones in which decoherence occurs, are capable of supporting systems complex enough to achieve conciousness. There is nothing in this view to rule out the Maxwell’s demon conclusively, although we may note that he is very unlikely to be generated by a natural process due to the second law of thermodynamics.

Therefore, there is something comforting about Steane’s proposal. If true, my very existence can be used to infer that I will never be wiped out by a Maxwell’s demon. All we need to do to test the theory is to try and wipe out a conscious being by constructing such a demon, which is obviously impractical and also unethical. Needless to say, there is something troubling about drawing such a strong metaphysical conclusion from quantum theory, which is why I still prefer the many-worlds account over Steane’s proposal at the moment. (That’s not to say that I agree with the former either though.)