Author Archives: mleifer

Why is many-worlds winning the foundations debate?

Almost every time the foundations of quantum theory are mentioned in another science blog, the comments contain a lot of debate about many-worlds. I find it kind of depressing the extent to which many people are happy to jump on board with this interpretation without asking too many questions. In fact, it is almost as depressing as the fact that Copenhagen has been the dominant interpretation for so long, despite the fact that most of Bohr’s writings on the subject are pretty much incoherent.

Well, this year is the 50th anniversary of Everett’s paper, so perhaps it is appropriate to lay out exactly why I find the claims of many-worlds so unbelievable.

WARNING: The following rant contains Philosophy!

Traditionally, philosophers have made a distinction between analytic and synthetic truths. Analytic truths are those things that you can prove to be true by deduction alone. They are necessary truths and essentially they are just the tautologies of classical logic, e.g. either this is a blog post or this is not a blog post. On the other hand, synthetic truths are things we could imagine to have been another way, or things that we need to make some observation of the world in order to confirm or deny, e.g. Matt Leifer has never written a blog post about his pet cat.

Perhaps the central problem of the philosophy of science is whether the correctness of the scientific method is an analytic or a synthetic truth. Of course this depends a lot on how exactly you decide to define the scientific method, which is a topic of considerable controversy in itself. However, it’s pretty clear that the principle of induction is not an analytic truth, and even if you are a falsificationist you have to admit that it has some role in science, i.e. if a single experiment contradicts the predictions of a dominant theory then you call it an anomaly rather than a falsification. Of the other alternatives, if you’re a radical Kuhnian then you’re probably not reading this blog, since you are busy writing incoherent postmodern junk to write for a sociology journal. If you are a follower of Feyerabend then you are a conflicted soul and I sympathize. Anyway, back to the plot for people who do believe that induction has some role to play in science.

Kant’s resolution to this dilemma was to divide the synthetic truths into two categories, the a priori truths and the rest (I don’t know a good name for non-a priori synthetic truths). The a priori synthetic truths are things that cannot be directly deduced, but are nevertheless so basic to our functioning as beings living in this world that we must assume them to be true, i.e. it would be impossible to make any sense of the world without them. For example, we might decide that the fact that the universe is regular enough to perform approximately repeatable scientific experiments and to draw reliable inferences from them should be in the category of a priori truths. This seems reasonable because it is pretty hard to imagine that any kind of intelligent life could exist in a universe where the laws of physics were in continual flux.

One problem with this notion is that we can’t know a priori exactly what the a priori truths are. We can write down a list of what we currently believe to be a priori truths – our assumed a priori truths – but this is open to revision if we find that we can in fact still make sense of the world when we discard some of these assumed truths. The most famous example of this comes from Kant himself, who assumed that the way our senses are hooked up meant that we must describe the world in terms of events happening in space and time, implicitly assuming a Euclidean geometry. As we now know, the world still makes sense if we drop the Euclidean assumption, unifying space and time and working with much more general geometries. Still, even in relativity we have the concept of events occurring at spacetime locations as a fundamental primitive. If you like, you can modify Kant’s position to take this as the fundamental a priori truth, and explain that he was simply misled by the synthetic fact that our spacetime is approximately flat on ordinary length scales.

At this point, it is useful to introduce Quine’s pudding-bowl analogy for the structure of knowledge (I can’t remember what kind of bowl Quine actually used, but he’s making pudding as far as we are concerned). If you make a small chip at the top of a pudding bowl, then you won’t have any problem making pudding with it and the chip can easily be fixed up. On the other hand, if you make a hole near the bottom then you will have a sticky mess in the oven. It will take some considerable surgery to fix up the bowl and you are likely to consider just throwing out the bowl and sitting down at the pottery wheel to build a new one. The moral of the story is that we should be more skeptical of changes in the structure of our knowledge that seem to undermine assumptions that we think are fundamental. We need to have very good reasons to make such changes, because it is clear that there is a lot of work to be done in order to reconstruct all the dependent knowledge further up the bowl that we rely on every day. The point is not that we should never make such changes – just that we should be careful to ensure that there isn’t an equally good explanation that doesn’t require making such as drastic change.

Aside: Although Quine has in mind a hierarchical structure for knowledge – the parts of the pudding bowl near the bottom are the foundation that supports the rest of the bowl – I don’t think this is strictly necessary. We just need to believe that some areas of knowledge have higher connectivity than others, i.e. more other important things that depend on them. It would work equally well if you think knowledge is stuctured like a power-law graph for example.

The Quinian argument is often levelled against proposed interpretations of quantum theory, e.g. the idea that quantum theory should be understood as requiring a fundamental revision of logic or probability theory rather than these being convenient mathematical formalisms that can coexist happily with their classical counterparts. The point here is that it is bordering on the paradoxical for a scientific theory to entail changes to things on which the scientific method itself seems to depend, and we did use logical and statistical arguments to confirm quantum theory in the first place. Thus, if we revise logic or probability then the theory seems to be “eating its own tail”. This is not to say that this is an actual paradox, because it could be the case that when we reconstruct the entire edifice of knowledge according to the new logic or probability theory we will still find that we were right to believe quantum theory, but just mistaken about the reasons why we should believe it. However, the whole exercise is question begging because if we allow changes to such basic things then why not make a more radical change and consider the whole space of possible logics or probability theories. There are clearly some possible alternatives under which all hell breaks loose and we are seriously deluded about the validity of all our knowledge. In other words, we’ve taken a sledgehammer to our pudding bowl and we can’t even make jelly (jello for North Ameican readers) any more.

At this point, you might be wondering whether a Quinian argument can be levelled against the revision of geometry implied by general relativity as well. The difference is that we do have a good handle of what the space of possible alternative geometries looks like. We can imagine writing down countless alternative theories in the language of differential geometry and figuring out what the world looks like according to them. We can adopt the assumed a priori truth that the world is describable in terms of events in asome spacetime geometry and then we find the synthetic fact that general relativity is in accordance with our observations, while most of the other theories are not. We did some significant damage close to the bottom of the bowl, but it turned out that we could fix it relatively easily. There are still some fancy puddings – like the theory of quantum gravity (baked Alaska) – that we haven’t figured out how to make in the repaired bowl, but we can live without them most of the time.

Now, is there a Quinian argument to be made against the many-worlds interpretation? I think so. The idea is that when we apply the scientific method we assume we can do experiments which have actual definite outcomes. These are the basic data from which we build a confirmation or refutation our theories. Many-worlds says that this assumption is wrong, there are no fundamental definite outcomes – it just appears that way to us because we are all entangled up in the wavefunction of the universe. This is a pretty dramatic assertion and it does seem to be bordering on the “theory eating its own tail” type of assertion. We need to be pretty sure that there isn’t an equally good alternative explanation in which experiments really do have definite outcomes before accepting it. Also, as with the case of revising logic or probability, we don’t have a good understanding of the space of possible theories in which experiments do not have definite outcomes. I can think of one other theory of this type, namely a bizarre interpretation of classical probability theory in which all outcomes that are assigned nonzero probability occur in different universes, but two possible theories does not amount to much in the grand scheme of things. The problem is that on dropping the assumption of definite outcomes, we have not replaced it with an adequate new assumed a priori truth. That the world is describable by vectors in Hilbert space that evolve unitarily seems much to specific to be considered as a potential candidate. Until we do come up with such an assumption, I can’t see why many-worlds is any less radical than proposing a revision of logic or probability theory. Until then, I won’t be making any custard tarts in that particular pudding bowl myself.

Teaching Quantum Theory

The recent article by Chandralekha Singh, Mario Belloni and Wolfgang Christian on Students’ understanding of Quantum Mechanics in Physics Today provoked an interesting series of letters in response. Both Robert Griffith and Travis Norsen argue that students’ understanding would be improved by replacing the usual Copenhagen/Orthodox dogma by discussion of some more recent developments in the foundations of quantum theory.

Given that I don’t actually have much experience teaching quantum theory (I have only covered a lecturer’s absence for two lectures) it is perhaps a bit presumptuous for me to contribute my thoughts on this topic. Nevertheless, I do agree wholeheartedly with the basic sentiment of both these letters. I think one can easily see that at least some of the misconceptions that Sing, Belloni and Christian have written about could be easily remedied by a bit more foundational discussion at the ground level. For example, I think the common misconception that stationary states are the only allowed states of a quantum system could be dispelled by a deeper discussion of the sense in which quantum theory is analogous to classical probability theory.

However, I think both Griffith and Norsen make a mistake in the approaches they advocate in their letters. Griffith suggests replacing the orthodoxy with his own favored approach, namely decoherent/consistent histories, and Norsen thinks we should teach students Bohmian mechanics. In fact, in his letter Griffith gives the misleading impression that his approach is universally and unproblematicallly accepted by all right-thinking physicists. Whilst the formalism certainly has quite a few adherents in quantum cosmology, it is far from true that it has received universal support from all serious thinkers on the foundations of quantum theory. Similarly, whilst I agree that Bohmian mechanics presents the clearest counterexample to many common misconceptions about quantum theory, it is far from clear that it represents the best road to future progress.

In my view, the problem is not that we are teaching the wrong orthodoxy to students, but rather that we are teaching them any orthodoxy at all, since foundations is a subject that is still mired in controversy to this day. It is hard for me to imagine any physicist who is not directly involved in foundations taking either Griffith’s or Norsen’s arguments seriously, since their letters directly contradict each other about what is the best approach to teach, and a non-specialist really has no way of deciding which one of them they should trust. The view that foundations is a murky area, with no clear reason for choosing one approach over any other is only reinforced by such arguments and it is unlikely to persuade a skeptic to change their whole teaching strategy.

On the other hand, I do believe that there are a lot of developments in foundations that have made our current understanding much clearer, and these could be usefully communicated to students. For example, we have a much clearer understanding of the “no-go” theorems, such as Bell’s theorem, and their possible loopholes, and a much clearer understanding of the space of possible realist interpretations of quantum theory. We have an improved understanding of the classical limit, via decoherence theory amongst other approaches, and quantum information theory has shown that entanglement and the understanding of quantum theory as a generalized probability theory actually have useful consequences. I believe we should teach these things as a central part of quantum mechanics courses, and not just as peripheral topics covered in the last one or two lectures, which students are instructed not to worry about because it won’t be on the final exam! We should also give students an understanding of the space of possible resolutions to foundational problems, to equip them with a BS detector for statements they are likely to hear about quantum theory. Why do I believe this? Well, simply because I think it will leave students less confused about how to understand quantum theory and because I think these areas are all increasingly fruitful avenues of research that we might want to encourage them to pursue.

The difficult question, I think, is not the why but the how. It would entail battling against the prevailing wisdom that foundations are to be de-emphasised and relegated to the end of the course. Also, good teaching materials at an appropriate level that could supplement the existing curriculum are not readily available, and that is a problem we definitely have to address if we want this to happen.

Site Updated

I have updated my publications and my CV.

Foundations Summer School: Apply Now!

Just a short note to let you know that the application form for the Perimeter Institute Quantum Foundations Summer School is now available online from here. The application deadline is 20th May.

Update: I should have mentioned that for successful applicants who are grad students all expenses will be paid by Perimeter. That should make it easier to persuade your advisor to let you go. You don’t have to be an expert on foundations and we are hoping that students studying a wide variety of areas of Physics will attend.

Update 2: Whether non-students, e.g. postdocs, will be allowed to attend is still an open question. I’m waiting to hear more about this from the organizers. Clearly, the priority for a summer school has to be grad students, so I would speculate that it will depend on the number and quality of applications that we get. I’m just guessing at the moment though and I’ll post another update once I hear the official word.

Update 3: I have just heard that there will be up to 10 places will be made available at the summer school for postdocs and junior faculty.

Foundations at APS, take 2

It doesn’t seem that a year has gone by since I wrote about the first sessions on quantum foundations organized by the topical group on quantum information, concepts and computation at the APS March meeting. Nevertheless it has, and I am here in Denver after possibly the longest day of continuous sitting through talks in my life. I arrived at 8am to chair the session on Quantum Limited Measurements, which was interesting, but readers of this blog won’t want to hear about such practical matters, so instead I’ll spill the beans on the two foundations sessions that followed.

In the first foundations session, things got off to a good start with Rob Spekkens as the invited speaker explaining to us once again why quantum states are states of knowledge. OK, I’m biased because he’s a collaborator, but he did throw us a new tidbit on how to make an analog of the Elitzur Vaidman bomb experiment in his toy theory by constructing a version for field theory.

Next, there was a talk by some complete crackpot called Matt Leifer. He talked about this.

Frank Schroeck gave an overview of his formulation of quantum mechanics on phase space, which did pique my interest, but 10 minutes was really too short to do it justice. Someday I’ll read his book.

Chris Fuchs gave a talk which was surprisingly not the same as his usual quantum Bayesian propaganda speech. It contained some new results about Symmetric Informationally Complete POVMs, including the fact that the states the POVM elements are proportional to are minimum uncertainty states with respect to mutually unbiased bases. This should be hitting an arXiv near you very soon.

Caslav Brukner talked about his recent work on the emergence of classicality via coarse graining. I’ve mentioned it before on this blog, and it’s definitely a topic I’m becoming much more interested in.

Later on, Jeff Tollaksen talked about generalizing a theorem proved by Rob Spekkens and myself about pre- and post-selected quantum systems to the case of weak measurements. I’m not sure I agree with the particular spin he gives on it, especially his idea of “quantum contextuality”, but you can decide for yourself by reading this.

Jan-Ake Larrson gave a very comprehensible talk about a “loophole” (he prefers the term “experimental problem”) in Bell inequality tests to do with coincidence times of photon detection. You can deal with it by having a detection efficiency just a few percent higher than that needed to overcome the detection loophole. Read all about it here.

Most of the rest of the talks in this session were more quantum information oriented, but I suppose you can argue they were at the foundational end of quantum information. Animesh Datta talked about the role of entanglement in the Knill-Laflamme model of quantum computation with one pure qubit, Anil Shaji talked about using easily computable entanglement measures to put bounds on those that aren’t so easy to compute and finally Ian Durham made some interesting observations about the connections between entropy, information and Bell inequalities.

The second foundations session was more of a mixed bag, but let me just mention a couple of the talks that appealed to me. Marcello Sarandy Alioscia Hamma talked about generalizing the quantum adiabatic theorem to open systems, where you don’t necessarily have a Hamiltonian with well-defined eigenstates to talk about and Kicheon Kang talked about a proposal for a quantum eraser experiment with electrons.

On Tuesday, Bill Wootters won a prize for best research at an undergraduate teaching college. He gave a great talk about his discrete Wigner functions, which included some new stuff about minumum uncertainty states and analogs of coherent states.

That’s pretty much it for the foundations talks at APS this year. It’s all quantum information from here on in. That is unless you count Zeilinger, who is talking on Thursday. He’s supposed to be talking about quantum cryptography, but perhaps he will say something about the more foundationy experiments going on in his lab as well.

Tao on Many-Worlds and Tomb Raider

Terence Tao has an interesting post on why many-worlds quantum theory is like Tomb Raider.  I think it’s de Broglie-Bohm theory that is more like Tomb Raider though, as you can see from the comments.

Dates for your diary

Update: I am informed that the Oxford Everett meeting will be in the summer rather than in September and is invitation only.  Also, there will be a Symposium on the Foundations of Modern Physics in Vienna 7th-10th June.  Registration for that is open until the end of March.

I haven’t been contemplating too many quantum quandaries recently because I was away at a workshop on Operator Structures in Quantum Information in Banff (a very interesting meeting and a highly recommended location) and am currently visiting Caltech. My brain is mostly full of mathematics and non-foundations oriented physics. In the meantime, here are some interesting foundations events coming up this summer.

Firstly, Perimeter Institute is organising its first Summer School on Quantum Foundations August 27th-31st. There have been several summer schools in other locations in the past, which have mostly been philosophy/interpretations oriented. The PI School will have a distinctly “physics” flavor, e.g. it will include lectures on experiments amongst other things. I’ve seen the list of speakers and it looks like it’s going to be really interesting. For grad students and postdocs interested in foundations, summer schools are highly recommended because of the sparsity of experts in the subject at most institutions. It’s how I became reasonably competent in the subject at any rate. Please don’t write to me requesting further details because I can’t help you. All the information is going to be posted on your favorite quantum websites/mailing lists very soon. Alternatively, you’ll be able to get to the school website via this link once it is up and running.

Secondly, the Institute for Quantum Computing and Perimeter are jointly running a series of quantum oriented workshops this summer under the banner Taming the Quantum World. There’s lots of interesting events for quantum information folks, so check out the website, but the workshop on Operational Quantum Physics and the Quantum-Classical Contrast, June 4th-7th, organized by Paul Busch and Lucien Hardy will be of special interest to readers of this blog.

Since I’m plugging foundations meetings at my own institutions, I should also mention Many Worlds at 50, organized by Jonathan Barrett, Adrian Kent and David Wallace, taking place September 21st-24th.

Given the number of meetings in Waterloo this year, it is somewhat surprising that the foundations community has also found time to organise some events at other locations. Here’s the rundown of the rest:

– March 5th-9th: APS March Meeting, Denver – Two focus sessions on quantum foundations have been organised.

– March 29th-31st: 15th UK and European Meeting on the Foundations of Physics, Leeds.

– April 13th-15th: New Directions in the Foundations of Physics, Maryland. It’s invitation only (and full) I’m afraid.

– June 11th-16th: Quantum Theory: Reconsideration of Foundations 4, Vaxjo.

– July 2nd-13th: Operational probabilistic theories as foils to quantum theory, Cambridge. It’s invitation only (and full).

– Sometime in September: Everett at 50, Oxford.

If I’ve missed any meetings or you have any new info on any of these then please leave a comment.

Quantum Brains

OK, I should be preparing a talk, but it is late and my mind is wandering, so it’s not going to happen tonight.  Instead, I’ll pose this puzzler:  If quantum computers are more efficient than classical ones then why didn’t our brains evolve to take advantage of quantum information processing?

I have a vague recollection of seeing this question on a physics blog somewhere before, and it does have a family resemblance to Scott’s infamous post, albeit a more politically correct version.

There are a number of assumptions behind this question:

  • Evolution usually does a very efficient job of coming up with information processing devices.  As evidence for this note that the best algorithms we have for some tasks simply imitiate nature, e.g. neural networks, simulated annealing, etc.
  • Some functions of the brain, such as the ability to solve math problems, are best understood by regarding the brain as a kind of computer.  Note that we don’t need to say that the brain is merely a computer, only that it can be regarded as such for understanding some of its functions, i.e. we don’t need to get into a big philosophical debate about conciousness and artificial intelligence.
  • Further, in these respects the brain is a classical computer and not a quantum one.  It certainly seems that the information processing function of neurons can be understood in classical terms, i.e. neural networks again.  There is a small minority of experts who believe that quantum mechanics plays an essential role in the information processing functions of the brain for whom my question is nonsense.

Here are all the possible explanations I can think of.

  • The set of problems in BQP, but not in P does not include anything that would have conferred a significant survival advantage for our ancestors.  Admittedly, efficient factoring could be useful for surviving high-school math class, as well as for cracking codes, but this wouldn’t have mattered so much to cave-people.  This would be disappointing, although not devastating, news for people trying to come up with new quantum algorithms.
  • There is some big problem with building a stable quantum computer of any appreciable size, and so present day experimentalists will eventually run into the same problems that nature did.
  • Dumb luck.  Evolution tends to find local minima in the landscape of all possible species.  Having a quantum brain is indeed a lower minimum than our current classical brain, but we never got a big enough hit to get over the mountain separating that solution from ourselves.

The first two explanations seem like the most interesting ones.  If the third explanation wasn’t a possibility then there would have to be a tradeoff between the amount of progress possible in developing quantum algorithms and the amount possible in actually building a quantum computer.  Given that much quantum computing funding is predicated on the idea that massive progress is possible in both areas, I’d say we should thank Darwin for dumb luck!

Scirate – Digg for the arXiv

Dave Bacon has started a very interesting new website called Scirate. It is a Digg-like site for the physics arXiv. You can read his post about the site here. It only works for quant-ph at the moment, but I’d urge all fellow quantum travellers to sign up and take part in the experiment.

Geek blog is on the move

This is just a gentle prod to remind you that I have an even geekier blog than this one called Academic Tech.  I’ve actually started writing things for it now, and there will be lots of interesting links, such as this one.  This is the last time I’ll mention it here, unless there is something to do with quantum theory because I want to keep this a quantum foundations only zone.