# Tag Archives: probability

## FQXi Essay Contest

I wrote an essay for the FQXi essay contest.  This year’s theme is “It from bit or bit from it?” and I decided to write about the extent to which Wheeler’s “it from bit” helps us to understand the origin of quantum probabilities from a subjective Bayesian point of view.   You can go here to read and rate the essay and it would be especially great if any fellow FQXi members would do that.

## Anyone for frequentist fudge?

Having just returned from several evenings of Bayesian discussion in Vaxjo, I was inspired to read Facts, Values and Quanta by Marcus Appleby. Whilst not endorsing a completely subjectivist view of probability, the paper is an appropriate remedy for anyone who thinks that the frequentist view is the way to understand probability in physics, and particularly in quantum theory.

In fact, Appleby's paper provides good preparation for tackling a recent paper by Buniy, Hsu and Zee, pointed out by the Quantum Pontiff. The problem they address is how to derive the Born rule within the many-worlds interpretation, or simply from the eigenvalue-eigenstate (EE) link. The EE link says that if you have a system in an eigenstate of some operator, then the system posesses a definite value (the corresponding eigenvalue) for the associated physical quantity with certainty. Note that this is much weaker than the Born rule, since it does not say anything about the probabilities for observables that the system is not in an eigenstate of.

An argument dating back to Everett, but also discussed by Graham, Hartle and Farhi, Goldstone and Gutmann, runs as follows. Suppose you have a long sequence of identically prepared systems in a product state:

|psi>|psi>|psi>…|psi>

For the sake of definiteness, suppose these are qubits. Now suppose we are interested in some observable, with an eigenbasis given by |0>,|1>. We can construct a sequence of relative frequency operators, the first few of which are:

F1 = |1><1|

F2 = 1/2(|01><01| + |10><10|) + 1|11><11|

F3 = 1/3(|001><001| + |010><010| + |100><100|) + 2/3( |011><011| + |101><101| + |110><110|) + 1|111><111|

It is straightforward to show that in the limit of infinite copies, the state |psi>|psi>|psi>…|psi> becomes an eigenstate of Fn with eigenvalue |<psi|1>|^2. Thus, in this limit, the infinite system posesses a definite value for the relative frequency operator, given by the Born probability rule. The argument is also relevant for many worlds, since one can show that if the |0> vs. |1> measurement is repeated on the state |psi>|psi>|psi>…|psi> then there will be norm squared of the worlds where non Born-rule relative frequencies were found will tend to zero.

Of course, there are many possible objections to this argument (see Caves and Shack for a rebuttal of the Farhi, Goldstone, Gutmann version). One is that there are no infinite sequences available in the real world. For finite but large sequences, one can show that although the norm squared of the worlds with non Born probabilities is small, there are actually still far more of them than worlds which do have Born probabilities. Therefore, since we have no a priori reason to assign worlds with small amplitudes a small probability (which we do not because that is what we are trying to derive), we should expect to see non Born rule probabilities.

Buniy, Hsu and Zee point out that this problem can be avoided if we assume that the state space is fundamentally discrete, i.e. if |<phi|psi>| < epsilon for some small epsilon then |psi> and |phi> are actually the same physical state. They provide a way of discretizing the Hilbert space such that the small amplitude worlds dissapear for some large but finite number of copies of the state. They also argue that this discreteness of the state space might be derived from some future theory of quantum gravity.

I have to say that I do not buy their argument at all. For one thing, I hope that the conceptual problems of quantum theory have good answers independently of anything to do with quantum gravity. In any case, the question of whether the successful theory will really entail a discrete state space is still open to doubt. More seriously, it should be realized that the problem they are trying to solve is not unique to quantum mechanics. The same issue exists if one trys to give a frequentist account of classical probability based on large but finite ensembles. In that case, their solution would amount to the procustean method of just throwing away probabilities that are smaller than some epsilon. Hopefully, this already seems like a silly thing to do, but if you still have doubts then you can find persuasive arguments against this approach in the Appleby paper.

For me, the bottom line is that the problem being addressed has nothing to do with quantum theory, but is based on an erroneous frequentist notion of probability. Better to throw out frequentism and use something more sensible, i.e. Bayesian. Even then, the notion of probability in many-worlds remains problematic, but I think that Wallace has given the closest we are likely to get to a derivation of the Born rule for many-worlds along Bayesian lines.