Consistency conditions in consistent histories

[This is akin to a living review, which may improve from time to time. Last edited 2023-4-10.]

This post will summarize the various consistency conditions that can be found discussed in the consistent histories literature. Most of the conditions have gone by different names under different authors (and sometimes even under the same author), so I’ll try to give all the aliases I know; just hover over the footnote markers.

There is an overarching schism in the choice of terminology in the literature between the terms “consistent” and “decoherent”. ManyI used to say “most authors” here, but some quick Google Scholar searches suggest “consistent” is now more popular than “decoherent”, and that might always have been true.a   authors, including Gell-Mann and Hartle, now use the term “decoherent” very loosely and no longer employ “consistent” as an official label for any particular condition (or for the formalism as a whole). Zurek and I believe this is a significant loss in terminology, and we are stubbornly resisting it. In our recent arXiv offering, our rant was thus:

…we emphasize that decoherence is a dynamical physical process predicated on a distinction between system and environment, whereas consistency is a static property of a set of histories, a Hamiltonian, and an initial state. For a given decohering quantum system, there is generally a preferred basis of pointer states [1, 8]. In contrast, the mere requirement of consistency does not distinguish a preferred set of histories which describe classical behavior from any of the many sets with no physical interpretation.

(See also the first footnote on page 3347 of “Classical Equations for Quantum Systems”Gell-Mann and Hartleb   which agrees with the importance of this conceptual distinction.) Since Gell-Mann and Hartle did many of the investigations of consistency conditions, some conditions have only appeared in the literature using their terminology (like “medium-strong decoherence”). Nevertheless, I will exploit my home-blog advantage by putting my preferred terminology in italics and all alternate terminology in the footnotes. I eagerly tilt at windmills.

This post is largely based on papers by Halliwell, Gell-Mann and Hartle, and us where many of the various consistency conditions are reviewed.

Notation and terminology

An orthogonal Heisenberg-picture projectors is P(t), and a set of mutually-exclusive and exhaustive such projectors is \{P_a(t)\}, so that P_a(t) P_b(t) = \delta_{ab} P_a(t), P^\dagger_a(t) = P_a(t), and \sum_a P_a(t) = I. For multiple times t_1, \ldots, t_M, a fine-grained history \vec{a}= (a_1 , \ldots, a_M) is constructed by selecting a single projector at each time, where a_m is the alternative at times t_m. The set of all fine-grained histories forms the sample space \Omega. A coarse-grained history \alpha = \{\vec{a}_1,\vec{a}_2,\ldots\} is a subset of \Omega, and it corresponds to a class operator C_\alpha = \sum_i C_{\{\vec{a_i}\}}, where the class operator of a fine-grained history is C_{\{\vec{a}\}} = P_{\alpha_M}^{(M)}(t_M) \cdots P_{\alpha_1}^{(1)}(t_1).

A history is said to be homogeneousSimilarly, Isham has divided various attempts at set selection principles into ones that are mathematical (i.e. maximizing some function) and physical. However, the physical principle he considers is too vague to be useful, and if taken literally (rather than as a guiding principle) would produce far too many histories for them to remain consistent for realistic times scales.c   (or is said to have the “chain form”) if its class operator can be written as a chain of projectors at the different times; this obviously includes all the fine-grained histories, but it also includes coarse-grained histories where each time step is coarse-grained independent of the other time steps. Histories that cannot be written in this way are called inhomogeneous.

There is a coarse-graining partial order on the space of all possible coarse-grained histories, denoted by \alpha \preceq \tilde{\alpha}, when the fine-grained histories composing one are contained within the other: \alpha \subseteq \tilde{\alpha}.

The state is given by \rho. The decoherence functional is \mathfrak{D}(\alpha,\beta) = \mathrm{Tr}[ C_\alpha \rho C_\beta^\dagger]. The (unnormalized) conditional states are \rho_\alpha \equiv C_\alpha \rho C_\alpha^\dagger. For a given tensor factorization \mathcal{H} = \mathcal{A} \otimes \mathcal{B} of the Hilbert space, the partial-trace decoherence functional with respect to \mathcal{A} is \mathfrak{D}_\mathcal{A} (\alpha,\beta) = \mathrm{Tr}_\mathcal{A}[ C_\alpha \rho C_\beta^\dagger].

A (single) consistent history is a history that is a member of any consistent set of histories (defined below); equivalently, a consistent history is one for which \mathrm{Tr}[(I-C_\alpha) \rho C_\alpha^\dagger]=0.

A set of orthogonal, mutually exclusive projections operators R_\alpha are said to be recordsGell-Mann and Hartle call these “generalized records” to emphasize that they need not be feasibly accessible to any observer. Indeed, they will contrast with our concept of “redundant records”, which will be much closer to the intuitive idea of a record.d   of the histories \alpha when C_\alpha \rho = R_\alpha \rho.

In the special case that the state is pure, \rho = \vert \psi \rangle \langle \psi \vert, we give the name branches (or “branch state vectors”Gell-Mann and Hartlee  ) to the conditional pure states \vert \psi_\alpha \rangle \equiv C_\alpha \vert \psi \rangle.

Fragile probabilities, logical conditions, and physical conditions

The conditions are listed below in roughly increasing order of strengthThe order the conditions form under the relation of logical implication is not total, i.e. they can’t all be neatly ordered from weakest to strongest because some are not comparable.f  . The first four of them are generally considered too weak to be physically reasonable, now being merely of historical and conceptual interest for understanding the formalism. Indeed, Diosi has pointed out that histories required to only satisfy one of these weaker conditions always yields a mapping from quantum states to outcome probabilities that is unacceptably sensitive to purely formal adjustments (such as appending non-interacting ancilla systems). In contrast, bonafide probabilitiesIn the words of Diosi, “von Neumann probabilities”.g   can be assigned to sets of histories satisfying one of the later conditions. The key condition delineating this distinction is that of consistency. All the later conditions are strictly stronger than consistency, and so inherit its robustness.

For this reason, in the first four conditions we will explicitly state how probabilities are to be defined. In all remaining conditions, the probabilities are given by p_\alpha \equiv \mathfrak{D}(\alpha,\alpha) and, under those conditions, are necessarily real, positive, and sum to unity.

All conditions stronger than consistency are motivated at least partially to resolve the so-called set-selection problem. In other words, the later conditions are built on the foundation of bonafide probabilities (and, hence, consistency), and seek to pare down the enormous space of possible sets of histories in search of ones we could actually identify with the external classical reality we perceive. Such conditions can be separated into two major categories: those based on abstract logic, and those motivated by physical principles.Similarly, Anastopoulos has suggested a split between precise mathematical postulates and physically inspired ones.h   Most of the conditions of the later type require a preferred tensor factorization of the Hilbert space, namely the system-environment decomposition featured in the decoherence program or the decomposition into many subsystems featured in the quantum Darwinism program.

Consistency conditions

Below, \rho is assumed to be fixed. The histories \alpha and \beta are assumed to range over a set of histories \{ \alpha \} that is exclusive and exhaustive, i.e. \{ \alpha \} forms a partition of \Omega.

Logical conditions admitting fragile probabilities

  • A set of histories \{ \alpha \} is linearly positiveGoldstein and Pagei   when

    (1)   \begin{align*} \mathrm{Re}\, \mathrm{Tr}[ C_\alpha \rho ] \ge 0. \end{align*}

    The probabilities are defined by p_\alpha \equiv \mathrm{Re}\, \mathrm{Tr}[ C_\alpha \rho ].

  • A set of histories \{ \alpha \} satisfies partial consistencyThis was named “partial decoherence” by Halliwell.j   when

    (2)   \begin{align*} \mathrm{Re}\, \mathrm{Tr}[ C_\alpha \rho ] \ge 0,\\  \mathrm{Im}\, \mathrm{Tr}[ C_\alpha \rho ] = 0. \end{align*}

    The probabilities are defined by p_\alpha \equiv \mathrm{Re}\, \mathrm{Tr}[ C_\alpha \rho ].

  • A set of histories \{ \alpha \} satisfies weak consistencyThis is usually called “weak decoherence” (by Gell-Mann and Hartle) or, occasionally, just “consistency” (by Halliwell).k   when

    (3)   \begin{align*} %\mathrm{Re}\, \langle \psi \vert C_\beta^\dagger C_\alpha \vert \psi \rangle = \delta_{\alpha \beta} p_\alpha , \\ %\mathrm{Re}\, \mathrm{Tr}[ C_\alpha \rho C_\beta^\dagger]  = \delta_{\alpha \beta} p_\alpha. \mathrm{Re}\, \mathfrak{D}(\alpha,\beta) = 0 \quad , \quad \alpha \neq \beta. \end{align*}

    The probabilities are defined by p_\alpha \equiv \mathrm{Re}\, \mathfrak{D}(\alpha,\alpha).

Logical conditions not admitting fragile probabilities

  • A set of histories \{ \alpha \} is consistentThis is most commonly called “medium decoherence” (e.g. by Gell-Mann and Hartle). It was called “Gell-Mann-Hartle (GH) decoherence” by Finkelstein when contrasting it with partial-trace (PT) decoherence. It has also been called just “decoherence” (e.g. by Halliwell) and it is equivalent to the “non-interference condition” in the context of path integrals discussed by Griffiths. We like to follow Kent and others who call it just “consistency”. This is because we no longer consider the weaker conditions of “weak consistency”, “partial consistency”, or “linear positivity” to be viable, and because we wish to reserve “decoherence” for dynamical processes.l   when

    (4)   \begin{align*} %\mathrm{Re}\, \langle \psi \vert C_\beta^\dagger C_\alpha \vert \psi \rangle = \delta_{\alpha \beta} p_\alpha , \\ %\mathrm{Re}\, \mathrm{Tr}[ C_\alpha \rho C_\beta^\dagger]  = \delta_{\alpha \beta} p_\alpha. \mathfrak{D}(\alpha,\beta) = 0 \quad , \quad \alpha \neq \beta. \end{align*}

  • A set of histories \{ \alpha \} is preclusive consistentWallden, “Contrary Inferences in Consistent Histories and a Set Selection Criterion” [arXiv:1402.3733].m   when it is consistent and, for any history \tilde{\alpha}, we have

    (5)   \begin{align*} \left. \begin{array}{c} \alpha \preceq \tilde{\alpha}\\ p_\alpha \neq 0\end{array}\right\} \quad \Rightarrow \quad p_{\tilde{\alpha}}\neq 0, \end{align*}

    even when \tilde{\alpha} \notin \{ \alpha \}.

  • A set of histories \{ \alpha \} is ordered consistentKent, “Quantum Histories and Their Implications” [arXiv:gr-qc/9607073].n   when it is consistent and, for any consistent history \tilde{\alpha}, we have

    (6)   \begin{align*} \alpha \preceq {\tilde{\alpha}} \quad \Rightarrow \quad p_\alpha \le p_{\tilde{\alpha}}, \\ \alpha \succeq {\tilde{\alpha}} \quad \Rightarrow \quad p_\alpha \ge p_{\tilde{\alpha}}, \end{align*}

    even when \tilde{\alpha} \notin \{ \alpha \}.

  • A set of homogeneous histories \{ \alpha \} is strongly ordered consistentThis condition has never been given its own a name in the literature. Gell-Mann and Hartle considered this condition to be part I of their (new) “strong decoherence” condition. The other half of that condition, Part II, is apparently something akin to Finkelstein’s partial-trace consistency. However, I cannot for the life of me understand the notation in the paper in which they introduce this term.o   when it is consistent and

    (7)   \begin{align*} P_{\alpha_n}(t_n) \le P_{\alpha_m}(t_m) \quad , \quad t_n > t_m \end{align*}

    where the homogeneous histories are defined by C_\alpha = P_{\alpha_M}(t_M) \cdots P_{\alpha_1}(t_1).

Physical conditions

  • A set of histories \{ \alpha \} is recordedThis has been called “strong decoherence” by Gell-Mann and Hartle, but in latter work they decided the condition was too strong to be useful. Years afterwards they appropriated the term “strong decoherence” for something else.p   when there exists records R_\alpha such that

    (8)   \begin{align*} C_\alpha \rho = R_\alpha \rho. \end{align*}

  • Given a tensor factorization \mathcal{H} = \mathcal{S} \otimes \mathcal{E} of the Hilbert space, a set of histories \{ \alpha \} is partial-trace consistent with respect to \mathcal{E}, abbreviated \mathcal{E}-consistentThis has been called “Partial-trace (PT) decoherence” by Finkelstein where the system being traced out is assumed to be some given environment. This could potentially be called “\mathcal{E}-decoherence” when following the terminology of Gell-Mann and Hartle. Also, note that Gell-Mann and Hartle apparently considered something reminiscent of this condition to be part II of their (new) “strong decoherence” condition. However, I haven’t yet been able to understand the notation in that paper.q  , when

    (9)   \begin{align*} \mathfrak{D}_{\mathcal{E}}(\alpha,\beta) = 0 \quad , \quad \alpha \neq \beta. \end{align*}

  • Given a tensor factorization \mathcal{H} = \mathcal{S} \otimes \mathcal{E} of the Hilbert space, a set of histories \{ \alpha \} is recorded in \mathcal{E} when there exists records R_\alpha^\mathcal{E} acting non-trivially only on \mathcal{E} such that

    (10)   \begin{align*} C_\alpha \rho = (I^\mathcal{S} \otimes R_\alpha^\mathcal{E}) \rho. \end{align*}

  • Given a tensor factorization \mathcal{H} = \mathcal{H}_1 \otimes \cdots \otimes \mathcal{H}_N of the Hilbert space, a set of histories \{ \alpha \} is redundantly consistentThis could potentially be called “redundant decoherence” when following the terminology of Gell-Mann and Hartle.r   when it is \mathcal{H}_n-consistent for all n. In other words,

    (11)   \begin{align*} \mathfrak{D}_{\mathcal{H}_n}(\alpha,\beta) = 0 \quad , \quad \alpha \neq \beta \quad , \quad \forall n. \end{align*}

  • Given a tensor factorization \mathcal{H} = \mathcal{H}_1 \otimes \cdots \otimes \mathcal{H}_N of the Hilbert space, a set of histories \{ \alpha \} is redundantly recorded when it is recorded in \mathcal{H}_n for all n. In other words, there exist record R_\alpha^{\mathcal{H}_n} acting non-trivially only on {\mathcal{H}_n} such that

    (12)   \begin{align*} %C_\alpha \rho = \big(R_\alpha^{\mathcal{H}_n} \otimes \bigotimes_{m \neq n} I^{\mathcal{H}_n} \big) \rho. \quad , \quad \forall n. C_\alpha \rho = (I^{\setminus \mathcal{H}_n} \otimes R_\alpha^{\mathcal{H}_n} ) \rho \quad , \quad \forall n, \end{align*}

    where I^{\setminus \mathcal{H}_n} denotes the identity on all subsystems except \mathcal{H}_n.

Other conditions

For completeness, we also include these two other conditions which are rarely used or discussed.

  • A set of histories satisfies homogeneous-specific weak consistencyThis condition was was never formerly named. It was featured in the work of Omnes and Griffiths.s   when, for all the homogeneous class operators C_\alpha and C_\beta whose sum C_\alpha + C_\beta is also homogeneous, we have

    (13)   \begin{align*} %\mathrm{Re}\, \langle \psi \vert C_\beta^\dagger C_\alpha \vert \psi \rangle = \delta_{\alpha \beta} p_\alpha , \\ %\mathrm{Re}\, \mathrm{Tr}[ C_\alpha \rho C_\beta^\dagger]  = \delta_{\alpha \beta} p_\alpha. \mathrm{Re}\, \mathfrak{D}(\alpha,\beta) = 0 \quad , \quad \alpha \neq \beta. \end{align*}

    The probabilities are defined by p_\alpha \equiv \mathrm{Re}\,  \mathfrak{D}(\alpha,\alpha).

  • Let \rho = \sum_\mu q_\mu \vert \psi_\mu \rangle \langle \psi_\mu \vert be the expansion of \rho in terms of its eigenstates \vert \psi_\mu \rangle and eigenvalues q_\mu. A set of histories satisfies medium strong consistencyThis was called “medium strong decoherence” by Gell-Mann and Hartle.t   when, for each \mu with q_\mu strictly positive, there exists a set of mutually exclusive orthogonal projectors R_\alpha^{(\mu)} such that

    (14)   \begin{align*} C_\alpha \vert \psi_\mu \rangle = R_\alpha^{(\mu)} \vert \psi_\mu \rangle \end{align*}

    If the non-zero p_\mu are degenerate, then this must hold for at least one choice of expansion in terms of the \vert \psi_\mu \rangle.

The condition of homogeneous-specific weak consistency only made sense in the early days of the consistent histories program, when attention was generally restricted to homogenous histories.

The condition of medium strong decoherence made only one brief appearance in a paper by Gell-Mann and Hartle. It is unappealing because (1) it is probably not robust under a more complicated purification of \rho and (2) it becomes ungainly upon degeneracy.

MZ2_cropped
Venn diagram showing the relationship between those consistency conditions with important implication for probability robustness. Only “consistency” (aka “medium decoherence”) is considered sufficient to guarantee sensible assignment. This image is just a colorized knock-off of one by Halliwell.

Discussion

The homogeneous-specific weak consistency is only of historical interest, having been superseded by weak consistency which dispenses with the restriction to homogeneous class operators. See footnote 4 on page 3353 of “Classical Equations for Quantum Systems”Gell-Mann and Hartleu   for more info.

The condition of partial consistency was first discussed by Diosi in an arXiv paper that was never fully published. Most of the content in that paper eventually made its way to the PRL version over a decade later, but the discussion of this condition did not appear there. It was first discussed in print, and named “partial decoherence”, by Halliwell.I am unsure if partial consistency has been rule out quite as conclusively as linear positivity and weak consistency were by Diosi (it being the condition identified by Diosi in an attempt to rescue linear positivity). However, I know of no reason to think that it is at all attractive.v  

Records always imply consistency and, when the state is pure (\rho = \vert \psi \rangle \langle \psi \vert), they are equivalent. Likewise is true for records in \mathcal{E} and \mathcal{E}-consistency, and for redundant records and redundant consistency. Consistency also implies medium strong consistency for pure states.

Partial-trace consistency with respect to a subsystem implies partial-trace consistency with respect to a larger subsystem that contains it, which implies consistency.

Of those conditions stronger than consistency, the strictly logical ones (preclusive consistency and ordered consistency) are designed to avoid the so-called contrary inferences possible when considering two consistent but mutually incompatible sets. Ordered consistency is a strictly stronger condition than preclusion consistency. Wallden argues that requiring preclusive consistency is more essential than ordered consistency because propositions assigned extremal probability (0 or 1) have special ontological status, and that ordered consistency may be unreasonably difficult to check computationally.

When the histories are homogeneous, strongly ordered consistency is a strictly stronger condition than ordered consistency. No one has tried to define it for sets containing inhomogeneous histories.

One may also consider weakened forms of redundant recording and redundant consistency, where the respective equation holds only for n ranging over some large subset of the N subsystems. For instance, it might be interesting to look at a thermodynamic limit where histories are redundantly recorded in “most” subsystems as N \to \infty.

[Future tasks: (1) better understand Gell-Mann and Hartle’s two-part “strong decoherence”, and connect this to the importance of homogeneity (chains), (2) discuss the dependence of preclusive and ordered consistency to the choice of the sample space \Omega, (3) add feasibility, (4) say whether homogeneous-specific weak consistency applies only to sets which contain only homogeneous histories, and (5) discuss the relationship between Finkelstein, records, and strong decoherence as appears in “Classical equations for quantum system” Gell-Mann and Hartle.w  . ]

.

Footnotes

(↵ returns to text)

  1. I used to say “most authors” here, but some quick Google Scholar searches suggest “consistent” is now more popular than “decoherent”, and that might always have been true.
  2. Gell-Mann and Hartle
  3. Similarly, Isham has divided various attempts at set selection principles into ones that are mathematical (i.e. maximizing some function) and physical. However, the physical principle he considers is too vague to be useful, and if taken literally (rather than as a guiding principle) would produce far too many histories for them to remain consistent for realistic times scales.
  4. Gell-Mann and Hartle call these “generalized records” to emphasize that they need not be feasibly accessible to any observer. Indeed, they will contrast with our concept of “redundant records”, which will be much closer to the intuitive idea of a record.
  5. Gell-Mann and Hartle
  6. The order the conditions form under the relation of logical implication is not total, i.e. they can’t all be neatly ordered from weakest to strongest because some are not comparable.
  7. In the words of Diosi, “von Neumann probabilities”.
  8. Similarly, Anastopoulos has suggested a split between precise mathematical postulates and physically inspired ones.
  9. Goldstein and Page
  10. This was named “partial decoherence” by Halliwell.
  11. This is usually called “weak decoherence” (by Gell-Mann and Hartle) or, occasionally, just “consistency” (by Halliwell).
  12. This is most commonly called “medium decoherence” (e.g. by Gell-Mann and Hartle). It was called “Gell-Mann-Hartle (GH) decoherence” by Finkelstein when contrasting it with partial-trace (PT) decoherence. It has also been called just “decoherence” (e.g. by Halliwell) and it is equivalent to the “non-interference condition” in the context of path integrals discussed by Griffiths. We like to follow Kent and others who call it just “consistency”. This is because we no longer consider the weaker conditions of “weak consistency”, “partial consistency”, or “linear positivity” to be viable, and because we wish to reserve “decoherence” for dynamical processes.
  13. Wallden, “Contrary Inferences in Consistent Histories and a Set Selection Criterion” [arXiv:1402.3733].
  14. Kent, “Quantum Histories and Their Implications” [arXiv:gr-qc/9607073].
  15. This condition has never been given its own a name in the literature. Gell-Mann and Hartle considered this condition to be part I of their (new) “strong decoherence” condition. The other half of that condition, Part II, is apparently something akin to Finkelstein’s partial-trace consistency. However, I cannot for the life of me understand the notation in the paper in which they introduce this term.
  16. This has been called “strong decoherence” by Gell-Mann and Hartle, but in latter work they decided the condition was too strong to be useful. Years afterwards they appropriated the term “strong decoherence” for something else.
  17. This has been called “Partial-trace (PT) decoherence” by Finkelstein where the system being traced out is assumed to be some given environment. This could potentially be called “\mathcal{E}-decoherence” when following the terminology of Gell-Mann and Hartle. Also, note that Gell-Mann and Hartle apparently considered something reminiscent of this condition to be part II of their (new) “strong decoherence” condition. However, I haven’t yet been able to understand the notation in that paper.
  18. This could potentially be called “redundant decoherence” when following the terminology of Gell-Mann and Hartle.
  19. This condition was was never formerly named. It was featured in the work of Omnes and Griffiths.
  20. This was called “medium strong decoherence” by Gell-Mann and Hartle.
  21. Gell-Mann and Hartle
  22. I am unsure if partial consistency has been rule out quite as conclusively as linear positivity and weak consistency were by Diosi (it being the condition identified by Diosi in an attempt to rescue linear positivity). However, I know of no reason to think that it is at all attractive.
  23. Gell-Mann and Hartle.
Bookmark the permalink.

3 Comments

  1. Pingback: Comments on Gell-Mann & Hartle’s latest - foreXiv

  2. Pingback: Comments on Weingarten’s preferred branch – foreXiv

  3. Hi there! This is kind of off topic but I need some guidance from
    an established blog. Is it very difficult
    to set up your own blog? I’m not very techincal but I can figure things out pretty fast.
    I’m thinking about creating my own but I’m not sure where to begin. Do you have
    any ideas or suggestions? Cheers

Leave a Reply

Required fields are marked with a *. Your email address will not be published.

Contact me if the spam filter gives you trouble.

Basic HTML tags like ❮em❯ work. Type [latexpage] somewhere to render LaTeX in $'s. (Details.)