I’m happy to announce the recent publication of a paper by Mike, Wojciech, and myself.
Motivated by the advances of quantum Darwinism and recognizing the role played by redundancy in identifying the small subset of quantum states with resilience characteristic of objective classical reality, we explore the implications of redundant records for consistent histories. The consistent histories formalism is a tool for describing sequences of events taking place in an evolving closed quantum system. A set of histories is consistent when one can reason about them using Boolean logic, i.e., when probabilities of sequences of events that define histories are additive. However, the vast majority of the sets of histories that are merely consistent are flagrantly nonclassical in other respects. This embarras de richesses (known as the set selection problem) suggests that one must go beyond consistency to identify how the classical past arises in our quantum universe. The key intuition we follow is that the records of events that define the familiar objective past are inscribed in many distinct systems, e.g., subsystems of the environment, and are accessible locally in space and time to observers. We identify histories that are not just consistent but redundantly consistent using the partial-trace condition introduced by Finkelstein as a bridge between histories and decoherence. The existence of redundant records is a sufficient condition for redundant consistency. It selects, from the multitude of the alternative sets of consistent histories, a small subset endowed with redundant records characteristic of the objective classical past. The information about an objective history of the past is then simultaneously within reach of many, who can independently reconstruct it and arrive at compatible conclusions in the present.
Consistent histories is the (essentially unique) formalism for describing the past in a fully quantum universe, and this paper shows how to talk about realistic, localized, and above all redundant records about that past. The paper is longer than usual, but that’s mostly pedagogy; the new content is very compact.
FIG. 4. Branching structure of the evolution in the CNOT example. An initial product state of the system and all the environment subsystems evolves according to sequential branching and recording events. Here, the records are into a single two level subsystem of the environment for ease of depiction. However, each recording event can be redundant, making many copies of the system's state after each branching event.
The introduction contains a summary of the set selection problem, with a historical overview and many references. If my ranting about the importance of this problem has got you curious enough to dive into the literature, this is where I recommend you start (even if you don’t care about redundant information and Darwinism).
The second and third sections of the paper contain condensed summaries of decoherence & quantum Darwinism, and the consistent histories framework, respectively. We’re trying to tie together two pretty different topics, so we figured many readers would be familiar with only one or the other. But if you’ve seen that stuff before, then skip it.
The meat is in the fourth section, where we extend an old and underappreciated idea due to Finkelstein that links decoherence with consistent histories. Sections five and six are examples and discussion.
This paper will eventually be followed up by a sequel that fleshes out the connection with prior work on quantum Darwinism, but it may have to wait until I’ve got a permanent job…