
NonMarkovianity hinders Quantum Darwinism
Fernando Galve, Roberta Zambrini, and Sabrina ManiscalcoWe investigate Quantum Darwinism and the emergence of a classical world from the quantum one in connection with the spectral properties of the environment. We use a microscopic model of quantum environment in which, by changing a simple system parameter, we can modify the information back flow from environment into the system, and therefore its nonMarkovian character. We show that the presence of memory effects hinders the emergence of classical objective reality, linking these two apparently unrelated concepts via a unique dynamical feature related to decoherence factors.Galve and collaborators recognize that the recent Nat. Comm. by Brandao et al is not as universal as it is sometimes interpretted, because the records that are proved to exist can be trivial (no info). So Galve et al. correctly emphasize that Darwinism is dependent on the particular dynamics found in our universe, and the effectiveness of record production is in principle an open question.
Their main model is a harmonic oscillator in an oscillator bath (with bilinear spatial couplings, as usual) and with a spectral density that is concentrated as a hump in some finite window. (See black line with grey shading in Fig 3.) They then vary the system’s frequency with respect to this window. Outside the window, the system and environment decouple and nothing happens. Inside the window, there is good productions of records and Darwinism. At the edges of the window, there is nonMarkovianity as information about the system leaks into the environment but then flows back into the system from time to time. They measure nonMarkovianity as the time when the fidelity between the system’s state at two different times is going up (rather than down monotonically, as it must for completely positive dynamics).

An Introduction to Quantum Error Correction
Daniel GottesmanQuantum states are very delicate, so it is likely some sort of quantum error correction will be necessary to build reliable quantum computers. The theory of quantum errorcorrecting codes has some close ties to and some striking differences from the theory of classical errorcorrecting codes. Many quantum codes can be described in terms of the stabilizer of the codewords. The stabilizer is a finite Abelian group, and allows a straightforward characterization of the errorcorrecting properties of the code. The stabilizer formalism for quantum codes also illustrates the relationships to classical coding theory, particularly classical codes over GF(4), the finite field with four elements. 
The densitymatrix renormalization group in the age of matrix product states
Ulrich SchollwoeckThe densitymatrix renormalization group method (DMRG) has established itself over the last decade as the leading method for the simulation of the statics and dynamics of onedimensional strongly correlated quantum lattice systems. In the further devel opment of the method, the realization that DMRG operates on a highly interesting class of quantum states, socalled matrix prod uct states (MPS), has allowed a much deeper understanding of the inner structure of the DMRG method, its further potential and its limitations. In this paper, I want to give a detailed exposi tion of current DMRG thinking in the MPS language in order to make the advisable implementation of the family of DMRG algo rithms in exclusively MPS terms transparent. I then move on to dis cuss some directions of potentially fruitful further algorithmic development: while DMRG is a very mature method by now, I still see potential for further improvements, as exemplified by a num ber of recently introduced algorithms.Another excellent introduction, this time to matrix product states and the densitymatrix renormalization group, albeit as part of a much larger review.

The Logic of the Past Hypothesis
David WallaceI attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from timereversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a 'Past Hypothesis' about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a constraint on the early Universe's entropy.
This paper is interesting because it casts Jayne’s method as imposing the “simple” initial condition (i.e. low algorithmic entropy of description, I guess?) at an arbitrary time at the beginning of when we start analyzing a system.
Under this interpretation of Jayne’s work, the maxentropy principle is predicting antithermodynamic behavior (i.e. a reversed 2nd law) for all systems before we start analyzing them. Mostly a result of Wallace’s realist stance.
 Wallace claims there is a difference between the traditional “Low Entropy Past” hypothesis of Boltzmann and everyone else, and his personal “Simple Past” hypothesis. This is the crux, but I don’t understand yet how these are different.

Some key question: what would the world look like if the simple initial condition were at a time only modestly in the past rather than at the very beginning?
What would our records look like? Could we look at evidence and determine the exact time? What if there were multiple times for multiple systems? What would that look like?

In the video, Wallace says it’s wrong to attribute our choice of coarsegraining to those variables that we find “interesting”. (I might find the value of the stock market interesting, but that doesn’t mean I can integrate everything else out and find dynamics questions for it.) Rather, he says, it’s important that the choice of coarsegrained variables don’t throw away key information that allows one to get evolution laws. (I.e. local gas densities might have predictive laws, but a random coarsegraining function on 3N positions and velocities will not).
But this isn’t a complete story since we obviously do throw away some information (e.g., the higher level laws are often stochastic) and it’s not clear that there aren’t alternative variables we could choose that do have predictive laws but are unrelated to our experience or anything we care about.

He says it’s wrong to summarize the past hypothesis as “the early universe was in a low entropy state” because that’s a statement about it being in a certain macrostate associated with low entropy. Rather, the past hypothesis is about assuming the early universe isn’t very carefully correlated at the microlevel to conspire today to cause statistical mechanics to break (in the same way that if you took an irreversible process and looked at it’s timeinverse, it would look like there were a bunch of carefully finetuned conspiracies).
But is that true? Hypothesizing that the early universe had a region high temperature separate from one at low temperature (which is low entropy) is the sort of thing we need to get irreversible laws, right? If we were to hypothesize merely that the state of early universe didn’t have careful finetuned conspiracies, but was otherwise highentropy, then we would already be in heat death now. It’s true that the noconspiracy hypothesis would guarantee we’d persist in thermal equilibrium, rather than miraculously evolving into a lowentropy state, but that’s not enough to explain the world around us. (So maybe he’s not trying to explain the world around us, and just trying to explain the timeirreversibility of macroscopic laws? I don’t think so, because if we were and continued to be in thermal equilibrium, then the macroscopic laws would be timereversible!)
 Likewise, he says “the fact that our macro laws are entropy increasing means it’s not very difficult to infer from the historical record that the past had lower entropy”. No way! The macro laws are entropy nondecreasing, but (without a past hypothesis) the most likely state which agrees with a midentropy presentday observation is a conspiratorial highentropy state in the past.

This paper is interesting because it casts Jayne’s method as imposing the “simple” initial condition (i.e. low algorithmic entropy of description, I guess?) at an arbitrary time at the beginning of when we start analyzing a system.
LaTeX in comments
Include [latexpage] to render LaTeX in comments. (More.)Recent Comments
 How shocking are rare past events? (3)
 Kevin Zhou Like a lot of philosophical discussions, we may just have to agree to disagree, because... – Mar 17, 4:12 PM
 Jess Riedel Thanks for your comment, Kevin. Your first two paragraphs restate the problem in your preferred... – Mar 12, 5:07 PM
 Kevin Zhou My personal philosophy on these kinds of "shocking" events is the same as my philosophy... – Mar 12, 1:36 PM
 Quotes from Curtright et al.'s history of quantum mechanics in phase space (1)
 Peter Morgan On the other side, instead of comparing CM and QM both in phase space formalisms,... – Feb 28, 3:21 PM
 FAQ about experimental quantum Darwinism (4)
 Peter Morgan Whenever it may be, I look forward to "full force"!!! – Jul 28, 7:56 AM
 Jess Riedel > I was curious about the reason why “the other possibilities inherent in” the quantum... – Jul 26, 1:52 PM
 Stephen Antczak I'm not a physicist, nor do I play one on TV, but I was curious... – Jul 26, 11:14 AM
 How to think about Quantum Mechanics—Part 1: Measurements are about bases (11)
 Jess Riedel Thanks Teddy. Honestly I can't remember what I was trying to say with that condition... – Apr 06, 11:06 AM
 Teddy Parker In your "Implications" section, when you say that a normal operator is "nonsingular", do you... – Apr 06, 10:52 AM
 How shocking are rare past events? (3)
Licence
foreXiv by C. Jess Riedel is licensed under a Creative Commons AttributionShareAlike 4.0 International License.
Your email address will not be published. Required fields are marked with a *.