
NonMarkovianity hinders Quantum Darwinism
Fernando Galve, Roberta Zambrini, and Sabrina ManiscalcoWe investigate Quantum Darwinism and the emergence of a classical world from the quantum one in connection with the spectral properties of the environment. We use a microscopic model of quantum environment in which, by changing a simple system parameter, we can modify the information back flow from environment into the system, and therefore its nonMarkovian character. We show that the presence of memory effects hinders the emergence of classical objective reality, linking these two apparently unrelated concepts via a unique dynamical feature related to decoherence factors.Galve and collaborators recognize that the recent Nat. Comm. by Brandao et al is not as universal as it is sometimes interpretted, because the records that are proved to exist can be trivial (no info). So Galve et al. correctly emphasize that Darwinism is dependent on the particular dynamics found in our universe, and the effectiveness of record production is in principle an open question.
Their main model is a harmonic oscillator in an oscillator bath (with bilinear spatial couplings, as usual) and with a spectral density that is concentrated as a hump in some finite window. (See black line with grey shading in Fig 3.) They then vary the system’s frequency with respect to this window. Outside the window, the system and environment decouple and nothing happens. Inside the window, there is good productions of records and Darwinism. At the edges of the window, there is nonMarkovianity as information about the system leaks into the environment but then flows back into the system from time to time. They measure nonMarkovianity as the time when the fidelity between the system’s state at two different times is going up (rather than down monotonically, as it must for completely positive dynamics).

An Introduction to Quantum Error Correction
Daniel GottesmanQuantum states are very delicate, so it is likely some sort of quantum error correction will be necessary to build reliable quantum computers. The theory of quantum errorcorrecting codes has some close ties to and some striking differences from the theory of classical errorcorrecting codes. Many quantum codes can be described in terms of the stabilizer of the codewords. The stabilizer is a finite Abelian group, and allows a straightforward characterization of the errorcorrecting properties of the code. The stabilizer formalism for quantum codes also illustrates the relationships to classical coding theory, particularly classical codes over GF(4), the finite field with four elements. 
The densitymatrix renormalization group in the age of matrix product states
Ulrich SchollwoeckThe densitymatrix renormalization group method (DMRG) has established itself over the last decade as the leading method for the simulation of the statics and dynamics of onedimensional strongly correlated quantum lattice systems. In the further devel opment of the method, the realization that DMRG operates on a highly interesting class of quantum states, socalled matrix prod uct states (MPS), has allowed a much deeper understanding of the inner structure of the DMRG method, its further potential and its limitations. In this paper, I want to give a detailed exposi tion of current DMRG thinking in the MPS language in order to make the advisable implementation of the family of DMRG algo rithms in exclusively MPS terms transparent. I then move on to dis cuss some directions of potentially fruitful further algorithmic development: while DMRG is a very mature method by now, I still see potential for further improvements, as exemplified by a num ber of recently introduced algorithms.Another excellent introduction, this time to matrix product states and the densitymatrix renormalization group, albeit as part of a much larger review.

The Logic of the Past Hypothesis
David WallaceI attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from timereversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a 'Past Hypothesis' about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a constraint on the early Universe's entropy.
This paper is interesting because it casts Jayne’s method as imposing the “simple” initial condition (i.e. low algorithmic entropy of description, I guess?) at an arbitrary time at the beginning of when we start analyzing a system.
Under this interpretation of Jayne’s work, the maxentropy principle is predicting antithermodynamic behavior (i.e. a reversed 2nd law) for all systems before we start analyzing them. Mostly a result of Wallace’s realist stance.
 Wallace claims there is a difference between the traditional “Low Entropy Past” hypothesis of Boltzmann and everyone else, and his personal “Simple Past” hypothesis. This is the crux, but I don’t understand yet how these are different.

Some key question: what would the world look like if the simple initial condition were at a time only modestly in the past rather than at the very beginning?
What would our records look like? Could we look at evidence and determine the exact time? What if there were multiple times for multiple systems? What would that look like?

In the video, Wallace says it’s wrong to attribute our choice of coarsegraining to those variables that we find “interesting”. (I might find the value of the stock market interesting, but that doesn’t mean I can integrate everything else out and find dynamics questions for it.) Rather, he says, it’s important that the choice of coarsegrained variables don’t throw away key information that allows one to get evolution laws. (I.e. local gas densities might have predictive laws, but a random coarsegraining function on 3N positions and velocities will not).
But this isn’t a complete story since we obviously do throw away some information (e.g., the higher level laws are often stochastic) and it’s not clear that there aren’t alternative variables we could choose that do have predictive laws but are unrelated to our experience or anything we care about.

He says it’s wrong to summarize the past hypothesis as “the early universe was in a low entropy state” because that’s a statement about it being in a certain macrostate associated with low entropy. Rather, the past hypothesis is about assuming the early universe isn’t very carefully correlated at the microlevel to conspire today to cause statistical mechanics to break (in the same way that if you took an irreversible process and looked at it’s timeinverse, it would look like there were a bunch of carefully finetuned conspiracies).
But is that true? Hypothesizing that the early universe had a region high temperature separate from one at low temperature (which is low entropy) is the sort of thing we need to get irreversible laws, right? If we were to hypothesize merely that the state of early universe didn’t have careful finetuned conspiracies, but was otherwise highentropy, then we would already be in heat death now. It’s true that the noconspiracy hypothesis would guarantee we’d persist in thermal equilibrium, rather than miraculously evolving into a lowentropy state, but that’s not enough to explain the world around us. (So maybe he’s not trying to explain the world around us, and just trying to explain the timeirreversibility of macroscopic laws? I don’t think so, because if we were and continued to be in thermal equilibrium, then the macroscopic laws would be timereversible!)
 Likewise, he says “the fact that our macro laws are entropy increasing means it’s not very difficult to infer from the historical record that the past had lower entropy”. No way! The macro laws are entropy nondecreasing, but (without a past hypothesis) the most likely state which agrees with a midentropy presentday observation is a conspiratorial highentropy state in the past.

This paper is interesting because it casts Jayne’s method as imposing the “simple” initial condition (i.e. low algorithmic entropy of description, I guess?) at an arbitrary time at the beginning of when we start analyzing a system.
LaTeX in comments
Include [latexpage] to render LaTeX in comments. (More.)Recent Comments
 Wigner function = Fourier transform + Coordinate rotation (6)
 Ninnat Dangniam [latexpage] Mahmoud, one has to be careful how to define “quantumness” based on the negavitity... – Dec 28, 1:54 AM
 Mahmoud Now I know that the negative values of Wigner function represents of the nonclassical state... – Sep 13, 1:18 PM
 Jess Riedel Thanks Lucy. Sorry about the spam filter; the options provided by Wordpress are crummy and... – Aug 26, 8:39 AM
 Lucy Keer Oops, that was a bit unclear  meant to say that I tried to add... – Aug 26, 7:45 AM
 Lucy Keer Thanks very much for writing this up! I had very similar frustrations recently in trying... – Aug 26, 7:40 AM
 Links for AprilMay 2018 (8)
 Devin Bayer Hi Jess, I found your blog via another physics blog and subscribed because I found... – Aug 04, 3:29 PM
 Top posts (2)
 Jess Riedel Howdy Don. This one isn't very sophisticated. I saw a comment on a funny cat... – Jul 28, 11:37 AM
 Don Wright Jess. Greetings. Wondering where your interest in the hyoid bone and cat clavicles came from...... – Jul 28, 10:00 AM
 Links for July 2018 (3)
 Yuan Now the problem seems to be gone (at least for NewsBlur). Thanks! Yuan – Jul 18, 3:30 AM
 Jess Riedel Yikes, it looks like I'm not getting it on my feed reader either. That would... – Jul 17, 8:39 PM
 Yuan Hi Jess, I have been experiencing difficulties with the RSS feed of your blog lately.... – Jul 17, 5:40 PM
 Wigner function = Fourier transform + Coordinate rotation (6)
Licence
foreXiv by C. Jess Riedel is licensed under a Creative Commons AttributionShareAlike 4.0 International License.
Your email address will not be published. Required fields are marked with a *.