Reeh–Schlieder property in a separable Hilbert space

As has been discussed here before, the Reeh–Schlieder theorem is an initially confusing property of the vacuum in quantum field theory. It is difficult to find an illuminating discussion of it in the literature, whether in the context of algebraic QFT (from which it originated) or the more modern QFT grounded in RG and effective theories. I expect this to change once more field theorists get trained in quantum information.

The Reeh–Schlieder theorem states that the vacuum \vert 0 \rangle is cyclic with respect to the algebra \mathcal{A}(\mathcal{O}) of observables localized in some subset \mathcal{O} of Minkowski space. (For a single field \phi(x), the algebra \mathcal{A}(\mathcal{O}) is defined to be generated by all finite smearings \phi_f = \int\! dx\, f(x)\phi(x) for f(x) with support in \mathcal{O}.) Here, “cyclic” means that the subspace \mathcal{H}^{\mathcal{O}} \equiv \mathcal{A}(\mathcal{O})\vert 0 \rangle is dense in \mathcal{H}, i.e., any state \vert \chi \rangle \in \mathcal{H} can be arbitrarily well approximated by a state of the form A \vert 0 \rangle with A \in \mathcal{A}(\mathcal{O}). This is initially surprising because \vert \chi \rangle could be a state with particle excitations localized (essentially) to a region far from \mathcal{O} and that looks (essentially) like the vacuum everywhere else. The resolution derives from the fact the vacuum is highly entangled, such that the every region is entangled with every other region by an exponentially small amount.

One mistake that’s easy to make is to be fooled into thinking that this property can only be found in systems, like a field theory, with an infinite number of degrees of freedom. So let me exhibitMost likely a state with this property already exists in the quantum info literature, but I’ve got a habit of re-inventing the wheel. For my last paper, I spent the better part of a month rediscovering the Shor code… a quantum state with the Reeh–Schlieder property that lives in the tensor product of a finite number of separable Hilbert spaces:

    \[\mathcal{H} = \bigotimes_{n=1}^N \mathcal{H}_n, \qquad \mathcal{H}_n = \mathrm{span}\left\{ \vert s \rangle_n \right\}_{s=1}^\infty\]

As emphasized above, a separable Hilbert space is one that has a countable orthonormal basis, and is therefore isomorphic to L^2(\mathbb{R}), the space of square-normalizable functions.… [continue reading]

Abstracts for July 2017

  • Modewise entanglement of Gaussian states
    Alonso Botero and Benni Reznik
    We address the decomposition of a multimode pure Gaussian state with respect to a bipartite division of the modes. For any such division the state can always be expressed as a product state involving entangled two-mode squeezed states and single-mode local states at each side. The character of entanglement of the state can therefore be understood modewise; that is, a given mode on one side is entangled with only one corresponding mode of the other, and therefore the total bipartite entanglement is the sum of the modewise entanglement. This decomposition is generally not applicable to all mixed Gaussian states. However, the result can be extended to a special family of “isotropic” states, characterized by a phase space covariance matrix with a completely degenerate symplectic spectrum.

    It is well known that, despite the misleading imagery conjured by the name, entanglement in a multipartite system cannot be understood in terms of pair-wise entanglement of the parts. Indeed, there are only N(N-1) pairs of N systems, but the number of qualitatively distinct types of entanglement scales exponentially in N. A good way to think about this is to recognize that a quantum state of a multipartite system is, in terms of parameters, much more akin to a classical probability distribution than a classical state. When we ask about the information stored in a probability distributions, there are lots and lots of “types” of information, and correlations can be much more complex than just knowing all the pairwise correlations. (“It’s not just that A knows something about B, it’s that A knows something about B conditional on a state of C, and that information can only be unlocked by knowing information from either D or E, depending on the state of F…”).

[continue reading]

Legendre transform

The way that most physicists teach and talk about partial differential equations is horrible, and has surprisingly big costs for the typical understanding of the foundations of the field even among professionals. The chief victims are students of thermodynamics and analytical mechanics, and I’ve mentioned before that the preface of Sussman and Wisdom’s Structure and Interpretation of Classical Mechanics is a good starting point for thinking about these issues. As a pointed example, in this blog post I’ll look at how badly the Legendre transform is taught in standard textbooks,I was pleased to note as this essay went to press that my choice of Landau, Goldstein, and Arnold were confirmed as the “standard” suggestions by the top Google results. and compare it to how it could be taught. In a subsequent post, I’ll used this as a springboard for complaining about the way we record and transmit physics knowledge.

Before we begin: turn away from the screen and see if you can remember what the Legendre transform accomplishes mathematically in classical mechanics.If not, can you remember the definition? I couldn’t, a month ago. I don’t just mean that the Legendre transform converts the Lagrangian into the Hamiltonian and vice versa, but rather: what key mathematical/geometric property does the Legendre transform have, compared to the cornucopia of other function transforms, that allows it to connect these two conceptually distinct formulations of mechanics?

(Analogously, the question “What is useful about the Fourier transform for understanding translationally invariant systems?” can be answered by something like “Translationally invariant operations in the spatial domain correspond to multiplication in the Fourier domain” or “The Fourier transform is a change of basis, within the vector space of functions, using translationally invariant basis elements, i.e., the Fourier modes”.)

The status quo

Let’s turn to the canonical text by Goldstein for an example of how the Legendre transform is usually introduced.… [continue reading]

Links for May 2017

  • Methane hydrates will be the new shale gas. There is perhaps an order of magnitude more methane worldwide in hydrates than in shale deposits, but it’s harder to extract. “…it’s thought that only by 2025 at the earliest we might be able to look at realistic commercial options.”
  • Sperm whales have no (external) teeth on their upper jaw, which instead features holes into which the teeth on their narrow lower jaw fit.


  • Surprising and heartening to me: GiveWell finds that distributing antiretroviral therapy drugs to HIV positive patients (presumably in developing countries) is potentially cost-effective compared to their top recommendations.
  • Relatedly: the general flow of genetic information is DNA-RNA-protein. At a crude level, viruses are classified as either RNA viruses or DNA viruses depending on what sort of genetic material they carry. Generally, as parasites dependent on the host cell machinery, this determines where in the protein construction process they inject their payload. However, retroviruses (like HIV) are RNA viruses that bring along their own reverse transcriptase enzyme that, once inside the cell, converts their payload back into DNA and then grafts it into the host’s genome (which is then copied as part of the host cell’s lifecycle). Once this happens, it is very difficult to tell which cells have been infected and very difficult to root out the infection.
  • Claims about what makes Amazon’s vertical integration different:

    I remember reading about the common pitfalls of vertically integrated companies when I was in school. While there are usually some compelling cost savings to be had from vertical integration (either through insourcing services or acquiring suppliers/customers), the increased margins typically evaporate over time as the “supplier” gets complacent with a captive, internal “customer.”

    There are great examples of this in the automotive industry, where automakers have gone through alternating periods of supplier acquisitions and subsequent divestitures as component costs skyrocketed.

[continue reading]

Toward relativistic branches of the wavefunction

I prepared the following extended abstract for the Spacetime and Information Workshop as part of my continuing mission to corrupt physicists while they are still young and impressionable. I reproduce it here for your reading pleasure.


Finding a precise definition of branches in the wavefunction of closed many-body systems is crucial to conceptual clarity in the foundations of quantum mechanics. Toward this goal, we propose amplification, which can be quantified, as the key feature characterizing anthropocentric measurement; this immediately and naturally extends to non-anthropocentric amplification, such as the ubiquitous case of classically chaotic degrees of freedom decohering. Amplification can be formalized as the production of redundant records distributed over spatial disjoint regions, a certain form of multi-partite entanglement in the pure quantum state of a large closed system. If this definition can be made rigorous and shown to be unique, it is then possible to ask many compelling questions about how branches form and evolve.

A recent result shows that branch decompositions are highly constrained just by this requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code. A maximum length scale for records is enough to guarantee uniqueness. However, this result is grounded in a preferred tensor decomposition into independent microscopic subsystems associated with spatial locality. This structure breaks down in a relativistic setting on scales smaller than the Compton wavelength of the relevant field. Indeed, a key insight from algebraic quantum field theory is that finite-energy states are never exact eigenstates of local operators, and hence never have exact records that are spatially disjoint, although they can approximate this arbitrarily well on large scales.… [continue reading]

Links for April 2017

  • Why does a processor need billions of transistors if it’s only ever executing a few dozen instructions per clock cycle?
  • Nuclear submarines as refuges from global catastrophes.
  • Elite Law Firms Cash in on Market Knowledge“:

    …corporate transactions such as mergers and acquisitions or financings are characterized by several salient facts that lack a complete theoretical account. First, they are almost universally negotiated through agents. Transactional lawyers do not simply translate the parties’ bargain into legally enforceable language; rather, they are actively involved in proposing and bargaining over the transaction terms. Second, they are negotiated in stages, often with the price terms set first by the parties, followed by negotiations primarily among lawyers over the remaining non-price terms. Third, while the transaction terms tend to be tailored to the individual parties, in negotiations the parties frequently resort to claims that specific terms are (or are not) “market.” Fourth, the legal advisory market for such transactions is highly concentrated, with a half-dozen firms holding a majority of the market share.

    [Our] claim is that, for complex transactions experiencing either sustained innovation in terms or rapidly changing market conditions, (1) the parties will maximize their expected surplus by investing in market information about transaction terms, even under relatively competitive conditions, and (2) such market information can effectively be purchased by hiring law firms that hold a significant market share for a particular type of transaction.

    …The considerable complexity of corporate transaction terms creates an information problem: One or both parties may simply be unaware of the complete set of surplus-increasing terms for the transaction, and of their respective outside options should negotiations break down. This problem is distinct from the classic problem of valuation uncertainty.

[continue reading]

Branches and matrix-product states

I’m happy to use this bully pulpit to advertise that the following paper has been deemed “probably not terrible”, i.e., published.

When the wave function of a large quantum system unitarily evolves away from a low-entropy initial state, there is strong circumstantial evidence it develops “branches”: a decomposition into orthogonal components that is indistinguishable from the corresponding incoherent mixture with feasible observations. Is this decomposition unique? Must the number of branches increase with time? These questions are hard to answer because there is no formal definition of branches, and most intuition is based on toy models with arbitrarily preferred degrees of freedom. Here, assuming only the tensor structure associated with spatial locality, I show that branch decompositions are highly constrained just by the requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code. A maximum length scale for records is enough to guarantee uniqueness. Speculatively, objective branch decompositions may speed up numerical simulations of nonstationary many-body states, illuminate the thermalization of closed systems, and demote measurement from fundamental primitive in the quantum formalism.

Here’s the figureThe editor tried to convince me that this figure appeared on the cover for purely aesthetic reasons and this does not mean my letter is the best thing in the issue…but I know better! and caption:


Spatially disjoint regions with the same coloring (e.g., the solid blue regions \mathcal{F}, \mathcal{F}', \ldots) denote different records for the same observable (e.g., \Omega_a = \{\Omega_a^{\mathcal{F}},\Omega_a^{\mathcal{F}'},\ldots\}).
[continue reading]

Comments on Cotler, Penington, & Ranard

One way to think about the relevance of decoherence theory to measurement in quantum mechanics is that it reduces the preferred basis problem to the preferred subsystem problem; merely specifying the system of interest (by delineating it from its environment or measuring apparatus) is enough, in important special cases, to derive the measurement basis. But this immediately prompts the question: what are the preferred systems? I spent some time in grad school with my advisor trying to see if I could identify a preferred system just by looking at a large many-body Hamiltonian, but never got anything worth writing up.

I’m pleased to report that Cotler, Penington, and Ranard have tackled a closely related problem, and made a lot more progress:

Locality from the Spectrum
Jordan S. Cotler, Geoffrey R. Penington, Daniel H. Ranard
Essential to the description of a quantum system are its local degrees of freedom, which enable the interpretation of subsystems and dynamics in the Hilbert space. While a choice of local tensor factorization of the Hilbert space is often implicit in the writing of a Hamiltonian or Lagrangian, the identification of local tensor factors is not intrinsic to the Hilbert space itself. Instead, the only basis-invariant data of a Hamiltonian is its spectrum, which does not manifestly determine the local structure. This ambiguity is highlighted by the existence of dualities, in which the same energy spectrum may describe two systems with very different local degrees of freedom. We argue that in fact, the energy spectrum alone almost always encodes a unique description of local degrees of freedom when such a description exists, allowing one to explicitly identify local subsystems and how they interact.
[continue reading]

Links for March 2017

[continue reading]

Research debt

Chris Olah coins the term “research debt” to discuss a bundle of related destructive phenomena in research communities:

  • Poor Exposition – Often, there is no good explanation of important ideas and one has to struggle to understand them. This problem is so pervasive that we take it for granted and don’t appreciate how much better things could be.
  • Undigested Ideas – Most ideas start off rough and hard to understand. They become radically easier as we polish them, developing the right analogies, language, and ways of thinking.
  • Bad abstractions and notation – Abstractions and notation are the user interface of research, shaping how we think and communicate. Unfortunately, we often get stuck with the first formalisms to develop even when they’re bad. For example, an object with extra electrons is negative, and pi is wrong.
  • Noise – Being a researcher is like standing in the middle of a construction site. Countless papers scream for your attention and there’s no easy way to filter or summarize them. We think noise is the main way experts experience research debt.

Shout it from the rooftops (my emphasis):

It’s worth being clear that research debt isn’t just about ideas not being explained well. It’s a lack of digesting ideas – or, at least, a lack of the public version of ideas being digested. It’s a communal messiness of thought.

Developing good abstractions, notations, visualizations, and so forth, is improving the user interfaces for ideas. This helps both with understanding ideas for the first time and with thinking clearly about them. Conversely, if we can’t explain an idea well, that’s often a sign that we don’t understand it as well as we could…

Distillation is also hard.

[continue reading]

Abstracts for March 2017

  • Recent progress in synthetic chemistry and molecular quantum optics has enabled demonstrations of the quantum mechanical wave–particle duality for complex particles, with masses exceeding 10 kDa. Future experiments with even larger objects will require new optical preparation and manipulation methods that shall profit from the possibility to cleave a well-defined molecular tag from a larger parent molecule. Here we present the design and synthesis of two model compounds as well as evidence for the photoinduced beam depletion in high vacuum in one case.

    The technique of using “laser grating”, in place of physical grating (slits), for producing spatial interference of molecules relies on the laser’s ability to ionize the molecule. (Once ionized, standing electric fields can sweep it out of the way.) But for some molecules, especially large nanoparticles, this is ineffective. Solution: attach a molecular tag to the nanoparticle that reliably cleaves in the presence of a laser, allowing the nanoparticle to be vacuumed up. Rad.

  • This chapter discusses the asymptotics, singularities, and the reduction of theories. The reduction must involve the study of limits—asymptotics. The reduction is obstructed by the fact that the limit is highly singular. In addition, the type of singularity is important, and the singularities are directly connected to the existence of emergent phenomena and underlie some of the most difficult and intensively studied problems in physics today. The chapter provides six examples of singular limits and emergent phenomena such as special relativity and statistical mechanics. Reduction in its simplest form is well illustrated by special relativity.
[continue reading]

Links for February 2017

  • If you are a high school student, or know one, who would be interested in the SPARC summer camp, the deadline is March 1.

    SPARC helps talented high school students apply their quantitative thinking skills to their lives and the world.

    SPARC will be hosted in the San Francisco Bay Area from August 6 – 17, with students arriving the evening of the 6th and leaving the morning of the 17th. Room and board are provided free of charge.

    The curriculum covers topics from causal modeling and probability to game theory and cognitive science. But the focus of SPARC is on applying the same quantitative and rigorous spirit outside of the classroom. How can we understand our own reasoning and behavior? How can we think more clearly and better achieve our goals?

  • Indian Space Research Organisation’s Polar Satellite Launch Vehicle successfully launched 104 satellites into orbit on the same mission. Onboard video of the deployment:

    Pictures of some of the cubesats, including Planet‘s 88 imagining satellites for continuous Earth monitoring.
  • What is a ‘Shavers Only’ Electrical Outlet?
  • A possible rare shake-up of the GiveWell list: temporary subsidies for migrant workers in India.
  • How to think about cell walls:

    I most cells, the cell wall is flexible, meaning that it will bend rather than holding a fixed shape, but has considerable tensile strength. The apparent rigidity of primary plant tissues is enabled by cell walls, but is not due to the walls’ stiffness. Hydraulic turgor pressure creates this rigidity, along with the wall structure. The flexibility of the cell walls is seen when plants wilt, so that the stems and leaves begin to droop, or in seaweeds that bend in water currents.

[continue reading]

Links for January 2017

[continue reading]

Weinberg on the measurement problem

In his new article in the NY Review of Books, the titan Steven Weinberg expresses more sympathy for the importance of the measurement problem in quantum mechanics. The article has nothing new for folks well-versed in quantum foundations, but Weinberg demonstrates a command of the existing arguments and considerations. The lengthy excerpts below characterize what I think are the most important aspects of his view.

Many physicists came to think that the reaction of Einstein and Feynman and others to the unfamiliar aspects of quantum mechanics had been overblown. This used to be my view. After all, Newton’s theories too had been unpalatable to many of his contemporaries…Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.

In quantum mechanics the state of a system is not described by giving the position and velocity of every particle and the values and rates of change of various fields, as in classical physics. Instead, the state of any system at any moment is described by a wave function, essentially a list of numbers, one number for every possible configuration of the system….What is so terrible about that? Certainly, it was a tragic mistake for Einstein and Schrödinger to step away from using quantum mechanics, isolating themselves in their later lives from the exciting progress made by others. Even so, I’m not as sure as I once was about the future of quantum mechanics. It is a bad sign that those physicists today who are most comfortable with quantum mechanics do not agree with one another about what it all means. The dispute arises chiefly regarding the nature of measurement in quantum mechanics…

The introduction of probability into the principles of physics was disturbing to past physicists, but the trouble with quantum mechanics is not that it involves probabilities.

[continue reading]

Singular value decomposition in bra-ket notation

In linear algebra, and therefore quantum information, the singular value decomposition (SVD) is elementary, ubiquitous, and beautiful. However, I only recently realized that its expression in bra-ket notation is very elegant. The SVD is equivalent to the statement that any operator \hat{M} can be expressed as

(1)   \begin{align*} \hat{M} = \sum_i \vert A_i \rangle \lambda_i \langle B_i \vert \end{align*}

where \vert A_i \rangle and \vert B_i \rangle are orthonormal sets of vectors, possibly in Hilbert spaces with different dimensionality, and the \lambda_i \ge 0 are the singular values.

That’s it.… [continue reading]