## Symmetries and solutions

Here is an underemphasized way to frame the relationship between trajectories and symmetries (in the sense of Noether’s theorem)You can find this presentation in “A short review on Noether’s theorems, gauge symmetries and boundary terms” by Máximo Bañados and Ignacio A. Reyes (H/t Godfrey Miller).. Consider the space of all possible trajectories for a system, a real-valued Lagrangian functional on that space, the “directions” at each point, and the corresponding functional gradient in each direction. Classical solutions are exactly those trajectories such that the Lagrangian is stationary for perturbations in any direction , and continuous symmetries are exactly those directions such that the Lagrangian is stationary for any trajectory . That is,

(1)

There are many subtleties obscured in this cartoon presentation, like the fact that a symmetry , being a tangent direction on the manifold of trajectories, can vary with the tangent point it is attached to (as for rotational symmetries). If you’ve never spent a long afternoon with a good book on the calculus of variations, I recommend it.

### Footnotes

(↵ returns to text)

1. You can find this presentation in “A short review on Noether’s theorems, gauge symmetries and boundary terms” by Máximo Bañados and Ignacio A. Reyes (H/t Godfrey Miller).

• Popular-level introduction to the five methods used to identify exoplanets.
• Another good profile of the SEP.
• ArXiv gets some money to improve stuff.
• Flying fish are hard to believe. It’s something of a tragedy that fish capable of long-distance flight never evolved (that we know of?). They are so bird like it’s startling, and this ability has evolved independently multiple times.
• In addition to Russia and China, the US also at one time had ICBMs deployed by rail.
• For nuclear power plants governed by the United States Nuclear Regulatory Commission, SAFSTOR (SAFe STORage) is one of the options for nuclear decommissioning of a shut down plant. During SAFSTOR the de-fuelled plant is monitored for up to sixty years before complete decontamination and dismantling of the site, to a condition where nuclear licensing is no longer required. During the storage interval, some of the radioactive contaminants of the reactor and power plant will decay, which will reduce the quantity of radioactive material to be removed during the final decontamination phase.

The other options set by the NRC are nuclear decommissioning which is immediate dismantling of the plant and remediation of the site, and nuclear entombment which is the enclosure of contaminated parts of the plant in a permanent layer of concrete.Mixtures of options may be used, for example, immediate removal of steam turbine components and condensors, and SAFSTOR for the more heavily radioactive containment vessel. Since NRC requires decommissioning to be completed within 60 years, ENTOMB is not usually chosen since not all activity will have decayed to an unregulated background level in that time.

• The fraction of the federal budget devoted to NASA peaked in 1966, three years before the Moon landing.

## How to think about Quantum Mechanics—Part 7: Quantum chaos and linear evolution

[Other parts in this series: 1,2,3,4,5,6,7.]

You’re taking a vacation to Granada to enjoy a Spanish ski resort in the Sierra Nevada mountains. But as your plane is coming in for a landing, you look out the window and realize the airport is on a small tropical island. Confused, you ask the flight attendant what’s wrong. “Oh”, she says, looking at your ticket, “you’re trying to get to Granada, but you’re on the plane to Grenada in the Caribbean Sea.” A wave of distress comes over your face, but she reassures you: “Don’t worry, Granada isn’t that far from here. The Hamming distance is only 1!”.

After you’ve recovered from that side-splitting humor, let’s dissect the frog. What’s the basis of the joke? The flight attendant is conflating two different metrics: the geographic distance and the Hamming distance. The distances are completely distinct, as two named locations can be very nearby in one and very far apart in the other.

Now let’s hear another joke from renowned physicist Chris Jarzynski:

The linear Schrödinger equation, however, does not give rise to the sort of nonlinear, chaotic dynamics responsible for ergodicity and mixing in classical many-body systems. This suggests that new concepts are needed to understand thermalization in isolated quantum systems. – C. Jarzynski, “Diverse phenomena, common themes” [PDF]

Ha! Get it? This joke is so good it’s been told by S. Wimberger“Since quantum mechanics is the more fundamental theory we can ask ourselves if there is chaotic motion in quantum systems as well.[continue reading]

• Elephants are secretly wearing high heels.

• Cost per unit hard-drive space is flattening (for consumer models).
• Crux was known to the Ancient Greeks due to the fact that it can be seen from southern Egypt; Ptolemy regarded it as part of the constellation Centaurus. It was entirely visible as far north as Britain in the fourth millennium BC. However, the precession of the equinoxes gradually lowered its stars below the European horizon, and they were eventually forgotten by the inhabitants of northern latitudes. By AD 400, most of the constellation never rose above the horizon for Athenians.

• Zotero 5.0 has significant changes and is out now.
• Mainland China has 36 nuclear power reactors in operation, 21 under construction, and more about to start construction.” See also Wikipedia and this long piece on Chinese investment in Namibia. In comparison, the US gets essentially all nuclear power from reactors built at least 30 years ago, and has just 4 new reactors under construction.
• Sentience Institute: “In discussions of effective animal advocacy (EAA) — the field of study for how we can most effectively help animals, also known as effective altruism for animals — there are several important, challenging, and sometimes controversial foundational questions that come up over and over. This post attempts to summarize and catalog the key evidence cited by EAA supporters on each side of these debates for easy reference.”
• Third black hole merger detected by LIGO. No neutron stars yet. Binary BH distribution might be more massive and have more misaligned spins than popular models. Nothing revelatory.
• Vulcan aerospace unveils airplane with world’s largest wingspan (by far) as part of air-launch orbital rocket service.

## Selsam on formal verification of machine learning

Here is the first result out of the project Verifying Deep Mathematical Properties of AI SystemsTechnical abstract available here. Note that David Dill has taken over as PI from Alex Aiken. funded through the Future of Life Institute.

Noisy data, non-convex objectives, model misspecification, and numerical instability can all cause undesired behaviors in machine learning systems. As a result, detecting actual implementation errors can be extremely difficult. We demonstrate a methodology in which developers use an interactive proof assistant to both implement their system and to state a formal theorem defining what it means for their system to be correct. The process of proving this theorem interactively in the proof assistant exposes all implementation errors since any error in the program would cause the proof to fail. As a case study, we implement a new system, Certigrad, for optimizing over stochastic computation graphs, and we generate a formal (i.e. machine-checkable) proof that the gradients sampled by the system are unbiased estimates of the true mathematical gradients. We train a variational autoencoder using Certigrad and find the performance comparable to training the same model in TensorFlow.

Q: Is the correctness specification usually a fairly singular statement? Or will it often be of the form “The program satisfied properties A, B, C, D, and E”? (And then maybe you add “F” later.)

Daniel Selsam: There are a few related issues: how singular is a specification, how much of the functionality of the system is certified (coverage), and how close the specification comes to proving that the system actually does what you want (validation).… [continue reading]

## Reeh–Schlieder property in a separable Hilbert space

As has been discussed here before, the Reeh–Schlieder theorem is an initially confusing property of the vacuum in quantum field theory. It is difficult to find an illuminating discussion of it in the literature, whether in the context of algebraic QFT (from which it originated) or the more modern QFT grounded in RG and effective theories. I expect this to change once more field theorists get trained in quantum information.

The Reeh–Schlieder theorem states that the vacuum is cyclic with respect to the algebra of observables localized in some subset of Minkowski space. (For a single field , the algebra is defined to be generated by all finite smearings for with support in .) Here, “cyclic” means that the subspace is dense in , i.e., any state can be arbitrarily well approximated by a state of the form with . This is initially surprising because could be a state with particle excitations localized (essentially) to a region far from and that looks (essentially) like the vacuum everywhere else. The resolution derives from the fact the vacuum is highly entangled, such that the every region is entangled with every other region by an exponentially small amount.

One mistake that’s easy to make is to be fooled into thinking that this property can only be found in systems, like a field theory, with an infinite number of degrees of freedom. So let me exhibitMost likely a state with this property already exists in the quantum info literature, but I’ve got a habit of re-inventing the wheel. For my last paper, I spent the better part of a month rediscovering the Shor code… a quantum state with the Reeh–Schlieder property that lives in the tensor product of a finite number of separable Hilbert spaces:

As emphasized above, a separable Hilbert space is one that has a countable orthonormal basis, and is therefore isomorphic to , the space of square-normalizable functions.… [continue reading]

## Abstracts for July 2017

• Modewise entanglement of Gaussian states
Alonso Botero and Benni Reznik
We address the decomposition of a multimode pure Gaussian state with respect to a bipartite division of the modes. For any such division the state can always be expressed as a product state involving entangled two-mode squeezed states and single-mode local states at each side. The character of entanglement of the state can therefore be understood modewise; that is, a given mode on one side is entangled with only one corresponding mode of the other, and therefore the total bipartite entanglement is the sum of the modewise entanglement. This decomposition is generally not applicable to all mixed Gaussian states. However, the result can be extended to a special family of “isotropic” states, characterized by a phase space covariance matrix with a completely degenerate symplectic spectrum.

It is well known that, despite the misleading imagery conjured by the name, entanglement in a multipartite system cannot be understood in terms of pair-wise entanglement of the parts. Indeed, there are only pairs of systems, but the number of qualitatively distinct types of entanglement scales exponentially in . A good way to think about this is to recognize that a quantum state of a multipartite system is, in terms of parameters, much more akin to a classical probability distribution than a classical state. When we ask about the information stored in a probability distributions, there are lots and lots of “types” of information, and correlations can be much more complex than just knowing all the pairwise correlations. (“It’s not just that A knows something about B, it’s that A knows something about B conditional on a state of C, and that information can only be unlocked by knowing information from either D or E, depending on the state of F…”).

## Legendre transform

The way that most physicists teach and talk about partial differential equations is horrible, and has surprisingly big costs for the typical understanding of the foundations of the field even among professionals. The chief victims are students of thermodynamics and analytical mechanics, and I’ve mentioned before that the preface of Sussman and Wisdom’s Structure and Interpretation of Classical Mechanics is a good starting point for thinking about these issues. As a pointed example, in this blog post I’ll look at how badly the Legendre transform is taught in standard textbooks,I was pleased to note as this essay went to press that my choice of Landau, Goldstein, and Arnold were confirmed as the “standard” suggestions by the top Google results. and compare it to how it could be taught. In a subsequent post, I’ll used this as a springboard for complaining about the way we record and transmit physics knowledge.

Before we begin: turn away from the screen and see if you can remember what the Legendre transform accomplishes mathematically in classical mechanics.If not, can you remember the definition? I couldn’t, a month ago. I don’t just mean that the Legendre transform converts the Lagrangian into the Hamiltonian and vice versa, but rather: what key mathematical/geometric property does the Legendre transform have, compared to the cornucopia of other function transforms, that allows it to connect these two conceptually distinct formulations of mechanics?

(Analogously, the question “What is useful about the Fourier transform for understanding translationally invariant systems?” can be answered by something like “Translationally invariant operations in the spatial domain correspond to multiplication in the Fourier domain” or “The Fourier transform is a change of basis, within the vector space of functions, using translationally invariant basis elements, i.e., the Fourier modes”.)

#### The status quo

Let’s turn to the canonical text by Goldstein for an example of how the Legendre transform is usually introduced.… [continue reading]

• Methane hydrates will be the new shale gas. There is perhaps an order of magnitude more methane worldwide in hydrates than in shale deposits, but it’s harder to extract. “…it’s thought that only by 2025 at the earliest we might be able to look at realistic commercial options.”
• Sperm whales have no (external) teeth on their upper jaw, which instead features holes into which the teeth on their narrow lower jaw fit.

• Surprising and heartening to me: GiveWell finds that distributing antiretroviral therapy drugs to HIV positive patients (presumably in developing countries) is potentially cost-effective compared to their top recommendations.
• Relatedly: the general flow of genetic information is DNA-RNA-protein. At a crude level, viruses are classified as either RNA viruses or DNA viruses depending on what sort of genetic material they carry. Generally, as parasites dependent on the host cell machinery, this determines where in the protein construction process they inject their payload. However, retroviruses (like HIV) are RNA viruses that bring along their own reverse transcriptase enzyme that, once inside the cell, converts their payload back into DNA and then grafts it into the host’s genome (which is then copied as part of the host cell’s lifecycle). Once this happens, it is very difficult to tell which cells have been infected and very difficult to root out the infection.
• I remember reading about the common pitfalls of vertically integrated companies when I was in school. While there are usually some compelling cost savings to be had from vertical integration (either through insourcing services or acquiring suppliers/customers), the increased margins typically evaporate over time as the “supplier” gets complacent with a captive, internal “customer.”

There are great examples of this in the automotive industry, where automakers have gone through alternating periods of supplier acquisitions and subsequent divestitures as component costs skyrocketed.

## Toward relativistic branches of the wavefunction

I prepared the following extended abstract for the Spacetime and Information Workshop as part of my continuing mission to corrupt physicists while they are still young and impressionable. I reproduce it here for your reading pleasure.

Finding a precise definition of branches in the wavefunction of closed many-body systems is crucial to conceptual clarity in the foundations of quantum mechanics. Toward this goal, we propose amplification, which can be quantified, as the key feature characterizing anthropocentric measurement; this immediately and naturally extends to non-anthropocentric amplification, such as the ubiquitous case of classically chaotic degrees of freedom decohering. Amplification can be formalized as the production of redundant records distributed over spatial disjoint regions, a certain form of multi-partite entanglement in the pure quantum state of a large closed system. If this definition can be made rigorous and shown to be unique, it is then possible to ask many compelling questions about how branches form and evolve.

A recent result shows that branch decompositions are highly constrained just by this requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code. A maximum length scale for records is enough to guarantee uniqueness. However, this result is grounded in a preferred tensor decomposition into independent microscopic subsystems associated with spatial locality. This structure breaks down in a relativistic setting on scales smaller than the Compton wavelength of the relevant field. Indeed, a key insight from algebraic quantum field theory is that finite-energy states are never exact eigenstates of local operators, and hence never have exact records that are spatially disjoint, although they can approximate this arbitrarily well on large scales.… [continue reading]

• Why does a processor need billions of transistors if it’s only ever executing a few dozen instructions per clock cycle?
• Nuclear submarines as refuges from global catastrophes.
• …corporate transactions such as mergers and acquisitions or financings are characterized by several salient facts that lack a complete theoretical account. First, they are almost universally negotiated through agents. Transactional lawyers do not simply translate the parties’ bargain into legally enforceable language; rather, they are actively involved in proposing and bargaining over the transaction terms. Second, they are negotiated in stages, often with the price terms set first by the parties, followed by negotiations primarily among lawyers over the remaining non-price terms. Third, while the transaction terms tend to be tailored to the individual parties, in negotiations the parties frequently resort to claims that specific terms are (or are not) “market.” Fourth, the legal advisory market for such transactions is highly concentrated, with a half-dozen firms holding a majority of the market share.

[Our] claim is that, for complex transactions experiencing either sustained innovation in terms or rapidly changing market conditions, (1) the parties will maximize their expected surplus by investing in market information about transaction terms, even under relatively competitive conditions, and (2) such market information can effectively be purchased by hiring law firms that hold a significant market share for a particular type of transaction.

…The considerable complexity of corporate transaction terms creates an information problem: One or both parties may simply be unaware of the complete set of surplus-increasing terms for the transaction, and of their respective outside options should negotiations break down. This problem is distinct from the classic problem of valuation uncertainty.

## Branches and matrix-product states

I’m happy to use this bully pulpit to advertise that the following paper has been deemed “probably not terrible”, i.e., published.

When the wave function of a large quantum system unitarily evolves away from a low-entropy initial state, there is strong circumstantial evidence it develops “branches”: a decomposition into orthogonal components that is indistinguishable from the corresponding incoherent mixture with feasible observations. Is this decomposition unique? Must the number of branches increase with time? These questions are hard to answer because there is no formal definition of branches, and most intuition is based on toy models with arbitrarily preferred degrees of freedom. Here, assuming only the tensor structure associated with spatial locality, I show that branch decompositions are highly constrained just by the requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code. A maximum length scale for records is enough to guarantee uniqueness. Speculatively, objective branch decompositions may speed up numerical simulations of nonstationary many-body states, illuminate the thermalization of closed systems, and demote measurement from fundamental primitive in the quantum formalism.

Here’s the figureThe editor tried to convince me that this figure appeared on the cover for purely aesthetic reasons and this does not mean my letter is the best thing in the issue…but I know better! and caption:

Spatially disjoint regions with the same coloring (e.g., the solid blue regions ) denote different records for the same observable (e.g., ).

## Comments on Cotler, Penington, & Ranard

One way to think about the relevance of decoherence theory to measurement in quantum mechanics is that it reduces the preferred basis problem to the preferred subsystem problem; merely specifying the system of interest (by delineating it from its environment or measuring apparatus) is enough, in important special cases, to derive the measurement basis. But this immediately prompts the question: what are the preferred systems? I spent some time in grad school with my advisor trying to see if I could identify a preferred system just by looking at a large many-body Hamiltonian, but never got anything worth writing up.

I’m pleased to report that Cotler, Penington, and Ranard have tackled a closely related problem, and made a lot more progress:

Locality from the Spectrum
Jordan S. Cotler, Geoffrey R. Penington, Daniel H. Ranard
Essential to the description of a quantum system are its local degrees of freedom, which enable the interpretation of subsystems and dynamics in the Hilbert space. While a choice of local tensor factorization of the Hilbert space is often implicit in the writing of a Hamiltonian or Lagrangian, the identification of local tensor factors is not intrinsic to the Hilbert space itself. Instead, the only basis-invariant data of a Hamiltonian is its spectrum, which does not manifestly determine the local structure. This ambiguity is highlighted by the existence of dualities, in which the same energy spectrum may describe two systems with very different local degrees of freedom. We argue that in fact, the energy spectrum alone almost always encodes a unique description of local degrees of freedom when such a description exists, allowing one to explicitly identify local subsystems and how they interact.

## Research debt

Chris Olah coins the term “research debt” to discuss a bundle of related destructive phenomena in research communities:

• Poor Exposition – Often, there is no good explanation of important ideas and one has to struggle to understand them. This problem is so pervasive that we take it for granted and don’t appreciate how much better things could be.
• Undigested Ideas – Most ideas start off rough and hard to understand. They become radically easier as we polish them, developing the right analogies, language, and ways of thinking.
• Bad abstractions and notation – Abstractions and notation are the user interface of research, shaping how we think and communicate. Unfortunately, we often get stuck with the first formalisms to develop even when they’re bad. For example, an object with extra electrons is negative, and pi is wrong.
• Noise – Being a researcher is like standing in the middle of a construction site. Countless papers scream for your attention and there’s no easy way to filter or summarize them. We think noise is the main way experts experience research debt.

Shout it from the rooftops (my emphasis):

It’s worth being clear that research debt isn’t just about ideas not being explained well. It’s a lack of digesting ideas – or, at least, a lack of the public version of ideas being digested. It’s a communal messiness of thought.

Developing good abstractions, notations, visualizations, and so forth, is improving the user interfaces for ideas. This helps both with understanding ideas for the first time and with thinking clearly about them. Conversely, if we can’t explain an idea well, that’s often a sign that we don’t understand it as well as we could…

Distillation is also hard.