I prepared the following extended abstract for the Spacetime and Information Workshop as part of my continuing mission to corrupt physicists while they are still young and impressionable. I reproduce it here for your reading pleasure.
Finding a precise definition of branches in the wavefunction of closed many-body systems is crucial to conceptual clarity in the foundations of quantum mechanics. Toward this goal, we propose amplification, which can be quantified, as the key feature characterizing anthropocentric measurement; this immediately and naturally extends to non-anthropocentric amplification, such as the ubiquitous case of classically chaotic degrees of freedom decohering. Amplification can be formalized as the production of redundant records distributed over spatial disjoint regions, a certain form of multi-partite entanglement in the pure quantum state of a large closed system. If this definition can be made rigorous and shown to be unique, it is then possible to ask many compelling questions about how branches form and evolve.
A recent result shows that branch decompositions are highly constrained just by this requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code. A maximum length scale for records is enough to guarantee uniqueness. However, this result is grounded in a preferred tensor decomposition into independent microscopic subsystems associated with spatial locality. This structure breaks down in a relativistic setting on scales smaller than the Compton wavelength of the relevant field. Indeed, a key insight from algebraic quantum field theory is that finite-energy states are never exact eigenstates of local operators, and hence never have exact records that are spatially disjoint, although they can approximate this arbitrarily well on large scales.… [continue reading]
I’m happy to use this bully pulpit to advertise that the following paper has been deemed “probably not terrible”, i.e., published.
When the wave function of a large quantum system unitarily evolves away from a low-entropy initial state, there is strong circumstantial evidence it develops “branches”: a decomposition into orthogonal components that is indistinguishable from the corresponding incoherent mixture with feasible observations. Is this decomposition unique? Must the number of branches increase with time? These questions are hard to answer because there is no formal definition of branches, and most intuition is based on toy models with arbitrarily preferred degrees of freedom. Here, assuming only the tensor structure associated with spatial locality, I show that branch decompositions are highly constrained just by the requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code. A maximum length scale for records is enough to guarantee uniqueness. Speculatively, objective branch decompositions may speed up numerical simulations of nonstationary many-body states, illuminate the thermalization of closed systems, and demote measurement from fundamental primitive in the quantum formalism.
Here’s the figureThe editor tried to convince me that this figure appeared on the cover for purely aesthetic reasons and this does not mean my letter is the best thing in the issue…but I know better!a and caption:
Spatially disjoint regions with the same coloring (e.g., the solid blue regions
) denote different records for the same observable (e.g.,
… [continue reading]
One way to think about the relevance of decoherence theory to measurement in quantum mechanics is that it reduces the preferred basis problem to the preferred subsystem problem; merely specifying the system of interest (by delineating it from its environment or measuring apparatus) is enough, in important special cases, to derive the measurement basis. But this immediately prompts the question: what are the preferred systems? I spent some time in grad school with my advisor trying to see if I could identify a preferred system just by looking at a large many-body Hamiltonian, but never got anything worth writing up.
I’m pleased to report that Cotler, Penington, and Ranard have tackled a closely related problem, and made a lot more progress:
Essential to the description of a quantum system are its local degrees of freedom, which enable the interpretation of subsystems and dynamics in the Hilbert space. While a choice of local tensor factorization of the Hilbert space is often implicit in the writing of a Hamiltonian or Lagrangian, the identification of local tensor factors is not intrinsic to the Hilbert space itself. Instead, the only basis-invariant data of a Hamiltonian is its spectrum, which does not manifestly determine the local structure. This ambiguity is highlighted by the existence of dualities, in which the same energy spectrum may describe two systems with very different local degrees of freedom. We argue that in fact, the energy spectrum alone almost always encodes a unique description of local degrees of freedom when such a description exists, allowing one to explicitly identify local subsystems and how they interact.
… [continue reading]
Chris Olah coins the term “research debt” to discuss a bundle of related destructive phenomena in research communities:
Poor Exposition – Often, there is no good explanation of important ideas and one has to struggle to understand them. This problem is so pervasive that we take it for granted and don’t appreciate how much better things could be.
Undigested Ideas – Most ideas start off rough and hard to understand. They become radically easier as we polish them, developing the right analogies, language, and ways of thinking.
Bad abstractions and notation – Abstractions and notation are the user interface of research, shaping how we think and communicate. Unfortunately, we often get stuck with the first formalisms to develop even when they’re bad. For example, an object with extra electrons is negative, and pi is wrong.
Noise – Being a researcher is like standing in the middle of a construction site. Countless papers scream for your attention and there’s no easy way to filter or summarize them. We think noise is the main way experts experience research debt.
Shout it from the rooftops (my emphasis):
It’s worth being clear that research debt isn’t just about ideas not being explained well. It’s a lack of digesting ideas – or, at least, a lack of the public version of ideas being digested. It’s a communal messiness of thought.
Developing good abstractions, notations, visualizations, and so forth, is improving the user interfaces for ideas. This helps both with understanding ideas for the first time and with thinking clearly about them. Conversely, if we can’t explain an idea well, that’s often a sign that we don’t understand it as well as we could…
Distillation is also hard.
… [continue reading]
In his new article in the NY Review of Books, the titan Steven Weinberg expresses more sympathy for the importance of the measurement problem in quantum mechanics. The article has nothing new for folks well-versed in quantum foundations, but Weinberg demonstrates a command of the existing arguments and considerations. The lengthy excerpts below characterize what I think are the most important aspects of his view.
Many physicists came to think that the reaction of Einstein and Feynman and others to the unfamiliar aspects of quantum mechanics had been overblown. This used to be my view. After all, Newton’s theories too had been unpalatable to many of his contemporaries…Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.
In quantum mechanics the state of a system is not described by giving the position and velocity of every particle and the values and rates of change of various fields, as in classical physics. Instead, the state of any system at any moment is described by a wave function, essentially a list of numbers, one number for every possible configuration of the system….What is so terrible about that? Certainly, it was a tragic mistake for Einstein and Schrödinger to step away from using quantum mechanics, isolating themselves in their later lives from the exciting progress made by others. Even so, I’m not as sure as I once was about the future of quantum mechanics. It is a bad sign that those physicists today who are most comfortable with quantum mechanics do not agree with one another about what it all means. The dispute arises chiefly regarding the nature of measurement in quantum mechanics…
The introduction of probability into the principles of physics was disturbing to past physicists, but the trouble with quantum mechanics is not that it involves probabilities.
… [continue reading]
In linear algebra, and therefore quantum information, the singular value decomposition (SVD) is elementary, ubiquitous, and beautiful. However, I only recently realized that its expression in bra-ket notation is very elegant. The SVD is equivalent to the statement that any operator can be expressed as
where and are orthonormal sets of vectors, possibly in Hilbert spaces with different dimensionality, and the are the singular values.
That’s it.… [continue reading]
Late, alas. Also: there have been a couple of complaints about the spam filter for comments on this blog, and I’m trying to track down the issue. The filter is supposed to tell you what’s wrong and help you successfully post the comment. If you’ve been unable to get past the filter, or if it’s just too much of a hassle even when you can get past it, please let me know so I can try to fix this.
Europe’s Galileo satellite navigation system recently went online, although without yet a complete constellation. In just a few years, there will be a full four independent navigations from great powers: the EU, the US (GPS), Russia (GLONASS), and China (BeiDou). Devices are already being built to use all four systems at once. Everyone wins through the increased redundancy and satellite count.
Design of the Solo cup.
I highly recommend this semi-technical talk on ARC fusion reactor design by Dennis Whyte.
(Video DownloadHelper allows downloading video off YouTube.)
Proposed in 2014 by Whyte and collaborators, ARC is a newer but only under-development alternative to traditional Tokamak-style reactor, where rare earth barium copper oxide (ReBCo) superconductors play a crucial role. Whyte argues that the key hold-up on fusion reactors is their absolute size, which necessitate large-scale, lumbering international collaboration. ReBCo superconductors are the key technical advance allowing smaller magnetic confinement. The parameters of these designs scale extremely well with increased magnetic field. Significant downsides include increased vessel pressure and pulsed operation because of intrinsic limitations on neutrons shielding.The fusion fuel is deuterium and tritium, which is most amenable choice of reactant on the fusion slope of the nuclei binding energy curve.
… [continue reading]
Bousso has a recent paper bounding the maximum information that can be sent by a signal from first principles in QFT:
I derive a universal upper bound on the capacity of any communication channel between two distant systems. The Holevo quantity, and hence the mutual information, is at most of order
the average energy of the signal, and
is the amount of time for which detectors operate. The bound does not depend on the size or mass of the emitting and receiving systems, nor on the nature of the signal. No restrictions on preparing and processing the signal are imposed. As an example, I consider the encoding of information in the transverse or angular position of a signal emitted and received by systems of arbitrarily large cross-section. In the limit of a large message space, quantum effects become important even if individual signals are classical, and the bound is upheld.
Here’s his first figure:
This all stems from vacuum entanglement, an oft-neglected aspect of QFT that Bousso doesn’t emphasize in the paper as the key ingredient.I thank Scott Aaronson for first pointing this out.a The gradient term in the Hamiltonian for QFTs means that the value of the field at two nearby locations is always entangled. In particular, the value of and ) are sometimes considered independent degrees of freedom but, for a state with bounded energy, they can’t actually take arbitrarily different values as becomes small, or else the gradient contribution to the Hamiltonian violates the energy bound. Technically this entanglement exists over arbitrary distances, but it is exponentially suppressed on scales larger than the Compton wavelength of the field.… [continue reading]
[This post was originally “Part 0”, but it’s been moved. Other parts in this series: 1,2,3,4,5,6,7.]
In an ideal world, the formalism that you use to describe a physical system is in a one-to-one correspondence with the physically distinct configurations of the system. But sometimes it can be useful to introduce additional descriptions, in which case it is very important to understand the unphysical over-counting (e.g., gauge freedom). A scalar potential is a very convenient way of representing the vector force field, , but any constant shift in the potential, , yields forces and dynamics that are indistinguishable, and hence the value of the potential on an absolute scale is unphysical.
One often hears that a quantum experiment measures an observable, but this is wrong, or very misleading, because it vastly over-counts the physically distinct sorts of measurements that are possible. It is much more precise to say that a given apparatus, with a given setting, simultaneously measures all observables with the same eigenvectors. More compactly, an apparatus measures an orthogonal basis – not an observable.We can also allow for the measured observable to be degenerate, in which case the apparatus simultaneously measures all observables with the same degenerate eigenspaces. To be abstract, you could say it measures a commuting subalgebra, with the nondegenerate case corresponding to the subalgebra having maximum dimensionality (i.e., the same number of dimensions as the Hilbert space). Commuting subalgebras with maximum dimension are in one-to-one correspondence with orthonormal bases, modulo multiplying the vectors by pure phases.a You can probably start to see this by just noting that there’s no actual, physical difference between measuring and ; the apparatus that would perform the two measurements are identical.… [continue reading]