The way that most physicists teach and talk about partial differential equations is horrible, and has surprisingly big costs for the typical understanding of the foundations of the field even among professionals. The chief victims are students of thermodynamics and analytical mechanics, and I’ve mentioned before that the preface of Sussman and Wisdom’s Structure and Interpretation of Classical Mechanics is a good starting point for thinking about these issues. As a pointed example, in this blog post I’ll look at how badly the Legendre transform is taught in standard textbooks,I was pleased to note as this essay went to press that my choice of Landau, Goldstein, and Arnold were confirmed as the “standard” suggestions by the top Google results. a and compare it to how it could be taught. In a subsequent post, I’ll used this as a springboard for complaining about the way we record and transmit physics knowledge.
Before we begin: turn away from the screen and see if you can remember what the Legendre transform accomplishes mathematically in classical mechanics.If not, can you remember the definition? I couldn’t, a month ago. b I don’t just mean that the Legendre transform converts the Lagrangian into the Hamiltonian and vice versa, but rather: what key mathematical/geometric property does the Legendre transform have, compared to the cornucopia of other function transforms, that allows it to connect these two conceptually distinct formulations of mechanics?… [continue reading]
I prepared the following extended abstract for the Spacetime and Information Workshop as part of my continuing mission to corrupt physicists while they are still young and impressionable. I reproduce it here for your reading pleasure.
Finding a precise definition of branches in the wavefunction of closed many-body systems is crucial to conceptual clarity in the foundations of quantum mechanics. Toward this goal, we propose amplification, which can be quantified, as the key feature characterizing anthropocentric measurement; this immediately and naturally extends to non-anthropocentric amplification, such as the ubiquitous case of classically chaotic degrees of freedom decohering. Amplification can be formalized as the production of redundant records distributed over spatial disjoint regions, a certain form of multi-partite entanglement in the pure quantum state of a large closed system. If this definition can be made rigorous and shown to be unique, it is then possible to ask many compelling questions about how branches form and evolve.
A recent result shows that branch decompositions are highly constrained just by this requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code.… [continue reading]
I’m happy to use this bully pulpit to advertise that the following paper has been deemed “probably not terrible”, i.e., published.
When the wave function of a large quantum system unitarily evolves away from a low-entropy initial state, there is strong circumstantial evidence it develops “branches”: a decomposition into orthogonal components that is indistinguishable from the corresponding incoherent mixture with feasible observations. Is this decomposition unique? Must the number of branches increase with time? These questions are hard to answer because there is no formal definition of branches, and most intuition is based on toy models with arbitrarily preferred degrees of freedom. Here, assuming only the tensor structure associated with spatial locality, I show that branch decompositions are highly constrained just by the requirement that they exhibit redundant local records. The set of all redundantly recorded observables induces a preferred decomposition into simultaneous eigenstates unless their records are highly extended and delicately overlapping, as exemplified by the Shor error-correcting code. A maximum length scale for records is enough to guarantee uniqueness. Speculatively, objective branch decompositions may speed up numerical simulations of nonstationary many-body states, illuminate the thermalization of closed systems, and demote measurement from fundamental primitive in the quantum formalism.
… [continue reading]
One way to think about the relevance of decoherence theory to measurement in quantum mechanics is that it reduces the preferred basis problem to the preferred subsystem problem; merely specifying the system of interest (by delineating it from its environment or measuring apparatus) is enough, in important special cases, to derive the measurement basis. But this immediately prompts the question: what are the preferred systems? I spent some time in grad school with my advisor trying to see if I could identify a preferred system just by looking at a large many-body Hamiltonian, but never got anything worth writing up.
I’m pleased to report that Cotler, Penington, and Ranard have tackled a closely related problem, and made a lot more progress:
Essential to the description of a quantum system are its local degrees of freedom, which enable the interpretation of subsystems and dynamics in the Hilbert space. While a choice of local tensor factorization of the Hilbert space is often implicit in the writing of a Hamiltonian or Lagrangian, the identification of local tensor factors is not intrinsic to the Hilbert space itself. Instead, the only basis-invariant data of a Hamiltonian is its spectrum, which does not manifestly determine the local structure.
… [continue reading]
Chris Olah coins the term “research debt” to discuss a bundle of related destructive phenomena in research communities:
Poor Exposition – Often, there is no good explanation of important ideas and one has to struggle to understand them. This problem is so pervasive that we take it for granted and don’t appreciate how much better things could be.
Undigested Ideas – Most ideas start off rough and hard to understand. They become radically easier as we polish them, developing the right analogies, language, and ways of thinking.
Bad abstractions and notation – Abstractions and notation are the user interface of research, shaping how we think and communicate. Unfortunately, we often get stuck with the first formalisms to develop even when they’re bad. For example, an object with extra electrons is negative, and pi is wrong.
Noise – Being a researcher is like standing in the middle of a construction site. Countless papers scream for your attention and there’s no easy way to filter or summarize them. We think noise is the main way experts experience research debt.
Shout it from the rooftops (my emphasis):
It’s worth being clear that research debt isn’t just about ideas not being explained well. It’s a lack of digesting ideas – or, at least, a lack of the public version of ideas being digested.
… [continue reading]
In his new article in the NY Review of Books, the titan Steven Weinberg expresses more sympathy for the importance of the measurement problem in quantum mechanics. The article has nothing new for folks well-versed in quantum foundations, but Weinberg demonstrates a command of the existing arguments and considerations. The lengthy excerpts below characterize what I think are the most important aspects of his view.
Many physicists came to think that the reaction of Einstein and Feynman and others to the unfamiliar aspects of quantum mechanics had been overblown. This used to be my view. After all, Newton’s theories too had been unpalatable to many of his contemporaries…Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.
In quantum mechanics the state of a system is not described by giving the position and velocity of every particle and the values and rates of change of various fields, as in classical physics. Instead, the state of any system at any moment is described by a wave function, essentially a list of numbers, one number for every possible configuration of the system….What is so terrible about that? Certainly, it was a tragic mistake for Einstein and Schrödinger to step away from using quantum mechanics, isolating themselves in their later lives from the exciting progress made by others.
… [continue reading]
In linear algebra, and therefore quantum information, the singular value decomposition (SVD) is elementary, ubiquitous, and beautiful. However, I only recently realized that its expression in bra-ket notation is very elegant. The SVD is equivalent to the statement that any operator can be expressed as
where and are orthonormal sets of vectors, possibly in Hilbert spaces with different dimensionality, and the are the singular values.
That’s it.… [continue reading]
Late, alas. Also: there have been a couple of complaints about the spam filter for comments on this blog, and I’m trying to track down the issue. The filter is supposed to tell you what’s wrong and help you successfully post the comment. If you’ve been unable to get past the filter, or if it’s just too much of a hassle even when you can get past it, please let me know so I can try to fix this.
… [continue reading]
Bousso has a recent paper bounding the maximum information that can be sent by a signal from first principles in QFT:
I derive a universal upper bound on the capacity of any communication channel between two distant systems. The Holevo quantity, and hence the mutual information, is at most of order
the average energy of the signal, and
is the amount of time for which detectors operate. The bound does not depend on the size or mass of the emitting and receiving systems, nor on the nature of the signal. No restrictions on preparing and processing the signal are imposed. As an example, I consider the encoding of information in the transverse or angular position of a signal emitted and received by systems of arbitrarily large cross-section. In the limit of a large message space, quantum effects become important even if individual signals are classical, and the bound is upheld.
Here’s his first figure:
This all stems from vacuum entanglement, an oft-neglected aspect of QFT that Bousso doesn’t emphasize in the paper as the key ingredient.I thank Scott Aaronson for first pointing this out. a The gradient term in the Hamiltonian for QFTs means that the value of the field at two nearby locations is always entangled.… [continue reading]