Don Weingarten’s new^{a } attack [2105.04545] on the problem of defining wavefunction branches is the most important paper on this topic in several years — and hence, by my strange twisted lights, one of the most important recent papers in physics. Ultimately I think there are significant barriers to the success of this approach, but these may be surmountable. Regardless, the paper makes tons of progress in understanding the advantages and drawbacks of a definition of branches based on quantum complexity.
Here’s the abstract:
Don Weingarten
In this post I will give a non-rigorous bird’s-eye-view of the main ideas that seem most important to me. In particular, I will mostly concentrate on the non-relativistic setting and comment only briefly on the relativistic case (which Weingarten handles in detail). The following table of contents provides an outline of this blog post.
Summary of the paper’s core
My one-sentence summary of Weingarten’s proposal: The correct decomposition of the wavefunction into branches is given by the sum of orthogonal components that minimizes a linear combination of branch-norm entropy and (norm-weighted) mean squared branch complexity.
Review: geometric quantum circuit complexity [Sec III. A-C]
Quantum circuit complexity comes in several flavors. Weingarten uses Nielsen’s geometric version of complexity [quant-ph/0502070, quant-ph/0603161] with nearest-neighbor generators only^{b } and a zero-complexity reference class of product states, as I now define. Given the tensor-product Hilbert space over lattice sites , the relative complexity between two states is the minimum amount of “time” necessary to evolve one state to the other using a “Hamiltonian” with unit Hilbert-Schmidt norm constructed as the sum of nearest-neighbor terms. This means we consider
(1)
where the sum is over all nearest-neighbor pairs and where each is a Hermitian operator on . More specifically, the relative complexity between two pure states and is defined to be the minimum value of such that there exist at least one such schedule satisfying
(2)
where the exponentiated integral is time ordered by in the usual way. Then, picking the zero-complexity reference class to be the set of all first-quantized^{c } product states, the (absolute) complexity of a single state is its minimum relative complexity with a product state:
(3)
Very importantly, the squares of the complexity add over uncorrelated spatial regions: , where are disjoint spatial regions.
Net complexity and the Weingarten decomposition [Sec V.]
For any candidate decomposition of the state into orthogonal (unnormalized) components , Weingarten defines the net complexity of the decomposition to be
(4)
where are the norms of the candidate branches, is the norm-weighted mean-squared-complexity of the decomposition, is the Shannon entropy of the branch norms, and is a free parameter with units of volume^{d }.
We can now state the heart of Weingarten’s proposal: At any given time, the correct decomposition of the wavefunction into branches is given by the orthogonal decomposition that minimizes the net complexity.
The trade-off between the amount of branching and the per-branch complexity is quantified by the as-yet unspecified parameter . The interpretation and determination of is very subtle, which I will address in a forthcoming blog post, but for now let me for now point out two things.
- controls the “aggressiveness” of the branching: for there is never any branching because the penalty from the branch entropy is too large, and for the branches are chosen to be unphysically fine-grained since the net complexity vanishes for (and hence is minimized by) any decomposition into product states.
- shares some close similarities with the free parameter(s) in my proposed (partial) definition of branches and in objective collapse models. Among other things, sets a preferred length scale which as yet has no clear grounding.
Review: Second law of quantum complexity [Sec. III. D]
The viability of this proposal for wavefunction branches depends in a large part on it satisfying physically sensible properties discussed in the next subsection. Weingarten gives arguments that these properties are almost always satisfied, relying on a reasonable but imprecise conjecture from Brown, Susskind, and Zhao^{e } called the “Second Law of Quantum Complexity” [1507.02287, 1608.02612, 1701.01107], which I’ll briefly express.
The claim is that, for almost all nearest-neighbor Hamiltonians with Hilbert-Schmidt norm , and for almost all initial states with much less than the maximum possible quantum circuit complexity, the quantum-circuit complexity of a state grows nearly as fast as it can until it reaches maximum complexity, i.e., it nearly saturates the “speed limit”
(5)
One argument for this is based on the distribution of complexity over the space of states: the vast majority of states in Hilbert space have near-maximal complexity because , where is the Haar measure and is the set of states with complexity no larger than . Thus, of all the directions in Hilbert space that Hamiltonian evolution might drive a state, the overwhelming majority lead to higher complexity.
Claimed properties of the Weingarten decomposition [Sec. VI]
Weingarten argues for the following “binary-tree properties” of the branch decomposition of an evolving state, which are necessary for the time-dependent decomposition to form a binary tree that can be interpreted as wavefunction branches.
- The times when the number of branches changes, “branching times”, are isolated discrete moments.
- In between branching times, when the number of branches is constant, the decomposition is continuous in time, i.e., each branch at an earlier time is carried smoothly into one at a later time by the unitary evolution, rather than there being a “glitch” in the decomposition.
- At a branching time, the number of branches always increases by one, and in particular does not go down.
- At a branching time, two of the new branches always sum to one of the old ones; the rest of the branches are carried smoothly through the branching time by the unitary evolution.
These are claimed to hold generically (i.e., for almost all states) on the sub-exponential (in system size) timescales for which the Brown-Susskind quantum-circuit complexity conjecture hold. Although Weingarten offers detailed arguments for the above conjectured properties, I believe some of them may fail in unusual or not-so-unusual situations. However, they, or a slight weakened version of them, must hold for this circuit-complexity definition of branches to be viable as it currently stands.
Reading guide for skeptics
I believe many potential readers will be sufficiently wary of this overall project to not commit to an initial time investment of 49 dense pages. For them I suggest initially reading only Sec. I – XI (18 pages), passing over the proofs in Appendix A – C and in Sec. VII – X, and but making sure not to miss the details in Secs. III – VII. Here’s why.
First, the bulk of the paper is divided cleanly between the non-relativistic case (Secs. III – XI and Appendices A – C) and the relativistic case (Secs. XII – XXIV and Appendices D – G). The basic ideas of complexity-based branches are well captured in the non-relativistic discussion, whereas the relativistic discussion is (in my opinion) relevant mostly if you are already convinced that this branch decomposition is very promising and you want to see the extent to which it can be reconciled with Lorentz covariance.
Second, The detailed proof of the complexity bounds that are found in Sec. IV (proved in Appendix B and C) and in Sec. VII – X, impressive as they are, are not necessary for understanding the core of the paper. The proofs are a bit brute force (e.g., carefully addressing the Schmidt spectra of a state) and are probably most interesting for thinking about how things would be changed if the complexity function were modified, e.g., if non-nearest-neighbor Hamiltonians or different norms besides Hilbert-Schmidt were used.^{f } But if you’re trying to figure out whether you should care about Weingarten’s main ideas quickly, I would not get hung up on these details on an initial reading.
On the other hand, I found the (non-rigorous) arguments in Sec. VI for how the branching develops under time evolution to be super interesting beyond the bottom-line result. Although it could be cleaned up a bit, this discussion is really novel, bridging the two key ideas, complexity and wavefunction branches, that Weingarten has tied together for the first time.
Discussion
Here I’ll emphasize some key ideas that I think could easily be under-appreciated.
Weingarten’s generalized GHZ state
Readers of Weingarten’s paper should know that the multi-fermion state discussed in Sec. IV (“Complexity of entangled multi-fermion state”), which initially appeared quite mysterious to me, is just a particular lattice instantiation of an -partite GHZ state with components (branches):
(6)
with . Here, is the unit norm () complex phase of the -th branch, is the normalized state of subsystem (a subset of the lattice) conditional on the -th branch, and is the vaccum state on the rest of the universe. Weingarten takes this basic multipartite entanglement structure and encodes it into the lattice in different spatial ways, which are parameterized by a choices of spatial regions (), spin choices (), and surfaces (). In particular, the subsystems (parts) of the GHZ state are the regions , and, conditional on branch , the state of this region is constructed from the vacuum by placing fermions of spin at each site in region (while leaving the rest of as a vacuum)^{g }; this ensures that the are orthogonal as required. The surfaces are used by Weingarten to specify spatial gaps between the regions which, as he shows, have implications for the complexity of the state.
Motivating net complexity
When we model the macroscopic world with quantum states, we generally assume that widely separated spatial regions do not initially share much entanglement. (For example, when we model a measurement, we generally assume the measuring device begins completely uncorrelated with the measured system.) Roughly speaking, the macroscopic world looks like a tensor network of bounded bond dimension or, at least, such states are consistent with our observations to good accuracy. Yet, if the wavefunction of the world is just evolving unitarily, the amount of entanglement should be increasing steadily, as we evolve from an out-of-equilibrium area-law toward a heat-death volume-law.
The putative explanation for this is that, although the full wavefunction of the world is quickly accruing entanglement on larger scales, the bond dimension of each branch remains bounded. This requires a sufficiently high rate of branching to “absorb” all the entanglement being generated and, of course, that the branches are chosen so that each individually has low entanglement even though their sum does not.
There are of course very many ways to quantify many-body entanglement, but if you already think that complexity is particularly elegant way, and in particular if you think the proposed second law of complexity is likely to capture important features of the irreversibility necessary for a sensible definition of branches, then Weingarten’s decomposition based on net complexity is a fairly natural guess; it optimizes for the smallest amount of branching that achieves the lowest per-branch complexity.
As an aside, Weingarten’s proposal also suggests an alternative version: Perhaps, in one spatial dimension, branches should be given by the decomposition that minimizes “net MPS entanglement”, a quantity
(7)
obtained from net complexity by replacing the squared complexity function on candidate branches with some function that measures the degree of entanglement in the MPS representation of . For instance, one might consider the spatial mean bond entropy
(8)
where the first sum is over MPS bonds for a spatial lattice of size and where is the set of unnormalized Schmidt values for branch at the bond between sites and . Besides it not being immediately clear if this is well behaved mathematically, it seems somewhat less likely to me that this would correctly describe the all important irreversibility of branches. But maybe this general direction could be worth thinking about.
Relation between complexity-based and records-based branches
One notable deficiency of the purely records-based definition [1608.05377] of branches I wrote about is that it cannot absorb sufficient entanglement to ensure the bond dimension necessary to represent branches remains bounded, as collaborators^{h } and I later discovered. Some form of record-based branches could still be consistent with observations and basically the correct choice, but they would have a bunch of hidden entanglement that was infeasible to detect and would not admit a tensor-network representation. (We will be writing about this in more detail in the future.)
In contrast, Weingarten’s complexity-based definition looks like it may identify much finer-grained branches that have bounded bond dimension and consequently obey an area law. See Weingarten’s discussion, using different language, in Sec. X. Insofar as complexity-based branches and records-based branches are roughly compatible (very speculative), the complexity-based decomposition would be a fine-graining of the records-based one. This would mean that each coarse-grained branch (distinguished from other such branches by records) would be a sum of fine-grained branches that are macroscopically indistinguishable yet essentially independently evolving in the sense that it would be infeasible to detect coherence between the different fine-grained branches. This has important implications for using branch decompositions to run faster numerical simulations.
Spacetime independence and Poincaré symmetry
In both the relativistic and non-relativistic cases, one could ask that a reasonable definition of branching obey:
- Spatial independence. Non-interacting and uncorrelated spatial regions branch independently: If in isolation is branched to , and likewise in isolation is branched to , then should be branched to .
- Temporal independence. Branching happens independently of past branching, conditional on those branches: If the full wavefunction has already branched into (in the same spatial region), and if additionally would individually be branched as above, then the wavefunction will branch into .
For the Weingarten decomposition, the spatial independence follows from the additivity of Shannon^{i } entropy and the aforementioned additivity of the complexity. Temporal independence would follow from the claimed binary-tree properties.
In the second half of the paper, Weingarten proposes a method for defining branches in a (flat-space) relativistic setting that enjoys a certain flavor of Lorentz invariance. He makes use of a random lattice (on spacetime) with a Lorentz-invariant density and, loosely following Adrian Kent’s ideas [0708.3710, 1311.0249, 1608.04805], defines a branch decomposition at asymptotically late time that plausibly is boost- and translation-covariant in the limit of small lattice density. A frame-dependent branch decomposition at finite time can then be obtained by evolving the decomposition backward. As Weingarten correctly emphasizes, this frame dependence is to be expected: spacelike separated branching events will occur in different time order in different frames.
Although I don’t know enough about relativistic random lattices to say anything with authority, Weingarten’s approach to defining complexity in the relativistic setting looks like a novel and very interesting set of ideas quite independent of its application to branching.
I’ll close by noting that space and time will not be treated on completely equal footing by any sensible notion of branching, and this is to be expected: branching is a thermodynamic process, characterized by effective irreversibility, and so is intimately connected to the specialness of the low-entropy state on a spacelike hypersurface in the distant past (with no similar assumption about timelike hypersurfaces at large spatial distances).
Footnotes
(↵ returns to text)
- I previously blogged about earlier work by Weingarten on a related topic. This new paper directly addresses my previous concerns.↵
- That is, complexity is defined with a metric on the space of all unitaries that assigns infinite distance to all directions except those generated by nearest-neighbor Hamiltonians.↵
- A “first-quantized product state” is a particle-number eigenstate where each particle has a pure spatial wavefunctions uncorrelated with the others, which is constructed from the vacuum state with field operators and wavefunctions as . This is distinct from a “second-quantized product state”, which is the global state on a lattice formed as the tensor product of single-site pure states.↵
- The units here are slightly tricky. Recall that the spatial continuum wavefunction has units of so that is a probability density over space, which has units of inverse volume. Complexity is dimensionless when defined on a discrete spatial lattice of spacing , but has to be re-normalized like to get a well-behaved continuum limit. (The existence of this limit is not proven by Weingarten, which I will discuss in a future post.) then must have units of volume to agree with the units of squared complexity.↵
- Brown et al. often work with the circuit version of quantum complexity, but Weingarten uses the geometric version of Nielsen.↵
- Also: There are, to my knowledge, very few rigorous bounds on the complexity of quantum states in the literature. These proof techniques may be of significant interest even to people who don’t care about branches.↵
- In other words, you can roughly think of this as a many-subsystem, many-branch GHZ state where the subsystems are disjoint spatial regions and the branch is recorded on a region using the location of a bundle of fermions.↵
- Dan Ranard, Markus Hauru, Curt von Keyserlingk.↵
- In particular, this would not hold if some other entropy like the Rényi entropy were used in the definition of the net complexity.↵
In a Facebook comment, Howard Wiseman writes
My response:
Yes, I agree this is unlikely to be a literal measure-zero type situation, as Weingarten claims. (I’m in the process of writing another post about some of the problem’s with his proposal, of which this is one.) But I think it’s plausible something similar to Weingarten’s proposal could work. The aspect of this proposal that is particularly promising to me, and that may help evade the situations you raise, is that it is trying to directly bake in *irreversibility* (without modifying quantum mechanics).
There are (at least) three key issues:
1. Is it actually feasible to reverse Wigner’s friend in the real universe? It seems likely to me that the resources necessary for this grow exponentially in some measure of the relative complexity between the two versions of the friend, and thus the chance that any Wigner anywhere in the universe accomplishes this task is negligible. One could still object that FAPP isn’t good enough for a fundamental theory, but I would still say massive progress had been made in understanding quantum mechanics without measurement.
2. Will we accept a theory that decides whether a candidate branching event “really happened” at a certain time based on whether those branches recombine in the (perhaps distant) future? It seems distasteful…but hard to reject out of hand given our limitations as physical agents who have access only to our physically encoded memories.
3. How the heck do we think about the post-heat-death universe when FAPP irreversibility breaks down?