Links for July 2016

  • Strokes to the language-processing parts of the brain often manifest as expressive aphasia or fluent aphasia. Both are very grave disabilities, but can be fascinating. The latter looks like this:
  • Good HN discussion surround a Nature article on how bicycles are steered.
  • The results from the arXiv survey are in. Nature characterizes them as very conservative, but I as shocked to find that ~58% of responses thought “Allow readers to comment on papers” was very important, important, or somewhat important. From Andrej Karpathy:

    I developed and maintain Arxiv Sanity Preserver (http://www.arxiv-sanity.com/), one of the Arxiv overlays the article mentions. I built it to try address some of the pains that the “raw” arXiv introduces, such as being flooded by paper submissions without any support or tools for sifting through them.

    I’m torn on how Arxiv should proceed in becoming more complex. I support what seems to be the cited poll consensus (“The message was more or less ‘stay focused on the basic dissemination task, and don’t get distracted by getting overextended or going commercial’”) and I think the simplicity/rawness of arXiv was partly what made it succeed, but there is also a clear value proposition offered by more advanced search/filter/recommendation tools like Arxiv Sanity Preserver. It’s not clear to me to what extent arXiv should strive to develop these kinds of features internally.

    Whether they go a simple or more complex route, I really hope that they keep their API open and allow 3rd party developers such as myself to explore new ways of making the arXiv repository useful to researchers. Somewhat disappointedly, the arXiv poll they ran did not include any mentions of their API, which in my opinion are a critical, overlooked and somehow undervalued.

[continue reading]

Bleg: Classical theory of measurement and amplification

I’m in search of an authoritative reference giving a foundational/information-theoretic approach to classical measurement. What abstract physical properties are necessary and sufficient?

Motivation: The Copenhagen interpretation treats the measurement process as a fundamental primitive, and this persists in most uses of quantum mechanics outside of foundations. Of course, the modern view is that the measurement process is just another physical evolution, where the state of a macroscopic apparatus is conditioned on the state of a microscopic quantum system in some basis determined by their mutual interaction Hamiltonian. The apparent nonunitary aspects of the evolution inferred by the observer arises because the measured system is coupled to the observer himself; the global evolution of the system-apparatus-observer system is formally modeled as unitary (although the philosophical meaningfulness/ontology/reality of the components of the wavefunction corresponding to different measurement outcomes is disputed).

Eventually, we’d like to be able to identify all laboratory measurements as just an anthropocentric subset of wavefunction branching events. I am very interested in finding a mathematically precise criteria for branching.Note that the branches themselves may be only precisely defined in some large-N or thermodynamic limit. a   Ideally, I would like to find a property that everyone agrees must apply, at the least, to laboratory measurement processes, and (with as little change as possible) use this to find all branches — not just ones that result from laboratory measurements.Right now I find the structure of spatially-redundant information in the many-body wavefunction to be a very promising approach. b  

It seems sensible to begin with what is necessary for a classical measurement since these ought to be analyzable without all the philosophical baggage that plagues discussion of quantum measurement.… [continue reading]

Comments on an essay by Wigner

[PSA: Happy 4th of July. Juno arrives at Jupiter tonight!]

This is short and worth reading:

The sharp distinction between Initial Conditions and Laws of Nature was initiated by Isaac Newton and I consider this to be one of his most important, if not the most important, accomplishment. Before Newton there was no sharp separation between the two concepts. Kepler, to whom we owe the three precise laws of planetary motion, tried to explain also the size of the planetary orbits, and their periods. After Newton's time the sharp separation of initial conditions and laws of nature was taken for granted and rarely even mentioned. Of course, the first ones are quite arbitrary and their properties are hardly parts of physics while the recognition of the latter ones are the prime purpose of our science. Whether the sharp separation of the two will stay with us permanently is, of course, as uncertain as is all future development but this question will be further discussed later. Perhaps it should be mentioned here that the permanency of the validity of our deterministic laws of nature became questionable as a result of the realization, due initially to D. Zeh, that the states of macroscopic bodies are always under the influence of their environment; in our world they can not be kept separated from it.

This essay has no formal abstract; the above is the second paragraph, which I find to be profound. Here is the PDF. The essay shares the same name and much of the material with Wigner’s 1963 Nobel lecture [PDF].The Nobel lecture has a nice bit contrasting invariance principles with covariance principles, and dynamical invariance principles with geometrical invariance principles.[continue reading]

Links for June 2016

  • Another transmissible cancer found, this time in mollusks.
  • The “modern” pentathalon is bizarre:

    The modern pentathlon is an Olympic sport that comprises five very different events: fencing, 200 m freestyle swimming, show jumping, and a final combined event of pistol shooting, and a 3200 m cross-country run. The sport has been a core sport of the Olympic Games since 1912 despite dispute…

    The addition of modern to the name distinguished it from the original pentathlon of the ancient Olympic Games, which consisted of the stadion foot race, wrestling, long jump, javelin, and discus. As the events of the ancient pentathlon were modeled after the skills of the ideal soldier of that time, Coubertin created the contest to simulate the experience of a 19th-century cavalry soldier behind enemy lines: he must ride an unfamiliar horse, fight enemies with pistol and sword, swim, and run to return to his own soldiers.

  • Sketches of the flying car design being funded by Larry Page. (H/t Scott Alexander.)
  • Why keep making new car commercials when you can just make one with a dummy car and digitally add in the car after the fact?
  • Everyone should know Moore’s here-is-one-hand argument:

    In his 1925 essay A Defence of Common Sense, Moore argues against idealism and skepticism toward the external world on the grounds that skeptics could not give reasons to accept their metaphysical premises that were more plausible to him than the reasons he had to accept the common sense claims about our knowledge of the world that skeptics and idealists must deny. In other words, he is more willing to believe that he has a hand than to believe the premises of what he deems “a strange argument in a university classroom.” “I do not think it is rational to be as certain of any one of these … propositions”….

[continue reading]

Abstracts for May-June 2016

Lots of matter interference experiments this time, because they are awesome.

  • Quantum Interference of a Microsphere
    H. Pino, J. Prat-Camps, K. Sinha, B. P. Venkatesh, and O. Romero-Isart
    We propose and analyze an all-magnetic scheme to perform a Young’s double slit experiment with a micron-sized superconducting sphere of mass 10^{13} amu. We show that its center of mass could be prepared in a spatial quantum superposition state with an extent of the order of half a micrometer. The scheme is based on magnetically levitating the sphere above a superconducting chip and letting it skate through a static magnetic potential landscape where it interacts for short intervals with quantum circuits. In this way a protocol for fast quantum interferometry is passively implemented. Such a table-top earth-based quantum experiment would operate in a parameter regime where gravitational energy scales become relevant. In particular we show that the faint parameter-free gravitationally-induced decoherence collapse model, proposed by Diósi and Penrose, could be unambiguously falsified.


    An extremely exciting and ambitious proposal. I have no ability to assess the technical feasibility, and my prior is that this is too hard, but the authors are solid. Their formalism and thinking is very clean, and hence quite abstracted away from the nitty gritty of the experiment.

  • Do the laws of quantum physics still hold for macroscopic objects -- this is at the heart of Schrodinger's cat paradox -- or do gravitation or yet unknown effects set a limit for massive particles? What is the fundamental relation between quantum physics and gravity? Ground-based experiments addressing these questions may soon face limitations due to limited free-fall times and the quality of vacuum and microgravity.
[continue reading]

Comments on Hanson’s The Age of Em

One of the main sources of hubris among physicists is that we think we can communicate essential ideas faster and more exactly than many others.This isn’t just a choice of compact terminology or ability to recall shared knowledge. It also has to do with a responsive throttling of the level of detail to match the listener’s ability to follow, and quick questions which allow the listener to hone in on things they don’t understand. This leads to a sense of frustration when talking to others who use different methods. Of course this sensation isn’t overwhelming evidence that our methods actually are better and function as described above, just that they are different. But come on. a   Robin Hanson‘s Age of Em is an incredible written example of efficient transfer of (admittedly speculative) insights. I highly recommend it.

In places where I am trained to expect writers to insert fluff and repeat themselves — without actually clarifying — Hanson states his case concisely once, then plows through to new topics. There are several times where I think he leaps without sufficient justifications (at least given my level of background knowledge), but there is a stunning lack of fluff. The ideas are jammed in edgewise.



Academic papers usually have two reasons that they must be read slowly: explicit unpacking of complex subjects, and convoluted language. Hanson’s book is a great example of something that must be read slowly because of the former with no hint of the latter. Although he freely calls on economics concepts that non-economists might have to look up, his language is always incredibly direct and clear. Hanson is an academic Hemingway.… [continue reading]

My talk on ideal quantum Brownian motion

I have blogged before about the conceptual importance of ideal, symplectic covariant quantum Brownian motion (QBM). In short: QBM is to open quantum systems as the harmonic oscillator is to closed quantum systems. Like the harmonic oscillator, (a) QBM is universal because it’s the leading-order behavior of a taylor series expansion; (b) QBM evolution has a very intuitive interpretation in terms of wavepackets evolving under classical flow; and (c) QBM is exactly solvable.

If that sounds like a diatribe up your alley, then you are in luck. I recently ranted about it here at PI. It’s just a summary of the literature; there are no new results. As always, I recommend downloading the raw video file so you can run it at arbitrary speed.


Abstract: In the study of closed quantum system, the simple harmonic oscillator is ubiquitous because all smooth potentials look quadratic locally, and exhaustively understanding it is very valuable because it is exactly solvable. Although not widely appreciated, Markovian quantum Brownian motion (QBM) plays almost exactly the same role in the study of open quantum systems. QBM is ubiquitous because it arises from only the Markov assumption and linear Lindblad operators, and it likewise has an elegant and transparent exact solution. QBM is often introduced with specific non-Markovian models like Caldeira-Leggett, but this makes it very difficult to see which phenomena are universal and which are idiosyncratic to the model. Like frictionless classical mechanics or nonrenormalizable field theories, the exact Markov property is aphysical, but handling this subtlety is a small price to pay for the extreme generality.
[continue reading]

Bullshit in science

Francisco Azuaje (emphasis mine):

According to American philosopher Harry FrankfurtHere’s Frankfurt’s popular essay [PDF]. a  , a key difference between liars and bullshitters is that the former tend to accept that they are not telling the truth, while the latter simply do not care whether something is true or not.

Bullshitters strive to maximize personal gain through a continuing distortion of reality. If something is true and can be manipulated to achieve their selfish objectives, then good. If something is not true, who cares? All the same. These attributes make bullshitting worse than lying.

Furthermore, according to Frankfurt, it is the bullshitter’s capacity to get away with bullshitting so easily that makes them particularly dangerous. Individuals in prominent positions of authority may be punished for lying, especially if lying has serious damaging consequences. Professional and casual bullshitters at all levels of influence typically operate with freedom. Regardless of their roles in society, their exposure is not necessarily accompanied by negative legal or intellectual consequences, at least for the bullshitter…

Researchers may also be guilty of bullshitting by omission. This is the case when they do not openly challenge bullshitting positions, either in the public or academic settings. Scientists frequently wrongly assume that the public always has knowledge of well-established scientific facts. Moreover, scientists sometimes over-estimate the moderating role of the media or their capacity to differentiate facts from falsehood, and solid from weaker evidence.

Bullshitting happens. But very often it is a byproduct of indifference. Indifference frequently masking a fear of appearing confrontational to peers and funders. Depending on where you are or with whom you work, frontal bullshit fighting may not be good for career advancement.

[continue reading]

Comments on Rosaler’s “Reduction as an A Posteriori Relation”

In a previous post of abstracts, I mentioned philosopher Josh Rosaler’s attempt to clarify the distinction between empirical and formal notions of “theoretical reduction”. Reduction is just the idea that one theory reduces to another in some limit, like Galilean kinematics reduces to special relativity in the limit of small velocities.Confusingly, philosophers use a reversed convention; they say that Galilean mechanics reduces to special relativity. a   Formal reduction is when this takes the form of some mathematical limiting procedure (e.g., v/c \to 0), whereas empirical reduction is an explanatory statement about observations (e.g., “special relativity can explains the empirical usefulness of Galilean kinematics”).

Rosaler’s criticism, which I mostly agree with, is that folks often conflate these two. Usually this isn’t a serious problem since the holes can be patched up on the fly by a competent physicist, but sometimes it leads to serious trouble. The most egregious case, and the one that got me interested in all this, is the quantum-classical transition, and in particular the serious insufficiency of existing \hbar \to 0 limits to explain the appearance of macroscopic classicality. In particular, even though this limiting procedure recovers the classical equations of motion, it fails spectacularly to recover the state space.There are multiple quantum states that have the same classical analog as \hbar \to 0, and there are quantum states that have no classical analog as \hbar \to 0. b  

In this post I’m going to comment Rosaler’s recent elaboration on this ideaI thank him for discussion this topic and, full disclosure, we’re drafting a paper about set selection together. c  :

Reduction between theories in physics is often approached as an a priori relation in the sense that reduction is often taken to depend only on a comparison of the mathematical structures of two theories.
[continue reading]

Links for May 2016

  • The Peacock Spider (Maratus speciosus):

    If you haven’t long ago seen the BBC Earth bit on the birds of paradise, check it out.
  • If you use Zotero and iOS, then check out PaperShip. I have two or three minor complaints, but on the whole it is very high quality.
  • The New Mexico whiptail is like a mule in that it’s a hybrid of two species, but unlike the mule it can reproduce semi-cloning:

    The New Mexico whiptail (Cnemidophorus neomexicanus) is a female species of lizard found in the southern United States in New Mexico and Arizona, and in northern Mexico in Chihuahua. It is the official state reptile of New Mexico. It is one of many lizard species known to be parthenogenic. Individuals of the species can be created either through the hybridization of the little striped whiptail (C. inornatus) and the western whiptail (C. tigris), or through the parthenogenic reproduction of an adult New Mexico whiptail.

    The hybridization of these species prevents healthy males from forming whereas males do exist in both parent species (see Sexual differentiation). Parthenogenesis allows the resulting all-female population to reproduce and thus evolve into a unique species capable of reproduction. This combination of interspecific hybridization and parthenogenesis exists as a reproductive strategy in several species of whiptail lizard within the Cnemidophorus genus to which the New Mexico whiptail belongs.

    And in the extremely unlikely event that you don’t already know what parthenogenesis is…

    Parthenogenesis… is a natural form of asexual reproduction in which growth and development of embryos occur without fertilization. In animals, parthenogenesis means development of an embryo from an unfertilized egg cell and is a component process of apomixis.

[continue reading]

Links for April 2016

  • Paul Christiano has bet me $500 at even odds that a self-driving car can be reliably hailed by a member of the general public in at least 10 North American cities by July 2023.

    Details: At least 8 cities must be outside San Fransisco Bay Area. The car must available on at least 50% of days, i.e., not confined to very narrow weather or traffic situations. The car must be self-delivering, in the sense that it drives itself to the user, but not necessarily fully self-driving, in the sense that the user might need to drive it to the destination. (It’s easy to imagine tech and regulatory scenarios where self-driven cars are limited to speeds that are unacceptably slow during transportation of passengers, like the ~20 mph that Google’s car usually does, but are sufficient for getting to the hailing passenger if the density is high enough.) Carl Shulman will adjudicate any edge cases.

    I ascribe a 45% chance that a self-delivering car reaches this threshold, and 38% chance that a fully self-driving car does.

    Here’s a list of optimistic predictions for self-driving car timelines, which notably doesn’t mention the recent Google pessimism.

  • People I know build great stuff!

    My brother Will is an electrical engineer at Apple. He has been heavily involved in improving Apple display technology for the past two years, especially the True Tone feature and especially with the iPad Pro 9. Well, the reviews from the experts are in:

    The Absolute Color Accuracy of the iPad Pro 9.7 is Truly Impressive as shown in these Figures. It is the most color accurate display that we have ever measured. It is visually indistinguishable from perfect, and is very likely considerably better than any mobile display, monitor, TV or UHD TV that you have.

[continue reading]

Abstracts for March-April 2016

  • Unruh effect without trans-horizon entanglement
    Carlo Rovelli and Matteo Smerlak
    We estimate the transition rates of a uniformly accelerated Unruh-DeWitt detector coupled to a quantum field with reflecting conditions on a boundary plane (a “mirror”). We find that these are essentially indistinguishable from the usual Unruh rates, viz. that the Unruh effect persists in the presence of the mirror. This shows that the Unruh effect (thermality of detector rates) is not merely a consequence of the entanglement between left and right Rindler quanta in the Minkowski vacuum. Since in this setup the state of the field in the Rindler wedge is pure, we argue furthermore that the relevant entropy in the Unruh effect cannot be the von Neumann entanglement entropy. We suggest, an alternative, that it is the Shannon entropy associated with Heisenberg uncertainty.

    See also the related works by Gooding and Unruh, which connect to Pikovski et al. (blogged here).

  • What is the Entropy in Entropic Gravity?
    Sean M. Carroll and Grant N. Remmen
    We investigate theories in which gravity arises as a consequence of entropy. We distinguish between two approaches to this idea: holographic gravity, in which Einstein's equation arises from keeping entropy stationary in equilibrium under variations of the geometry and quantum state of a small region, and thermodynamic gravity, in which Einstein's equation emerges as a local equation of state from constraints on the area of a dynamical lightsheet in a fixed spacetime background. Examining holographic gravity, we argue that its underlying assumptions can be justified in part using recent results on the form of the modular energy in quantum field theory. For thermodynamic gravity, on the other hand, we find that it is difficult to formulate a self-consistent definition of the entropy, which represents an obstacle for this approach.
[continue reading]

Redundant consistency

I’m happy to announce the recent publication of a paper by Mike, Wojciech, and myself.

The Objective Past of a Quantum Universe: Redundant Records of Consistent Histories
C. Jess Riedel, Wojciech H. Zurek, and Michael Zwolak
Motivated by the advances of quantum Darwinism and recognizing the role played by redundancy in identifying the small subset of quantum states with resilience characteristic of objective classical reality, we explore the implications of redundant records for consistent histories. The consistent histories formalism is a tool for describing sequences of events taking place in an evolving closed quantum system. A set of histories is consistent when one can reason about them using Boolean logic, i.e., when probabilities of sequences of events that define histories are additive. However, the vast majority of the sets of histories that are merely consistent are flagrantly nonclassical in other respects. This embarras de richesses (known as the set selection problem) suggests that one must go beyond consistency to identify how the classical past arises in our quantum universe. The key intuition we follow is that the records of events that define the familiar objective past are inscribed in many distinct systems, e.g., subsystems of the environment, and are accessible locally in space and time to observers. We identify histories that are not just consistent but redundantly consistent using the partial-trace condition introduced by Finkelstein as a bridge between histories and decoherence. The existence of redundant records is a sufficient condition for redundant consistency. It selects, from the multitude of the alternative sets of consistent histories, a small subset endowed with redundant records characteristic of the objective classical past. The information about an objective history of the past is then simultaneously within reach of many, who can independently reconstruct it and arrive at compatible conclusions in the present.
[continue reading]

ArXiv and Zotero surveys

Quick note: the arXiv is administering a survey of user opinion on potential future changes, many of which were discussed previously on this blog. It can be reached by clicking the banner on the top of the arXiv homepage. I encourage you to take the survey if you haven’t already. (Doubly so if you agree with me…)

Likewise, Zotero is administering a somewhat shorter survey about what sorts of folks use Zotero and what they do with it.

To the question “Do you have suggestions for any of the above-mentioned new services, or any other new services you would like to see in arXiv?”, I responded:

I think the most important thing the arXiv to do would be to “nudge” authors toward releasing their work with a copyleft, e.g., Creative Commons – Attribution. (Or at least stop nudging them toward the minimal arXiv license, as is done now in the submission process.) For instance, make it clear to authors that if they publish in various open access journals that they should release the arXiv post on a similarly permissive license. Also, make is easier for authors to make the license more permissive at a later date once they know where they are publishing. So long as there is informed consent, anything that would increase the number of papers which can be built on (not just distributed) would be an improvement.

I would also like the arXiv to think about allowing for more fine-grained contribution tracking in the long term. I predict that collaboratively written documents will become much more common, and for this it will be necessary to produce a record of who changes what, like GitHub, with greater detail than merely the list of authors.

[continue reading]

Links for March 2016

  • With AlphaGo’s victory, Carl Shulman won his $100 bet with me (announced before the match here). For hindsight, here is a bit more evidence that AlphaGo’s win isn’t that shocking — perhaps even right on schedule — and therefore shouldn’t cause you to update much on overall AI progress:

    Comment from mjn:

    Fwiw, the point where the Go curve massively changes slope is when Monte-Carlo Tree Search (MCTS) began to be used in its modern form. I think that’s been an underreported part of AlphaGo’s success: deep networks get the lion’s share of the press, but AlphaGo is a hybrid deep-learning / MCTS system, and MCTS is arguably the most important of the algorithmic breakthroughs that led to computer Go being able to reach expert human level strength.

    (HN discussion.) John Langford concurs on the importance of MCTS.

  • Also: Ken Jennings welcomes Lee Sedol to the Human Loser Club. And: Do the Go prodigies of Asia have a future? (H/t Tyler Cowen.) These articles basically write themselves.
  • Also from Tyler: It was only a matter of time before Facebook began to hire reporters. And: “Will all of economic growth be absorbed into life extension?“:

    Some technologies save lives—new vaccines, new surgical techniques, safer highways. Others threaten lives—pollution, nuclear accidents, global warming, and the rapid global transmission of disease. How is growth theory altered when technologies involve life and death instead of just higher consumption? This paper shows that taking life into account has first-order consequences. Under standard preferences, the value of life may rise faster than consumption, leading society to value safety over consumption growth. As a result, the optimal rate of consumption growth may be substantially lower than what is feasible, in some cases falling all the way to zero.

[continue reading]