PI accepting 2017 master’s student applications

The Perimeter Scholars International (PSI) program is now accepting applications for this Master’s program, to start next fall. The due date is Feb 1st. Me previously:

If you’re in your last year as an undergrad, I strongly advise you (seriously) to consider applying. Your choice of grad school is 80% of the selection power determining your thesis topic, and that topic places very strong constraints on your entire academic career. The more your choice is informed by actual physics knowledge (rather than the apparent impressiveness of professors and institutions), the better. An additional year at a new institution taking classes with new teachers can really help.


Here’s the poster and a brand new propaganda video:
[continue reading]

Sank argues for a SciRate issue tracker

SciRate is the best location I know of for public discussion and feedback on academic papers, and is an impressive open-source achievement by Adam Harrow and collaborators. Right now it has the most traction in the field of quantum informationQuantum info leading the way, as usual…, but it could stand to become more popular, and to expand into other fields.

My colleague and good friend Dan Sank proposes a small but important tweak for SciRate: issue tracking, à la GitHub.

Issues in Scirate?

Scirate enables us to express comments/opinions on published works. Another very useful kind of feedback for research papers is issues. By “issue” I mean exactly the kind of thing I’m writing right now: a description of

  1. a problem with the work which can be definitively fixed, or
  2. a possible improvement to that product.

This differs from comments which are just statements of opinion which don’t require any reaction from the author. We all know that issues are essential in developing software, and based on a recent experience where I used github to host development of a research paper with three coauthors and more than a dozen group members providing feedback, I think that issues should also be used for research papers.

It might be nice to attach an issue tracker to Scirate, or at least have Scirate give links to an external issue tracker attached to each paper.

Why not just use a public github repo and get the issue tracker for free?

Making a github repo public makes everything public, including any sensitive information including comments about particular works/people. Having written a paper using github, I can imagine the authors would not want to make that repo public before going through the entire issue history making sure nobody said anything embarrassing/demeaning/etc.

[continue reading]

Abstracts for October 2016

  • One of von Neumann's motivations for developing the theory of operator algebras and his and Murray's 1936 classification of factors was the question of possible decompositions of quantum systems into independent parts. For quantum systems with a finite number of degrees of freedom the simplest possibility, i.e. factors of type I in the terminology of Murray and von Neumann, are perfectly adequate. In relativistic quantum field theory (RQFT), on the other hand, factors of type III occur naturally. The same holds true in quantum statistical mechanics of infinite systems. In this brief review some physical consequences of the type III property of the von Neumann algebras corresponding to localized observables in RQFT and their difference from the type I case will be discussed. The cumulative effort of many people over more than 30 years has established a remarkable uniqueness result: The local algebras in RQFT are generically isomorphic to the unique, hyperfinite type III, factor in Connes' classification of 1973. Specific theories are characterized by the net structure of the collection of these isomorphic algebras for different space-time regions, i.e. the way they are embedded into each other.
    [arXiv:math-ph/0411058]

    One of the key subtleties about trying to study quantum information in a field theory is that you can’t formally decompose the Hilbert space into a tensor product of spatially local subsystems. The reasons are technical, and rarely explained well. This paper is an exception, giving an excellent introduction to the key ideas, in a manner accessible to a quantum (non-field) information theorist. (See related work by Yngvason this blogpost by Tobias Osborne and my previous discussion re: Reeh-Schielder theorem.)

[continue reading]

Links for October 2016

I will start writing actual blog posts again soon, I promise. But until then, more nerdy space stuff…

  • ExoMars is approaching the Red Planet. The lander enters the atmosphere tomorrow.
  • The United States only operated continuous airborne alert — the maintenance of multiple nuclear-armed bomber aircraft continuously in flight to avoid the possibility of a sneak attack neutralizing the bomber force — during the ’60s, because the accident rate was too high. However, Operation Looking Glass kept at least one emergency command platform in the air around-the-clock for almost 30 years.

    At DEFCON 2 or higher, the Looking Glass pilot and co-pilot were both required to wear an eye patch, retrieved from their Emergency War Order (EWO) kit. In the event of a surprise blinding flash from a nuclear detonation, the eye patch would prevent blindness in the covered eye, thus enabling them to see in at least one eye and continue flying. Later, the eye patch was replaced by goggles that would instantaneously turn opaque when exposed to a nuclear flash, then rapidly clear for normal vision.

    They also continuously maintained airplanes flying over the ocean, dangling antenna into the water, to ensure constant communication with submarines. This stopped in 1991.

  • Very relatedly, former Secretary of Defense William Perry is teaching a MOOC about the continuing modern risk of nuclear weapons.
  • A history of the Project Orion. Abtract:

    The race to the Moon dominated manned space flight during the 1960’s. and culminated in Project Apollo. which placed 12 humans on the Moon. Unbeknownst to the public at that time, several U.S. government agencies sponsored a project that could have conceivably placed 150 people on the Moon, and eventually sent crewed expeditions to Mars and the outer planets.

[continue reading]

Executive branch reasonable on AI

President Obama was directly asked in a Wired interview about the dangers Bostrom raises regarding AI. From the transcript:

DADICH: I want to center our conversation on artificial intelligence, which has gone from science fiction to a reality that’s changing our lives. When was the moment you knew that the age of real AI was upon us?

OBAMA: My general observation is that it has been seeping into our lives in all sorts of ways, and we just don’t notice; and part of the reason is because the way we think about AI is colored by popular culture. There’s a distinction, which is probably familiar to a lot of your readers, between generalized AI and specialized AI. In science fiction, what you hear about is generalized AI, right? Computers start getting smarter than we are and eventually conclude that we’re not all that useful, and then either they’re drugging us to keep us fat and happy or we’re in the Matrix. My impression, based on talking to my top science advisers, is that we’re still a reasonably long way away from that. It’s worth thinking about because it stretches our imaginations and gets us thinking about the issues of choice and free will that actually do have some significant applications for specialized AI, which is about using algorithms and computers to figure out increasingly complex tasks. We’ve been seeing specialized AI in every aspect of our lives, from medicine and transportation to how electricity is distributed, and it promises to create a vastly more productive and efficient economy. If properly harnessed, it can generate enormous prosperity and opportunity. But it also has some downsides that we’re gonna have to figure out in terms of not eliminating jobs.

[continue reading]

Links for August-September 2016

[continue reading]

Abstracts for July-August 2016

  • Local dark matter searches with LISA
    Massimo Cerdonio, Roberto De Pietri, Philippe Jetzer and Mauro Sereno
    The drag-free satellites of LISA will maintain the test masses in geodesic motion over many years with residual accelerations at unprecedented small levels and time delay interferometry (TDI) will keep track of their differential positions at a level of picometers. This may allow investigations of fine details of the gravitational field in the solar system previously inaccessible. In this spirit, we present the concept of a method for measuring directly the gravitational effect of the density of diffuse local dark matter (LDM) with a constellation of a few drag-free satellites, by exploiting how peculiarly it would affect their relative motion. Using as a test-bed an idealized LISA with rigid arms, we find that the separation in time between the test masses is uniquely perturbed by the LDM, so that they acquire a differential breathing mode. Such an LDM signal is related to the LDM density within the orbits and has characteristic spectral components, with amplitudes increasing in time, at various frequencies of the dynamics of the constellation. This is the relevant result in that the LDM signal is brought to non-zero frequencies.
  • We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics.
[continue reading]

Three arguments on the measurement problem

When talking to folks about the quantum measurement problem, and its potential partial resolution by solving the set selection problem, I’ve recently been deploying three nonstandard arguments. To a large extent, these are dialectic strategies rather than unique arguments per se. That is, they are notable for me mostly because they avoid getting bogged down in some common conceptual dispute, not necessarily because they demonstrate something that doesn’t formally follow from traditional arguments. At least two of these seem new to me, in the sense that I don’t remember anyone else using them, but I strongly suspect that I’ve just appropriated them from elsewhere and forgotten. Citations to prior art are highly appreciated.

Passive quantum mechanics

There are good reasons to believe that, at the most abstract level, the practice of science doesn’t require a notion of active experiment. Rather, a completely passive observer could still in principle derive all fundamental physical theories simply by sitting around and watching. Science, at this level, is about explaining as many observations as possible starting from as minimal assumptions as possible. Abstractly we frame science as a compression algorithm that tries to find the programs with the smallest Kolmogorov complexity that reproduces observed data.

Active experiments are of course useful for at least two important reasons: (1) They gather strong evidence for causality by feeding a source of randomness into a system to test a causal model, and (2) they produce sources of data that are directly correlated with systems of interest rather than relying on highly indirect (and perhaps computationally intractable) correlations. But ultimately these are practical considerations, and an inert but extraordinarily intelligent observer could in principle derive general relativity, quantum mechanics, and field theoryOf course, there may be RG-reasons to think that scales decouple, and that to a good approximation the large-scale dynamics are compatible with lots of possible small-scale dynamics.[continue reading]

Links for July 2016

  • Strokes to the language-processing parts of the brain often manifest as expressive aphasia or fluent aphasia. Both are very grave disabilities, but can be fascinating. The latter looks like this:
  • Good HN discussion surround a Nature article on how bicycles are steered.
  • The results from the arXiv survey are in. Nature characterizes them as very conservative, but I as shocked to find that ~58% of responses thought “Allow readers to comment on papers” was very important, important, or somewhat important. From Andrej Karpathy:

    I developed and maintain Arxiv Sanity Preserver (http://www.arxiv-sanity.com/), one of the Arxiv overlays the article mentions. I built it to try address some of the pains that the “raw” arXiv introduces, such as being flooded by paper submissions without any support or tools for sifting through them.

    I’m torn on how Arxiv should proceed in becoming more complex. I support what seems to be the cited poll consensus (“The message was more or less ‘stay focused on the basic dissemination task, and don’t get distracted by getting overextended or going commercial’”) and I think the simplicity/rawness of arXiv was partly what made it succeed, but there is also a clear value proposition offered by more advanced search/filter/recommendation tools like Arxiv Sanity Preserver. It’s not clear to me to what extent arXiv should strive to develop these kinds of features internally.

    Whether they go a simple or more complex route, I really hope that they keep their API open and allow 3rd party developers such as myself to explore new ways of making the arXiv repository useful to researchers. Somewhat disappointedly, the arXiv poll they ran did not include any mentions of their API, which in my opinion are a critical, overlooked and somehow undervalued.

[continue reading]

Bleg: Classical theory of measurement and amplification

I’m in search of an authoritative reference giving a foundational/information-theoretic approach to classical measurement. What abstract physical properties are necessary and sufficient?

Motivation: The Copenhagen interpretation treats the measurement process as a fundamental primitive, and this persists in most uses of quantum mechanics outside of foundations. Of course, the modern view is that the measurement process is just another physical evolution, where the state of a macroscopic apparatus is conditioned on the state of a microscopic quantum system in some basis determined by their mutual interaction Hamiltonian. The apparent nonunitary aspects of the evolution inferred by the observer arises because the measured system is coupled to the observer himself; the global evolution of the system-apparatus-observer system is formally modeled as unitary (although the philosophical meaningfulness/ontology/reality of the components of the wavefunction corresponding to different measurement outcomes is disputed).

Eventually, we’d like to be able to identify all laboratory measurements as just an anthropocentric subset of wavefunction branching events. I am very interested in finding a mathematically precise criteria for branching.Note that the branches themselves may be only precisely defined in some large-N or thermodynamic limit. Ideally, I would like to find a property that everyone agrees must apply, at the least, to laboratory measurement processes, and (with as little change as possible) use this to find all branches — not just ones that result from laboratory measurements.Right now I find the structure of spatially-redundant information in the many-body wavefunction to be a very promising approach.

It seems sensible to begin with what is necessary for a classical measurement since these ought to be analyzable without all the philosophical baggage that plagues discussion of quantum measurement.… [continue reading]

Comments on an essay by Wigner

[PSA: Happy 4th of July. Juno arrives at Jupiter tonight!]

This is short and worth reading:

The sharp distinction between Initial Conditions and Laws of Nature was initiated by Isaac Newton and I consider this to be one of his most important, if not the most important, accomplishment. Before Newton there was no sharp separation between the two concepts. Kepler, to whom we owe the three precise laws of planetary motion, tried to explain also the size of the planetary orbits, and their periods. After Newton's time the sharp separation of initial conditions and laws of nature was taken for granted and rarely even mentioned. Of course, the first ones are quite arbitrary and their properties are hardly parts of physics while the recognition of the latter ones are the prime purpose of our science. Whether the sharp separation of the two will stay with us permanently is, of course, as uncertain as is all future development but this question will be further discussed later. Perhaps it should be mentioned here that the permanency of the validity of our deterministic laws of nature became questionable as a result of the realization, due initially to D. Zeh, that the states of macroscopic bodies are always under the influence of their environment; in our world they can not be kept separated from it.

This essay has no formal abstract; the above is the second paragraph, which I find to be profound. Here is the PDF. The essay shares the same name and much of the material with Wigner’s 1963 Nobel lecture [PDF].The Nobel lecture has a nice bit contrasting invariance principles with covariance principles, and dynamical invariance principles with geometrical invariance principles.[continue reading]

Links for June 2016

  • Another transmissible cancer found, this time in mollusks.
  • The “modern” pentathalon is bizarre:

    The modern pentathlon is an Olympic sport that comprises five very different events: fencing, 200 m freestyle swimming, show jumping, and a final combined event of pistol shooting, and a 3200 m cross-country run. The sport has been a core sport of the Olympic Games since 1912 despite dispute…

    The addition of modern to the name distinguished it from the original pentathlon of the ancient Olympic Games, which consisted of the stadion foot race, wrestling, long jump, javelin, and discus. As the events of the ancient pentathlon were modeled after the skills of the ideal soldier of that time, Coubertin created the contest to simulate the experience of a 19th-century cavalry soldier behind enemy lines: he must ride an unfamiliar horse, fight enemies with pistol and sword, swim, and run to return to his own soldiers.

  • Sketches of the flying car design being funded by Larry Page. (H/t Scott Alexander.)
  • Why keep making new car commercials when you can just make one with a dummy car and digitally add in the car after the fact?
  • Everyone should know Moore’s here-is-one-hand argument:

    In his 1925 essay A Defence of Common Sense, Moore argues against idealism and skepticism toward the external world on the grounds that skeptics could not give reasons to accept their metaphysical premises that were more plausible to him than the reasons he had to accept the common sense claims about our knowledge of the world that skeptics and idealists must deny. In other words, he is more willing to believe that he has a hand than to believe the premises of what he deems “a strange argument in a university classroom.” “I do not think it is rational to be as certain of any one of these … propositions”….

[continue reading]

Abstracts for May-June 2016

Lots of matter interference experiments this time, because they are awesome.

  • Quantum Interference of a Microsphere
    H. Pino, J. Prat-Camps, K. Sinha, B. P. Venkatesh, and O. Romero-Isart
    We propose and analyze an all-magnetic scheme to perform a Young’s double slit experiment with a micron-sized superconducting sphere of mass 10^{13} amu. We show that its center of mass could be prepared in a spatial quantum superposition state with an extent of the order of half a micrometer. The scheme is based on magnetically levitating the sphere above a superconducting chip and letting it skate through a static magnetic potential landscape where it interacts for short intervals with quantum circuits. In this way a protocol for fast quantum interferometry is passively implemented. Such a table-top earth-based quantum experiment would operate in a parameter regime where gravitational energy scales become relevant. In particular we show that the faint parameter-free gravitationally-induced decoherence collapse model, proposed by Diósi and Penrose, could be unambiguously falsified.


    An extremely exciting and ambitious proposal. I have no ability to assess the technical feasibility, and my prior is that this is too hard, but the authors are solid. Their formalism and thinking is very clean, and hence quite abstracted away from the nitty gritty of the experiment.

  • Do the laws of quantum physics still hold for macroscopic objects -- this is at the heart of Schrodinger's cat paradox -- or do gravitation or yet unknown effects set a limit for massive particles? What is the fundamental relation between quantum physics and gravity? Ground-based experiments addressing these questions may soon face limitations due to limited free-fall times and the quality of vacuum and microgravity.
[continue reading]

Comments on Hanson’s The Age of Em

One of the main sources of hubris among physicists is that we think we can communicate essential ideas faster and more exactly than many others.This isn’t just a choice of compact terminology or ability to recall shared knowledge. It also has to do with a responsive throttling of the level of detail to match the listener’s ability to follow, and quick questions which allow the listener to hone in on things they don’t understand. This leads to a sense of frustration when talking to others who use different methods. Of course this sensation isn’t overwhelming evidence that our methods actually are better and function as described above, just that they are different. But come on. Robin Hanson‘s Age of Em is an incredible written example of efficient transfer of (admittedly speculative) insights. I highly recommend it.

In places where I am trained to expect writers to insert fluff and repeat themselves — without actually clarifying — Hanson states his case concisely once, then plows through to new topics. There are several times where I think he leaps without sufficient justifications (at least given my level of background knowledge), but there is a stunning lack of fluff. The ideas are jammed in edgewise.



Academic papers usually have two reasons that they must be read slowly: explicit unpacking of complex subjects, and convoluted language. Hanson’s book is a great example of something that must be read slowly because of the former with no hint of the latter. Although he freely calls on economics concepts that non-economists might have to look up, his language is always incredibly direct and clear. Hanson is an academic Hemingway.

Most of what I might have said on the book’s substance was very quickly eclipsed by other reviews, so you should just read Bryan Caplan, Richard Jones, or Scott Alexander, along with some replies by Hanson.… [continue reading]

My talk on ideal quantum Brownian motion

I have blogged before about the conceptual importance of ideal, symplectic covariant quantum Brownian motion (QBM). In short: QBM is to open quantum systems as the harmonic oscillator is to closed quantum systems. Like the harmonic oscillator, (a) QBM is universal because it’s the leading-order behavior of a taylor series expansion; (b) QBM evolution has a very intuitive interpretation in terms of wavepackets evolving under classical flow; and (c) QBM is exactly solvable.

If that sounds like a diatribe up your alley, then you are in luck. I recently ranted about it here at PI. It’s just a summary of the literature; there are no new results. As always, I recommend downloading the raw video file so you can run it at arbitrary speed.


Abstract: In the study of closed quantum system, the simple harmonic oscillator is ubiquitous because all smooth potentials look quadratic locally, and exhaustively understanding it is very valuable because it is exactly solvable. Although not widely appreciated, Markovian quantum Brownian motion (QBM) plays almost exactly the same role in the study of open quantum systems. QBM is ubiquitous because it arises from only the Markov assumption and linear Lindblad operators, and it likewise has an elegant and transparent exact solution. QBM is often introduced with specific non-Markovian models like Caldeira-Leggett, but this makes it very difficult to see which phenomena are universal and which are idiosyncratic to the model. Like frictionless classical mechanics or nonrenormalizable field theories, the exact Markov property is aphysical, but handling this subtlety is a small price to pay for the extreme generality.
[continue reading]