Links for October 2016

I will start writing actual blog posts again soon, I promise. But until then, more nerdy space and military stuff…

  • ExoMars is approaching the Red Planet. The lander enters the atmosphere tomorrow.
  • The United States only operated continuous airborne alert — the maintenance of multiple nuclear-armed bomber aircraft continuously in flight to avoid the possibility of a sneak attack neutralizing the bomber force — during the ’60s, because the accident rate was too high. However, Operation Looking Glass kept at least one emergency command platform in the air around-the-clock for almost 30 years.

    At DEFCON 2 or higher, the Looking Glass pilot and co-pilot were both required to wear an eye patch, retrieved from their Emergency War Order (EWO) kit. In the event of a surprise blinding flash from a nuclear detonation, the eye patch would prevent blindness in the covered eye, thus enabling them to see in at least one eye and continue flying. Later, the eye patch was replaced by goggles that would instantaneously turn opaque when exposed to a nuclear flash, then rapidly clear for normal vision.

    They also continuously maintained airplanes flying over the ocean, dangling antenna into the water, to ensure constant communication with submarines.

[continue reading]

Executive branch reasonable on AI

President Obama was directly asked in a Wired interview about the dangers Bostrom raises regarding AI. From the transcript:

DADICH: I want to center our conversation on artificial intelligence, which has gone from science fiction to a reality that’s changing our lives. When was the moment you knew that the age of real AI was upon us?

OBAMA: My general observation is that it has been seeping into our lives in all sorts of ways, and we just don’t notice; and part of the reason is because the way we think about AI is colored by popular culture. There’s a distinction, which is probably familiar to a lot of your readers, between generalized AI and specialized AI. In science fiction, what you hear about is generalized AI, right? Computers start getting smarter than we are and eventually conclude that we’re not all that useful, and then either they’re drugging us to keep us fat and happy or we’re in the Matrix. My impression, based on talking to my top science advisers, is that we’re still a reasonably long way away from that. It’s worth thinking about because it stretches our imaginations and gets us thinking about the issues of choice and free will that actually do have some significant applications for specialized AI, which is about using algorithms and computers to figure out increasingly complex tasks.

[continue reading]

Links for August-September 2016

[continue reading]

Abstracts for July-August 2016

  • Local dark matter searches with LISA
    Massimo Cerdonio, Roberto De Pietri, Philippe Jetzer and Mauro Sereno
    The drag-free satellites of LISA will maintain the test masses in geodesic motion over many years with residual accelerations at unprecedented small levels and time delay interferometry (TDI) will keep track of their differential positions at a level of picometers. This may allow investigations of fine details of the gravitational field in the solar system previously inaccessible. In this spirit, we present the concept of a method for measuring directly the gravitational effect of the density of diffuse local dark matter (LDM) with a constellation of a few drag-free satellites, by exploiting how peculiarly it would affect their relative motion. Using as a test-bed an idealized LISA with rigid arms, we find that the separation in time between the test masses is uniquely perturbed by the LDM, so that they acquire a differential breathing mode. Such an LDM signal is related to the LDM density within the orbits and has characteristic spectral components, with amplitudes increasing in time, at various frequencies of the dynamics of the constellation. This is the relevant result in that the LDM signal is brought to non-zero frequencies.
[continue reading]

Three arguments on the measurement problem

When talking to folks about the quantum measurement problem, and its potential partial resolution by solving the set selection problem, I’ve recently been deploying three nonstandard arguments. To a large extent, these are dialectic strategies rather than unique arguments per se. That is, they are notable for me mostly because they avoid getting bogged down in some common conceptual dispute, not necessarily because they demonstrate something that doesn’t formally follow from traditional arguments. At least two of these seem new to me, in the sense that I don’t remember anyone else using them, but I strongly suspect that I’ve just appropriated them from elsewhere and forgotten. Citations to prior art are highly appreciated.

Passive quantum mechanics

There are good reasons to believe that, at the most abstract level, the practice of science doesn’t require a notion of active experiment. Rather, a completely passive observer could still in principle derive all fundamental physical theories simply by sitting around and watching. Science, at this level, is about explaining as many observations as possible starting from as minimal assumptions as possible. Abstractly we frame science as a compression algorithm that tries to find the programs with the smallest Kolmogorov complexity that reproduces observed data.… [continue reading]

Links for July 2016

  • Strokes to the language-processing parts of the brain often manifest as expressive aphasia or fluent aphasia. Both are very grave disabilities, but can be fascinating. The latter looks like this:
  • Good HN discussion surround a Nature article on how bicycles are steered.
  • The results from the arXiv survey are in. Nature characterizes them as very conservative, but I as shocked to find that ~58% of responses thought “Allow readers to comment on papers” was very important, important, or somewhat important. From Andrej Karpathy:

    I developed and maintain Arxiv Sanity Preserver (, one of the Arxiv overlays the article mentions. I built it to try address some of the pains that the “raw” arXiv introduces, such as being flooded by paper submissions without any support or tools for sifting through them.

    I’m torn on how Arxiv should proceed in becoming more complex. I support what seems to be the cited poll consensus (“The message was more or less ‘stay focused on the basic dissemination task, and don’t get distracted by getting overextended or going commercial’”) and I think the simplicity/rawness of arXiv was partly what made it succeed, but there is also a clear value proposition offered by more advanced search/filter/recommendation tools like Arxiv Sanity Preserver.

[continue reading]

Bleg: Classical theory of measurement and amplification

I’m in search of an authoritative reference giving a foundational/information-theoretic approach to classical measurement. What abstract physical properties are necessary and sufficient?

Motivation: The Copenhagen interpretation treats the measurement process as a fundamental primitive, and this persists in most uses of quantum mechanics outside of foundations. Of course, the modern view is that the measurement process is just another physical evolution, where the state of a macroscopic apparatus is conditioned on the state of a microscopic quantum system in some basis determined by their mutual interaction Hamiltonian. The apparent nonunitary aspects of the evolution inferred by the observer arises because the measured system is coupled to the observer himself; the global evolution of the system-apparatus-observer system is formally modeled as unitary (although the philosophical meaningfulness/ontology/reality of the components of the wavefunction corresponding to different measurement outcomes is disputed).

Eventually, we’d like to be able to identify all laboratory measurements as just an anthropocentric subset of wavefunction branching events. I am very interested in finding a mathematically precise criteria for branching.Note that the branches themselves may be only precisely defined in some large-N or thermodynamic limit. a   Ideally, I would like to find a property that everyone agrees must apply, at the least, to laboratory measurement processes, and (with as little change as possible) use this to find all branches — not just ones that result from laboratory measurements.… [continue reading]

Comments on an essay by Wigner

[PSA: Happy 4th of July. Juno arrives at Jupiter tonight!]

This is short and worth reading:

The sharp distinction between Initial Conditions and Laws of Nature was initiated by Isaac Newton and I consider this to be one of his most important, if not the most important, accomplishment. Before Newton there was no sharp separation between the two concepts. Kepler, to whom we owe the three precise laws of planetary motion, tried to explain also the size of the planetary orbits, and their periods. After Newton's time the sharp separation of initial conditions and laws of nature was taken for granted and rarely even mentioned. Of course, the first ones are quite arbitrary and their properties are hardly parts of physics while the recognition of the latter ones are the prime purpose of our science. Whether the sharp separation of the two will stay with us permanently is, of course, as uncertain as is all future development but this question will be further discussed later. Perhaps it should be mentioned here that the permanency of the validity of our deterministic laws of nature became questionable as a result of the realization, due initially to D.
[continue reading]

Links for June 2016

  • Another transmissible cancer found, this time in mollusks.
  • The “modern” pentathalon is bizarre:

    The modern pentathlon is an Olympic sport that comprises five very different events: fencing, 200 m freestyle swimming, show jumping, and a final combined event of pistol shooting, and a 3200 m cross-country run. The sport has been a core sport of the Olympic Games since 1912 despite dispute…

    The addition of modern to the name distinguished it from the original pentathlon of the ancient Olympic Games, which consisted of the stadion foot race, wrestling, long jump, javelin, and discus. As the events of the ancient pentathlon were modeled after the skills of the ideal soldier of that time, Coubertin created the contest to simulate the experience of a 19th-century cavalry soldier behind enemy lines: he must ride an unfamiliar horse, fight enemies with pistol and sword, swim, and run to return to his own soldiers.

  • Sketches of the flying car design being funded by Larry Page. (H/t Scott Alexander.)
  • Why keep making new car commercials when you can just make one with a dummy car and digitally add in the car after the fact?
  • Everyone should know Moore’s here-is-one-hand argument:

    In his 1925 essay A Defence of Common Sense, Moore argues against idealism and skepticism toward the external world on the grounds that skeptics could not give reasons to accept their metaphysical premises that were more plausible to him than the reasons he had to accept the common sense claims about our knowledge of the world that skeptics and idealists must deny.

[continue reading]

Abstracts for May-June 2016

Lots of matter interference experiments this time, because they are awesome.

  • Quantum Interference of a Microsphere
    H. Pino, J. Prat-Camps, K. Sinha, B. P. Venkatesh, and O. Romero-Isart
    We propose and analyze an all-magnetic scheme to perform a Young’s double slit experiment with a micron-sized superconducting sphere of mass 10^{13} amu. We show that its center of mass could be prepared in a spatial quantum superposition state with an extent of the order of half a micrometer. The scheme is based on magnetically levitating the sphere above a superconducting chip and letting it skate through a static magnetic potential landscape where it interacts for short intervals with quantum circuits. In this way a protocol for fast quantum interferometry is passively implemented. Such a table-top earth-based quantum experiment would operate in a parameter regime where gravitational energy scales become relevant. In particular we show that the faint parameter-free gravitationally-induced decoherence collapse model, proposed by Diósi and Penrose, could be unambiguously falsified.

    An extremely exciting and ambitious proposal. I have no ability to assess the technical feasibility, and my prior is that this is too hard, but the authors are solid. Their formalism and thinking is very clean, and hence quite abstracted away from the nitty gritty of the experiment.

[continue reading]

Comments on Hanson’s The Age of Em

One of the main sources of hubris among physicists is that we think we can communicate essential ideas faster and more exactly than many others.This isn’t just a choice of compact terminology or ability to recall shared knowledge. It also has to do with a responsive throttling of the level of detail to match the listener’s ability to follow, and quick questions which allow the listener to hone in on things they don’t understand. This leads to a sense of frustration when talking to others who use different methods. Of course this sensation isn’t overwhelming evidence that our methods actually are better and function as described above, just that they are different. But come on. a   Robin Hanson‘s Age of Em is an incredible written example of efficient transfer of (admittedly speculative) insights. I highly recommend it.

In places where I am trained to expect writers to insert fluff and repeat themselves — without actually clarifying — Hanson states his case concisely once, then plows through to new topics. There are several times where I think he leaps without sufficient justifications (at least given my level of background knowledge), but there is a stunning lack of fluff. The ideas are jammed in edgewise.… [continue reading]

My talk on ideal quantum Brownian motion

I have blogged before about the conceptual importance of ideal, symplectic covariant quantum Brownian motion (QBM). In short: QBM is to open quantum systems as the harmonic oscillator is to closed quantum systems. Like the harmonic oscillator, (a) QBM is universal because it’s the leading-order behavior of a taylor series expansion; (b) QBM evolution has a very intuitive interpretation in terms of wavepackets evolving under classical flow; and (c) QBM is exactly solvable.

If that sounds like a diatribe up your alley, then you are in luck. I recently ranted about it here at PI. It’s just a summary of the literature; there are no new results. As always, I recommend downloading the raw video file so you can run it at arbitrary speed.

Abstract: In the study of closed quantum system, the simple harmonic oscillator is ubiquitous because all smooth potentials look quadratic locally, and exhaustively understanding it is very valuable because it is exactly solvable. Although not widely appreciated, Markovian quantum Brownian motion (QBM) plays almost exactly the same role in the study of open quantum systems. QBM is ubiquitous because it arises from only the Markov assumption and linear Lindblad operators, and it likewise has an elegant and transparent exact solution.
[continue reading]

Bullshit in science

Francisco Azuaje (emphasis mine):

According to American philosopher Harry FrankfurtHere’s Frankfurt’s popular essay [PDF]. a  , a key difference between liars and bullshitters is that the former tend to accept that they are not telling the truth, while the latter simply do not care whether something is true or not.

Bullshitters strive to maximize personal gain through a continuing distortion of reality. If something is true and can be manipulated to achieve their selfish objectives, then good. If something is not true, who cares? All the same. These attributes make bullshitting worse than lying.

Furthermore, according to Frankfurt, it is the bullshitter’s capacity to get away with bullshitting so easily that makes them particularly dangerous. Individuals in prominent positions of authority may be punished for lying, especially if lying has serious damaging consequences. Professional and casual bullshitters at all levels of influence typically operate with freedom. Regardless of their roles in society, their exposure is not necessarily accompanied by negative legal or intellectual consequences, at least for the bullshitter…

Researchers may also be guilty of bullshitting by omission. This is the case when they do not openly challenge bullshitting positions, either in the public or academic settings. Scientists frequently wrongly assume that the public always has knowledge of well-established scientific facts.

[continue reading]

Comments on Rosaler’s “Reduction as an A Posteriori Relation”

In a previous post of abstracts, I mentioned philosopher Josh Rosaler’s attempt to clarify the distinction between empirical and formal notions of “theoretical reduction”. Reduction is just the idea that one theory reduces to another in some limit, like Galilean kinematics reduces to special relativity in the limit of small velocities.Confusingly, philosophers use a reversed convention; they say that Galilean mechanics reduces to special relativity. a   Formal reduction is when this takes the form of some mathematical limiting procedure (e.g., v/c \to 0), whereas empirical reduction is an explanatory statement about observations (e.g., “special relativity can explains the empirical usefulness of Galilean kinematics”).

Rosaler’s criticism, which I mostly agree with, is that folks often conflate these two. Usually this isn’t a serious problem since the holes can be patched up on the fly by a competent physicist, but sometimes it leads to serious trouble. The most egregious case, and the one that got me interested in all this, is the quantum-classical transition, and in particular the serious insufficiency of existing \hbar \to 0 limits to explain the appearance of macroscopic classicality. In particular, even though this limiting procedure recovers the classical equations of motion, it fails spectacularly to recover the state space.… [continue reading]

Links for May 2016

  • The Peacock Spider (Maratus speciosus):

    If you haven’t long ago seen the BBC Earth bit on the birds of paradise, check it out.
  • If you use Zotero and iOS, then check out PaperShip. I have two or three minor complaints, but on the whole it is very high quality.
  • The New Mexico whiptail is like a mule in that it’s a hybrid of two species, but unlike the mule it can reproduce semi-cloning:

    The New Mexico whiptail (Cnemidophorus neomexicanus) is a female species of lizard found in the southern United States in New Mexico and Arizona, and in northern Mexico in Chihuahua. It is the official state reptile of New Mexico. It is one of many lizard species known to be parthenogenic. Individuals of the species can be created either through the hybridization of the little striped whiptail (C. inornatus) and the western whiptail (C. tigris), or through the parthenogenic reproduction of an adult New Mexico whiptail.

    The hybridization of these species prevents healthy males from forming whereas males do exist in both parent species (see Sexual differentiation). Parthenogenesis allows the resulting all-female population to reproduce and thus evolve into a unique species capable of reproduction. This combination of interspecific hybridization and parthenogenesis exists as a reproductive strategy in several species of whiptail lizard within the Cnemidophorus genus to which the New Mexico whiptail belongs.

[continue reading]