Research debt

Chris Olah coins the term “research debt” to discuss a bundle of related destructive phenomena in research communities:

  • Poor Exposition – Often, there is no good explanation of important ideas and one has to struggle to understand them. This problem is so pervasive that we take it for granted and don’t appreciate how much better things could be.
  • Undigested Ideas – Most ideas start off rough and hard to understand. They become radically easier as we polish them, developing the right analogies, language, and ways of thinking.
  • Bad abstractions and notation – Abstractions and notation are the user interface of research, shaping how we think and communicate. Unfortunately, we often get stuck with the first formalisms to develop even when they’re bad. For example, an object with extra electrons is negative, and pi is wrong.
  • Noise – Being a researcher is like standing in the middle of a construction site. Countless papers scream for your attention and there’s no easy way to filter or summarize them. We think noise is the main way experts experience research debt.

Shout it from the rooftops (my emphasis):

It’s worth being clear that research debt isn’t just about ideas not being explained well. It’s a lack of digesting ideas – or, at least, a lack of the public version of ideas being digested. It’s a communal messiness of thought.

Developing good abstractions, notations, visualizations, and so forth, is improving the user interfaces for ideas. This helps both with understanding ideas for the first time and with thinking clearly about them. Conversely, if we can’t explain an idea well, that’s often a sign that we don’t understand it as well as we could…

Distillation is also hard.

[continue reading]

Abstracts for March 2017

  • The technique of using “laser grating”, in place of physical grating (slits), for producing spatial interference of molecules relies on the laser’s ability to ionize the molecule. (Once ionized, standing electric fields can sweep it out of the way.) But for some molecules, especially large nanoparticles, this is ineffective. Solution: attach a molecular tag to the nanoparticle that reliably cleaves in the presence of a laser, allowing the nanoparticle to be vacuumed up. Rad.

  • Berry points out that the \hbar \to 0 limit of quantum mechanics is singular, implying that things like Ehrenfest’s theorem and the canceling of the path integral are not adequate to describe the quantum-classical transition. A similar situation can be found with critical points in statistical mechanics, where the N \to \infty limit similarly becomes ill-defined. If you think that the huge intellectual investment in understanding critical points is justified by their fundamental significance (regardless of practical applications), I claim you should think similarly about the quantum-classical limit.

    Even in what philosophers might regard as the simplest reductions, between different areas within physics, the detailed working-out of how one theory can contain another has been achieved in only a few cases and involves sophisticated ideas on the forefront of physics and mathematics today….It should be clear from the foregoing that a subtle and sophisticated understanding of the relation between theories within physics requires real mathematics, and not only verbal, conceptual and logical analysis as currently employed by philosophers.

    For introductions, see these popular and non-technical treatments.

  • I have no intelligent comments about this, and have no idea if the paper is interesting. It’s just a crazy long coherence time.

  • (Note that the ACM version is a much shorter “abstract”, missing most of the content.)

  • (H/t Sean Carroll.) Jarzynski’s equality and the Crooks fluctuation theorem are recent and important strengthening of the second law of thermodynamics.

[continue reading]

Links for February 2017

  • If you are a high school student, or know one, who would be interested in the SPARC summer camp, the deadline is March 1.

    SPARC helps talented high school students apply their quantitative thinking skills to their lives and the world.

    SPARC will be hosted in the San Francisco Bay Area from August 6 – 17, with students arriving the evening of the 6th and leaving the morning of the 17th. Room and board are provided free of charge.

    The curriculum covers topics from causal modeling and probability to game theory and cognitive science. But the focus of SPARC is on applying the same quantitative and rigorous spirit outside of the classroom. How can we understand our own reasoning and behavior? How can we think more clearly and better achieve our goals?

  • Indian Space Research Organisation’s Polar Satellite Launch Vehicle successfully launched 104 satellites into orbit on the same mission. Onboard video of the deployment:

    Pictures of some of the cubesats, including Planet‘s 88 imagining satellites for continuous Earth monitoring.
  • What is a ‘Shavers Only’ Electrical Outlet?
  • A possible rare shake-up of the GiveWell list: temporary subsidies for migrant workers in India.
  • How to think about cell walls:

    I most cells, the cell wall is flexible, meaning that it will bend rather than holding a fixed shape, but has considerable tensile strength. The apparent rigidity of primary plant tissues is enabled by cell walls, but is not due to the walls’ stiffness. Hydraulic turgor pressure creates this rigidity, along with the wall structure. The flexibility of the cell walls is seen when plants wilt, so that the stems and leaves begin to droop, or in seaweeds that bend in water currents.

[continue reading]

Links for January 2017

[continue reading]

Weinberg on the measurement problem

In his new article in the NY Review of Books, the titan Steven Weinberg expresses more sympathy for the importance of the measurement problem in quantum mechanics. The article has nothing new for folks well-versed in quantum foundations, but Weinberg demonstrates a command of the existing arguments and considerations. The lengthy excerpts below characterize what I think are the most important aspects of his view.

Many physicists came to think that the reaction of Einstein and Feynman and others to the unfamiliar aspects of quantum mechanics had been overblown. This used to be my view. After all, Newton’s theories too had been unpalatable to many of his contemporaries…Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.

In quantum mechanics the state of a system is not described by giving the position and velocity of every particle and the values and rates of change of various fields, as in classical physics. Instead, the state of any system at any moment is described by a wave function, essentially a list of numbers, one number for every possible configuration of the system….What is so terrible about that? Certainly, it was a tragic mistake for Einstein and Schrödinger to step away from using quantum mechanics, isolating themselves in their later lives from the exciting progress made by others. Even so, I’m not as sure as I once was about the future of quantum mechanics. It is a bad sign that those physicists today who are most comfortable with quantum mechanics do not agree with one another about what it all means. The dispute arises chiefly regarding the nature of measurement in quantum mechanics…

The introduction of probability into the principles of physics was disturbing to past physicists, but the trouble with quantum mechanics is not that it involves probabilities.

[continue reading]

Singular value decomposition in bra-ket notation

In linear algebra, and therefore quantum information, the singular value decomposition (SVD) is elementary, ubiquitous, and beautiful. However, I only recently realized that its expression in bra-ket notation is very elegant. The SVD is equivalent to the statement that any operator \hat{M} can be expressed as

(1)   \begin{align*} \hat{M} = \sum_i \vert A_i \rangle \lambda_i \langle B_i \vert \end{align*}

where \vert A_i \rangle and \vert B_i \rangle are orthonormal sets of vectors, possibly in Hilbert spaces with different dimensionality, and the \lambda_i \ge 0 are the singular values.

That’s it.… [continue reading]

Links for December 2016

Late, alas. Also: there have been a couple of complaints about the spam filter for comments on this blog, and I’m trying to track down the issue. The filter is supposed to tell you what’s wrong and help you successfully post the comment. If you’ve been unable to get past the filter, or if it’s just too much of a hassle even when you can get past it, please let me know so I can try to fix this.

  • Europe’s Galileo satellite navigation system recently went online, although without yet a complete constellation. In just a few years, there will be a full four independent navigations from great powers: the EU, the US (GPS), Russia (GLONASS), and China (BeiDou). Devices are already being built to use all four systems at once. Everyone wins through the increased redundancy and satellite count.
  • Design of the Solo cup.
  • I highly recommend this semi-technical talk on ARC fusion reactor design by Dennis Whyte.

    (Video DownloadHelper allows downloading video off YouTube.)

    Proposed in 2014 by Whyte and collaborators, ARC is a newer but only under-development alternative to traditional Tokamak-style reactor, where rare earth barium copper oxide (ReBCo) superconductors play a crucial role. Whyte argues that the key hold-up on fusion reactors is their absolute size, which necessitate large-scale, lumbering international collaboration. ReBCo superconductors are the key technical advance allowing smaller magnetic confinement. The parameters of these designs scale extremely well with increased magnetic field. Significant downsides include increased vessel pressure and pulsed operation because of intrinsic limitations on neutrons shielding.The fusion fuel is deuterium and tritium, which is most amenable choice of reactant on the fusion slope of the nuclei binding energy curve.

[continue reading]

Comments on Bousso’s communication bound

Bousso has a recent paper bounding the maximum information that can be sent by a signal from first principles in QFT:

Here’s his first figure:

This all stems from vacuum entanglement, an oft-neglected aspect of QFT that Bousso doesn’t emphasize in the paper as the key ingredient.I thank Scott Aaronson for first pointing this out.a   The gradient term in the Hamiltonian for QFTs means that the value of the field at two nearby locations is always entangled. In particular, the value of \phi(x) and \phi(x+\Delta x) are sometimes considered independent degrees of freedom but, for a state with bounded energy, they can’t actually take arbitrarily different values as \Delta x becomes small, or else the gradient contribution to the Hamiltonian violates the energy bound. Technically this entanglement exists over arbitrary distances, but it is exponentially suppressed on scales larger than the Compton wavelength of the field. For massless fields (infinite Compton wavelength), the entanglement is long range, but the amount you can actually measure is suppressed exponentially on a scale given by the length of your measuring apparatus.

In this case Bob’s measuring apparatus has effective size c \Delta t, which of courseYou can tell this is a HEP theorist playing with some recently-learned quantum information because he sets c=1 but leaves \hbar explicit. 😀b   Bousso just calls \Delta t. (It may actually be of size L = c \Delta t or, like a radio antenna, it may effectively be this size by integrating the measurement over a time long enough for a wave of that length to pass by.) Such a device is necessarily noisy when trying to measure modes whose wavelength is longer than this scale. So Alice can only communicate to Bob with high fidelity through excitations of energy at least \hbar/\Delta t.… [continue reading]