I’m happy to announce the recent publication of a paper by Mike, Wojciech, and myself.
Motivated by the advances of quantum Darwinism and recognizing the role played by redundancy in identifying the small subset of quantum states with resilience characteristic of objective classical reality, we explore the implications of redundant records for consistent histories. The consistent histories formalism is a tool for describing sequences of events taking place in an evolving closed quantum system. A set of histories is consistent when one can reason about them using Boolean logic, i.e., when probabilities of sequences of events that define histories are additive. However, the vast majority of the sets of histories that are merely consistent are flagrantly nonclassical in other respects. This embarras de richesses (known as the set selection problem) suggests that one must go beyond consistency to identify how the classical past arises in our quantum universe. The key intuition we follow is that the records of events that define the familiar objective past are inscribed in many distinct systems, e.g., subsystems of the environment, and are accessible locally in space and time to observers.
… [continue reading]
Quick note: the arXiv is administering a survey of user opinion on potential future changes, many of which were discussed previously on this blog. It can be reached by clicking the banner on the top of the arXiv homepage. I encourage you to take the survey if you haven’t already. (Doubly so if you agree with me…)
Likewise, Zotero is administering a somewhat shorter survey about what sorts of folks use Zotero and what they do with it.
To the question “Do you have suggestions for any of the above-mentioned new services, or any other new services you would like to see in arXiv?”, I responded:
I think the most important thing the arXiv to do would be to “nudge” authors toward releasing their work with a copyleft, e.g., Creative Commons – Attribution. (Or at least stop nudging them toward the minimal arXiv license, as is done now in the submission process.) For instance, make it clear to authors that if they publish in various open access journals that they should release the arXiv post on a similarly permissive license. Also, make is easier for authors to make the license more permissive at a later date once they know where they are publishing.
… [continue reading]
Question: What sort of physics — if any — should be funded on the margin right now by someone trying to maximize positive impact for society, perhaps over the very long term?
First, it’s useful to separate the field into fundamental physics and non-fundamental physics, where the former is concerned with discovering new fundamental laws of the universe (particle physics, high-energy theory, cosmology, some astrophysics) and the latter applies accepted laws to understand physical systems (condensed matter, material physics, quantum information and control, plasma physics, nuclear physics, fluid dynamics, biophysics, atomic/molecular/optical physics, geophysics).Some folks like David Nelson dispute the importance/usefulness of this distinction: PDF. In my opinion, he is correct, but only about the most boring part of fundamental physics (which has unfortunately dominated most of those subfields). More speculative research, such as the validity (!!!) of quantum mechanics, is undeniably of a different character from the investigation of low-energy field theories. But that point isn’t important for the present topic. a
That distinction made, let’s dive in.
Let’s first list some places where non-fundamental physics might have a social impact:
condensed matter and material science discoveries that give high-temperature superconductors, stronger/lighter/better-insulating/better-conducting materials, higher density batteries, new computing architectures, better solar cells;
quantum information discoveries that make quantum computers more useful than we currently think they will be, especially a killer app for quantum simulations;
plasma physics discoveries that make fusion power doable, or fission power cheaper;
quantum device technologies that allow for more precise measurements;
climate physics (vague);Added 2016-Dec-20.
… [continue reading]
Just in the nick of time…
Eliezer Yudkowsky has a large Facebook thread resulting in many public bets on the Lee Sedol vs DeepMind’s AlphaGo match.
In particular, I have bet Carl Shulman $100 at even odd that Sedol will win. (For the record, my confidence is low, and if I win it will be mostly luck.) The match, taking place March 9-15, will be streamed live on YouTube.
Relatedly, here is excellent (if slightly long winded) discussion of why the apparent jump in AI Go ability may be partially attributable to a purposeful application of additional computing power and researcher GO-specific expertise, rather than purely a large jump in domain-general AI power.
SciHub has been in the news recently, and I guess they decided to upgrade their appearance.
Want a postdoc doing theoretical physics, machine learning, and genomics? You’re in luck.
Luke Muehlhauser has good quote from Bill Gates on AI timelines.
“Assortative Mating—A Missing Piece in the Jigsaw of Psychiatric Genetics“.
Why are psychiatric disorders so highly heritable when they are associated with reduced fecundity? Why are some psychiatric disorders so much more highly heritable than others? Why is there so much genetic comorbidity across psychiatric disorders?
… [continue reading]
David L. Stern on changing incentives in science by getting rid of journals:
Instead, I believe, we will do better to rely simply on the scientific process itself. Over time, good science is replicated, elevated, and established as most likely true; bad science may be unreplicated, flaws may be noted, and it usually is quietly dismissed as untrue. This process may take considerable time—sometimes years, sometimes decades. But, usually, the most egregious papers are detected quickly by experts as most likely garbage. This self-correcting aspect of science often does not involve explicit written documentation of a paper’s flaws. The community simply decides that these papers are unhelpful and the field moves in a different direction.
In sum, we should stop worrying about peer review….
The real question that people seem to be struggling with is “How will we judge the quality of the science if it is not peer reviewed and published in a journal that I ‘respect’?” Of course, the answer is obvious. Read the papers! But here is where we come to the crux of the incentive problem. Currently, scientists are rewarded for publishing in “top” journals, on the assumption that these journals publish only great science. Since this assumption is demonstrably false, and since journal publishing involves many evils that are discussed at length in other posts, a better solution is to cut journals out of the incentive structure altogether.
… [continue reading]
A new paper of mine (PRA 93, 012107 (2016), arXiv:1507.04083) just came out. The main theorem of the paper is not deep, but I think it’s a clarifying result within a formalism that is deep: ideal quantum Brownian motion (QBM) in symplectic generality. In this blog post, I’ll refresh you on ideal QBM, quote my abstract, explain the main result, and then — going beyond the paper — show how it’s related to the Kolmogorov-Sinai entropy and the speed at which macroscopic wavefunctions branch.
If you Google around for “quantum Brownian motion”, you’ll come across a bunch of definitions that have quirky features, and aren’t obviously related to each other. This is a shame. As I explained in an earlier blog post, ideal QBM is the generalization of the harmonic oscillator to open quantum systems. If you think harmonic oscillator are important, and you think decoherence is important, then you should understand ideal QBM.
Harmonic oscillators are ubiquitous in the world because all smooth potentials look quadratic locally. Exhaustively understanding harmonic oscillators is very valuable because they are exactly solvable in addition to being ubiquitous. In an almost identical way, all quantum Markovian degrees of freedom look locally like ideal QBM, and their completely positive (CP) dynamics can be solved exactly.… [continue reading]
I thought this criticism by Ars Technica of the woeful state of Wikipedia’s science articles was mostly off the mark. (HN Comments.) The author framed it as a conflict between laymen and specialists, claiming that scientific articles are targeted at specialists at the expense of laymen, with lots of jargon, etc. I eagerly agree with him that there are lots of terrible science articles, and that some technical articles could use better context and introductory bits. But I think this is largely a problem of not having enough skilled science writers rather than a purposeful choice between laymen and specialists. Due to the curse of knowledge the specialists literally do not understand what is and isn’t accessible to laymen; they see through the jargon like the matrix. And the laymen do not get in their gut how many true technical dependencies there really are, that unless you understand topics X and Y, topic Z is pretty much useless. They assume that all this jargon is used by the specialists either because they are too lazy to translate, or are purposefully constructing barriers to entry. I empathize with skilled science writers (which are unfortunately rare), because their best articles often go unnoticed as both laymen and scientists read them and shrug “Yea, that’s pretty clear.… [continue reading]
Perimeter Institute runs a pretty great and unusual 1-year master’s program called Perimeter Scholars International.PSI…ha! a If you’re in your last year as an undergrad, I strongly advise you (seriously) to consider applying. Your choice of grad school is 80% of the selection power determining your thesis topic, and that topic places very strong constraints on your entire academic career. The more your choice is informed by actual physics knowledge (rather than the apparent impressiveness of professors and institutions), the better. An additional year at a new institution taking classes with new teachers can really help.
(Older academics can advertise this to students by printing out this poster.)
Here’s the blurb:
Each year, Canada’s Perimeter Institute for Theoretical Physics recruits approximately 30 exceptional science graduates for an immersive, 10-month physics boot camp: Perimeter Scholars International (PSI). This unique Master’s program seeks not only students with stellar undergraduate physics track records, but also those with diverse backgrounds, collaborative spirit, creativity, and other attributes that will set them apart as future innovators.
Features of the program include:
All student costs (tuition and living) are covered, removing financial and/or geographical barriers to entry
Students learn from world-leading theoretical physicists – resident Perimeter researchers and visiting scientists – within the inspiring environment of Perimeter Institute.
… [continue reading]
[Just shooting from the hip here, for fun.]
I think we should send humans to Mars, but I don’t really think it’s possible to justify it as an instrumental means of achieving other more concrete goals. (I just take it as an intrinsic goal.) But here is Robert Zubrin making the best instrumental case I’ve heard.
My biggest criticism is that not finding evidence of life on Mars does not imply life is extraordinarily rare, because there are other options besides easy-starting life (with the great filter somewhere after) and extremely-hard-starting life. If you think it’s possible that there’s a filter strong enough to prevent single-cell life from developing interstellar travelI’m skeptical. When it comes to estimating extremely unlikely events, with multiple independent unlikely steps that all need to happen quickly, the development of the first replicator seems to require vastly more steps than relatively simple things like sexual reproduction. The only thing that makes me uncertain is the possibility that there are extremely simple replicators that resemble nothing like minimal cells, and there is a relatively natural progression to minimal cells that simply isn’t large enough to leave fossils. I would love to update on this if you know something I’m not thinking of.… [continue reading]
I gave a talk recently on Itay’s and my latests results for detecting dark matter through the decoherence it induces in matter interferometers.
Quantum superpositions of matter are unusually sensitive to decoherence by tiny momentum transfers, in a way that can be made precise with a new diffusion standard quantum limit. Upcoming matter interferometers will produce unprecedented spatial superpositions of over a million nucleons. What sorts of dark matter scattering events could be seen in these experiments as anomalous decoherence? We show that it is extremely weak but medium range interaction between matter and dark matter that would be most visible, such as scattering through a Yukawa potential. We construct toy models for these interactions, discuss existing constraints, and delineate the expected sensitivity of forthcoming experiments. In particular, the OTIMA interferometer developing at the University of Vienna will directly probe many orders of magnitude of parameter space, and the proposed MAQRO satellite experiment would be vastly more sensitive yet. This is a multidisciplinary talk that will be accessible to a non-specialized audience.
]If you ever have problems finding the direct download link for videos on PI’s website (they are sometimes missing), this Firefox extension seems to do the trick.
… [continue reading]