Decoherence Detection FAQ—Part 1: Dark matter

[Updated 2016-7-2]

I’ve submitted my papers (long and short arXiv versions) on detecting classically undetectable new particles through decoherence. The short version introduces the basic idea and states the main implications for dark matter and gravitons. The long version covers the dark matter case in depth. Abstract for the short version:

Detecting Classically Undetectable Particles through Quantum Decoherence

Some hypothetical particles are considered essentially undetectable because they are far too light and slow-moving to transfer appreciable energy or momentum to the normal matter that composes a detector. I propose instead directly detecting such feeble particles, like sub-MeV dark matter or even gravitons, through their uniquely distinguishable decoherent effects on quantum devices like matter interferometers. More generally, decoherence can reveal phenomena that have arbitrarily little classical influence on normal matter, giving new motivation for the pursuit of macroscopic superpositions.

This is figure 1:

MZ2_cropped
Decoherence detection with a Mach-Zehnder interferometer. System \mathcal{N} is placed in a coherent superposition of spatially displaced wavepackets \vert N_{L} \rangle and \vert N_{R} \rangle that each travel a separate path and then are recombined. In the absence of system \mathcal{E}, the interferometer is tuned so that \mathcal{N} will be detected at the bright port with near unit probability, and at the dim port with near vanishing probability. However, if system \mathcal{D} scatters off \mathcal{N}, these two paths can decohere and \mathcal{N} will be detected at the dim port 50% of the time.

Below are some FAQs I have received.

Won’t there always be momentum transfer in any nontrivial scattering?

For any nontrivial scattering of two particles, there must be some momentum transfer.  But the momentum transfer can be arbitrarily small by simply making the mass of the dark particle as tiny as desired (while keeping its velocity fixed).  For the dark matter particle to “see” the atom, its de broglie wavelength must only be short enough for it to distinguish between the halves of a superposition of the atom (i.e. the distance between the arms of the interferometer).  In principle, this can be arbitrarily long even for fixed properties of the atom.

As a matter of principle, it’s also worth pointing out that you can get exactly zero momentum transfer by looking at a three-particle scattering event.  Just consider two distinguishable dark matter particles with the same mass and equal but opposite momentum that scatter from opposite directions at the same time.  (This is a measure-zero type situation, of course.) They both bounce off the atom and head in opposite directions.  The net transfer of momentum to the atom is zero, but the two dark matter particles have recorded the position of the atom—remember that they are distinguishable—and so have decohered it. This idea can be extended to a dense isotropic gas. There is zero expected momentum transfer, and if one imagines breaking the gas molecules into tinier and tinier parts, the momentum transfer fluctuations can go to zero while holding the decoherence rate constant.

Finally, it might also be worth noting the possibility that the center-of-mass degree of freedom can be decohered by fully internal degrees of freedom.

But what about the uncertainty principle? If you decohere in the position basis, aren’t you necessarily increasing the momentum uncertainty?

No. The uncertainty principle is an inequality, and the states of the probe which are sensitive to decoherence by negligible momentum particles are states which are far from saturating the inequality. Indeed, a minimal uncertainty wavepacket has position and momentum spread \sigma_x and \sigma_p which satisfy \sigma_x \sigma_p = \hbar/2. If I create a superposition by adding a second wavepacket identical to the first except that it is spatially displaced by a large distance \Delta x \gg \sigma_x, then the uncertainty of the position grows very large, to order \Delta x, but the momentum uncertainty does not change. Furthermore, it’s possible to decohere this superposition into an incoherent sum of two possible states, both with the same momentum uncertainty \sigma_p.

If there is no momentum transfer, does the dark matter scatter coherently over both arms? That is, if we take the coherent sum of final states? Then it seems that the paths of the normal matter would still interfere.

The dark matter scatters coherently (i.e. does not become entangled) only insofar as the out-state conditional on scattering from the left arm of the interferometer is the same as the out-state conditional on the left-arm.  The dark matter must have sufficiently short de Broglie wavelength to resolve the spatial separation between the arms. Trivially, a very short-wavelength dark matter particle can be be fired out of a gun aimed at only one arm.  If it scatters, the atom is on that arm.  If it does not, the atom is on the other arm.  For long wavelength dark matter (compared to the separation of the arms) there is partial decoherence (i.e. 0 < \vert \gamma \vert < 1), and it requires multiple scattering events to cause complete decoherence.

It’s true that in order for the dark matter out-state to be different at all, there must be non-zerp momentum transfer. But this transfer can be made arbitrarily small compared to the characteristics of the atom by considering very small dark matter masses, and (as described above) it could in principle be canceled out through a second, distinguishable dark matter particle scattering simultaneously in the opposite direction.

Wouldn’t one detect interactions with gas molecules and cosmic rays long before dark matter? How can you detect decoherence from DM collisions when keV DM only has a density of ~3*104/cm, much smaller than the molecule number density of the vacuum reasonably achievable in the laboratory?

These decoherence rates are in the same ballpark. The vacuums necessary for the OTIMA experiment are of order 10-10mbar, or a few million molecules per cubic centimeter. The density of keV DM is a few tens of thousands per cubic centimeter. However, the DM is traveling about a thousand times faster (10-3 c). Even for short-range forced, the cross section of the superposed target to DM can be of order the geometric cross section (its ultimate limit), even though individual nucleons are weakly coupled to DM, because of the coherent scattering enhancement. When the forces are longer range (e.g., 100 nm), the target’s DM cross section can be of order that size, i.e., much larger than the cross section to scattering with remaining gas molecules in the vacuum.

More generally, the decoherence rate from all backgrounds is already accounted for in the existing experimental proposals.  When someone claims “I will produce a superpositions of particle X over distance D lasting time T“, they must justify this claim by proving that they will be able to suppress all known forms of decoherence.  For instance, the AGIS satellite’s limiting source of decoherence is (as you suspect) the solar wind, which sets the maximum parameters of the superposition they will achieve.  It’s sensitivity to dark matter, then, is just the weakest dark matter that causes more decoherence than the solar wind.

Finally, even DM decoherence which is weaker than the background rate from residual gas scattering can identified through its variation with the sidereal day.

The cross-section enhancement due to coherent scattering is the result of contributions from multiple nucleons?  How can you have multiple scattering events when the mean free path is much larger than the sample size?

Coherent elastic scattering does not require multiple scattering events; more precisely, it does not require additional factors of the coupling constant compared to the single-nucleon case. It just requires a single dark matter particle to scatter once from the entire target (which recoils uniformly, albeit negligibly).  The feature is discussed in the Appendix of the long-version article.

In order to detect dark matter with the next generation of matter interferometers, your proposal requires operating them in space.  Won’t the sensitivity of matter interferometers to dark matter be worse when they are adapted for satellite missions?

In fact, interferometers placed on satellites will enjoy increased, not decreased, precision. Many groups are already planning interferometers for space, and they are doing it because space is a more pristine environment that allows for greater sensitivity.  The chief deterrent for space missions is not technical feasibility, but cost in money and time.The space environment has two highly desirable features: (1) isolation from vibration during operation and (2) unlimited free-fall time.  Vibrations and free-fall times are the restrictions which currently limit terrestrial atom interferometer experiments, and free-fall time is projected to be the ultimate limiting factor for the OTIMA design.  I have it on good authority that we can expect to achieve superpositions of between 10^6 and 10^7 amu on Earth, and up to 10^{10} or 10^{11} amu in orbit.

The attractiveness of the weightless environment is evidenced by the many calls for atom interferometers in space such as the MWXG, GAUGE, QUANTUS, HYPER, and SAI proposals.The SAI group expects sensitivity to accelerations increased by 2 orders of magnitude when operating their interferometer in space rather than on Earth (See page 553 of Sorrentino, F. et al. “A Compact Atom Interferometer for Future Space Missions.” Microgravity Science and Technology 22, no. 4 (2010): 551–561.)The GAUGE group expects 1 to 3 orders of magnitude increased sensitivity to new spin-coupling forces for their satellite-based interferometer compared to state-of-the-art experiments on Earth. (See Figure 8 of Amelino-Camelia, G. et al. “GAUGE: The GrAnd Unification and Gravity Explorer.” Experimental Astronomy 23, no. 2 (2008): 549–572.)The HYPER group expects 2 to 3 order of magnitude greater sensitivity to angular rotation rates and accelerations compared to state-of-the-art experiments on Earth. (See page 2216 of Jentsch, C. et al. “HYPER: A Satellite Mission in Fundamental Physics Based on High Precision Atom Interferometry.” General Relativity and Gravitation 36, no. 10 (2004): 2197–2221.)The MWXG group expects their interferometer to be 3 to 4 orders of magnitude more sensitive to accelerations when operated in space than on Earth. (See page 620 of Ertmer, W et al. “Matter Wave Explorer of Gravity (MWXG).” Experimental Astronomy 23, no. 2 (2009): 611–649.)The proposed MAQRO experiment is a satellite-based superposition device based on the Nanosphere proposal.  It would increase exposure time (and hence sensitivity) by 3 orders of magnitude while holding constant or improving the other key parameters of the superposition.  (See Kaltenbaek, R. at al. “Macroscopic Quantum Resonators (MAQRO).” Experimental Astronomy 34, no. 2 (2012): 1–42.)a    (The STE-QUEST experiment has recently been accepted by the European Space Agency as a candidate M-class mission.  It incorporates elements of the SAI and QUANTUS proposals.)

Why haven’t you performed a full analysis of the conventional sources of decoherence for each experiment you discuss?

Such an analysis of decoherence has already been done in the three existing experimental proposals. Conventional sources of decoherence are exhaustively enumerated by these proposals, and their effects are shown to be manageable enough that coherence can be maintained (in the absence of dark matter) over a certain spatial and temporal extent.

Furthermore, this is fully sufficient for my purposes.  Decoherence from dark matter and decoherence from conventional sources would add together strictly linearly. There is no interaction between them.  In fact, once the experiment is successfully demonstrating interference, one does not need to understand the conventional sources of decoherence whatsoever in order to exclude certain types of dark matter.

The point of my proposal is not to champion any particular experiment as a good way to detect dark matter, and certainly not to propose specific new experimental designs.  (Indeed, none of the experiments I discuss were designed with any consideration to dark matter whatsoever!)  Rather, this proposal provides a general mechanism for translating experimental interferometry results into dark matter sensitivity.  If one of the particular experiment is later found to be unable to establish interference as promised because of some neglected source of conventional decoherence, then this is a flaw with their theoretical analysis of that experiment rather than with the dark matter detection method.

Lastly, it is very hard to imagine that anomalous decoherence (if it’s detected) will ever be attributed to dark matter through the understanding of conventional sources of decoherence.  The nature of decoherence is that—in practice—it’s extremely difficult to be confident that all decoherence should have been eliminated and that the remaining decoherence must be from dark matter.  (In this sense, systematic uncertainties from decoherence are perhaps more challenging than traditional background events in particle physics.) Instead, good evidence for dark matter will only be established by understanding and observing the distinctive functional behavior of the dark-matter-induced decoherence, e.g. how it varies with interferometer orientation, superposition separation, or degree of shielding.

Do you have a specific model of dark matter you would be testing with this proposal?  Why should I be interested in this range of parameters?

My philosophy is to remain agnostic about the nature of dark matter.  Physicist have been pouring resources (both mental and monetary) into subtle theoretical hints for decades: the WIMP miracle, the strong CP problem, supersymmetry, sterile neutrinos. (And many more that simply never become popular.) This has produced exactly zero positive dark matter results in more than half a century.

Precious few model-independent properties of dark matter are known. Although there is no limit to the number of ways to parameterize potential models, I argue—given that we know its galactic density and rough momentum distribution at least as well as we know anything about dark matter—that a pretty good parameterization is “how heavy is it, and how often does it collide with atoms?”.  My proposal offers a way to rule out huge swaths of this space.

Moreover, dark matter in this region is widely—but incorrectly—regarded as being undetectable.  I think it’s reasonable to suspect this leads to reduced theoretical exploration of this area.  My claim is that the novelty of the method (and the many new experimental possibilities it suggests both inside and outside the study of dark matter) is much more important than the previous existence of a complete cosmological model.   It’s also worth considering the historical record of new experimental abilities leading to discovery even in the absence of detailed theoretical demand for those abilities.

[These questions and answers were adapted from communication with referees, editors, and conference participants. I have appended “Part:1 Dark matter” to the title since I am preparing another FAQ for the more general case.]

Edit: After much gnashing of teeth, the long version has now appeared as a PRD.

Footnotes

(↵ returns to text)

  1. The SAI group expects sensitivity to accelerations increased by 2 orders of magnitude when operating their interferometer in space rather than on Earth (See page 553 of Sorrentino, F. et al. “A Compact Atom Interferometer for Future Space Missions.” Microgravity Science and Technology 22, no. 4 (2010): 551–561.)

    The GAUGE group expects 1 to 3 orders of magnitude increased sensitivity to new spin-coupling forces for their satellite-based interferometer compared to state-of-the-art experiments on Earth. (See Figure 8 of Amelino-Camelia, G. et al. “GAUGE: The GrAnd Unification and Gravity Explorer.” Experimental Astronomy 23, no. 2 (2008): 549–572.)

    The HYPER group expects 2 to 3 order of magnitude greater sensitivity to angular rotation rates and accelerations compared to state-of-the-art experiments on Earth. (See page 2216 of Jentsch, C. et al. “HYPER: A Satellite Mission in Fundamental Physics Based on High Precision Atom Interferometry.” General Relativity and Gravitation 36, no. 10 (2004): 2197–2221.)

    The MWXG group expects their interferometer to be 3 to 4 orders of magnitude more sensitive to accelerations when operated in space than on Earth. (See page 620 of Ertmer, W et al. “Matter Wave Explorer of Gravity (MWXG).” Experimental Astronomy 23, no. 2 (2009): 611–649.)

    The proposed MAQRO experiment is a satellite-based superposition device based on the Nanosphere proposal.  It would increase exposure time (and hence sensitivity) by 3 orders of magnitude while holding constant or improving the other key parameters of the superposition.  (See Kaltenbaek, R. at al. “Macroscopic Quantum Resonators (MAQRO).” Experimental Astronomy 34, no. 2 (2012): 1–42.)

Bookmark the permalink.

Leave a Reply

Required fields are marked with a *. Your email address will not be published.

Contact me if the spam filter gives you trouble.

Basic HTML tags like ❮em❯ work. Type [latexpage] somewhere to render LaTeX in $'s. (Details.)