I keep a text file on Evernote called “paper ideas”, now numbering 20 or so half-baked bullet points. Some of these topics are interesting physics problems that I think ought to be solved, and some of them are already well-understood but aren’t covered comprehensively and clearly anywhere.
It seemed like I should take advantage of the explosion in traffic coming from Peter’s nice comments about yesterday’s post. So I figure now’s as good a time as any for me to put up this list. (EDIT: Needless to say, let me know if you think these are already answered!)
Here is a list of topics I’d like to write a paper about with a collaborator. I encourage you shoot me an email if one of them piques your interest, regardless of where you’re located geographically. Some of these topics would need an expert, but a lot of them would be suitable for an enthusiastic grad student. If you’re a student at PI or Waterloo and are interested in working with me, this list is a good place to start.
- What dark matter masses and cross sections could be probed by the MAQRO satellite experiment, as currently proposed? Back-of-the-envelope calculations suggest this could reach cross-sections as low as for some dark matter masses between and . This is well below the scale of solar neutrinos cross sections. Writing this paper would require understanding shielding from the Earth and the satellite, the time stability of conventional sources of decoherence, and the unusual quadratic scaling of the coherent scattering enhancement that appears in the appendix here.
- Related to previous: Construct a concrete dark matter model that would be tested by MAQRO. It would have to evade other restrictions, e.g. from the LHC. See one tentative idea by Bateman et al.
- What, if anything, would micromechanical resonators be better than matter interferometers at detecting?
- What are the prospects for detecting the following phenomena with quantum enhanced measurements, i.e., through anomalous decoherence or phase shifts of large superpositions of matter?
- Millicharges [Possible collaboration with Natalia Toro and/or Savas Dimopoulos.]
- Ultralight coherent boson field (as dark matter?) [Possible collaboration with Asimina Arvanitaki.]
- Relic neutrinos (Very unlikely through decoherence because the latter is never at second order. But maybe phase shifts?) [Possible collaboration with Gordan Krnjaic and Asimina Arvanitaki.]
- Anomalous fluxes of eV to keV neutrinos that might fill the gap in this plot.
- Long–wavelength photons? (What are the longest single photons currently detectable?)
- Slow neutrons?
- Sterile neutrinos (not as DM)?
- Unruh radiation [Possible collaboration with Rob Spekkens.]
- How would superconducting qubits be decohered if the electrons scatter a flux of long-wavelength particles like low-mass dark matter? This requires understanding the microscopic wavefunction of a SQUID.
- Could low-mass dark matter become hotter after entering the atmosphere but still reach the ground, thereby increasing the rate at which it decoheres interferometers? [Possible collaboration with Phillip Schuster.]
- As mentioned in the appendix of my dark matter paper, the spatial separation of a superposition takes the place of detector acceptance in neutron experiments for setting the maximum scale of coherent elastic scattering. Can any insights be gained by fleshing out this analogy?
- Construct a simple toy model of a physical system that includes all of the following time scales: (1) An “objective branching time” (i.e. the average time between unambiguous quantum amplification events that resembles a von Neumann measurement). (2) A redundancy time (i.e., a rate at which records about branching events are created). (3) A mixing/thermalization time. (4) A Poincare recurrence time. [Collaboration with Charlie Bennett.]
- Variant on previous: Construct a concrete example of a chaotic quantum system that provides a natural, obvious measurement rate. Can we define a spatial density of branching events multiplied by their inverse Lyaponov times, and identify this with the “instantaneous” Kolmorgarov-Sinai entropy? How are hardware quantum random number generators built, and what sets the limits on their bandwidth? Is the quantum entropy production captured by a notion of work? [Collaboration with Charlie Bennett.]
- Quantum computers don’t work in the presence of the wrong kind of decoherence, and there are several proposals — mostly motivated by trying to make quantum mechanics less weird — for objective collapse processes that look experimentally like decoherence. Is it possible to protect quantum computations from GRW/CSL collapse models by running the computations in subspaces that aren’t affected by that flavor of collapse? If so, GRW/CSL doesn’t really make quantum mechanics any less weird, does it? (Has Scott Aaronson already answered this? 🙂 ) Still would have an exponentially large Hilbert space. [Possible collaboration with Rob Spekkens.]
(not necessarily extremely long)
- A structured, annotated, and referenced list of toy decoherence models. The key idea here would be to be precise about the various limits and conditions being assumed by each articles (and describing what they calculate) in a single unified language. A lot of the notes I have are for decoherence of a single continuous degree of freedom, which is mostly quantum Brownian motion plus many generalizations (e.g., Markovian versus non-Markovian, first order Lindblad operators versus higher order, etc.).
- A review of what I call the “set selection problem”, but which is very similar to the “quantum factorization problem” (Tegmark), the “physics-from-scratch problem” (Tegmark), the “what’s-a-system problem” (Susskind), the “static-state problem” (Jan-Markus Schwindt), the “quantum reality problem” (Kent), and doubtlessly many others. This would have a significant philosophical component.
- A review of consistency conditions and set-selection principles in the consistent histories formalism. This would be building from this post. (Very ambitiously, this could grow into a contrast/compare article of the different formulations of consistent histories, including less well known approaches like Isham’s.)