Abstracts for March 2017

  • Recent progress in synthetic chemistry and molecular quantum optics has enabled demonstrations of the quantum mechanical wave–particle duality for complex particles, with masses exceeding 10 kDa. Future experiments with even larger objects will require new optical preparation and manipulation methods that shall profit from the possibility to cleave a well-defined molecular tag from a larger parent molecule. Here we present the design and synthesis of two model compounds as well as evidence for the photoinduced beam depletion in high vacuum in one case.

    The technique of using “laser grating”, in place of physical grating (slits), for producing spatial interference of molecules relies on the laser’s ability to ionize the molecule. (Once ionized, standing electric fields can sweep it out of the way.) But for some molecules, especially large nanoparticles, this is ineffective. Solution: attach a molecular tag to the nanoparticle that reliably cleaves in the presence of a laser, allowing the nanoparticle to be vacuumed up. Rad.

  • This chapter discusses the asymptotics, singularities, and the reduction of theories. The reduction must involve the study of limits—asymptotics. The reduction is obstructed by the fact that the limit is highly singular. In addition, the type of singularity is important, and the singularities are directly connected to the existence of emergent phenomena and underlie some of the most difficult and intensively studied problems in physics today. The chapter provides six examples of singular limits and emergent phenomena such as special relativity and statistical mechanics. Reduction in its simplest form is well illustrated by special relativity. The connection between the encompassing theory of special relativity and the less general theory of Newtonian mechanics is contained in the “low-speed” series expansion. The limits of physical theories are not analytic, they are singular; and the emergent phenomena associated with reduction are contained in the singularity. Often, these emergent phenomena inhabit the borderland between theories. Thermodynamics is a continuum theory, so reduction has to show that density fluctuations arising from interatomic forces have a finite range.

    Berry points out that the \hbar \to 0 limit of quantum mechanics is singular, implying that things like Ehrenfest’s theorem and the canceling of the path integral are not adequate to describe the quantum-classical transition. A similar situation can be found with critical points in statistical mechanics, where the N \to \infty limit similarly becomes ill-defined. If you think that the huge intellectual investment in understanding critical points is justified by their fundamental significance (regardless of practical applications), I claim you should think similarly about the quantum-classical limit.

    Even in what philosophers might regard as the simplest reductions, between different areas within physics, the detailed working-out of how one theory can contain another has been achieved in only a few cases and involves sophisticated ideas on the forefront of physics and mathematics today….It should be clear from the foregoing that a subtle and sophisticated understanding of the relation between theories within physics requires real mathematics, and not only verbal, conceptual and logical analysis as currently employed by philosophers.

    For introductions, see these popular and non-technical treatments.

  • A long-time quantum memory capable of storing and measuring quantum information at the single-qubit level is an essential ingredient for practical quantum computation and communication. Recently, there have been remarkable progresses of increasing coherence time for ensemble-based quantum memories of trapped ions, nuclear spins of ionized donors or nuclear spins in a solid. Until now, however, the record of coherence time of a single qubit is on the order of a few tens of seconds demonstrated in trapped ion systems. The qubit coherence time in a trapped ion is mainly limited by the increasing magnetic field fluctuation and the decreasing state-detection efficiency associated with the motional heating of the ion without laser cooling. Here we report the coherence time of a single qubit over 10 minutes in the hyperfine states of a Yb ion sympathetically cooled by a Ba ion in the same Paul trap, which eliminates the heating of the qubit ion even at room temperature. To reach such coherence time, we apply a few thousands of dynamical decoupling pulses to suppress the field fluctuation noise. A long-time quantum memory demonstrated in this experiment makes an important step for construction of the memory zone in scalable quantum computer architectures or for ion-trap-based quantum networks. With further improvement of the coherence time by techniques such as magnetic field shielding and increase of the number of qubits in the quantum memory, our demonstration also makes a basis for other applications including quantum money.

    I have no intelligent comments about this, and have no idea if the paper is interesting. It’s just a crazy long coherence time.

  • Several researchers, including Leonid Levin, Gerard ’t Hooft, and Stephen Wolfram, have argued that quantum mechanics will break down before the factoring of large numbers becomes possible. If this is true, then there should be a natural set of quantum states that can account for all quantum computing experiments performed to date, but not for Shor’s factoring algorithm. We investigate as a candidate the set of states expressible by a polynomial number of additions and tensor products. Using a recent lower bound on multilinear formula size due to Raz, we then show that states arising in quantum error-correction require nΩ(log n) additions and tensor products even to approximate, which incidentally yields the first superpolynomial gap between general and multilinear formula size of functions. More broadly, we introduce a complexity classification of pure quantum states, and prove many basic facts about this classification. Our goal is to refine vague ideas about a breakdown of quantum mechanics into specific hypotheses that might be experimentally testable in the near future.

    (Note that the ACM version is a much shorter “abstract”, missing most of the content.)

  • The reason we never observe violations of the second law of thermodynamics is in part a matter of statistics: When 1023 degrees of freedom are involved, the odds are overwhelmingly stacked against the possibility of seeing significant deviations away from the mean behavior. As we turn our attention to smaller systems, however, statistical fluctuations become more prominent. In recent years it has become apparent that the fluctuations of systems far from thermal equilibrium are not mere background noise, but satisfy strong, useful, and unexpected properties. In particular, a proper accounting of fluctuations allows us to rewrite familiar inequalities of macroscopic thermodynamics as equalities. This review describes some of this progress, and argues that it has refined our understanding of irreversibility and the second law.
    (H/t Sean Carroll.) Jarzynski’s equality and the Crooks fluctuation theorem are recent and important strengthening of the second law of thermodynamics. You can think about the second law as the lowest order term in an expansion of Jarzynski’s equality, which is the one surviving the thermodynamic limit. In particular, Jarzynski’s equality bounds the probability of a fluctuation violating the second law as a function of the size of the fluctuation (which goes to zero exponentially fast for macroscopic fluctuations). This introduction is elementary and excellent. The results can also be framed as a consequence of a more general result based merely on the Markov property. To go deeper, start reading Crook’s thesis [PDF].
  • Yang-Mills for pedestrians.
Bookmark the permalink.

3 Comments

  1. Symmetry breaking may be understood mathematically as the non-commutation of the following two limits:

    \lim_{h\to0}\lim_{N\to\infty}\neq\lim_{N\to\infty}\lim_{h\to0},

    where h is the external field that breaks the global symmetry, and N is the number of degrees of freedom. Bogolyubov pointed out that the correct order is to first take the thermodynamic limit and then the zero-field limit. I wonder if one can make a similar statement about the emergence of classical mechanics. For instance,

    \lim_{\hbar\to0}\lim_{g\to0}\neq\lim_{g\to0}\lim_{\hbar\to0},

    where g is the coupling between the instrument / observer / bath.

    • Hmm. Is there a good cite for the Bogolyubov observation?

      (Btw, if you include “[latexpage]” (without the quotes) in a comment, the latex will be rendered. I’ve edited your comment to add it.)

      • Thanks for fixing the LaTeX. It’s not easy to dig out the original paper from Bogoliubov (written in Russian and published as internal Dubna report). But the following historical note may be useful:

        https://arxiv.org/abs/1003.1363

        Bogoliubov pointed out the the naive Gibbs ensemble is problematic when spontaneous symmetry breaking (SSB) is present. The cure, known as “quasi-averaging” in Russian literature, is to add an infinitesimal symmetry breaking field. Berry’s essay makes me wondering how far one can push the analogy between SSB and the emergence of classical mechanics. Could the emergence of classical mechanics be understood as some sort of “symmetry breaking” among all possible quantum basis?

Leave a Reply

Required fields are marked with a *. Your email address will not be published.

Contact me if the spam filter gives you trouble.

Basic HTML tags like ❮em❯ work. Type [latexpage] somewhere to render LaTeX in $'s. (Details.)