Abstracts for October-November 2015

  • High energy particle colliders have been in the forefront of particle physics for more than three decades. At present the near term US, European and international strategies of the particle physics community are centered on full exploitation of the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). A number of the next generation collider facilities have been proposed and are currently under consideration for the medium and far-future of accelerator-based high energy physics. In this paper we offer a uniform approach to evaluation of various accelerators based on the feasibility of their energy reach, performance potential and cost range.
    (H/t Sabine.) The contraction has been happening for quite some time:

    maximum c.o.m. energy has drastically slowed down since the early 1990’s and the lepton colliders even went backwards in energy to study rare processes…Moreover, the number of the colliding beam facilities in operation has dropped from 9 two decades ago to 5 now…

  • Conditions for Quantum Violation of Macroscopic Realism
    Johannes Kofler and Časlav Brukner
    Why do we not experience a violation of macroscopic realism in everyday life. Normally, no violation can be seen either because of decoherence or the restriction of coarse-grained measurements, transforming the time evolution of any quantum state into a classical time evolution of a statistical mixture. We find the sufficient condition for these classical evolutions for spin systems under coarse-grained measurements. However, there exist ‘‘nonclassical’’ Hamiltonians whose time evolution cannot be understood classically, although at every instant of time the quantum state appears as a classical mixture. We suggest that such Hamiltonians are unlikely to be realized in nature because of their high computational complexity.
    Possibly relevant to set selection problem or, more specifically, what are the quasiclassical degrees of freedom?

    This is one of the rare papersThe first place I can remember reading this distinction coming up clearly is the dispute over Zurek and Paz’s Hyperion example. a   where I’ve seen someone clearly point out these two conceptually distinct but tightly related explanation for classicality:

    In our everyday life, to experience macrorealism it is usually sufficient to employ a certain type of decoherence (where the system is isolated [4] and only at the times of measurement the environment
    makes a premeasurement on the apparatus [5]) or the restriction of coarse-grained measurements [6–9].

    I don’t think these explanations are actually in tension, in the sense that the more one applies the less it is necessary, to understand the emergence of classicality, to appeal to the other. More precisely, I think these are currently both “underspecified” explanations, and that a complete precise understanding will be expressible in either language by making certain terms in those languages more exact.

    One can talk about decoherence of the system by an environment, but then one needs to answer: What are systems? What are the environments? Alternatively, one can talk about certain preferred variables that can only be followed in a coarse-grained manner, but then one needs to answer: What are the variables? What sort of coarse-graining is necessary and why? In both cases, the follow-up questions can be sort-of answered with appeals to intuition, but never with mathematical precision, so it largely just passes the buck.

  • I distinguish two types of reduction within the context of quantum-classical relations, which I designate “formal” and “empirical”. Formal reduction holds or fails to hold solely by virtue of the mathematical relationship between two theories; it is therefore a two-place, a priori relation between theories. Empirical reduction requires one theory to encompass the range of physical behaviors that are well-modeled in another theory; in a certain sense, it is a three-place, a posteriori relation connecting the theories and the domain of physical reality that both serve to describe. Focusing on the relationship between classical and quantum mechanics, I argue that while certain formal results concerning singular ℏ→0 limits have been taken to preclude the possibility of reduction between these theories, such results at most provide support for the claim that singular limits block reduction in the formal sense; little if any reason has been given for thinking that they block reduction in the empirical sense. I then briefly outline a strategy for empirical reduction that is suggested by work on decoherence theory, arguing that this sort of account remains a fully viable route to the empirical reduction of classical to quantum mechanics and is unaffected by such singular limits.

    This paper is very clear, and is very much in the spirit of my earlier post on casting quantum indeterminism as an anomaly.

  • Free Nano-Object Ramsey Interferometry for Large Quantum Superpositions
    C. Wan, M. Scala, G. W. Morley, ATM. A. Rahman, H. Ulbricht, J. Bateman, P. F. Baker, S. Bose, M. S. Kim
    We propose an interferometric scheme based on an untrapped nano-object subjected to classical gravity. The center of mass (CM) of the free object is coupled to its internal spin system magnetically, and a free flight scheme is developed in which the matter wave of the test object is split and merged in a double slit interferometry fashion. It shows the capability of generating a large spatially separated superposition of the composite system and consequently evidencing it via a Ramsey interferometry that reveals a gravity induced dynamical phase accrued solely on the spin. We find a remarkable immunity to the motional noise in the CM so that our scheme would work for a thermal initial state with moderate cooling. The mass independence of our scheme makes it viable for nano-object ensembles with a high mass variability. The $100$ nm scale of spatial separation of the superposed components, as well as the high visibility of the resulting Ramsey interference over $100 mu$s provides a route to test postulated modifications of quantum theory such as continuous spontaneous localisation.

    If feasible, this is big news for decoherence detection, since this experiment would be two orders of magnitude more sensitive than OTIMA in the coherent scattering regime.

  • We characterize the pointer states generated by the master equation of quantum Brownian motion and derive stochastic equations for the dynamics of their trajectories in phase space. Our method is based on a Poissonian unraveling of the master equation whose deterministic part exhibits soliton-like solutions that can be identified with the pointer states. In the semiclassical limit, their phase space trajectories turn into those of classical diffusion, yielding a clear picture of the induced quantum- classical transition.


(↵ returns to text)

  1. The first place I can remember reading this distinction coming up clearly is the dispute over Zurek and Paz’s Hyperion example.
Bookmark the permalink.

Leave a Reply

Include [latexpage] in your comment to render LaTeX equations with $'s. (More.)

Your email address will not be published. Required fields are marked with a *.