Pages

Monday, November 14, 2016

Quick Physics Hits

* Torsion fields interacting with an expanded Higgs doublet could be a dark matter candidate.

* Erik Verlinde is known for his conjecture that gravity is an emergent property of the entanglement of particles via quantum mechanics, providing a natural and parsimonious first principles theory of quantum gravity. Several of his new talks on the subject are mentioned in the link. Peter Woit has also chimed in on this work, saying he doesn't understand the argument being made and doesn't see a case comparing it with the empirical evidence yet.

* The DUNE experiment, which is just getting started, should be able to shed light on the "two major unknowns in neutrino oscillation physics. These are [the] octant of θ23 (i.e. if θ23 is <45 or >45) and Dirac CP phase δCP."

* The 4gravitons blog has correspondence from one of the authors of a paper casting doubt on the strength of the evidence for dark energy and cosmic acceleration, which argues that the strength of the evidence is overstated by the use of inappropriate statistical methods by its proponents. They still find evidence for it, but at a slightly less than 3 sigma significance as opposed to the more than five sigma discovery level of significance claimed by supporters of the theory. Thus, even if there is dark energy or a positive cosmological constant, its magnitude may be smaller than previously claimed. This would tend to bring the amount of dark energy in the universe closer to the magnitude of the component of the total make up of the universe attributed to dark matter, heightening the so called "coincidence problem" and tending to favor theories in which dark matter and dark energy phenomena have a common source.

* Sabine Hossenfelder joins Steve Weinberg in expressing concern that the definition of a measurement that collapses the wave function in quantum mechanics is not yet satisfactory. As she explains:
My misgivings of quantum mechanics are pretty much identical to the ones which Weinberg expresses in his lecture. The axioms of quantum mechanics, whatever interpretation you chose, are unsatisfactory for a reductionist. They should not mention the process of measurement, because the fundamental theory should tell you what a measurement is.
She is a fan of superdeterminism.

* We have a good model for explaining the exclusive leptoproduction of the neutral rho meson, which in confirmed experimentally. It is normally theorized as a blend of an up and antiup quark meson and a down and anti-down quark meson. It is a vector meson (spin-1, negative P parity and negative C parity) that has a mass of 775.49±0.34 MeV/c2.

It swiftly (in a mean lifetime of about 4.5 x 10-24 seconds) decays to a positively charged and negatively charged pion pair (which it does 99.9% of the time, while decaying to a pair of electrons or pair of muons 0.05% of the time each (possibly in an interaction that also involves an additional photon) with the helicity predicted by the Standard Model and QCD.

The neutral rho meson is one of several mesons that carry the residual strong force that binds nucleons in atoms.

* In contrast, we struggle to explain the decay of a boson called X(3915) with QCD and the Standard Model. As the abstract of a November 11, 2016 preprint by P. Gonzales explains:
Strong decays of X(3915) are analyzed from two quark model descriptions of X(3915), a conventional one in terms of the Cornell potential and an unconventional one from a Generalized Screened potential. We conclude that the experimental suppression of the OZI allowed decay X(3915)DD might be explained in both cases as due to the momentum dependence of the decay amplitude. However, the experimental significance of the OZI forbidden decay X(3915)ωJ/ψ could favor an unconventional description.
The myth that the Standard Model fully explains all experimental observations is particularly exaggerated in the case of QCD, which there are lots of unexplained anomalies like this one. But, nobody has proposed an alternative that does a consistently better job in all circumstances (although there are some alternatives to Standard Model QCD that seem to do a better job in particular kinds of circumstances). It is also hard to say what true Standard Model QCD really predicts, because everyone uses approximations of it in practice because the Standard Model equations cannot, in general, be solved analytically in most circumstances that present themselves in real life. Existing methods get quite close to reality up to the limited accuracy of current calculations, but they don't reliably and consistently explain all observations in anything approaching a straightforward manner.

* The search for flavor changing neutral currents, which violate the Standard Model's conservation of baryon number and/or lepton number, continues to come up empty and produces every tighter constraints. One way to more accurately search for lepton number violation is to develop a more accurate Standard Model prediction for a process that provides most of the background in searches for lepton number violation in muon decays, which happens to predict a smaller background than lower order calculations had suggested. The same approach can also provide a template for predicting the tau lepton decays that contribute to the background in lepton number violation searches.

Together with neutrinoless beta decay searches and proton decay searches, the evidence is overwhelming that baryon number and lepton number are perfectly conserved in Nature. But, tightening these limits is important because they constrain beyond the Standard Model physics, where there is a powerful desire in formulating grand unified theories (GUTs) and theories of everything (TOEs) to allow baryon number and lepton number violations, because without doing so, it is impossible to explain baryogenesis, leptogenesis, and the matter-antimatter asymmetry of the universe, if you assume that the Big Bang originated from pure energy.

No Standard Model process explains how the current universe could arise from that initial state, however, largely because of baryon number conservation, lepton number conservation, and the absence of any sufficiently strong CP violating process that treats matter and antimatter different to a strong enough degree. Naively, the Standard Model predicts that a universe starting from a pure energy state should give rise to equal amounts of matter and antimatter in the universe, separately on both the quark side of the ledger and the lepton side. 

So, either (1) the prediction of a pure energy initial condition for the Big Bang are false (something that has always been entirely possible), or (2) the existence of new processes that do not conserve these quantities at energies beyond the domain of applicability of the Standard Model (in which case highly sensitive high energy collider experiments might glimpse them). 

My own pet theory to resolve this is that the Big Bang gave rise to two separated universes, our own and another dominated by antimatter in which time and thermodynamics move away from the Big Bang (in the opposite direction relative to each other of arrow of time and thermodynamics in our universe), with the "bang" in the Big Bang caused by matter-antimatter collisions at this point of intersection as antimatter tries to make its way back through the singularity and matter tries to make its way forwards from the universe on the other side of the Big Bang. (Something I recognize has no real solid support, but which is parsimonious and simply compared to many of the equally speculative alternatives.)

No comments:

Post a Comment