Several recent papers have confirmed the BMW calculation of the hadronic contribution to muon g-2, the anomalous magnetic moment of the muon, whose Standard Model prediction is consistent with the experimental result, and have shown strong tensions with the Theory Initiative calculation of the Standard Model predicted value of the hadronic contribution to muon g-2 based by electron-positron collisions.
A new preprint today continues that trend.
Why Care?
For those of you who haven't been paying attention, this is a big deal because muon g-2 is an observable which is a global measure of the consistency of the lower energy Standard Model physics with reality at extreme precision. It implicates all three of the Standard Model forces, although, predictably (since QCD calculations, which involve the strong force, are always the least precise), the greatest uncertainty is in the QCD part of the calculation even though it is responsible for only a very small part of the aggregate value of muon g-2.
If the correct Standard Model prediction matches the experimental result, which it would if the BMW calculation is correct, then beyond the Standard Model physics are extremely tightly constrained and possibilities like electroweak scale supersymmetry are rule out. In this case, most beyond the Standard Model physics have to be at energy scales sufficient in excess of those that are implicated in the material terms of the muon g-2 calculation to allow these new physics to "decouple" from this calculation. So, even if there are new physics out there to be discovered, they aren't likely to appear at the next generation particle collider.
On the other hand, if the correct Standard Model prediction differs significantly from the experimental result, which it would if the Theory Initiative calculation were correct, then new physics beyond the Standard Model have to be just around the corner and would likely be visible at a next generation particle collider and hinted at in the highest energy data from the current Large Hadron Collider (LHC), since it hasn't been definitively discovered yet. Moreover, the magnitude of the deviation from the Standard Model would be quantified quite precisely, further narrowing the range of possible BSM theories.
The Theory Initiative paper's data driven approach, if it is flawed, is most likely to be wrong either because the data were inserted into an otherwise theoretical calculation incorrectly in some subtle respect, or because the systemic uncertainty in the data it is relying upon was understated.
It also bears noting that the Theory Initiative discrepancy with BMW value in the Standard Model prediction is confined to the QCD portion of the calculation. But despite the fact that discrepancies between theory and experiment are most common in the various methods of operationalizing QCD calculations, there are very few theories out there that propose that the QCD portion of the Standard Model is what needs to be tweaked with new physics. Instead, almost all of the scientific debate is over how best to operationalize a profoundly challenging theory to use to do calculations.
Most of the theoretical proposals to reconcile the experimentally measured value of muon g-2 with the Theory Initiative's calculation involve particles and forces beyond any of the three Standard Model components and assume that any new physics, if they are present, don't contaminate the electron-positron collision data used to make its data based estimation of the hadronic part of the muon g-2 calculation.
But, instead, the replications of the BMW calculation of the hadronic component of muon g-2 produce a strong tension between the estimate based upon the electron-positron collision data (which the Theory Initiative assumes is free of new physics) and the lattice QCD calculations directly using QCD done by BMW and increasingly by multiple other groups confirming BMW's result. If the Theory Initiative's assumptions about the data driven methods it is using are correct, this is an apples to apples comparison.
On the other hand, if the BMW group has done the calculations right, and the Theory Initiative is relying on data with a correctly estimated magnitude of systemic error, and has integrated this data correctly into the overall calculation, then it would seem that the electron-positron collision data is what is defying the Standard Model, but in a way that somehow doesn't manifest with muons.
Increasingly, conventional wisdom is starting to conclude that one of those things (or both) is true, and that the BMW calculation of the expected Standard Model value of muon g-2 is correct, in which case, we are in a "physics desert."
Other Tests Of The Standard Model
The Particle Data Group's comprehensive assembly and organization of particle physics data rules out directly all manner of specific possible deviations from the Standard Model and beyond the Standard Model theories experimentally. Usually, these exclusions aren't absolute, but the parameter space for new physics below the 1 TeV energy scale (and sometimes beyond it) has been profoundly narrowed by direct exclusions in the LHC data.
Muon g-2 isn't the only global test of the Standard Model that is reinforcing this conclusion either.
The branching fractions of Higgs boson decays are another global test of the Standard Model, since any beyond the Standard Model particle that acquires its mass via the Higgs mechanism that is not greatly above the energy scale of a top quark-antiquark pair (about 350 GeV), would greatly change all of the branching fractions. But, the more measurements the LHC does, the closer the properties of the Higgs boson are to those predicted by the Standard Model, and the less room there is for novel decays not predicted by the Standard Model.
The decays of the W and Z bosons, likewise, have long closely confirmed the Standard Model's predictions and are a global test of the set of Standard Model particles that interact via the weak force (i.e. all of the massive fundamental particles of the Standard Model, but not gluons and photons).
The deviations between the experimental values of the CKM matrix entries and the theoretical expectation that the probability of any given quark transitioning via a W boson to one of three other possible quarks should equal 100%, are manageable, although there are some mild tensions.
Among other things, all of these global measures strongly disfavor a fourth generation of Standard Model particles, or higher order fundamental gauge bosons like the W' or Z' or extra Higgs bosons.
As we better understand how mass is generated by gluons in QCD, we are finding that QCD's methods are sound. See, e.g., here (a global review) and here (noting that omitting a third-loop from QCD calculations produces smaller errors than had been previously expected in calculations where energy scales are low enough to allow bottom quark contributions to be ignored). We are also finding that QCD bound structures more complex than two valence quark mesons and three valence quark baryons, which are theoretically possible in QCD but had not been definitively identified until the last decade or so, really do exist.
There are some remaining anomalies in particle physics that are obvious measurement errors, like the outlier recalculated W boson mass from old CDF experiment data collected at Fermilab (even if we can't yet tell exactly what the source of this error is), the discrepancy in the neutron lifetime between two measurement methods, and Russian false alarms of neutrinoless beta decay detections that have been repeatedly contradicted by experiments everywhere else.
Limits on Lorentz invariance and CPT violations also continue to be very strict. See also here.
There Is Only One Credible Anomaly Left
So, we are left with only one really credible remaining anomaly, which is the apparent violation of charged lepton universality in semi-leptonic B meson decays. Figuring out what is going on there is challenging.
It could be that there are flaws in the Standard Model prediction calculation that ignores a mass dependent source of differences between decays to tau leptons, muons, and electrons (and their antiparticles), perhaps overlooking some theoretically sound but neglected process that when you calculate it turns out to be more important than believed.
The hypothesis that omitted processes are responsible for the apparent violation of lepton universality is further supported by the fact that when lepton universality is apparently violated, the excess charged leptons are always less massive than the heavier charged leptons. This suggests an additional omitted process that generates enough mass-energy in the end state to produce lighter, but not heavier, charged leptons.
For example, if the process produces fewer tau leptons than muons, the muons produced up to roughly the number of tau leptons produced probably comes from the main W boson decay considered in the predicted ratio, while the excess of muons over the number of tau leptons produces may come from some other process that doesn't have the 1.78 GeV of mass-energy to produce a final state tau lepton.
It could be that the experiments aren't actually distinguishing between the backgrounds they are trying to exclude and the signals that they are looking for in the way that they believe that they are. It could also be that the results are simply due to some other experimental systemic error or to statistical flukes expected given the look elsewhere effect.
The biggest issue, in my view, with this anomaly, is that in the Standard Model, the leptons in a semi-leptonic decay of a hadron are always produced via the production and subsequent decay of a W boson, and W bosons should decay in the same way no matter how they are produced. But, in every other circumstance in which we observe semi-leptonic or fully leptonic decays of hadrons via an intermediate production of W bosons, of which there are perhaps half a dozen, lepton universality is observed.
So, any explanation of lepton universality violations in semi-leptonic B meson decays has to preserve the lepton universality non-violation seen in all other W boson mediated processes and shouldn't be due to some new property of the W boson that is at the root of most of the other complexities of the Standard Model.
My Bayesian priors strong favor the prediction that somehow or other, these lepton universality violations will go away, and leave the Standard Model with a complete success in all circumstances.
Over the decade and a half of carefully following the field (which is itself younger than I am), all sorts of anomalies have been touted only to be resolved without new physics: the muonic proton radius, the superluminal neutrino, the 750 GeV particle, the 20something GeV particle, the muon g-2 anomaly, and the reactor anomalies in neutrino physics leading to the hypothesis that there are sterile neutrinos. I'm sure that I've omitted half a dozen less notable ones.
Really the only truly "new physics" that has been discovered in the last forty years has been the discovery that neutrinos have mass and oscillate (which, of course, has largely been accepted as part of the core theory of fundamental physics by now, even though not all of the details have been worked out completely).
Similarly, "dark energy" while it seems mysterious, is the Lambda in the Standard Model of Cosmology's LambdaCDM model, and the cosmological constant in the equations of general relativity. It doesn't require new physics because it is already part of gravity.
I'd be remiss, of course, if I didn't mention the biggest anomaly of all, which hasn't shown up at any particle collider, which is dark matter phenomena.
As I haven't been shy in mentioning, I think that Alexandre Deur has it right and that dark matter phenomena (and indeed dark energy phenomena too) are due to the self-interaction of gravitational fields present in plain old classical General Relativity without a cosmological constant that has been around for more than a century, in an effect that has been widely overlooked, but that no one has rebutted for more than a decade, because everyone else is using a Newtonian approximation in astronomy and cosmological applications when it is inappropriate to do so.
This is quite an out on a limb position. It has only one fundamental experimentally measured physical constant (Newton's constant G) which has already been measured to the parts per ten thousand level. You are stuck with one set of equations (although they can be expressed in more than one way) that have been set in stone for more than a century.
Nobody disputes that gravitational field self-interaction is a thing and leads to non-linear effects in general relativity, as is routinely considered in strong gravitational fields. But, the justification for the Newtonian approximation in the physics of galaxies and larger structures is very shallow and back of napkin in character in the case of mass-energy distributions that aren't spherically symmetric, which lots of the universe is not at the scale of galaxies, galaxy clusters, and the cosmic web.
But, amazingly, it seems to work, replicating the results of MOND in spiral galaxies, but doing it one better in the case of galaxy clusters, and refining it slightly in the case of elliptical galaxies.
Certainly, the cold dark matter that is the CDM of the LambdaCDM model is wrong. Not every single other option has been ruled out, but all of the other either posit a fifth force, either a self-interaction term or a very weak interaction with ordinary matter term, or quantum behavior associated with exceedingly light dark matter particles (which in the case of axion-like particle theories, converge onto a quantum gravity variation of General Relativity in which the behavior of ultra-low energy graviton self-interactions produce the same results). So, all of the dark matter particle alternatives eventually start looking like gravitational modifications anyway, and still struggle to reproduce what we observe.
But, jf I'm wrong, I think it is more likely that General Relativity needs actual slight modification in weak fields, than it is that dark matter particles exist, even though that is the dominant paradigm right now. MOND works too well and too consistently for vanilla CDM to be right.
No comments:
Post a Comment