The latest results reaffirm the conclusion that lepton universality (i.e. the principal that electrons, muons, and tau leptons are identical except for their masses, which is part of the Standard Model of Physics) holds in all interactions experimentally measured to date. Earlier tensions with this conclusion were due to a flawed analysis of the data.
This is particularly a blow to proposals for leptoquarks or vector-like quarks that had been proposed to address the lepton universality violation anomalies that had previously seemed to be present.
The measurement of the R(Xe/µ) ratio in the region p(l)∗ > 1.3 GeV is 1.033 ± 0.010stat ± 0.020 sys which is compatible with the Standard Model prediction of 1.006 ± 0.001 within 1.2σ [13]. At the time of presenting, this result was the most precise branching-fraction based test of lepton flavour universality, now superseded by the latest R(K), R(K∗) measurement from LHCb [14]. Importantly, the R(Xe/µ) result is compatible with the 2019 exclusive result from Belle on R(D∗ e/µ) which was measured to be 1.01 ± 0.01stat ± 0.03sys [15].
From the body text of Priyanka Cheema,"Semileptonic and Leptonic B Physics at Belle II" arXiv:2303.01730 (March 3, 2023) (conference paper in advance of PLR publication).
The Big Picture In High Energy Physics
There is very little room left to deviate from the Standard Model.
The demise of the lepton universality violation anomaly in the experimental data coincides with the probable disappearance of the muon g-2 anomaly due to better calculations of the Standard Model expectation, and data that continues to disfavor and virtually rule out sterile neutrino theories. And, of course, the W boson mass anomaly based upon reanalyzed Tevatron data, which never credible as further analysis is revealing, since it is so out of step with all other high precision measurements made of that particle's mass, in part because the Tevatron data reanalysis used a subtly atypical definition of the W boson mass.
Of these, the experimental data confirmation of the improved lattice QCD calculations of muon g-2 is perhaps the most important, because it is a global measure of low to medium energy scale deviations from the Standard Model, at least up to the energy scales that can be reached by a next generation particle collider.
Similarly, after a decade of Higgs boson experimentation at the Large Hadron Collider (LHC), all of the data is consistent with a Standard Model Higgs boson with a mass of about 125 GeV, although uncertainties in some of the measurements leave wiggle room for alternatives that is narrowed incrementally with the experimental results from each new LHC run of data and is released on an annual basis. There have been no statistically significant signs of additional Higgs bosons. Also, the measured Higgs boson mass is significant because it has a value that prevents a mathematical breakdown of the Standard Model calculations up to the "grand unification theory" (GUT) scale and demands at least "metastable" vacuum (i.e. one where a vacuum collapse of the universe could happen, but is mathematically unlikely to occur on timeframes equal to the current lifetime of the universe).
Slowly but surely, the minimum experimentally supported lifetime of the proton (which is completely stable in the Standard Model) and the frequency with which neutrinoless double beta decay occurs (which is impossible in the Standard Model) are getting ever closer to the Standard Model expectation. The absence of proton decay at the levels observed ruled out the possibility of the simplest grand unified theories of particle physics long ago.
The non-detection of neutrinoless double beta decay to date (which is a function of Majorana neutrino mass if neutrinos have Majorana mass and does not occur at all if neutrinos have Dirac mass), is so far consistent with neutrino oscillation data suggesting very small (meV scale) mass differences between the three neutrino mass eigenstates, with cosmology data suggesting that the sum of the three neutrino masses is close to the minimum allowed by neutrino oscillation data, and with direct observations of neutrino masses, which while much less strict than those from the other methods mentioned above still constrain the lightest neutrino mass to be significantly less than 1 eV at the 95% confidence level and have best fit values that are smaller than that.
High energy physics (HEP) is not entirely free of tensions between experiment and Standard Model predicted values, particularly with respect to the consistency of the measured CKM matrix element values with each other (there are nine experimental observables that the Standard Model proposes can be summarized with four parameters that result in unitary weak force flavor transition probabilities for quarks) and when it comes to the properties exotic hadrons.
But, none of these tensions have a character or magnitude that is widely seen in the HEP-physics community as a lightly sign of beyond the Standard Model physics.
Alas, however, currently there is also sufficient uncertainty in the two dozen or so experimentally measured parameters of the Standard Model (fifteen masses, four CKM matrix elements, four PMNS matrix elements, and three coupling constants, for a total of 26 parameters, with a few less degrees of freedom since some of these experimentally measured parameters are functionally related to each other) to distinguish between a variety of "within the Standard Model" additional functional relationships between these parameters that could greatly reduce the number of degrees of freedom in the Standard Model. There are also a couple of experimentally fixed parameters, like Planck's constant, the magnitude of the electron charge, and the speed of light (now used to define units of measurement at a value equal to the most accurate measurements before it was defined definitionally), which while they are not normally considered Standard Model parameters, are necessary to do Standard Model physics.
Most extrapolations of the Standard Model, whose parameters vary with energy scale according to equations called beta functions that can be determined exactly from theory, at very high energies also exclude the impact of unifying the Standard Model with gravity which in a quantum gravity theory should have a subtle by exactly calculable impact on the very high energy behavior of the running of the Standard Model physical constants - which is a known flaw that can be ignored at experimental scale energies given their current precisions.
In short, there is no strong high energy physics motivation at this time for any beyond the Standard Model physics at any kind. Phenomenology efforts to suggest new physics are mostly of the "God of the gaps" variety that exploit experimental results that are fully consistent with the Standard Model but due to their uncertainties, can tolerate moderate deviations from it.
In all probability, the next generation collider will not have high enough energies to detect any new physics that existing experiments don't already strongly disfavor. Existing constraints strongly disfavor new physics at least up to energy scales about a hundred times as great as the LHC. And, there are basically no hints that there will be anything but a new physics desert anywhere between the energies of the LHC and the GUT scale. Nothing we observe positively suggests any need of motivation for new physics at these scales.
We are firmly ensconced in what particle physicists call the "nightmare scenario" - meaning one in which there are no prospects for them to discover new physics in the foreseeable future.
Really, the best they can do at this point is to better grasp the strong force (QCD) physics of extremely ephemeral and rare exotic hadrons like tetraquarks, pentaquarks, hexaquarks, and so improbably that they are virtually impossible to create top quark hadrons. And, this is mostly a question of more precise measurements and better computing, not of high energy colliders.
Similarly, bridging the gap from hadrons like protons and neutron and pions to the complex assemblies of hadrons found in nuclear physics at a first principles level, is a long term project that we are not particularly close to achieving. But, the laws of physics and relevant fundamental parameters involved are know perfectly for all practical purposes for these kinds of inquiries except for greater precision measurements of the Standard Model parameters (especially the strong coupling constants and some of the quark masses).
There are likewise parameters of neutrino physics to be measured more precisely. But this situation should be dramatically improved in a decade or two by experiments currently in the pipeline.
The Situation In Astrophysics
The situation in astrophysics is quite different.
On one hand, the standard version of General Relativity with a cosmological constant used in the LambdaCDM Standard Model of Cosmology itself has only two experimentally measured physical constants that go into this law of nature - Newton's constant G (known to parts per 10,000 or so), and the cosmological constant (known much less precisely). The free parameters of the relevant laws of physics are few and don't need a lot of tuning.
But the data tell us that General Relativity with a cosmological constant as conventionally applied with ordinary matter alone is extremely inconsistent with what astronomers observe. There is either something wrong with our theory of gravity as conventionally applied, or there is a lot of dark matter out there.
Astronomy observations fit into the simply six parameter LambdaCDM model (plus a couple of parameters for a massive neutrino extension of it) can fix the relative proportions by mass-energy equivalent of ordinary matter, dark matter, neutrinos, and dark energy (a.k.a. the effects of the cosmological constant) as well as elucidating the topology of the universe (very nearly Euclidian).
But, while LambdaCDM is a decent first order simple approximation of our universe's cosmology, upon closer inspection it starts to fall apart.
Its simplest form with sterile collisionless dark matter is inconsistent with galaxy dynamics, produces galaxies too late, gets key aspects of galaxy cluster dynamics wrong, and has other more or less independent shortcomings that have been more recently noticed or are more techincal that amount to a dozen or so in all. Simple variations like very light dark matter particles with quantum properties that impact their dynamics and self-interacting dark matter models don't solve all of these problems. Yet, variations from truly sterile dark matter should be observable more directly than the current observations have thus far, and this has greatly narrowed the parameter space for dark matter candidates of almost all of the leading proposals.
Also, in the LambdaCDM model, the Hubble constant should be a constant, and the large scale structure of the universe should be homogeneous, and should be isotropic. But, all three of those assumptions are increasingly in tension with observation. Early 21cm background radiation measurements are inconsistent with dark matter although there have been serious questions posed as to the accuracy of these early observations.
The supersymmetric thermal WIMP candidate for dark matter at the electroweak mass scale, which was the original prime candidate for dark matter, has been basically ruled out. Warm dark matter and sterile neutrinos are likewise in deep trouble. MACHOs (Massive compact halo objects) were ruled out decades ago. Primordial black holes and exotic hadron dark matter particles made out of ordinary matter are also both basically ruled out as well. The viable parameter space for self-interacting dark matter is small and faces serious challenges at solving the problems it was devised to create.
Also, once you get beyond a single class of sterile dark matter particles, Occam's Razor ceases to favor dark matter particle explanations for dark matter phenomena. If this model doesn't work, dark matter particle theories are introducing not just a new particle but a new force.
The hypothetical X17 boson, which was proposed based upon apparent deviations between the predicted and observed angles of decays in certain nuclear decays that are very hard to model correctly with QCD which is an accurate but low precision part of the Standard Model, even if it existed, wouldn't have the right properties to make it a viable dark matter candidate either. There is very good reason so far to believe that the X17 boson does not exist, although a couple of experiments in the near future will confirm this conclusion.
Axion-like particle (ALP) dark matter is much less well motivated than its proponents claim, because the case that a particle called the axion that was invented to explain why the strong force is indifferent to the direction of time, is weak and ill-motivated. By process of elimination, however, this ultralight bosonic dark matter candidate (whose particles start to approach the minscule mass-energy of gravitons) is perhaps the least definitely constrained or ruled out dark matter particle candidate. To be perfectly honest, however, I haven't followed the various papers narrowing the ALP parameter space - largely because the exclusions in individual experiments and papers are so small, and because they are highly model dependent which makes them hard to compare. Most ALP models, however, propose particles that ought to be relatively easy to detect experimentally with small to medium budget instruments, if you have an experiment tuned to the right ALP properties.
We've now reached the point where gravity based solutions are starting to look good by comparison.
The most well known gravitational tweak, MOND (for modified Newtonian Dynamics) performs remarkably well in the case of the weak field area of galaxy scale systems. It improves on the baryonic matter only dynamics for galaxy clusters and large scale structure but doesn't complete solve those problems. The right way to generalize it relativistically isn't clear, although progress as been made to explain the impossible early galaxy problem and the cosmic microwave background patterns without dark matter particles using relativistic generalizations.
There are also several gravity based approaches, like Deur's general relativity considering self-interactions in weak gravitational fields in large system, Moffat's MOG theories, emergent gravity, conformal gravity, non-local gravity, and some other proposals that call attention to GR effects usually neglected or modify gravity that I've explored less deeply at this blog, which show promise of having easier relativistic generalizations and a broader range of applicability than MOND while reproducing its successes. Many of these are promising but starved for attention, although some early efforts like TeVeS, one of the first relativistic generalizations of gravity, have lost some of their shine.
Other proposals for gravity modification, like massive gravity, are qualitatively strongly disfavored even though they make interesting models to consider to understand relativistic physics.
The good news for astrophysics is that we have a torrent of data gushing in from improved observational instruments (crudely "telescopes" in the most general sense of the word) and improved analytical tools to crunch this big data. So, we can afford to let the data guide the way forward for us and can try all of the approaches to gather different kinds of data in different ways. Gravitational wave detectors are the latest addition to the types of "telescopes" that are out there.
There are also big picture theoretical issues to deal with in astrophysics. Classical GR is not easily made consistent with the quantum physics foundations of the Standard Model, which would be much easier to integrate with a quantum gravity model that either quantized space-time or has a force carrying spin-2 massless gravity, or both. And there are well know practical problems to doing Standard Model type calculations with massless spin-2 gravitons in non-trivial cases, especially at higher energies, because hypothetical gravitons would have self-interactions (more technically, it is a non-linear, non-Abelian theory), and because the naive theory is "non-renormalizable" which is the methodological trick that makes other Standard Model physics problems tractable mathematically.
We have no strong observational cues to tell us if gravity really is quantum or classical in character, and conventional wisdom holds that distinguishing the differences between the two is very hard, especially outside ultra-strong gravitational fields like those found near black holes and at the time of the Big Bang.
My strong intuition is that just a very subtle tweak from General Relativity as conventionally applied, probably without a cosmological constant, like Deur's work on gravity, can solve all of the outstanding problems associated with dark matter, dark energy, and more. It can do this without new particles or truly new fields. It can also be much more easily be generalized to a quantum theory than GR with a cosmological constant.
The calculation issues associated with a non-Abelian, non-renormalizable theory are surmountable as illustrated by the successes of the gravity as QCD squared paradigm and other methodological tools that are just starting to get more attention like lattice quantum gravity, mean field, and scalar field approximations with higher order refinements.
There are likewise good theoretical foundations for most of the open questions in cosmology that arguably aren't even true "scientific" questions that very on philosophy.
We aren't there yet and it make take the deaths of the current generation of scientists to be replaced by a new generation with more up to date assessments of the state of the data and an openness to new world views and paradigms to get there.
But, I'm an optimist. I've seen glimpses of the promised land from the border of that era and I'm confident that eventually, we will get there, and leave the "dark ages" of astrophysics and cosmology behind us for good, if not in my children's lives, in the lives of my grandchildren.
Another cause for optimism is that even if human physicists with disciplinary sociological blinders the deter them from discovering the truth flounder, sociology-free machine learning approaches to deducting physical laws may make progress where the human scientists fail.
Footnote
The relatively large mass of the Higgs boson, mH≃125 GeV coupled with the (as yet) lack of discovery of any supersymmetric particle at the LHC, has pushed the supersymmetry breaking scale to several TeV or higher.
From here.
still waiting for x17
ReplyDeleteInvestigation of a light Dark Boson existence: The New JEDI project
Beyhan Bastin1*,**,***, Jürgen Kiener2**,***, Isabelle Deloncle2**, Alain Coc2**, Maxim Pospelov3**, Jaromir Mrazek4**, Livio
Published online: 3 February 2023
Abstract
Several experiments around the world are looking for a new particle, named Dark Boson, which may do the link between the Ordinary Matter (which forms basically stars, planets, interstellar gas...) and the Hidden Sectors of the Universe. This particle, if it exists, would act as the messenger of a new fundamental interaction of nature. In this paper, the underlying Dark Sectors theory will be introduced first. A non-exhaustive summary of experimental studies carried out to date and foreseen in the incoming years will be presented after,including the 8Be anomaly. The last section will provide a status of the New JEDI**** project which aims to investigate the existence or not of a Dark Boson in the MeV range.
https://www.epj-conferences.org/articles/epjconf/abs/2023/01/epjconf_enas112023_01012/epjconf_enas112023_01012.html
arXiv:2302.13281 (hep-ex)
[Submitted on 26 Feb 2023]
A Time Projection Chamber to Search for Feebly Interacting Bosons via Proton Induced Nuclear Reactions
Martin Sevior, Michael Baker, Lindsey Bignell, Catalina Curceanu, Jackson T.H. Dowie, Tibor Kibedi, David Jamieson, Andrew Stuchbery, Andrea Thamm, Martin White
We propose a new Time Projection Chamber particle detector (TPC) to search for the existence of feebly-interacting bosons and to investigate the existence of the X17 boson, proposed by the ATOMKI group to explain anomalous results in the angular distributions of electron-positron pairs created in proton-induced nuclear reactions. Our design will provide 200 times greater sensitivity than ATOMKI and the program of research will also provide world-leading limits on feebly interacting bosons in the mass range of 5 - 25 MeV.
Comments: 26 pages, 16 figures, 2 tables
Subjects: High Energy Physics - Experiment (hep-ex); High Energy Physics - Phenomenology (hep-ph)
Cite as: arXiv:2302.13281 [hep-ex]
There are also several gravity based approaches, like Deur's general relativity considering self-interactions in weak gravitational fields in large system, Moffat's MOG theories, emergent gravity, conformal gravity, non-local gravity, and some other proposals that call attention to GR effects usually neglected or modify gravity that I've explored less deeply at this blog
ReplyDeleteand refracted gravity