In conclusion, we have made a measurement of the branching fractions of the radiative leptonic τ decays τ → eγνν¯ and τ → µγνν¯, for a minimum photon energy of 10 MeV in the τ rest frame, using the full dataset of e +e − collisions collected by BABAR at the center-of-mass energy of the Υ(4S) resonance. We find B(τ → µγνν¯) = (3.69 ± 0.03 ± 0.10) × 10−3 , and B(τ → eγνν¯) = (1.847 ± 0.015 ± 0.052) × 10−2 , where the first error is statistical and the second is systematic. These results are more precise by a factor of three compared to previous experimental measurements. Our results are in agreement with the Standard Model values at tree level, B(τ → µγνν¯) = 3.67 × 10−3 , and B(τ → eγνν¯) = 1.84 × 10−2 [3], and with current experimental bounds.From here.
The pertinent language in the cited source for the Standard Model prediction published October 23, 2013 states:
For radiative τ− decays, with the same threshold Emin γ = 10 MeV, we obtain 1.84 × 10−2 (l = e) and 3.67 × 10−3 (l = µ), to be compared with the values measured by the CLEO Collaboration, (1.75 ± 0.06 ± 0.17) × 10−2 and (3.61 ± 0.16 ± 0.35) × 10−3, respectively, where the first error is statistical and the second one is systematic [41].
The experimental results from CLEO cited at [41] in the October 23, 2013 paper were published in the year 2000. The BABAR result was obviously much more accurate and much closer to the theoretical prediction as well. Indeed, the BABAR result is consistent with a true systemic error of zero, rather than the conservative estimate given, with all error seen actually being simply a function of statistical sample size. I noted similar instances of extremely accurate approximations of theoretical predictions in another post as year ago.
Good article on this Tau (τ) decays issue. Yet, for looking forward, the LHC is about ready for the Run II in 6 to 7 weeks.
ReplyDeleteWith the 13 Tev energy, the first thing that we can detect could be the excited quarks. If so, then quark could be composed. Can you address this issue? I put my view here for your review.
There are two issues for this.
I1, “language” has no truth/false value by itself. When the items (entities) which are described by a language are true, the language must be true. Although the Standard Model is known not complete, its quark world is true. Thus, a language which can ‘describe’ it (quark world) must be true, and this ‘language truth (not theory)’ needs no additional scientific proof.
The language (not theory) which describes the quark and lepton world was discussed at http://putnamphil.blogspot.com/2014/06/a-final-post-for-now-on-whether-quine.html?showComment=1403375810880#c249913231636084948 and
http://scientiasalon.wordpress.com/2014/11/10/the-ongoing-evolution-of-evolutionary-theory/comment-page-1/#comment-9581 .
In general, for composite particles, we can pump energy into it (such as a quark) to excite it, and then watch the excited particle (quark) relax back into its ground state. Excited quarks, if they exist, could be readily produced because their ingredients are the most commonly found particles in the proton.
Yet, there are two types of composite.
T1, the sub-particle compositeness: your analogy is readily applicable.
T2, the iceberg type compositeness: iceberg (the visible part of a big chunk ice) consists of three parts {a big chunk of ice, a large body of water, a huge void space above}. These parts are zillion times larger in size than the visible iceberg. The most of the energy could be absorbed by the large body of water or the huge void of space without getting into an excited state. Thus, the current LHC energy might not have enough energy to produce an excited-quark. But, I hope it does.
What do you think about this prospect?
A Standard Model with four or more fermion generations, rather than exactly three in the Standard Model was once very promising (and is true for all composite hadrons that there are an infinite number of possible excited states with higher J at high enough energies), but was dealt a serious experimental blow in August of 2011 and the LHC results since then have only further disfavored this model. I currently believe that there are exactly three generations of fermions, no more and no less, based upon a variety of independent sources (e.g. (1) the very small difference in mean lifetime between the top quark and the W boson, (2) the exclusion of a fourth generation neutrino to a very high mass relative to the tau neutrino mass, (3) the absence of a fourth generation charged lepton at a Koide's rule predicted vicinity which the LHC should have seen by now, and (4) the accuracy of precision electric and magnetic moments that are observed).
ReplyDeleteI have previously noted that (with corrections made without notation):
"If one extends the [extended Koide's rule for quarks] formula based upon recent data on the mass of the bottom and top quarks and presumes that there is a b', t, b triple, and uses masses of 173,400 MeV for the top quark and 4,190 MeV for the bottom quark, then the predicted b' mass would be 3,563 GeV and the predicted t' mass would be about 83.75 TeV (i.e. 83,750 GeV).
Since they would be produced a t'-anti-t' and b'-anti-b' pairs, it would take about 167.5 TeV of energy to produce a t' and 7.1 TeV of energy to produce a b'."
The up to the minute direct exclusion range at the LHC for the b' and t' is that there can be no b' with a mass of less than 670 GeV and no t' with a mass of less than 656 GeV (per ATLAS) and the comparable exclusions from CMS are similar (well under 1 TeV)."
The b' exclusion is now 675 GeV and the t' exclusion is now 782 GeV. Special relativistic effects that allow a b' or t' decay products of the several TeV scale or more, to escape detection would also have an unprecedented amount of missing energy which would raise all sorts of flags and wouldn't be missed.
The extended Koide's rule prediction for a 4th generation charged lepton would be about 43.6 GeV v. the 100.8 GeV or less exclusion.
The heaviest neutrino mass is about 0.05 eV in a normal hierarchy, and less than 0.15 eV in an inverted hierarchy. A fourth generation neutrino is definitively ruled out by Z boson decays up to 45,000,000,000 eV, and ratio of at least 10^11 from each other.
The t decay width (which is inversely proportional to its mean lifetime) is about 2.00 +/- 0.47 GeV. The W boson decay width is 2.085 +/- 0.042 GeV. We would expect a b' or t' decay width to be much more than 2.00 GeV which would be narrower than the width of the W boson by which they would decay.
The electric and magnetic moment data strongly disfavor any new fundamental particles that behave like their lower generation counterparts such as excited 4th or greater generation quarks or leptons up to about 10 TeV.
A model with exactly three generations of quarks is much harder to hypothesize a mechanism for than one with an infinite number of excited states at higher energies.
"the LHC is about ready for the Run II in 6 to 7 weeks." But, meaningful results from Run II will probably take until October at the earliest to be generated and released.
ReplyDelete"Although the Standard Model is known not complete, its quark world is true."
I am agnostic on this point. The quark model does a lot of things very well, but until we know more about the structure of scalar and axial-vector mesons, and the problem that quarks do not seem to be the direct source of total angular momentum (J) in hadrons as the quark model would lead one to expect, it is hard to be certain that it is "true" as opposed to "mostly true."
The logic of this post suggest that the CKM matrix element squared for a first to fourth generation transition would be expected to be on the order of 100 million to one. So the fact that the CKM matrix is unitary is not necessarily a ruling out factor of SM4+ given the margins of error involved, especially for a very heavy next generation. But the PMNS mixing angles are much bigger, so the absence of a fourth generation neutrino continues to be very problematic.
ReplyDeleteCorrection: 10^28 to one. The top to bottom prime probability would be about 10^16 to one (in cases where there was sufficient energy).
ReplyDeleteThus, even at 13 TeV of energy, you'd need to make about 10^16 top quarks to have a 50-50 chance of seeing a b' and you'd have to assume an asymptotic bound on width between the top quark and W boson width.
Honestly, the quark side is much less of an experimental barrier than the lepton side due to high expected masses and very small CKM matrix elements, and the charged lepton bound isn't so profoundly in excess of the Koide expectation that it can be ruled out either (and you wouldn't expect a 101+ GeV tau prime, anti-tau prime to be produced in Higgs boson decays either at any meaningful rate since that would be 202 GeV from a 125-126 GeV particle.
The lack of a fourth generation neutrino under 45 GeV (which might not show up in Higgs boson decays if neutrinos don't get mass via a Higgs mechanism and instead get it by some other means like weak force self-energy), is much harder to find an out for. And, you need a fourth generation neutrino to allow your fourth generation charged lepton to decay properly and it would need to be much lighter than the fourth generation charged lepton (which incidentally means that it wouldn't impact Neff and would be a WIMP candidate except for its lack of stability, although if the PMNS matrix element was tiny, it might be metastable).
Still, seeing a b' or a tau prime at the LHC would be a long shot.
To get around the neutrino problem you'd need (1) an inverse mass hierarchy, so that the new neutrino would have less than about 1 meV of mass or so, (2) pretty strong lepton flavor conservation so that new tau prime neutrinos or antineutrinos were only created in the very rare instances when fourth generation charged leptons are created, and (3) a tiny PMNS matrix angle that makes oscillations to tau prime neutrino states so rare that the don't contribute meaningfully to Neff.
ReplyDeleteBut, this should still screw up W and Z boson decays which should decay to tau prime and anti-tau prime neutrino pairs with the same frequency as other neutrino flavors, which we know that they do not. That version of tau prime would be a huge violation of lepton universality. But, if hints of lepton universality violation as generations increase (bear out) which the original post experiment strongly disfavors in muon v. electron comparisons in this experiment despite hints in other experiments, then maybe.
Thanks for the replies.
ReplyDeleteandrew: "Still, seeing a b' or a tau prime at the LHC would be a long shot."
Agree. But, my point is that there is 'theoretical truth' which is much more prominent than the empirical evidences. As the entire quark and lepton (fermions) world is able to be described with a LANGUAGE (see the putnum link), it is a theoretical truth and needs no empirical support.
andrew: "And, you need a fourth generation neutrino to allow your fourth generation charged lepton to decay properly..."
My Alpha equation consists of ONLY two numbers {64, 48}, and these two numbers give rise to only three (3) generations, no more nor less. I don't truly give a damn about the "Koide's rule".
One of the better techniques for making plausible hypotheses is to extrapolate patterns in the existing data to see what the next term would be, or to generalize a relationship that is observed by parameterizing deviations from the accepted rule and then placing experimental bounds on what the parameters could be and be consistent with the data.
ReplyDeleteKoide's rule in its formal original sense doesn't hold for anything but the three charged leptons. But, it does do a decent job of approximating the texture of the Standard Model mass matrix in a pretty model independent phenomenological way, which makes it a useful tool for thinking about what the fourth generation fermion masses would look like if they existed (a quite straightforward extrapolation of the Standard Model).
I don't really think that there are more than three generations now, and obviously, there aren't fewer, even though it wasn't so clear a few years ago. But, I also don't claim to have the "theoretical truth" and believe it was worth keeping an open mind so long as it isn't absolutely impossible to reach a different result.