Tuesday, May 5, 2026

Cracking Linear Elamite


Four years ago, French archaeologist François Desset reportedly cracked the 4,000 year old Linear Elamite script, "[m]ade up of 77 signs – diamonds, curves, and other geometric patterns – the writing system comes from the Bronze Age civilisation of Elam" in Southwest Iran that collapsed long ago. The script was rediscovered in 1903, but has eluded decipherment until now. 


Elam is shown in red.

The Elamite language has partially been known long before this breakthrough from inscriptions of the language made in an Elamite cuneiform script, which was adapted from Akkadian cuneiform. This language was still spoken in the first century CE, and probably went extinct around the eleventh century CE. Most linguists consider it to be a language isolate (and in my review of this literature, I have not found that Elamo-Dravidian hypothesis to be a credible one).
Elamite is regarded by the vast majority of linguists as a language isolate,[31][32] as it has no demonstrable relationship to the neighbouring Semitic languages, Indo-European languages, or to Sumerian, despite having adopted the Sumerian-Akkadian cuneiform script.

An Elamo-Dravidian family connecting Elamite with the Brahui language of Pakistan and Dravidian languages of India was suggested in 1967 by Igor M. Diakonoff[33] and later, in 1974, defended by David McAlpin and others.[34][35] In 2012, Southworth proposed that Elamite forms the "Zagrosian family" along with Brahui and, further down the cladogram, the remaining Dravidian languages; this family would have originated in Southwest Asia (southern Iran) and was widely distributed in South Asia and parts of eastern West Asia before the Indo-Aryan migration.[36] Recent discoveries regarding early population migration based on ancient DNA analysis have revived interest in the possible connection between proto-Elamite and proto-Dravidian.[37][38][39][40] A critical reassessment of the Elamo-Dravidian hypothesis has been published by Filippo Pedron in 2023.[41]

Václav Blažek proposed a relation with the Semitic languages.[42]

In 2002 George Starostin published a lexicostatistic analysis finding Elamite to be approximately equidistant from Nostratic and Semitic.[43]

None of these ideas have been accepted by mainstream historical linguists.[31]
Desset accomplished this primarily by using proper names to decode the meaning of those signs and applying that method to ten new Linear Elamite texts inscribed on vases, according an April 28, 2026 story from France 24. This story of breakthrough is also reported in National Geographic, January 2026, pp. 110-131, entitled, "Decoding the Lost Scripts of the Ancient World", by Joshua Hammer. Hat top to Language Log. The Smithsonian magazine also has a recent article on the topic. It isn't entirely clear to me why this development is making headlines now, four years after the leading article on the topic was published (which I blogged at the time, see also an earlier post on a related topic).

Wikipedia (at the link above) explains that:
In 2022, Desset et al. (2022) argued that Linear Elamite is an alpha-syllabary, which would make it the oldest known purely phonographic writing system.[5] However, they admit that some logograms may have been used, although only rarely and not systematically, arguing that Elamite scribes rejected logographic writing in the 3rd millennium BCE.[30] Other researchers, such as the linguist Michael Mäder, dispute this, arguing that only around 70 percent of Linear Elamite characters are likely to be purely phonographic and that the remainder are logograms, as evidenced by mathematical analyses of Linear Elamite inscriptions.[3][31]

His 2022 article is the capstone of the project. Its abstract states:

Linear Elamite writing was used in southern Iran in the late 3rd/early 2nd millennium BCE (ca. 2300–1880 BCE). First discovered during the French excavations at Susa from 1903 onwards, it has so far resisted decipherment. The publication of eight inscribed silver beakers in 2018 provided the materials and the starting point for a new attempt; its results are presented in this paper. A full description and analysis of Linear Elamite of writing, employed for recording the Elamite language, is given here for the first time, together with a discussion of Elamite phonology and the biscriptualism that characterizes this language in its earliest documented phase.

Desset's main publications on the subject are as follows: 

Desset, François (2018a). "Linear Elamite Writing". In Álvarez-Mon, Javier; Basello, Gian Pietro; Wicks, Yasmina (eds.). The Elamite World. Abingdon, Oxon: Routledge. pp. 397–415. ISBN 978-1-315-65803-2.



Desset, François (2020b). A New History of Writing on The Iranian Plateau – via YouTube.

Desset, François (1 September 2021). "On The Decipherment of Linear Elamite Writing". The Postil (Interview). Interviewed by Robert M. Kerr.

Desset, François; Tabibzadeh, Kambiz; Kervran, Matthieu; Basello, Gian Pietro; Marchesi, Gianni (2022). "The Decipherment of Linear Elamite Writing". Zeitschrift für Assyriologie und vorderasiatische Archäologie. 112 (1): 11–60. doi:10.1515/za-2022-0003. ISSN 0084-5299. S2CID 250118314.

Surfaceology


A new technique called "surfaceology" (described in the linked Quanta magazine article) provides a profoundly more efficient method than the path integrals implied by Feynman diagrams to calculate the probability of Standard Model interactions. 

It is also useful in doing calculations in "double copy" approaches to quantum gravity, in which on does a calculation in QCD and "squares" it, to get an answer for a parallel problem in quantum gravity. 

Surfaceology flows from the same line of reasoning as the amplituhedron of theoretical physics superstar Nima Arkani-Hamed (which only works for supersymmetry theories) and was devised by a junior member of his research group, Carolina Figueiredo, in 2022, with a pair of preprints (here and here) first published in September of 2023. But, it works for real Standard Model particles and not just for simplified theoretical physics models.

Further developments in the winter of 2023-2024 described outcomes that were considered with many calculations in Feynman diagram calculations that eventually revealed that these outcomes were effectively impossible called "hidden zeros." Figueiredo and Arkani-Hamed, along with Qu Cao, Jin Dong, and Song He, posted theses findings in a series of preprints.

More efficient calculations that this method facilitates could turn many particle physics and quantum gravity problems that were theoretically possible to calculate, but as a practical matter, impossible to numerically work out, into practically solvable problems, and can very difficult calculations vastly easier to solve.

Hat tip to 4Gravitons.
(opens a new tab

Thursday, April 30, 2026

The Standard Model Still Works (Again)

The LHCb experiment at the Large Hadron Collider (LHC) has made a statistically significant observation (although not an absolutely certain discovery) a rare decay of a particular kind of positively charged bottom quark meson (to a positively charged pion and an electron-positron pair, which is an example of what is called a semi-leptonic decay because it is a mix of a hadron, the pion, and leptons like electrons and positrons) with a frequency of one decay per 40 million decays of this kind of meson (a kind of meson which, itself, doesn't make up a large share of mesons produced at LHCb). 

This just happens to be statistically consistent with the frequency of this kind of decay of this kind of meson that the Standard Model predicts of B(B+→ π+ℓ+ℓ−) = (2.04 ± 0.21) × 10^−8, which is about one per 50 million decays. The same decay, but with muons, was first seen in 2012 at a branching fraction of one per 55 million decays that was also statistically consistent with the Standard Model expectation (which is the same for electrons and for muons due to lepton universality).

The first evidence for the decay B+→π+e+e− is reported using proton-proton collision data recorded by the LHCb experiment at centre-of-mass energies of 7, 8 and 13 TeV, corresponding to an integrated luminosity of 9 fb^−1. 
A signal excess with a significance of 3.2σ is observed and the branching fraction is measured to be B(B+→ π+e+e−) (2.4+0.9−0.8+0.4−0.2) × 10^−8, where the first set of uncertainties is statistical and the second is systematic. The result is consistent with the Standard Model expectation.
LHCb collaboration, "First evidence of the decay B+→π+e+e−" arXiv:2604.26784 (April 29, 2026).

Combining the statistical and systemic uncertainties, the total uncertainty is about 2.4 ± 0.9 x 10^-8, which a larger branching fraction (i.e. more events) actually slightly favored over a smaller one (i.e. fewer events), relative to the best fit value.

The deviation from the Standard Model expectation in the muon measurement was about 0.7 sigma (in the opposite direction of the deviation in the electron experiment, from the best fit value), while the deviation from the Standard Model expectation in the electron measurement was about 0.4 sigma. This suggests that the systemic uncertainty estimate in the Standard Model prediction and in the experiments was probably conservatively somewhat high.

This particular hadron decay isn't extremely significant (hadrons are either mesons like the B+ or baryons like the proton). But comparing the decay rate of a positively charged pion with a muon-antimuon pair to the decay rate of a positively charge pion with an electron-positron pair is a good test of "lepton universality" (i.e. the Standard Model rule that electrons, muons, and tau leptons have properties that are identical except for their masses). For several years there were experimental anomalies that made it appear that lepton universality was violated, but those anomalies were recently resolved in favor of the Standard Model prediction that lepton universality is not violated.

There are about a hundred plain vanilla mesons and baryons in the Standard Model like the B+ meson studied here, and some of the heavier ones have perhaps hundreds of decay modes with a predicted branching fraction of less than one decay per billion decays. So, the universe of Standard Model predicted meson decays to look for is somewhere on the order of 10,000.

The B+ meson has two "valance quarks" an up quark and an anti-b quark. It has a rest mass of 5279.26 ± 0.17 MeV/c^2 (about 5.6 times the mass of a proton and a little less massive than a Lithium-6 atom). It has total angular moment (a.k.a. "spin") of 0 and odd (i.e. negative) parity, which means that it is a "pseudo-scalar" meson. It is ephemeral, it has a mean lifetime of (1.638 ± 0.004) × 10^−12 seconds (i.e. a little more than a trillionth of a second). It has more than two dozen measured decay modes that happen in more than one in a million decays, and the vast majority of the time B+ mesons decay to particles that include some kind of charm quark hadron. It has hundreds of decay modes more probable than this one.

The Standard Model was devised in the early 1970s, the b quark was discovered at Fermilab in 1977. The full set of fundamental particles (except the Higgs boson, which was discovered at Fermilab in 2012 and the discovery that the neutrinos were massive), was in place in 1995, more than three decades ago. 

The Standard Model prediction for the frequency of this particular B+ meson decay was cited in connection with the first observation of the parallel muon decay in 2012 and in 2015, and derive from a 2008 paper (i.e. it was made more than 14 years before this decay was observed just as predicted).

'The Higgs boson and the neutrino masses don't (meaningfully) enter into the calculation of the branching fractions of the B+ meson, so the only thing that has changed in the Standard Model since 1995 that is relevant to this calculation is that the measurements of some of the fundamental physical constants involved in the calculation, especially the relevant CKM matrix elements (as noted at page 13 of the 2008 paper) have gotten more precise over that time. (The accuracy with which we know another non-fundamental physical constant, called the "form factor" of the B+ meson, which is too hard to calculate from first principles at this point, has also improved and is material to this calculation.)

The physical constant whose improved precision matters most in this context are the CKM matrix elements for the b quark to up quark transition probability in W boson interactions and the top quark to down quark transition probability in W boson interactions, which are low: about 0.14% and 0.007% respectively. 

The respective 3% and 2% uncertainties in the world average measurement of these physical constants are probably some of the leading sources of the roughly 10% uncertainty in the Standard Model prediction of the frequency of this B+ meson decay branching fraction. It is hard to say exactly how much of a share of the uncertainty in the predicted value is from this source, however, because while the respective papers linked above provide an error budget chart for the uncertainties in their experimental measurements, none of the papers provide an exact error budget chart for their Standard Model predictions for this decay frequency, probably because this was considered too elementary to publish. 

Computer processing capacity has also improved greatly since then, which makes these calculations much less cumbersome to actually make.

In isolation, this experimental confirmation of the Standard Model prediction could be just a lucky fluke, although a quite remarkable one, even on its own. But together with thousands of other measured hadron branching decay fractions, the Standard Model is really unstoppable. 

Experimental result anomalies where there are deviations from the Standard Model prediction are few, far between, modest in statistical significance, and usually go away quickly for closer inspections and more experiments and analysis. Experiments testing the Standard Model in contexts other than hadron decay branching fractions that involve completely different kinds of calculations are just as consistently correct. It is an extremely robustly tested theory.

Even if there are beyond the Standard Model physics gaps that are missing from the Standard Model, it is very close to the truth. The open parameter space for deviations from it are very small.