Tuesday, May 3, 2016

Planet Nine Lacks A Good Origin Story

While a variety of investigators have been hot on the trail of determine where Planet Nine might be and what characteristics it might have, all of the origin stories for it are low probability ones.

Dilution Or Selection?

A new paper from Nature focuses on 51 ancient genomes from the Upper Paleolithic.

One notable observation is that Neanderthal admixture falls from 3%-6% in the early Upper Paleolithic to current levels of about 2%. This is attributed to selection, although dilution with less admixed populations could produce the same top line result. (Oase1 is an outlier at 10% from 40kya).

The paper's analysis suggests also that most Neanderthal admixture is quite old (long before cohabitation of Europe) and decayed slowly and steadily through slight natural selection, presumably in West Asia or SW Asia, rather than Europe. Indeed, the model is consistent with zero admixture of modern humans and Neanderthals in Europe itself.

Eurogenes captures many observations from the comments.

Broad brush, modern populations are really only in continuity with historic populations in Europe back to the Epipaleolithic era ca. 14,000 years ago when Europe's population was replaced following the Last Glacial Maximum when the Western Hunter Gatherer (WGH) autosomal population starts to gel.  Earlier individuals loosely cluster around MA1 from 24,000 years ago with one individual from ca. 19,000 years ago looking like a transitional figure.

This is one Epipaleolithic individual from Northern Italy ca. 14,000 year ago with Y-DNA R1b.  There are stronger Asia affinities in European individuals than would be expected for most of the Upper Paleolithic.

The picture form uniparental ancient DNA until now has been one of a very narrow, homogeneous gene pool with a small effective population size.  To the extent that this is true, it is a post-LGM phenomena as the ancient autosomal DNA over the tens of thousands of years and thousands of miles spanned by the sample shows only fairly loose affinity.

Two out of 21 pre-LGM samples are mtDNA-M (now found almost exclusively in East Eurasia).  Of 13 pre-LGM Y-DNA samples, three of C (now found almost exclusively in East Eurasia) and four are haplogroups that precede the East-West divide in Y-DNA haplogroups.

Saturday, April 30, 2016

Frank Wilczek Teases Renormalization Breakthrough?

As I discussed recently, renormalization is a technique at the heart of modern Standard Model physics, and our inability to renormalize what we naively expect the equations of quantum gravity should look like is a major conundrum in theoretical physics.

Peter Woit, at his blog "Not Even Wrong" calls attention to a recent interview with Frank Wilczek, in which Wilczek suggests that he is right on the brink a making a major break through.

This breakthrough sounds very much like he and his collaborators have finally managed to develop a mathematically rigorous understanding of renormalization that has eluded some of the brightest minds in physics for four decades. First and foremost among them was Richard Feynman who was instrumental in inventing renormalization in the first place and who frequently and publicly expressed concerns about this foundational technique;s lack of mathematical rigor.

This topic came up not so long ago in a discussion in the comments at the 4graviton's blog. The blog's author (a graduate student in theoretical physics) noted in that regard that "Regarding renormalization, the impression I had is that defining it rigorously is one of the main obstructions to the program of Constructive Quantum Field Theory." The linked survey article on the topic conludes as follows:
It is evident that the efforts of the constructive quantum field theorists have been crowned with many successes. They have constructed superrenormalizable models, renormalizable models and even nonrenormalizable models, as well as models which fall outside of that classification scheme since they apparently do not correspond to some classical Lagrangian. And they have found means to extract rigorously from these models physically and mathematically crucial information. In many of these models the HAK and the Wightman axioms have been verified. In the models constructed to this point, the intuitions/hopes of the quantum field theorists have been largely confirmed. 
However, local gauge theories such as quantum electrodynamics, quantum chromodynamics and the Standard Model — precisely the theories whose approximations of various kinds are used in a central manner by elementary particle theorists and cosmologists — remain to be constructed. These models present significant mathematical and conceptual challenges to all those who are not satisfied with ad hoc and essentially instrumentalist computation techniques.

Why haven’t these models of greatest physical interest been constructed yet (in any mathematically rigorous sense which preserves the basic principles constantly evoked in heuristic QFT and does not satisfy itself with an uncontrolled approximation)? Certainly, one can point to the practical fact that only a few dozen people have worked in CQFT. This should be compared with the many hundreds working in string theory and the thousands who have worked in elementary particle physics. Progress is necessarily slow if only a few are working on extremely difficult problems. It may well be that patiently proceeding along the lines indicated above and steadily improving the technical tools employed will ultimately yield the desired rigorous constructions.

It may also be the case that a completely new approach is required, though remaining within the CQFT program as described in Section 1, something whose essential novelty is analogous to the differences between the approaches in Section 2, 3, 5 and 6. It may even be the case that, as Gurau, Magnen and Rivasseau have written, “perhaps axiomatization of QFT might have been premature”; in other words, perhaps the Wightman and HAK axioms do not provide the proper mathematical framework for QED, QCD, SM, even though, as the constructive quantum field theorists have so convincingly demonstrated, that framework is quite suitable for so many models of such varying types and, as the algebraic quantum field theorists have just as convincingly demonstrated, that framework is flexible and powerful when dealing with the conceptual and mathematical problems in QFT which go beyond mathematical existence. 
But it is possible that the mathematically and conceptually essential core of a rigorous formulation of QFT that can include the missing models lies somewhere else. Certainly, there are presently many attempts to understand aspects of QFT from the perspective of mathematical ideas which are quite unexpected when seen from the vantage point of current QFT and even from the vantage point of quantum theory itself, as rigorously formulated by von Neumann and many others. These speculations, as suggestive as some may be, are currently beyond the scope of this article.
The 4gravitons author may also have been referring to papers like this one (suggesting way to rearrange an important infinite series widely believed to be divergent, into a set of several convergent infinite series).

Wilczek says in the interview (conducted the day after his most recent preprint was posted):
What I’ve been thinking about today specifically is something of a potential breakthrough in understanding our fundamental theories of physics. We have something called a standard model, but its foundations are kind of scandalous. We have not known how to define an important part of it mathematically rigorously, but I think I have figured out how to do that, and it’s very pretty. I’m in the middle of calculations to check it out....
It’s a funny situation where the theory of electroweak or weak interactions has been successful when you calculate up to a certain approximation, but if you try to push it too far, it falls apart. Some people have thought that would require fundamental changes in the theory, and have tried to modify the theory so as to remove the apparent difficulty. 
What I’ve shown is that the difficulty is only a surface difficulty. If you do the mathematics properly, organize it in a clever way, the problem goes away. 
It falsifies speculative theories that have been trying to cure a problem that doesn’t exist. It’s things like certain kinds of brane-world models, in which people set up parallel universes where that parallel universe's reason for being was to cancel off difficulties in our universe—we don’t need it. It's those kinds of speculations about how the foundations might be rotten, so you have to do something very radical. It’s still of course legitimate to consider radical improvements, but not to cure this particular problem. You want to do something that directs attention in other places.
There are other concepts in the standard model whose foundations aren't terribly solid.  But, I'd be hard pressed to identify a bitter fit to his comments than renormalization. Also his publications linking particle physics to condensed matter physics make this a plausible target, because renormalization is a technique borrowed by analogy by particle physicists from condensed matter physics. 

His first new preprint in nearly two years came out earlier this month and addresses this subject, so I took a look there to see if I can unearth any more hints lurking there. Here's what he says about renormalization in his new paper:
- the existence of sharp phase transitions, accompanying changes in symmetry. 
Understanding the singular behavior which accompanies phase transitions came from bringing in, and sharpening, sophisticated ideas from quantum field theory (the renormalization group). The revamped renormalization group fed back in to quantum field theory, leading to asymptotic freedom, our modern theory of the strong interaction, and to promising ideas about the unification of forces. The idea that changes in symmetry take place through phase transitions, with the possibility of supercooling, is a central part of the inflationary universe scenario.
But, there was nothing in the paper that obviously points in the direction I'm thinking, so perhaps I am barking up the wrong tree about this breakthrough.