Saturday, April 30, 2016

Frank Wilczek Teases Renormalization Breakthrough?

As I discussed recently, renormalization is a technique at the heart of modern Standard Model physics, and our inability to renormalize what we naively expect the equations of quantum gravity should look like is a major conundrum in theoretical physics.

Peter Woit, at his blog "Not Even Wrong" calls attention to a recent interview with Frank Wilczek, in which Wilczek suggests that he is right on the brink a making a major break through.

This breakthrough sounds very much like he and his collaborators have finally managed to develop a mathematically rigorous understanding of renormalization that has eluded some of the brightest minds in physics for four decades. First and foremost among them was Richard Feynman who was instrumental in inventing renormalization in the first place and who frequently and publicly expressed concerns about this foundational technique;s lack of mathematical rigor.

This topic came up not so long ago in a discussion in the comments at the 4graviton's blog. The blog's author (a graduate student in theoretical physics) noted in that regard that "Regarding renormalization, the impression I had is that defining it rigorously is one of the main obstructions to the program of Constructive Quantum Field Theory." The linked survey article on the topic conludes as follows:
It is evident that the efforts of the constructive quantum field theorists have been crowned with many successes. They have constructed superrenormalizable models, renormalizable models and even nonrenormalizable models, as well as models which fall outside of that classification scheme since they apparently do not correspond to some classical Lagrangian. And they have found means to extract rigorously from these models physically and mathematically crucial information. In many of these models the HAK and the Wightman axioms have been verified. In the models constructed to this point, the intuitions/hopes of the quantum field theorists have been largely confirmed. 
However, local gauge theories such as quantum electrodynamics, quantum chromodynamics and the Standard Model — precisely the theories whose approximations of various kinds are used in a central manner by elementary particle theorists and cosmologists — remain to be constructed. These models present significant mathematical and conceptual challenges to all those who are not satisfied with ad hoc and essentially instrumentalist computation techniques.

Why haven’t these models of greatest physical interest been constructed yet (in any mathematically rigorous sense which preserves the basic principles constantly evoked in heuristic QFT and does not satisfy itself with an uncontrolled approximation)? Certainly, one can point to the practical fact that only a few dozen people have worked in CQFT. This should be compared with the many hundreds working in string theory and the thousands who have worked in elementary particle physics. Progress is necessarily slow if only a few are working on extremely difficult problems. It may well be that patiently proceeding along the lines indicated above and steadily improving the technical tools employed will ultimately yield the desired rigorous constructions.

It may also be the case that a completely new approach is required, though remaining within the CQFT program as described in Section 1, something whose essential novelty is analogous to the differences between the approaches in Section 2, 3, 5 and 6. It may even be the case that, as Gurau, Magnen and Rivasseau have written, “perhaps axiomatization of QFT might have been premature”; in other words, perhaps the Wightman and HAK axioms do not provide the proper mathematical framework for QED, QCD, SM, even though, as the constructive quantum field theorists have so convincingly demonstrated, that framework is quite suitable for so many models of such varying types and, as the algebraic quantum field theorists have just as convincingly demonstrated, that framework is flexible and powerful when dealing with the conceptual and mathematical problems in QFT which go beyond mathematical existence. 
But it is possible that the mathematically and conceptually essential core of a rigorous formulation of QFT that can include the missing models lies somewhere else. Certainly, there are presently many attempts to understand aspects of QFT from the perspective of mathematical ideas which are quite unexpected when seen from the vantage point of current QFT and even from the vantage point of quantum theory itself, as rigorously formulated by von Neumann and many others. These speculations, as suggestive as some may be, are currently beyond the scope of this article.
The 4gravitons author may also have been referring to papers like this one (suggesting way to rearrange an important infinite series widely believed to be divergent, into a set of several convergent infinite series).

Wilczek says in the interview (conducted the day after his most recent preprint was posted):
What I’ve been thinking about today specifically is something of a potential breakthrough in understanding our fundamental theories of physics. We have something called a standard model, but its foundations are kind of scandalous. We have not known how to define an important part of it mathematically rigorously, but I think I have figured out how to do that, and it’s very pretty. I’m in the middle of calculations to check it out....
It’s a funny situation where the theory of electroweak or weak interactions has been successful when you calculate up to a certain approximation, but if you try to push it too far, it falls apart. Some people have thought that would require fundamental changes in the theory, and have tried to modify the theory so as to remove the apparent difficulty. 
What I’ve shown is that the difficulty is only a surface difficulty. If you do the mathematics properly, organize it in a clever way, the problem goes away. 
It falsifies speculative theories that have been trying to cure a problem that doesn’t exist. It’s things like certain kinds of brane-world models, in which people set up parallel universes where that parallel universe's reason for being was to cancel off difficulties in our universe—we don’t need it. It's those kinds of speculations about how the foundations might be rotten, so you have to do something very radical. It’s still of course legitimate to consider radical improvements, but not to cure this particular problem. You want to do something that directs attention in other places.
There are other concepts in the standard model whose foundations aren't terribly solid.  But, I'd be hard pressed to identify a bitter fit to his comments than renormalization. Also his publications linking particle physics to condensed matter physics make this a plausible target, because renormalization is a technique borrowed by analogy by particle physicists from condensed matter physics. 

His first new preprint in nearly two years came out earlier this month and addresses this subject, so I took a look there to see if I can unearth any more hints lurking there. Here's what he says about renormalization in his new paper:
- the existence of sharp phase transitions, accompanying changes in symmetry. 
Understanding the singular behavior which accompanies phase transitions came from bringing in, and sharpening, sophisticated ideas from quantum field theory (the renormalization group). The revamped renormalization group fed back in to quantum field theory, leading to asymptotic freedom, our modern theory of the strong interaction, and to promising ideas about the unification of forces. The idea that changes in symmetry take place through phase transitions, with the possibility of supercooling, is a central part of the inflationary universe scenario.
But, there was nothing in the paper that obviously points in the direction I'm thinking, so perhaps I am barking up the wrong tree about this breakthrough.

2 comments:

Tienzen said...

"...so perhaps I am barking up the wrong tree about this breakthrough."

Any tree cannot produce (calculate) the fruits of the nature constants (alpha, etc.) and the Planck CMB data is a dead tree.

So, you are not barking up the wrong tree but up a dead tree.

andrew said...

This isn't about determining Nature's constants, and I don't know what you are mentioning the Planck CMB data (are you perhaps trying to comment on a different post?) as I don't see anything in the post about Planck CMB data (although I don't see in principle why Planck CMB data, like any other measurement, couldn't be used to measure some physical constant that would place bounds on more fundamental physical constants). It wouldn't produce an exact calculated value, but it would provide an experimental test of any other value you might determine by other means. For example, you could establish that the cosmological constant is not consistent with a source equal to the vacuum expectation value of the Higgs field with Planck CMB data.

The issue in this post is that there has been real doubt about whether the renormalization technique used in particle physics is mathematically valid for the four decades since it was invented. Now, a leading physicists is hinting that indeed it is mathematically valid as demonstrated by his novel proof, although this purported break through is not discussed in his latest published paper.

The constants that go into the beta function used for renormalization are known exactly and follow entirely from mathematics without result to any experimental measurements, except for the renormalization cutoff scale which is chosen arbitrarily by the physicist, and without any pretension that it represents an experimentally measurable quantity.

Now, you can't do any quantum physics calculations unless you measure the physical constant to be renormalized at some given momentum scale. But, if, for example, you measure the strong force coupling constant alpha sub s at the Z boson mass momentum scale, you can determine the equivalent strong force coupling constant strength alpha sub s at any other momentum scale you wish to use so long as it isn't too close to your arbitrarily chosen renormalization calculation cutoff scale.

The fact that this has held up experimentally (and to extreme precision in the case of electroweak measurements) means that the structure of the equations where renormalization is used is correct in all material respects.

By analogy, it is an accurate conversion tool in much the same way that an equation for determining the length of a geodesic across a spherical surface based upon two coordinates on a spherical surface will reproduce reality if you are really on an object whose shape is very nearly spherical, with pure mathematics and one measured constant (in that case, the radius of the sphere).