The essence of the argument is that different kinds of particles have different speed limits. The fastest would be the graviton whose speed limit would be the "c" of special relativity (since they are massless and don't have strong, weak or electromagnetic interactions). Neutrinos and photons would have different speed limits that might be a hair smaller, due to the interactions of a coupling constant and a "B" field of a particular strength that was present as opposed to entirely empty space-time. At an appropriate B field strength, roughly 0.001 in the appropriate units, the peak speed of the photon would be reduced by more than the peak speed of the neutrino, since in this theory a slightly modified Lagrangian reduces has a term that reduces particle speed by a factor of the square of (2*pi*coupling constant*time B field strength).
This is essentially a more sophisticated version of the idea that I explored previously that the interactions of an average photon with local electromagnetic field from charged matter and photons in open space from the Earth's magnetic field, radiowaves, etc. could increase the length or time of an average photons travel over a distance and thereby reduce its effective peak speed. Since the neutrino interacts more weakly than a photon, it would have fewer interactions and a higher effective speed, despite having a mass.
In the same vein, it is notable that the observed deviation is on the order of the square of the anomalous electron magnetic dipole moment which is at a first order approximation, the electromagnetic coupling constant (alpha) divided by 2*pi. Thus, (2*pi*alpha)^2 would be just about the right factor.
Of course, the true value of c for special relativity and general relativity is a bit higher than the speed of light due to this correction, then all of the horrible problems that tachyons could cause in physics go away.
I'm not convinced the non-communitive gravity itself is key, but the notion that a non-true vacuum (due to matter or fields) could slow down photons to make them slower than highly energetic neutrinos by a rate independent of photon energy does make sense.
OPERA is a pretty simple experiment. If they are wrong, they are wrong with distance, they are wrong with duration of the trip, they have the wrong value of "c", or there are tachyons. Any problem with the first three has to be quite subtle. Any problem with the last is a big problem theoretically.
I'm inclined to think that the actual measurements of distance are right with the possible exception of problems in the GPS distance formula particularly the possibility that time dilation effects associated with distance from the center of the Earth are not properly considered. I'm inclined to think that similar time dialation effects could impact the clock synchronization, or that a variety of other effects in the timing of the movement of electricity through the equipment.
Lubos sketches out in a post at list of candidates for problems that I largely agree at the most likely suspects for conceptual or experimental problems in the result, although the speed of the signals within the electronic equipment at either end causing systemic issues with the timing is one he discounts more than I would:
•inconsistencies in the whole GPS methodology of measuring space and time coordinates . . .
•subtle old-fashioned physics issues neglected by GPS measurements: the index of refraction of the troposphere and (even more importantly) ionosphere that slows down and distorts the path of GPS signals; confusing spherical Earth and geoid; neglecting gravitational effects of the Alps; neglecting magnetic fields at CERN that distort things; and so on
•forgetting that 2 milliseconds isn't zero and things change (e.g. satellites move) during this short period, too
•subtle special relativistic effects neglected in the GPS calculations
•subtle general relativistic effects neglected in the GPS calculations
•wrong model of where and when the neutrinos are actually created on the Swiss side . . .
This is just a partial list but I feel that most people who have tried to find a mistake will prefer and focus on one of the categories above. Recall that to find a mistake in the Opera paper, you need to find a discrepancy comparable to their signal of 18 meters (60 nanoseconds times c). Some of the proposed mistakes lead to much too big effects relatively to 18 meters and it's therefore clear that Opera hasn't made those errors; on the other hand, some errors and subtle effects are much smaller than 18 meters and may be ignored.
I have completely omitted the technicalities of their timing systems (their local, "lab" properties) because even if they're wrong about them, they're vastly more professional in dealing with them than all of us and we're unlikely to find a mistake here.
He also appropriately notices that systemmic errors in GPS which influence accuracy but not precision could be adapted to by ordinary people in a wide range of contexts much as one might to a slight redefinition of the units that you are using.
But, I think that there is a quite decent chance that the error is in the value of "c" used for the special relativity limit purposes that prior measurements based on measuring the speed of light failed to capture because the speed of light differs systemmically from "c" in the places where we measure it. The canonical value of "c" is here.
No comments:
Post a Comment