[There] is a deep conflict between fundamental physical principles, that form the foundation of our most basic framework for describing physics. These pillars are quantum mechanics, the principles of relativity (special or general), and locality. These pillars underly local quantum field theory, which serves as the basis for our description of physical reality—from the shining sun to creation of matter in the early Universe to the Higgs boson (if that's what it is).
These principles clash when pushed to the extreme—the sharpest version of the problem arises when we collide two particles at sufficient energy to form a black hole. Here, we encounter the famed black hole information problem: if the incoming particles start in a pure quantum state, Hawking's calculation predicts that the black hole evaporates into a mixed, thermal-like final state, with a massive loss of quantum information. This would violate—and thus doom—quantum mechanics
While serious people still consider modifying quantum mechanics, so far proposals to do so create much bigger problems. . . . Quantum mechanics appears to be remarkably resistant to sensible modification. If quantum mechanics is sacred, apparently other principles must go: either those of relativistic invariance, or of locality, or both. The former likewise appears resistant to sensible modifications, but locality is a remarkably "soft" principle, in the context of a theory with quantum mechanics and gravity. So, that seems a reasonable concept to suspect.
The basic statement of locality is that quantum information cannot propagate faster than the speed of light. At least as early as 1992, modification of locality to solve the problem of getting missing quantum information out of a black hole was proposed. . . . In a context where one or more supposed bedrock principles must be discarded, we obviously need to be a little crazy—but not too crazy! . . .
Suffice it to say: while it appears that one of three basic pillars of physics must be modified, and that locality is the prime suspect, modification of locality is no small matter. Naive modifications of locality—as often proposed by physicists "on the fringe," generically lead to disastrous collapse of the entire framework of quantum field theory, which not only has been experimentally tested to a very high degree of accuracy, but underlies our entire physical picture of the world. If such modification must be made, it must be subtle indeed. It also appears that the basic picture of reality as underlain by the fabric of space and time may well be doomed. What could replace it is a framework where the mathematical structure of quantum mechanics comes to the fore. I would say more … but marching orders.
I will say I am deeply concerned about how we will arrive at a complete and consistent theory of gravity, and that we must, in order to describe not only black holes—which have been found to be ubiquitous in the universe—but also both the early inflationary and pre-inflationary evolution of our universe, as well as our seemingly dark-energy dominated future. The current problems at the foundations link to multiple big questions—and I fear it will be no small feat to resolve them.From the answer of UC Santa Barbara physicist Steve Giddings to the 2013 "Edge" question, "What *Should* We Be Worried About?", entitled Crisis At The Foundations of Physics.
Another pillar of fundamental physics that is closely related to and intimately intertwined with locality is the notion of causality.
The Example of Quantum Entanglement
A useful illustration of this concept is the phenomena known as quantum entanglement (the drawing below is mine and not copied from another source).
When two quantum particles become "entangled" at a particular point in time and space (A), and then separate from each other at different points in time and space (B) and (C), their behavior remains correlated with each other.
Individually, a particular quantum mechanical property of (B) is entirely random and cannot be known until it is measured. But, if you measure that property in (B) you can know to a certainty that its entangled partner, (C), will take a value of that property that is complementary to (B).
So, if we measure (B) to be + then (C) will be - no matter when we actually measure (B) and (C). For example, even if we measure (B) and (C) at precisely the same time and they are separated by a ten light years at that point, and the +/- value of (B) and (C) are indeterminate at that point, when our experimenters eventually compare their measurements, if we measured (B) to be +, we will have measured (C) to be minus (-).
That this is true is a matter of indisputable empirically proven scientific fact, and it is implicit in the equations of quantum mechanics (both in the Standard Model and in all non-crackpot variants of it). But, why this is possible or how this comes to be is an open question. Some physicists under the slogan "shut up and calculate" are content to know this is true without a fundamental mechanism and doubt that the question of "how" is anything other than a category error.
But, lots of very respectable physicists and natural philosophers do ponder this question.
Alternative Explanations For Entanglement
There are three basic ways (at least) that this empirical reality can be resolved.
1. Lee Smolin, a physicists best known for his loop quantum gravity work, offers one of them in his Edge question answer, "hidden variables."
In other words, through a theory deeper than the stochastic quantum mechanical theory that we use today, a deterministic law has resolved the question of whether each particular particle will be a + or a - when measured, even though it appears to be underdetermined to us when we observe it.
He is in good company with Einstein, de Broglie, Schroedinger, and Bohm, all of whom have stated that they believed this to be the case at some point.
But, "hidden variables" theories are strongly disfavored by the physics community at this point because a variety of experiments designed to reveal more naiive hidden variables have failed to find them. Bohm, the last of these titans of physics to weigh in on the question formulated his hidden variables interpretation of quantum mechanics in the 1950s. The "Copenhagen interpretation" as it is called, which disavows hidden variables, rather than Bohmian quantum mechanics, is overwhelmingly the mainstream view of practicing physicists today.
2. A second solution is non-locality. The value of the quantum property truly is undetermined until (B) or (C) is measured, whichever happens first, but, when this happens, this is communicated instantly, rather than merely at the speed of light, from one particle to the other. The term "quantum teleportation" implicitly adopts this understanding. As Professor Giddings explains, there are multiple ways that a non-locality proposition could be implemented. I will offer here several of my own examples (rather than his examples which are a bit obscure to people not familiar with recent debates about black hole "firewalls"):
* The information truly "teleports" from (B) to (C) without passing through intermediate points (i.e. there is no speed of light limitation on information); or
* Space-time is fundamentally made up of a discrete grid of nodes with perhaps four connections each. While, on average, these nodes are arranged in a fairly smooth adjacent pattern that emergently gives rise to four dimensional space-time, nothing prevents one node from being directly connected to another node that is far from the first node in classical space-time based on its other three connections (this is true in most loop quantum gravity theories); or
* Space-time as we know it is a four dimensional space embedded in a larger eleven dimensional universe, and certain kinds of quantum information can take shortcuts through one or more of the other seven dimensions, even though no known fundamental particles themselves (except perhaps the hypothetical graviton) can do so (this is true in many version of string theory).
 One reason to be skeptical of non-locality is that the phrase "whichever happens first" is inherently ambiguous in the context of general relativity because time passes at different rates for different observers. There are definitional ways of resolving this ambiguity, but they aren't very elegant.
3. A third solution is backward causality. Assuming (without loss of generality) that (B) is measured first, a message goes backward in time from (B) to the point of entanglement (A) at the speed of light or some slower speed, and then goes forward in time to (C).
Importantly, because it illustrates the deep connections between locality and causality, backward causality is equivalent to stating that non-locality for quantum information is possible for entangled particles, but only within the light cone of the point of entanglement.
Which approach makes the most sense?
The trouble is that messing with any of the pillars of fundamental physics: an absence of hidden variables, locality, or causality, raise all sorts of difficult and troubling issues if not bounded in some exquisitely constrained way.
Each approach is appealing in its own way.
I tend to favor either a hidden variables or causality violation interpretation over a non-locality interpretation for the reason I state in my footnote . But, I'm not at all certain that interpretations aren't all mathematically and physically equivalent. Also, I recognize my personal biases strongly influence how I weigh the possibilities. This is a situation where it is not at all clear which way Occam's Razor points.