What Is the EPR Paradox and the Uncertainty Principle?
The Uncertainty Principle
In the early 20th century, quantum mechanics was born as the double slit experiment demonstrated that particle/wave duality and the collapse due to measurement was real and physics was changed forever. In those early days, many different camps of scientists banded together in either defending the new theory or trying to find holes in it. One of those who fell into the latter was Einstein, who felt quantum theory was not only incomplete but also not a true representation of reality. He created many famous thought experiments to try and defeat quantum mechanics but many like Bohr were able to counter them. One of the largest issues was the Heisenberg uncertainty principle, which places limits on what information you can know about a particle at a given moment. I cannot give a 100% position and momentum state for a particle at any moment, according to it. I know, its wild, and Einstein came up with a doozy he felt defeated it. Along with Boris Podolsky and Nathan Rosen, the three developed the EPR paradox (Darling 86, Baggett 167).
The Main Idea
Two particles collide with each other. Particle 1 and 2 go off in their own directions, but I know where the collision happens by measuring that and that alone. I then find one of the particles a time later and measure its speed. By calculating the distance between the particle then and now and finding the velocity, I can find its momentum and therefore find the other particle’s as well. I have found both position and momentum of the particle, violating the uncertainty principle. But it gets worse, because if I find the state of one particle then to ensure the principle stands the information has to change for the particle instantly. No matter where I conduct this, the state must collapse. Doesn’t that violate the speed of light because of the state of information travel? Did one particle need the other in order to have any properties? Are the two entangled? What is to be done about this ‘spooky action at a distance?” To resolve this, EPR predicts some hidden variables that will restore the causality that we are all familiar with, for distance should be a barrier to such issues as seen here (Darling 87, 92-3; Blanton, Baggett 168-170, Harrison 61)
But Bohr developed a response. First, you have to know the exact position, something that is impossible to do. Also, you would have to ensure that each particle contributes momentum equally, something that some particles like photons do not do. When you take it all into account, the uncertainty principle holds strong. But do experiments actually hold up to it? Turns out, his solution wasn’t entirely complete, as the following demonstrates (Darling 87-8).
The ESW Experiment
In 1991, Marlan Scully, Berthold Georg Englert, and Herbert Walther developed a possible quantum tracking experiment involving a double slit set up, and in 1998 it was conducted. It involved creating variances in the energy state of particle being fired, in this case rubidium atoms cooled to nearly absolute zero. This causes the wavelength to be huge and thus results in a clear interference pattern. The beam of atoms was split by a microwave laser as it enters an energy and upon recombining created an interference pattern. When the scientists looked at the different paths, they found that one had no energy change but the other had an increase caused by the microwaves hitting it. Tracking which atom came from where is easy. Now, it should be noted that microwaves have small momentum, so the uncertainty principle should have minimal impact overall. But, as it turns out when you track this information, combining two quantum pieces of information…the interference pattern is gone! What is happening here? Did EPR predict this issue? (88)
Turns out, it’s not so simple as that. Entanglement is goofing up this experiment and making it seem like the uncertainty principle is violated, but was actually what EPR said shouldn’t happen. The particle has a wave component to it and based on the slit interaction creates an interference pattern on a wall after passing through it. But, when we fire that photon to measure what type of particle is going through the slit (microwaved or not), we have actually created a new level of interference with the entanglement. Only one level of entanglement can happen at any given point for a system, and the new entanglement destroys the old one with the energized and non-energized particles, thus destroying the interference pattern that would have arisen. The act of measurement doesn’t violate uncertainty nor does it validate EPR. Quantum mechanics holds true. This is but one example that shows Bohr was right, but for the wrong reasons. Entanglement is what saves the principle, and it shows how physics does have non-locality and a superposition of properties (89-91, 94).
Bohm and Bell
This wasn’t the first instance of testing out the EPR experiment, by far. In 1952, David Bohm developed a spin-version of the EPR experiment. Particles have either clockwise or counterclockwise spin, and it’s always at the same rate. You can also only be spin up or spin down. So, get two particles with different spins and entangle them. The wave function for this system would be the probability sum of both having different spins, because the entanglement prevents them both having the same. And as it turns out, the experiment verified the entanglement does hold and is nonlocal (95-6).
But what if hidden parameters were affecting the experiment prior to the measurements being taken? Or does entanglement itself perform the property distribution? In 1964, John Bell (CERN) decided to find out by modifying the spin experiment so that there was an x, y, and z spin component for the object. All are perpendicular to each other. This would be the case for particles A and B, which are entangled. By measuring the spin of just one direction (and no direction has a preference), that should be the only change to the compliment. It is a built-in independence to ensure that nothing else is contaminating the experiment (such as information being transmitted at near c), and we can scale it up accordingly and search for hidden variables. This is Bell’s Inequality, or that the number of x/y spins being up should be less than the number of x/z ups plus y/z ups. But if quantum mechanics is true, then upon entanglement the direction of the inequality should flip, depending on the degree of correlation. We know that if the Inequality is violated, then hidden variables would be impossible (Darling 96-8, Blanton, Baggett 171-2, Harrison 61).
The Alain Aspect Experiment
To test out Bell’s Inequality in reality is tough, based on the number of known variables one must control. In the Alain Aspect Experiment, photons were chosen because they are not only easy to entangle but have relatively few properties that could goof up a set up. But wait, photons have no spin! Well, turns out they do, but only in one direction: where its moving towards. So instead, polarization was employed, for the waves that are selected and not selected can be made analogous to the spin choices we had. Calcium atoms were hit with laser lights, exciting electrons to a higher orbital and releasing photons as the electrons fall back. Those photons are then sent through a collimator, polarizing the waves of the photons. But this presents a potential problem of having information leakage around this and thus goofing up the experiment by creating new entanglement. To resolve this, the experiment was conducted at 6.6 meters to ensure that the time it took the polarization (10ns) with the time of travel (20ns) would be shorter than the time for entangled information (40ns) to be communicated – too long to change anything. Scientists could then see how the polarization turned out. After all of this, the experiment was run and Bell’s Inequality was beaten, just as quantum mechanics predicted! A similar experiment was also done in the late 1990's by Anton Zeilinger (University of Vienna) whose set-up had the angles randomly chosen by the direction and were done very close to the measurement (to ensure that it was too fast for hidden variables) (Darling 98-101, Baggett 172, Harrison 64).
Loophole Free Bell Test
However, an issue is present and its the photons. They are not reliable enough because of the rate of absorption/emission they undergo. We have to assume the "fair sampling assumption," but what if the photons we lose actually contribute to the hidden variable scenario? That's why the loophole-free Bell Test done by Hanson and his team from the Delft University in 2015 is huge, because it switched from photons and instead went to electrons. Inside a diamond, two electrons were entangles and located in defect centers, or where a carbon atom should be but isn't. Each electron is put in a different location across the center. A fast number generator was used to decide the direction of the measurement, and that was stored on a hard drive right before the measurement data arrived. Photons were used in an informational capacity, exchanging information between the electrons to achieve an entanglement of 1 kilometer. This way, the electrons were the driving force behind the experiment, and the results pointed to the Bell Inequality being violated by up to 20%, just as quantum theory predicted. In fact, the chance that hidden variable happened in the experiment was only 3.9% (Harrison 64)
Over the years, more and more experiments have been carried out, and they all point to the same thing: quantum mechanics is correct on the uncertainty principle. So, rest assured: reality is just as crazy as well all thought it was.
Baggett, Jim. Mass. Oxford University Press, 2017. Print. 167-172.
Blanton, John. “Does Bell’s Inequality rule out local theories of quantum mechanics?”
Darling, David. Teleportation: The Impossible Leap. John Wiley & Sons, Inc. New Jersey. 2005. 86-101.
Harrison, Ronald. "Spooky Action." Scientific American. Dec. 2018. Print. 61, 64.
Questions & Answers
© 2018 Leonard Kelley