Understanding quantum mechanics is often a good joke amongst scientists. No one full agrees on the results or the implications that arise from many of the experiments, yet a framework does exist that offers solutions. Can quantum mechanics be truly understood? I think so, but it requires you to approach the problems of the theory from different perspectives. One can start by seeing the experiments used to verify quantum behavior, and why they arose in the first place. We should start with the fundamental challenge to quantum mechanics, otherwise known as locally hidden variables, and see how the pursuit of that led to the realization of nonlocal behavior.
Fighting for Classical Understanding
Many scientists were not happy with quantum mechanics in the early 20th century. It seemed to violate the properties that classical mechanics had established experimentally for centuries, and yet quantum mechanics offered a probabilistic model that matched experimental results, despite lacking a real foundational why. This made people wonder if quantum mechanics was just a tool to describe something else going on, a fundamental property of reality unknown to scientists but once uncovered would remove the seemingly maddening consequences of quantum mechanics. Hence the hunt for locally hidden variables (Ananthaswamy 61-3).
John Bell developed a famous inequality in his 1964 theorem (see my EPR article for more details on that interesting scientific quest) to find out if quantum mechanics was the only thing at play, but many experiments needed single particles to be generated in order to test out the theory. And even if you could do that, you had to somehow ensure the particle you created ended up being the particle you detected and not just a happenstance from something else (Ibid).
Enter the Aspect and Grangier experiment. It makes use of a Mach-Zehnder interferometer, which we will see is a common tool in quantum study. In this set-up, a calcium laser was used for our light source, because it carries a distinct wavelength signal of 551.3 nanometers and 422.7 nanometer photons being created as the atoms drop in orbital energy. The green photon will scatter out and be detected separately, then the blue one will be sent through the rig 1 nanosecond later (this ensuring continuity of our signal and removing the randomness of the detection). This photon will encounter a beam splitter, capable of sending the photon down one of two paths. Mirrors at the end of each of these paths will redirect the photon route to another beam splitter, and two detectors will be present at the end of each or our routes. How many options are there for a photon to travel? (68-74)
Well, the first beam splitter can cause one of two options, either a reflected path or a transmitted path. And upon meeting that other beam splitter, the reflected beam can become reflected again (RR) or transmitted to the 1st detector (RT) while the transmitted can be reflected (TR) or transmitted though to the 2nd detector (TT). So, it would seem as if we have 4 options here and this is where the world of quantum mechanics will get to you and make you see things in a new light. The reflected and transmitted paths each have an equal chance of happening and so uncertainties arise from their patterns, but then the 2nd beam splitter causes another uncertainty (Ananthaswamy 74-9, Sweatman).
In this case, the TT/RR paths creates destructive interference, destroying both and causing a null result, while the RT and TR both go to the same detector. It is a subtle difference in path length (specifically, ¼ of a wavelength) for the reflected photon as it travels that extra distance in the material as its reflected back through the splitter to its new path. When it gets transmitted, it suffers no further signal delay, but the complementary TR path did, meaning they are now in sync and so are in a constructive interference. But the RR is now a half wavelength out of sync and when it meets the TT wave they cancel each other out destructively. With this set-up, 100% of the photons go to a single detector (Ibid)!
This could only arise if light is a wave, therefore exhibiting a duality of wave and particle behavior, and no matter the number of photons sent down this pattern will hold. And if I block the 2nd beam splitter, no interference arises and so both detectors get half the signals, as we would expect the probability of being. That is about the superposition of states, and it makes no claims about actual paths taken or not taken but only probabilities (Ibid).
However (you knew it was coming), this all points to a subtle but troubling issue. We do all this work and develop quantum models for the behavior seen, but shouldn’t our tools used to measure the phenomena also be subjected to quantum mechanics? Traditionally, the Copenhagen Interpretation of quantum physics has the act of measuring causing the collapse from probability to a definite state, but if our tools are also a part of a quantum reality then how can we be certain that they don’t impact our results? How do we know they themselves are not in a state of superposition? (Ananthaswamy 80-2)
We would need to measure to find out, but then the same problems stack up with our new measurements. As Anathaswamy says, “Where’s the boundary between the classical and the quantum?” This is known as the measurement problem, and it gives real pause as we grapple with this infinite chain of reasoning where nothing we measure with can be fully trusted to be in a strictly classical state. We are faced with a scenario in which “reality does not exist in any meaningful sense independent of measurement with a classical apparatus” (Ibid).
Delayed Choice Experiment
Maybe the resolution lies in going to the most basic unit of our quantum system, and that is our photon. The particle/wave duality as postulated by Bohr means that you are either one or the other at any given measurement, but otherwise have both properties. It is our trying to find both at once that leads to interference patterns. But how does that quantum system decide which state to show us? As John Wheeler demonstrated in his ‘great smoky dragon’ metaphor, we can see that if we go from the tail (where our photon is before entering the interferometer) to the head (where the detectors are located) the body is fuzzy, all concealed by smoke (like a probability cloud). We need some clarity if we want to know the structure of this process. And so John conducted what is known as the delayed choice experiment in an effort to reveal that body and also show the state selection process in action (86).
In this thought experiment, a Mach-Zehnder interferometer was used again, with the ability to remove the 2nd beam splitter after the photon had passed through the first one. Without that, the photon will go through unaltered and our 50/50 chance of detection at either location is upheld. But, it gets interesting if I reinsert the 2nd beam splitter after the photon has passed that location but before being detected, or instead remove the beam splitter after the photon passed though it but before being detected. Without the splitter, we have particle behavior but with it we should get that wave interference pattern as a result of the two paths interfering with each other (Ananthaswamy 86-7, Choi).
You would think such considerations shouldn’t matter since the path has been taken but remember this is a quantum system with different states to it. Any alterations like we talked about do impact the photon’s path or its wave function, seemingly in violation of temporal laws. Our photon will appear to retroactively chance the path it took, with one option giving a particle detection while the beam splitter would hint at a wave detection. That is to say, our photon had a delayed choice in what behavior it displays, either a particle or a wave, until the moment of detection. So, does reality match up to this experiment? (Ibid)
To test this out, we have to overcome some hurdles first. First of all, we need the ability to remove and insert a beam splitter in a tight time frame of a few nanoseconds for a roughly 20 foot long interferometer. So, longer arms to the interferometer means greater amount of time to play with. Also, the calcium laser from before wouldn’t be sufficient at this length or easy to confine, so a nanobeam of photons needed to be generated. Such technology didn’t exist for years but in 2005 a test was done with an interferometer of 48 meters while a few years later a space-based version was done at nearly 1800 kilometers apart (Ananthaswamy 88-90, Choi).
Shockingly, the though experiment was found to be true in both instances. With no 2nd beam splitter the photon displayed particle behavior and with it wave behavior. No matter when the beam splitter was added/removed, the overall behavior remained the same. It is only via the actual act of measuring the system that finally causes the collapse, so until you do that you can alter the state as much as you want, provided it doesn’t create a measurement in the process (Ibid).
But (again, surprised?) this seems to fly against relativity, which would have the photon and beam splitter be space-like separated and so any changes to one shouldn’t impact the other, and yet it does. To many, this again hinted at local hidden variables that awaited detection of scientists but as we will see that was conclusively shown to be impossible (Ibid).
EPR Paradox and Bell’s Theorem
In 1927, Einstein developed a thought experiment involving photons passing through a hole in a screen. Our photon has a wave function that describes it and once passed through the hole offers a hemisphere of probability as to the photon’s location. If we measure the photon to be at one place, all the other possible locations the wave function predicts it could instantly collapse no matter how far away we are. This flies in the face of relativity again, with a local action outpacing the speed of light and thus rendering a non-local phenomena (Ananthaswamy 94-99).
It is this central idea that is critical to the EPR paradox of 1935, where two particles interact and have entanglement, or when their respective states are correlated to the other. Information of one instantly tells me the information of the other, no matter that distance. If nature is local, then this shouldn’t be possible unless a hidden variable was present that could explain this feature away while quantum mechanics would be left grasping for a solution. Instead of a truly non-local event happening, the collapse of a wave function is just an incomplete description of physical reality. If, however, no hidden variables exist then reality is instead non-local (Ibid).
For years, this paradox gave classical realists comfort because it was a way out of the confusion that quantum mechanics offered. In 1964, Bell came up with a theory to see if such hidden variable could be proven to exist rather than just postulated. In his theory, we can have two photons whose polarization information is entangled and they go off to two different people. Each makes a measurement but doesn’t report it to the other person. If the two truly are entangled, then we should know the other person’s result 100% of the time but if not truly entangled then a general lack of correlation should manifest itself and instead a hidden variable would be present that confounds the results (103-7).
To ensure a proper test of this, we need the events to be space-like separated so that information cannot be transferred in a relativistic manner, and over the years experiments have shown that hidden variables cannot exist because of the correlation seen. Indeed, reality is non-local and strange (Ibid).
And where does this take us? For more, check out my quantum eraser experiment article, as we set out to show causality take even further hits.
Ananthaswamy, Anil. Through Two Doors at Once. Random House, New York. 2018. Print. 61-3, 68-82, 86-90, 94-99, 103-7.
Choi, Charles Q. “Space-Based Test Proves Light’s Quantum Weirdness.” Scientificamerican.com. Scientific American, 25 Oct. 2017. Web. 18 Feb. 2020.
Sweatman, Will. “The Quantum Eraser.” Hackaway.com. Hackaday, 07 Sept. 2016.Web. 18 Feb. 2020.
This content is accurate and true to the best of the author’s knowledge and is not meant to substitute for formal and individualized advice from a qualified professional.
© 2021 Leonard Kelley