Entanglement

EXPERIMENTAL EVIDENCES FAVORING QUANTUM MECHANICS

In 1969, Clauser, Horne, Shimony, and Holt generalized Bell’s work so that realizable experiments could be designed. For the first time Einstein and Bohr’s debate found its way out of the foggy world of epistemology to the verifiable domain of experimental physics. Many experiments have since been conducted and all of them have favored quantum mechanics—their results violate the limits imposed by a “local hidden variable” theory of the type favored by Einstein and his collaborators. In spite of the spectacular success, more work needed to be done as conclusions of these experiments are still the subject of considerable interest and debate. Every experimental test of entanglement appears to have one or more loopholes, raising the slim possibility that “some alternative theory, distinct from quantum mechanics and more aligned with Einstein’s intuitions”, could be in play.

EARLY EXPERIMENTS

The first experiments that put Einstein’s thought experiment into practice were carried out at the University of California, Berkeley and Harvard University in 1972, followed by Texas A&M University in 1976. Freedman and Clauser (Freedman & Clauser, 1972) were the first to measure the linear polarization correlation of the photons emitted in an atomic cascade of calcium. Earlier, Clauser, Horne, Shimony, and Holt’s generalization of Bell’s inequality showed that the existence of local hidden variables imposes restrictions on this correlation that was in conflict with the predictions of quantum mechanics. Freedman and Clauser’s data agreed with quantum mechanics and violated these restrictions to high statistical accuracy, thus providing strong evidence against local hidden-variable theories. More experiments followed. Barring some initial inconsistencies, these experiments showed an agreement with quantum mechanics and a violation of Bell’s inequalities was reported by as much as 6 standard deviations. However, a few loopholes quickly surfaced, allowing the possibility of a local realist interpretation of these results.

ALAIN ASPECT

In the early 1980’s, Alain Aspect and his colleagues started designing a series of experiments at Orsay, France to test Bells inequalities. To create the entangled photons, calcium atoms were irradiated by a laser beam causing them to transition to a high-energy state. The excited atoms decayed by emitting two photons in opposite directions with their polarizations8 entangled—meaning they were instantaneously connected over any distance in such a way that the measured property of one depended on the other—like a pair of tossed dice that always roles doubles.

first-orsay-experiment
Figure 6: Einstein-Podolsky-Rosen-Bohm Gedanken experiment with photons. The two photons \nu_1 and \nu_2, emitted in the state \psi(1,2), are analyzed by linear polarizers in orientations a and b. One can measure the probabilities of single or joint detections in the output channels of the polarizers.

Their first experiment used a single detector with a polarizer9 in front of it, on each side of the atomic beam. Some photons passed through, while others were stopped by the polarizer. Each detector recorded photons that passed through for various combinations of the polarizer settings (polarizers were rotated to register photons of other polarization directions). The polarizations of pairs of photons were measured. Although each individual measurement gave random results, Aspect found these results were in fact correlated (as if the photons “rolled doubles” more than a certain fraction of the time), clearly violating Bell’s inequalities for local hidden variable theories. However, a loophole, called the detector efficiency loophole, was quickly spotted. The detectors sometimes could “miss” a photon and the absence of a count in a particular detector was a significant piece of information. It could be a signal—for example, the polarization was horizontal when the polarizer was vertical. This meant that one could explain their result using a local theory that just happened to “miss” photons in a particularly lucky pattern.

To overcome the detector efficiency loophole, the experiment was repeated with four detectors, two on each side, and only counting data when one photon was detected on each side. Again, the measured correlations exceeded the classical limit, by an absurd 40 times the statistical uncertainty in their measurement.

But the fundamental loophole in these experiments, according to Bell, was the “locality loophole.” The result of a measurement at one polarizer should not depend on the orientation of the other. Bell proposed a way to close this loophole. He argued that if the orientation of each polarizer was chosen while the photons were in flight, then the relativistic causality (no influence can travel faster than light) would prevent a polarizer from “knowing” the orientation of the other during measurement. Thus, an ideal test of Bell’s inequalities would require randomly choosing the orientation of each polarizer.

orsay-experiment
Figure 7: Aspect’s timing-experiment with optical switches (C1 and C2).

The switch C1 followed by the two polarizers in orientations a and a’ is equivalent to a single polarizer switched between the orientations a and a’. A switching occurs about once in every 10 ns. A similar setup is implemented on the second side. Source: Aspect, 2000.

Aspect subsequently came up with a design involving a switching device and two polarizers in orientations: a and a’ on side I, b and b’ on side II. The optical switch C1 on side I rapidly redirected the photons to either polarizer a or to a’ thus acting like a polarizer whose orientation could be changed between a and a’. Side II had a similar set up with a variable polarizer capable of switching between b and b’. The distance L between the two switches was 13 m, with L / c = 43 ns. The distance L between the switches was large enough (13 m) so that the time taken by the signal to travel between the switches (43 ns) was significantly larger than the delay between two switching operations (about 10 ns) plus the time delay between the emission of the two successive photons (5 ns average).

Aspect’s clever design allowed the orientations of the polarizers to be changed rapidly while the photons were still in flight. It effectively closed the locality loophole according to which the measurement device itself could somehow transmit information about one photon to the other that would explain any coordination between them. Bell’s inequality was still violated by 6 standard deviations.

In spite of the improved design, the polarizer orientations in Aspect’s experiment couldn’t be made fully random because of certain technical difficulties. Also, the distance between the source and polarizers were such that communications between them at the speed of light couldn’t be excluded.

ANTON ZEILINGER

The definitive experiment was performed by Anton Zeilinger and his team at the University of Innsbruck in 1997. The experimenters used a much improved source of entangled photons and quantum random-number generators to set the detector orientations. Again, a violation of Bell’s inequality was observed by several tens of standard deviations.

The method that Zeilinger’s team used to generate polarization-entangled photon is called spontaneous parametric down-conversion (SPDC). We won’t burden ourselves with the details of SPDC until the next Section, but for now, suffice it to say when a special kind of crystal is irradiated with a strong laser beam, a photon from the laser beam could be converted into two polarization-entangled daughter photons. These photons were subsequently used in the Innsbruck experiment.

The orientations of the polarizers were varied every nanosecond by placing a device called an electro-optical modulator in front of a polarizer (Figure 8). The effect was to rotate the polarizer at a specific angle. This was done on both the sender and receiver sides independently and randomly using a quantum random-number generator (ORNG).

innsbruck-experiment
Figure 8: Set up of the Innsbruck experiment with two independent observers Alice and Bob (Weihs, Jennewein, Simon, Weinfurter, & Zeilinger, 1998). Two entangled photons are dispatched by way of optical fiber cables to Alice and Bob whose measuring stations are located about 400 m apart. The polarization directions of the in-flight photons at each measurement station is set by two independent ORNGs so that events are registered independently on both sides and coincidences can be identified long after the experiment is finished.

Random number sequences are needed in many theoretical and experimental scenarios and usually complex algorithms are summoned to generate them. But the problem with algorithmic ORNGs is that the sequences they generate are never strictly random, as repetitive number patterns are known to emerge with time. Quantum random numbers, on the other hand, are vastly superior because they never repeat patterns. The Innsbruck team was able to harness the intrinsic randomness of the quantum world to build their QRNG. Here is how.

There are many examples of quantum randomness in nature. Imagine the case when a vertically orientated polarizer meets a photon polarized at 45 degrees. The photon has a 50-50 chance of passing through the polarizer and the process is completely random—nobody can predict if a specific photon would pass through or not. A similar kind of randomness is seen in double-slit experiments where it is impossible to predict through which slit a particular photon would emerge.

The Innsbruck team used half-silvered mirrors or beam splitters to construct their ORNG. When a single photon is incident on a half-silvered mirror, it has a 50 percent chance of passing through (transmission) and a 50 percent chance of getting blocked (reflected). If a photon detector is placed behind the half-silvered mirror to register the transmitted photons and a second detector in the front to detect the reflected photons, both detectors will have equal probability of registering photons. If we associate 1 with the “click” of the detector of transmitted photons and 0 with that of the reflected photons, a random sequence of zeros and ones will emerge with a given stream of photons—a fraction of the huge sequence of random numbers obtained by the Innsbruck team is reproduced here (Zeilinger, 2010):

random-numbers

The photon pairs were generated in a building at the center of the campus, and the two photons were directed to two different measurement stations located about 400 meters from each other. The Innsbruck team shined a beam splitter with a weak light source producing random numbers at a rate faster than the time necessary to communicate with the detector on the other side. In other words, the polarization settings on both sides were changed very rapidly and at the last moment with electro-optic modulators that altered the polarizations at a rate proportional to the voltage applied, with the modulators themselves controlled by the QRNGs. Thus the Innsbruck experiment was able to eliminate the nagging communication loophole because any communication between the detectors on either side had to happen at a speed faster than the speed of light. Yet a violation of Bell’s inequality was observed by several tens of standard deviations (S = 2.73 ±0.02) and quantum entanglement was clearly established with completely independent measuring stations.

FUTURE EXPERIMENTS

This could be the end of the story—all experimental tests conducted so far had irrefutably favored quantum mechanics. But curiously, a particularly subtle loophole called the “free will” loophole still lingered: The possibility that the settings of the detectors that measure the entangled photons could be correlated with any “hidden” information in their shared causal pasts. In Bells own words: “It is the requirement of locality, or more precisely that the result of a measurement on one system be unaffected by operations on a distant system with which it has interacted in the past, that creates the essential difficulty.”

cosmic-bell
Figure 8: Schematic of the “Cosmic Bell” experiment. A source sends entangled particles to two distant detectors. While the particles are in flight, light from distant quasars x and y is used to randomly choose the detector settings a and b on each side, which then measure experimental outcomes A and B. Many runs would be compared to confirm that the results always violate the Bell inequalities and are independent of the choices of which quasars to observe. Source: Gallicchio

In fact, according to a recent theoretical study, only a small correlation between the detector settings and any local hidden variables would be sufficient to mimic the predictions of quantum mechanics. Previous experiments were only able to rule this out less than a millisecond or so before the test began. The recently proposed “Cosmic Bell” experiment aims to eliminate this “over almost the entire history of the universe, all the way back to the Big Bang 14 billion years ago”.

This may sound bizarre, but here is how this experiment will work. The detector settings would be determined not by the experimenters—but by two distant cosmic sources (for example, light from distant quasars). A source will send entangled particles to two distant detectors in the usual way. While the particles are in flight, light from two distant cosmic sources will be used to randomly choose the settings (e.g. polarizing filter angles) for each of the two detectors, which will then measure experimental outcomes A and B, as shown in Figure 9. The quasars will act as special kind of random number generators, based on the random arrival times of light from them.

Selecting sufficiently distant cosmic sources will ensure that they have never been in causal contact with each other. Because of the astronomical distance between them, “conspiring” hidden variables will not have enough time to communicate with both quasars as the lights from these quasars were emitted billions of years ago.


8. Polarization is the property that describes light oscillations in specific planes. A polarization filter will only allow light to pass if it oscillates in a specific plane.
9. A polarizer or a polarization filter determines the direction of polarization of light (or photon).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s