As we approach that magical time of the year again when Santa Claus will set off on his merry way, bringing gifts to children around the world, that familiar question arises:
How on Earth does he visit all those different places in a single night?
In many sceptics’ minds Santa seems to defy the laws of physics. But for quantum physicists there is no issue. The most modern theory, according to Professor John Goold and Dr Mark Mitchison (of the TCD QuSys research group in Trinity), is that Santa Claus is in fact exploiting quantum mechanics to deliver the gifts.
In a nutshell, quantum mechanics allows objects (and Santa, Rudolph and co.) to be in many places simultaneously. That is the key ingredient, which allows for his extraordinarily efficient delivery on Christmas Eve.
Quantum physics describes the basic building blocks of the stuff we can see around us. It explains almost everything we understand about the world: how the sun shines, why metal looks and feels different from plastic or wood, and very many other things. But quantum physics also makes some bizarre predictions, starting with the fact that objects can be in “superposition”, meaning that they exist in many different places at the same time!
Professor Goold, Assistant Professor and Royal Society URF in Physics at Trinity,, said:
“Experiments show that these weird states describe tiny things – like atoms – but also much larger things too. In fact, an important part of our job as physicists is trying to put bigger and bigger objects into superpositions, which we think will help us to build ultra-fast computers and a more secure Internet in the future. But we still haven’t learned to do it as well as Santa can!
“There is little doubt now to quantum physicists that Santa is exploiting what we know as ‘macroscopic quantum coherence’, which is precisely the same resource used by cutting-edge quantum technologies to outperform technologies based on classical physics.”
Einstein vs Santa
Historically the idea that an object can be in a macroscopic superposition has led to significant controversy. In fact, it has led many scientists over the years to question if quantum physics can really be true. Probably the most famous critic was Albert Einstein, who helped to discover quantum physics over 100 years ago, but then spent the rest of his life arguing that it was incomplete.
However, an intriguing rumour that has been circulating since Einstein’s time is that he hated Christmas (basically, the rumour, which may or may not have originated at the North Pole, implies he was a Grinch who didn’t like Santa Claus). Even after sparking a revolution in physics and establishing himself as the smartest man in history, Einstein still wasn’t as famous as Santa…
Green with envy, some believe Einstein tried to discredit Santa by arguing that quantum superpositions were impossible so no one could possibly visit all the children in the world in one night. Nowadays, scientists don’t take Einstein’s ideas about quantum physics seriously and it is widely accepted that superpositions are real – along with Santa Claus.
Santa’s advanced tech
Even if we agree that Santa uses quantum physics to bring gifts to all the children in the world on the same night, we still don’t understand exactly how he does it.
Dr Mark Mitchison said:
“When we observe a quantum object, we only ever find it in one place at a time. This tells us that superpositions are very fragile. Just looking at them causes them to ‘collapse’, which means the object ends up in just one place and all the other possibilities vanish.
“We are pretty sure that Santa has developed some advanced technology to protect his quantum superposition and stop such a collapse from ruining Christmas. But – just in case – we advise children the world over to go to bed early on Christmas Eve and suggest they don’t try to catch a glimpse of him and risk collapsing his merry superposition!”
Scientists from EPFL, MIT, and CEA Saclay demonstrate a state of vibration that exists simultaneously at two different times. They evidence this quantum superposition by measuring the strongest class of quantum correlations between light beams that interact with the vibration.
An especially counter-intuitive feature of quantum mechanics is that a single event can exist in a state of superposition – happening both here and there, or both today and tomorrow.
Such superpositions are hard to create, as they are destroyed if any kind of information about the place and time of the event leaks into the surrounding – and even if nobody actually records this information. But when superpositions do occur, they lead to observations that are very different from that of classical physics, questioning down to our very understanding of space and time.
Scientists from EPFL, MIT, and CEA Saclay, publishing in Science Advances, demonstrate a state of vibration that exists simultaneously at two different times, and evidence this quantum superposition by measuring the strongest class of quantum correlations between light beams that interact with the vibration.
The researchers used a very short laser-pulse to trigger a specific pattern of vibration inside a diamond crystal. Each pair of neighboring atoms oscillated like two masses linked by a spring, and this oscillation was synchronous across the entire illuminated region. To conserve energy during this process, a light of a new color is emitted, shifted toward the red of the spectrum.
This classical picture, however, is inconsistent with the experiments. Instead, both light and vibration should be described as particles, or quanta: light energy is quantized into discrete photons while vibrational energy is quantized into discrete phonons (named after the ancient Greek “photo = light” and “phono = sound”).
The process described above should therefore be seen as the fission of an incoming photon from the laser into a pair of photon and phonon – akin to nuclear fission of an atom into two smaller pieces.
But it is not the only shortcoming of classical physics. In quantum mechanics, particles can exist in a superposition state, like the famous Schrödinger cat being alive and dead at the same time.
Even more counterintuitive: two particles can become entangled, losing their individuality. The only information that can be collected about them concerns their common correlations. Because both particles are described by a common state (the wavefunction), these correlations are stronger than what is possible in classical physics. It can be demonstrated by performing appropriate measurements on the two particles. If the results violate a classical limit, one can be sure they were entangled.
In the new study, EPFL researchers managed to entangle the photon and the phonon (i.e., light and vibration) produced in the fission of an incoming laser photon inside the crystal. To do so, the scientists designed an experiment in which the photon-phonon pair could be created at two different instants. Classically, it would result in a situation where the pair is created at time t1 with 50% probability, or at a later time t2 with 50% probability.
But here comes the “trick” played by the researchers to generate an entangled state. By a precise arrangement of the experiment, they ensured that not even the faintest trace of the light-vibration pair creation time (t1 vs. t2) was left in the universe. In other words, they erased information about t1 and t2. Quantum mechanics then predicts that the phonon-photon pair becomes entangled, and exists in a superposition of time t1andt2. This prediction was beautifully confirmed by the measurements, which yielded results incompatible with the classical probabilistic theory.
By showing entanglement between light and vibration in a crystal that one could hold in their finger during the experiment, the new study creates a bridge between our daily experience and the fascinating realm of quantum mechanics.
“Quantum technologies are heralded as the next technological revolution in computing, communication, sensing, says Christophe Galland, head of the Laboratory for Quantum and Nano-Optics at EPFL and one of the study’s main authors. “They are currently being developed by top universities and large companies worldwide, but the challenge is daunting. Such technologies rely on very fragile quantum effects surviving only at extremely cold temperatures or under high vacuum. Our study demonstrates that even a common material at ambient conditions can sustain the delicate quantum properties required for quantum technologies. There is a price to pay, though: the quantum correlations sustained by atomic vibrations in the crystal are lost after only 4 picoseconds — i.e., 0.000000000004 of a second! This short time scale is, however, also an opportunity for developing ultrafast quantum technologies. But much research lies ahead to transform our experiment into a useful device — a job for future quantum engineers.”
References: Santiago Tarrago Velez, Vivishek Sudhir, Nicolas Sangouard, Christophe Galland. Bell correlations between light and vibration at ambient conditions. Science Advances 18 December 2020, Vol. 6, no. 51, eabb0260 DOI: 10.1126/sciadv.abb0260 https://advances.sciencemag.org/content/6/51/eabb0260.full
A team in Paris has made the most precise measurement yet of the fine-structure constant, killing hopes for a new force of nature.
As fundamental constants go, the speed of light, c, enjoys all the fame, yet c’s numerical value says nothing about nature; it differs depending on whether it’s measured in meters per second or miles per hour. The fine-structure constant, by contrast, has no dimensions or units. It’s a pure number that shapes the universe to an astonishing degree — “a magic number that comes to us with no understanding,” as Richard Feynman described it. Paul Dirac considered the origin of the number “the most fundamental unsolved problem of physics.”
Numerically, the fine-structure constant, denoted by the Greek letter α (alpha), comes very close to the ratio 1/137. It commonly appears in formulas governing light and matter. “It’s like in architecture, there’s the golden ratio,” said Eric Cornell, a Nobel Prize-winning physicist at the University of Colorado, Boulder and the National Institute of Standards and Technology. “In the physics of low-energy matter — atoms, molecules, chemistry, biology — there’s always a ratio” of bigger things to smaller things, he said. “Those ratios tend to be powers of the fine-structure constant.”
The constant is everywhere because it characterizes the strength of the electromagnetic force affecting charged particles such as electrons and protons. “In our everyday world, everything is either gravity or electromagnetism. And that’s why alpha is so important,” said Holger Müller, a physicist at the University of California, Berkeley. Because 1/137 is small, electromagnetism is weak; as a consequence, charged particles form airy atoms whose electrons orbit at a distance and easily hop away, enabling chemical bonds. On the other hand, the constant is also just big enough: Physicists have argued that if it were something like 1/138, stars would not be able to create carbon, and life as we know it wouldn’t exist.
Physicists have more or less given up on a century-old obsession over where alpha’s particular value comes from; they now acknowledge that the fundamental constants could be random, decided in cosmic dice rolls during the universe’s birth. But a new goal has taken over.
Physicists want to measure the fine-structure constant as precisely as possible. Because it’s so ubiquitous, measuring it precisely allows them to test their theory of the interrelationships between elementary particles — the majestic set of equations known as the Standard Model of particle physics. Any discrepancy between ultra-precise measurements of related quantities could point to novel particles or effects not accounted for by the standard equations. Cornell calls these kinds of precision measurements a third way of experimentally discovering the fundamental workings of the universe, along with particle colliders and telescopes.
Today, in a new paper in the journal Nature, a team of four physicists led by Saïda Guellati-Khélifa at the Kastler Brossel Laboratory in Paris reported the most precise measurement yet of the fine-structure constant. The team measured the constant’s value to the 11th decimal place, reporting that α = 1/137.035999206.
With a margin of error of just 81 parts per trillion, the new measurement is nearly three times more precise than the previous best measurement in 2018 by Müller’s group at Berkeley, the main competition. (Guellati-Khélifa made the most precise measurement before Müller’s in 2011.) Müller said of his rival’s new measurement of alpha, “A factor of three is a big deal. Let’s not be shy about calling this a big accomplishment.”
Guellati-Khélifa has been improving her experiment for the past 22 years. She gauges the fine-structure constant by measuring how strongly rubidium atoms recoil when they absorb a photon. (Müller does the same with cesium atoms.) The recoil velocity reveals how heavy rubidium atoms are — the hardest factor to gauge in a simple formula for the fine-structure constant. “It’s always the least accurate measurement that’s the bottleneck, so any improvement in that leads to an improvement in the fine-structure constant,” Müller explained.
The Paris experimenters begin by cooling the rubidium atoms almost to absolute zero, then dropping them in a vacuum chamber. As the cloud of atoms falls, the researchers use laser pulses to put the atoms in a quantum superposition of two states — kicked by a photon and not kicked. The two possible versions of each atom travel on separate trajectories until more laser pulses bring the halves of the superposition back together. The more an atom recoils when kicked by light, the more out of phase it is with the unkicked version of itself. The researchers measure this difference to reveal the atoms’ recoil velocity. “From the recoil velocity, we extract the mass of the atom, and the mass of the atom is directly involved in the determination of the fine-structure constant,” Guellati-Khélifa said.
In such precise experiments, every detail matters. Table 1 of the new paper is an “error budget” listing 16 sources of error and uncertainty that affect the final measurement. These include gravity and the Coriolis force created by Earth’s rotation — both painstakingly quantified and compensated for. Much of the error budget comes from foibles of the laser, which the researchers have spent years perfecting.
“I love building shiny nice machines. And I love applying them to something important.”, said Holger Müller.
For Guellati-Khélifa, the hardest part is knowing when to stop and publish. She and her team stopped the week of February 17, 2020, just as the coronavirus was gaining a foothold in France. Asked whether deciding to publish is like an artist deciding that a painting is finished, Guellati-Khélifa said, “Exactly. Exactly. Exactly.”
Surprisingly, her new measurement differs from Müller’s 2018 result in the tenth digit, a bigger discrepancy than the margin of error of either measurement. This means — barring some fundamental difference between rubidium and cesium — that one or both of the measurements has an unaccounted-for error. The Paris group’s measurement is the more precise, so it takes precedence for now, but both groups will improve their setups and try again.
Though the two measurements differ, they closely match the value of alpha inferred from precise measurements of the electron’s g-factor, a constant related to its magnetic moment, or the torque that the electron experiences in a magnetic field. “You can connect the fine-structure constant to the g-factor with a hell of a lot of math,” said Cornell. “If there are any physical effects missing from the equations [of the Standard Model], we would be getting the answer wrong.”
Instead, the measurements match beautifully, largely ruling out some proposals for new particles. The agreement between the best g-factor measurements and Müller’s 2018 measurement was hailed as the Standard Model’s greatest triumph. Guellati-Khélifa’s new result is an even better match. “It’s the most precise agreement between theory and experiment,” she said.
And yet she and Müller have both set about making further improvements. The Berkeley team has switched to a new laser with a broader beam (allowing it to strike their cloud of cesium atoms more evenly), while the Paris team plans to replace their vacuum chamber, among other things.
What kind of person puts such a vast effort into such scant improvements? Guellati-Khélifa named three traits: “You have to be rigorous, passionate and honest with yourself.” Müller said in response to the same question, “I think it’s exciting because I love building shiny nice machines. And I love applying them to something important.” He noted that no one can single-handedly build a high-energy collider like Europe’s Large Hadron Collider. But by constructing an ultra-precise instrument rather than a super-energetic one, Müller said, “you can do measurements relevant to fundamental physics, but with three or four people.”
Copyright of this article totally belongs to Natalie Wolchover. This article is republished here from quanta magazine under common creative licenses.
A phenomenon of quantum mechanics known as superposition can impact timekeeping in high-precision clocks, according to a theoretical study from Dartmouth College, Saint Anselm College and Santa Clara University.
Research describing the effect shows that superposition—the ability of an atom to exist in more than one state at the same time—leads to a correction in atomic clocks known as “quantum time dilation.”
The research, published in the journal Nature Communications, takes into account quantum effects beyond Albert Einstein’s theory of relativity to make a new prediction about the nature of time.
“Whenever we have developed better clocks, we’ve learned something new about the world,” said Alexander Smith, an assistant professor of physics at Saint Anselm College and adjunct assistant professor at Dartmouth College, who led the research as a junior fellow in Dartmouth’s Society of Fellows. “Quantum time dilation is a consequence of both quantum mechanics and Einstein’s relativity, and thus offers a new possibility to test fundamental physics at their intersection.”
In the early 1900s, Albert Einstein presented a revolutionary picture of space and time by showing that the time experienced by a clock depends on how fast it is moving—as the speed of a clock increases, the rate at which it ticks decreases. This was a radical departure from Sir Isaac Newton’s absolute notion of time.
Quantum mechanics, the theory of motion governing the atomic realm, allows for a clock to move as if it were simultaneously traveling at two different speeds: a quantum “superposition” of speeds. The research paper takes this possibility into account and provides a probabilistic theory of timekeeping, which led to the prediction of quantum time dilation.
To develop the new theory, the team combined modern techniques from quantum information science with a theory developed in the 1980s that explains how time might emerge out of a quantum theory of gravity.
“Physicists have sought to accommodate the dynamical nature of time in quantum theory for decades,” said Mehdi Ahmadi, a lecturer at Santa Clara University who co-authored the study. “In our work, we predict corrections to relativistic time dilation which stem from the fact that the clocks used to measure this effect are quantum mechanical in nature.”
In the same way that carbon dating relies on decaying atoms to determine the age of organic objects, the lifetime of an excited atom acts as a clock. If such an atom moves in a superposition of different speeds, then its lifetime will either increase or decrease depending on the nature of the superposition relative to an atom moving at a definite speed.
The correction to the atom’s lifetime is so small that it would be impossible to measure in terms that make sense at the human scale. But the ability to account for this effect could enable a test of quantum time dilation using the most advanced atomic clocks.
Just as the utility of quantum mechanics for medical imaging, computing, and microscopy, might have been difficult to predict when that theory was being developed in the early 1900s, it is too early to imagine the full practical implications of quantum time dilation.
Researchers from Trinity College Dublin have discovered a uniquely quantum effect in erasing information that may have significant implications for the design of quantum computing chips. Their surprising discovery brings back to life the paradoxical “Maxwell’s demon”, which has tormented physicists for over 150 years.
The thermodynamics of computation was brought to the fore in 1961 when Rolf Landauer, then at IBM, discovered a relationship between the dissipation of heat and logically irreversible operations. Landauer is known for the mantra “Information is Physical”, which reminds us that information is not abstract and is encoded on physical hardware.
The “bit” is the currency of information (it can be either 0 or 1) and Landauer discovered that when a bit is erased there is a minimum amount of heat released. This is known as Landauer’s bound and is the definitive link between information theory and thermodynamics.
Professor John Goold’s QuSys group at Trinity is analysing this topic with quantum computing in mind, where a quantum bit (a qubit, which can be 0 and 1 at the same time) is erased.
In just-published work in the journal, Physical Review Letters, the group discovered that the quantum nature of the information to be erased can lead to large deviations in the heat dissipation, which is not present in conventional bit erasure.
Thermodynamics and Maxwell’s demon
One hundred years previous to Landauer’s discovery people like Viennese scientist, Ludwig Boltzmann, and Scottish physicist, James Clerk Maxwell, were formulating the kinetic theory of gases, reviving an old idea of the ancient Greeks by thinking about matter being made of atoms and deriving macroscopic thermodynamics from microscopic dynamics.
Professor Goold says:
“Statistical mechanics tells us that things like pressure and temperature, and even the laws of thermodynamics themselves, can be understood by the average behavior of the atomic constituents of matter. The second law of thermodynamics concerns something called entropy which, in a nutshell, is a measure of the disorder in a process. The second law tells us that in the absence of external intervention, all processes in the universe tend, on average, to increase their entropy and reach a state known as thermal equilibrium.
“It tells us that, when mixed, two gases at different temperatures will reach a new state of equilibrium at the average temperature of the two. It is the ultimate law in the sense that every dynamical system is subject to it. There is no escape: all things will reach equilibrium, even you!”
However, the founding fathers of statistical mechanics were trying to pick holes in the second law right from the beginning of the kinetic theory. Consider again the example of a gas in equilibrium: Maxwell imagined a hypothetical “neat-fingered” being with the ability to track and sort particles in a gas based on their speed.
Maxwell’s demon, as the being became known, could quickly open and shut a trap door in a box containing a gas, and let hot particles through to one side of the box but restrict cold ones to the other. This scenario seems to contradict the second law of thermodynamics as the overall entropy appears to decrease and perhaps physics’ most famous paradox was born.
But what about Landauer’s discovery about the heat-dissipated cost of erasing information? Well, it took another 20 years until that was fully appreciated, the paradox solved, and Maxwell’s demon finally exorcised.
Landauer’s work inspired Charlie Bennett – also at IBM – to investigate the idea of reversible computing. In 1982 Bennett argued that the demon must have a memory, and that it is not the measurement but the erasure of the information in the demon’s memory which is the act that restores the second law in the paradox. And, as a result, computation thermodynamics was born.
Now, 40 years on, this is where the new work led by Professor Goold’s group comes to the fore, with the spotlight on quantum computation thermodynamics.
In the recent paper, published with collaborator Harry Miller at the University of Manchester and two postdoctoral fellows in the QuSys Group at Trinity, Mark Mitchison and Giacomo Guarnieri, the team studied very carefully an experimentally realistic erasure process that allows for quantum superposition (the qubit can be in state 0 and 1 at same time).
Professor Goold explains:
“In reality, computers function well away from Landauer’s bound for heat dissipation because they are not perfect systems. However, it is still important to think about the bound because as the miniaturisation of computing components continues, that bound becomes ever closer, and it is becoming more relevant for quantum computing machines. What is amazing is that with technology these days you can really study erasure approaching that limit.
“We asked: ‘what difference does this distinctly quantum feature make for the erasure protocol?’ And the answer was something we did not expect. We found that even in an ideal erasure protocol – due to quantum superposition – you get very rare events which dissipate heat far greater than the Landauer limit.
“In the paper we prove mathematically that these events exist and are a uniquely quantum feature. This is a highly unusual finding that could be really important for heat management on future quantum chips – although there is much more work to be done, in particular in analysing faster operations and the thermodynamics of other gate implementations.
“Even in 2020, Maxwell’s demon continues to pose fundamental questions about the laws of nature.”
There is an old saying among cocktail party physicists: “Time flies like an arrow; fruit flies like a banana.” This is half true. Quantum physics doesn’t concern the tastes of insects, but it does quibble with the idea of unidirectional time. Quantum physics asserts that time is simply another dimension, meaning it can be explored like physical space. And if it can be explored, it can also not be explored. One should be able to stand perfectly still on a timeline without falling forward into the future.
That’s where the quantum Zeno Effect — a.k.a. Turing’s Paradox — comes in. Taking its name from Zeno’s arrow paradox (a moving arrow can’t actually be seen moving in any single instant, which means it’s not really moving at all), this idea basically states that if you never stop observing a particle that undergoes decay, then that particle will never decay. If that particle never decays, you’ve basically stopped it from doing anything. You’ve stopped time.
That probably doesn’t make any bit of sense if you never took advanced physics in college, so let’s take this one step at a time. The study of quantum physics is limited by the actions of the “observer” on a particular system. The most famous example of this is probably Schrödinger’s Cat, a thought experiment that illustrates the paradox inherent in quantum mechanics. You can learn more about Schrödinger and his damn cat here, but the basic conclusion is that before an observer can actually “measure” a certain system, she has to assume all outcomes are possible — and, therefore, before an observation is made, both of those outcomes exist simultaneously. They are “superimposed” on one-another.
But what happens if you are constantly observing the system? Well, if it’s the real world and not some crazy thought experiment, then the cat dies or doesn’t. But that’s not the way subatomic particles work. Multiple studies illustrate how measuring particles with increased frequency will affect the rate of decay — potentially suppressing it completely. And that’s because if you keep making measurements, there’s no time for the particles to progress into a superimposed state — they will always exist in their original, undecayed state
And if an unstable particle isn’t decaying, it’s basically frozen in time.
In a way, this demonstrates a possible way to stop time. Now, it’s nowhere near practical, of course — nobody has easy access to the type of high-tech scientific instruments you need to measure decaying atoms with such intense frequency. But the quantum Zeno effect does show that, at very, very tiny scales, you might be able to stop time. You just have to be a master at staring contests.
The double-slit experiment seems simple enough: Cut two slits in a sheet of metal and send light through them, first as a constant wave, then in individual particles. What happens, though, is anything but simple. In fact, it’s what started science down the bizarre road of quantum mechanics.
In the early 1800s, the majority of scientists believed that light was made up of particles, not waves. English scientist Thomas Young had a hunch that the particle theory wasn’t the end of the story and set out to prove that light was a wave. He knew that waves interacted in predictable ways, and if he could demonstrate those interactions with light, he could prove that light was indeed a wave. So he set up an experiment: He cut two slits in a sheet of metal and shone light through them onto a screen.
If light were indeed made of particles, the particles that hit the sheet would bounce off and those that passed through the slits would create the image of two slits on the screen, sort of like spraying paint on a stencil. But if light were a wave, it would do something very different: Once they passed through the slits, the light waves would spread out and interact with one another. Where the waves met crest-to-crest, they’d strengthen each other and leave a brighter spot on the screen. Where they met crest-to-trough, they would cancel each other out, leaving a dark spot on the screen. That would produce what’s called an “interference pattern” of one very bright slit shape surrounded by “echoes” of gradually darker slit shapes on either side. Sure enough, that’s what happened. Light traveled in waves. All’s well that ends well, right? Keep reading.
Around the turn of the 20th century, a few scientists began to refine this idea. Max Planck suggested that light and other types of radiation come in discrete amounts — it’s “quantized” — and Albert Einstein proposed the idea of the photon, a “quantum” of light that behaves like a particle. As a result, he said that light was both a particle and a wave. So back to the double-slit experiment: Remember when we said if light were a particle, it would create a sort of spray-paint stencil pattern instead of an interference pattern? By using a special tool, you actually can send light particles through the slits one by one. But when scientists did this, something strange happened.
The interference pattern still showed up.
This suggests something very, very weird is going on: The photons seem to “know” where they would go if they were in a wave. It’s as if a theater audience showed up without seat assignments, but each person still knew the exact seat to choose in order to fill the theater correctly. As Popular Mechanics puts it, this means that “all the possible paths of these particles can interfere with each other, even though only one of the possible paths actually happens.” All realities exist at once (a concept known as superposition) until the final result occurs.
Weirder still, when scientists placed detectors at each slit to determine which slit each photon was passing through, the interference pattern disappeared. That suggests that the very act of observing the photons “collapses” those many realities into one. Mind-blowing, right? It is for scientists too, which is why quantum mechanics is one of the most hotly debated areas of modern science.