Scientific first at CERN facility a preview of upcoming 3-year research campaign
The international Forward Search Experiment team, led by physicists at the University of California, Irvine, has achieved the first-ever detection of neutrino candidates produced by the Large Hadron Collider at the CERN facility near Geneva, Switzerland.
In a paper published today in the journal Physical Review D, the researchers describe how they observed six neutrino interactions during a pilot run of a compact emulsion detector installed at the LHC in 2018.
“Prior to this project, no sign of neutrinos has ever been seen at a particle collider,” said co-author Jonathan Feng, UCI Distinguished Professor of physics & astronomy and co-leader of the FASER Collaboration. “This significant breakthrough is a step toward developing a deeper understanding of these elusive particles and the role they play in the universe.”
He said the discovery made during the pilot gave his team two crucial pieces of information.
“First, it verified that the position forward of the ATLAS interaction point at the LHC is the right location for detecting collider neutrinos,” Feng said. “Second, our efforts demonstrated the effectiveness of using an emulsion detector to observe these kinds of neutrino interactions.”
The pilot instrument was made up of lead and tungsten plates alternated with layers of emulsion. During particle collisions at the LHC, some of the neutrinos produced smash into nuclei in the dense metals, creating particles that travel through the emulsion layers and create marks that are visible following processing. These etchings provide clues about the energies of the particles, their flavors – tau, muon or electron – and whether they’re neutrinos or antineutrinos.
According to Feng, the emulsion operates in a fashion similar to photography in the pre-digital camera era. When 35-millimeter film is exposed to light, photons leave tracks that are revealed as patterns when the film is developed. The FASER researchers were likewise able to see neutrino interactions after removing and developing the detector’s emulsion layers.
“Having verified the effectiveness of the emulsion detector approach for observing the interactions of neutrinos produced at a particle collider, the FASER team is now preparing a new series of experiments with a full instrument that’s much larger and significantly more sensitive,” Feng said.
Since 2019, he and his colleagues have been getting ready to conduct an experiment with FASER instruments to investigate dark matter at the LHC. They’re hoping to detect dark photons, which would give researchers a first glimpse into how dark matter interacts with normal atoms and the other matter in the universe through nongravitational forces.
With the success of their neutrino work over the past few years, the FASER team – consisting of 76 physicists from 21 institutions in nine countries – is combining a new emulsion detector with the FASER apparatus. While the pilot detector weighed about 64 pounds, the FASERnu instrument will be more than 2,400 pounds, and it will be much more reactive and able to differentiate among neutrino varieties.
“Given the power of our new detector and its prime location at CERN, we expect to be able to record more than 10,000 neutrino interactions in the next run of the LHC, beginning in 2022,” said co-author David Casper, FASER project co-leader and associate professor of physics & astronomy at UCI. “We will detect the highest-energy neutrinos that have ever been produced from a human-made source.”
What makes FASERnu unique, he said, is that while other experiments have been able to distinguish between one or two kinds of neutrinos, it will be able to observe all three flavors plus their antineutrino counterparts. Casper said that there have only been about 10 observations of tau neutrinos in all of human history but that he expects his team will be able to double or triple that number over the next three years.
“This is an incredibly nice tie-in to the tradition at the physics department here at UCI,” Feng said, “because it’s continuing on with the legacy of Frederick Reines, a UCI founding faculty member who won the Nobel Prize in physics for being the first to discover neutrinos.”
“We’ve produced a world-class experiment at the world’s premier particle physics laboratory in record time and with very untraditional sources,” Casper said. “We owe an enormous debt of gratitude to the Heising-Simons Foundation and the Simons Foundation, as well as the Japan Society for the Promotion of Science and CERN, which supported us generously.”
Savannah Shively and Jason Arakawa, UCI Ph.D. students in physics & astronomy, also contributed to the paper.
Featured image: The FASER particle detector that received CERN approval to be installed at the Large Hadron Collider in 2019 has recently been augmented with an instrument to detect neutrinos. The UCI-led FASER team used a smaller detector of the same type in 2018 to make the first observations of the elusive particles generated at a collider. The new instrument will be able to detect thousands of neutrino interactions over the next three years, the researchers say. Photo courtesy of CERN
In a first for particle physics, the CMS collaboration has observed three J/ψ particles emerging from a single collision between two protons
It’s a triple treat. By sifting through data from particle collisions at the Large Hadron Collider (LHC), the CMS collaboration has seen not one, not two but three J/ψ particles emerging from a single collision between two protons. In addition to being a first for particle physics, the observation opens a new window into how quarks and gluons are distributed inside the proton.
The J/ψ particle is a special particle. It was the first particle containing a charm quark to be discovered, winning Burton Richter and Samuel Ting a Nobel prize in physics and helping to establish the quark model of composite particles called hadrons.
Experiments including ATLAS, CMS and LHCb at the LHC have previously seen one or two J/ψ particles coming out of a single particle collision, but never before have they seen the simultaneous production of three J/ψ particles – until the new CMS analysis.
The trick? Analysing the vast amount of high-energy proton–proton collisions collected by the CMS detector during the second run of the LHC, and looking for the transformation of the J/ψ particles into pairs of muons, the heavier cousins of the electrons.
From this analysis, the CMS team identified five instances of single proton–proton collision events in which three J/ψ particles were produced simultaneously. The result has a statistical significance of more than ﬁve standard deviations – the threshold used to claim the observation of a particle or process in particle physics.
These three-J/ψ events are very rare. To get an idea, one-J/ψ events and two-J/ψ events are about 3.7 million and 1800 times more common, respectively. “But they are well worth investigating,” says CMS physicist Stefanos Leontsinis, “A larger sample of three-J/ψ events, which the LHC should be able to collect in the future, should allow us to improve our understanding of the internal structure of protons at small scales.”
Using an ultracold quantum gas of dysprosium atoms, it was possible for the first time to obtain a two-dimensional “droplet” lattice that possesses both the properties of a solid and a superfluid. We talk about it with the scientist at the head of the team of experimental physicists who conducted the experiment, Francesca Ferlaino of the University of Innsbruck
It’s called supersolidity and it’s a state of matter chased by physicists for decades. A quantum state with almost magical properties: a supersolid, in fact, simultaneously possesses the properties of a solid – with the atoms arranged in an ordered way, forming a crystalline structure – and those of a superfluid – in particular, the absence of friction. A bit like an ice cube sliding on water, or water sliding on an ice cube, wanting to look for an analogy, but with the individual atoms that are each and at the same time water andice. A state that is difficult even to imagine, let alone to achieve. Yet in the laboratory of the Department of Experimental Physics of the University of Innsbruck, Austria, they have succeeded.
They succeeded by using an ultra-cold quantum gas – we are talking about just a hundred nanokelvin, therefore on the border with absolute zero – of dysprosium atoms : an element of the rare earth group which, at low temperatures, is strongly magnetic. And what they managed to produce represents an absolute novelty: not a single row of atoms – therefore a one-dimensional supersolid , already made two years ago – but a matrix, or a two-dimensional supersolid (see opening image). A lattice of what scientists call droplets: “Droplets” thickened in a sea of ultra-cold quantum gas. With a purely quantum peculiarity, and completely counterintuitive, which establishes its supersolidity: despite the densification, the atoms of dysprosium are indistinguishable, as each of them is spread over the entire lattice. “Usually, one would think that every atom is in a given droplet , and that there is no way for this atom to move between them,” says the first author of the study published today in Nature , Matthew Norcia , of Institute of Quantum Optics and Quantum Information ( Iqoqi) of the Austrian Academy of Sciences in Innsbruck. On the contrary, in the supersolid state, each particle is delocalized on all the droplets , being simultaneously in each droplet. Thus, the system forms a series of high-density regions ( droplets ) that share the same delocalized atoms ».
But how did they get there? To understand this, it is better to go back to the origins of the studies on supersolidity, explains to Media Inaf the scientist at the head of the group that carried out the experiment , Francesca Ferlaino , who after a degree in physics from Federico II in Naples and a doctorate from Lens in Florence is today full professor at the University of Innsbruck, Austria, as well as managing and research director of Iqoqi, also in Innsbruck.
“If we have to choose a year from which to start this story, I would say 1957, when the great founding fathers of quantum mechanics started wondering what were the most paradoxical states that this theory could support, and superfluidity was starting to be an increasingly better phenomenon. known. First Eugene Gross , then Andreev, Lifshitz and Chester, wondered if it was possible to create a state of matter that on the one hand shows the rigidity of a solid, but on the other hand flows like a liquid “
And was it possible?
«From the point of view of the rules of quantum mechanics, yes: this type of state seemed to exist. Of course it is a paradoxical state, even just imagining a crystal flowing is quite difficult. However, this opened up a whole series of theoretical debates in the 1960s and 1970s, with scientists like Oliver Penrose and Lars Onsager saying no: superfluidity, with a fluid that flows without friction, and localization are two types of orders of matter that contradict each other – one prevents the formation of the other, so this super-solid state cannot exist ».
On the other hand, there were those who said that yes, maybe it existed?
“Well, there were those who wondered: but if it existed, what system could it be seen in? Which system in nature? In this sense, a great contribution came from Nobel laureate Anthony Leggett, theoretical physicist who in 1970 published an article – now considered a milestone on the subject – entitled “ Can a solid be superfluid? “. An article in which he discusses various possibilities ».
«The experimental approaches – or theoretical simulations – started from the idea of creating a solid that at some point, if cooled enough, could have superfluid properties. A material in which there were no two components – there were no solid and superfluid – but a single particle, indistinguishable as much as mechanically, which behaved at the same time as localized and diffused ».
Which material? Was there any candidate?
“From an experimental point of view, we started looking for this state in solid helium. A great result was then published in Nature in 2004 by two scientists at Penn State University – E. Kim and MHW Chan – in an article entitled ” Probable observation of a supersolid in helium “. A result that has attracted the attention of the whole condensed matter community: there have been great debates, there have been many theoretical groups that have tried to reproduce it by doing calculations, numerical simulations, etc. And the community began to split into two parts: those who believed in that result and those who tended to be a little skeptical. On the other hand, even just from the title of the paper – “ probableobservation… ”- it was obvious that not everything was completely under control at that time. The debate was very heated, but very heated within this community ».
How did it turn out?
«There has been an intense scientific work, some experimental groups have repeated the experiment of Kim and Chan obtaining different results, thus increasing even more the mystery around this state. Subsequently the same Kim and Chan repeated their experiment, this time on the basis of the calculations and the understanding of the system developed over the years, ending with a comment on their own work, published in 2012 in Physical Review Letters , entitled ” Absence of supersolidity in solid helium “”.
A tombstone …
«Let’s say that in the condensed matter community the question has become: game over , is it all over, or is this system still possible? A part of the research – above all theoretical, but also experimental – continues to develop considering these solids as the mother platform for the observation of supersolidity. Other theorists, however, have shifted attention to another type of platform: that of “cold atoms”. Because in cold atoms, in fact, various ingredients necessary for supersolidity are, let’s say, innate ».
What is special about cold atoms?
“In an ultra-cold gas, such as a Bose-Einstein condensate , with long-range interactions – therefore with the atoms that are” seen “from a distance – and not spherical, therefore anisotropic – such that if the atoms are seen in a direction they attract but if they look in another direction they repel – the system tends to be somewhat unstable. The system wants to crystallize: it is a gas, it never becomes a true crystalline structure, but it wants to spontaneously organize itself into a structure like a wave of matter – like a regular, crystalline wave. The problem is that this wantsof the system is too strong: only with this “desire” of theirs – because we know that in physics all systems seek a lower energy state – in a direction that decreases energy, the desire to crystallize would make them collapse. What was lacking in understanding – and this is where the experiments really mattered – was to find a mechanism that could stabilize , that is to say: you can crystallize up to a certain point, but then you have to stop. In this way the system undergoes a phase transition into a new state, which is precisely this state of minimum energy which is supersolid: the wave of matter forms and remains ».
And what is this mechanism capable of stabilizing the system?
“It was discovered, first by a group from Stuttgart and then quantitatively confirmed by our group, that in the dipolar atoms the system is stabilized by its own quantum fluctuations. The system, therefore, does create a crystal, but it cannot completely “explode” with infinitely high peaks, let’s say, because there is this stabilization system that tells it: you can’t create density peaks that much. It’s a kind of “quantum pressure”. All these ingredients together then allowed the observation of the fact that the system, by itself, spontaneously enters a new fundamental state: a completely coherent state, in which each atom is identical to the others – therefore they are indistinguishable – and they are all both localized than widespread. And this is precisely the supersolid state ».
State that you have made with an ultracold gas of dysprosium atoms, we said. But how did you get there, using dysprosium? Do you take the table of elements and choose a few at random – elements, by the way, which for us are absolutely exotic – until you find the right one to achieve the effect you are looking for? Or do you go without fail, already knowing more or less what region of the periodic table is to go fishing in order to achieve such a state?
«The first condensate was made with simple atoms, the alkalis , then atoms from the first column of the periodic table, which have only one valence electron. More or less there and in the next column, that of alkaline earths , the search for cold atoms had stopped a bit. Then, as we began to understand these simpler systems, we took the courage to go further and ask ourselves: I want a certain atomic property, which atom can give it to me?
And you made the leap from helium to dysprosium …
“Yes, but with a series of steps. We tried ytterbium, but it looks a lot like alkaline earthy. Then with chromium, which already showed dipolar effects, but needed an even higher magnetic moment. So we went to see which were the most magnetic atoms of the periodic table, and we arrived at dysprosium ».
Supersolids aside, among all the elements of the periodic table which is the one that gives you the most satisfaction? The one to which, by dint of using it in the laboratory, are you most fond of?
«I am very attached to erbium, because we were the first to condense it, but also to find a“ recipe ”that is now used in many laboratories . But it must be said that the erbium its lanthanide brothers are almost the same: you learn one and you have learned them all. So yes, I feel a certain sense of “motherhood” – so to speak – towards the erbium, the firstborn, but also for the other children ».
Featured image: Supersolid in 2D created by an ultracold gas. The colors represent the density, from black (low) to yellow (high). Source: Matthew A. Norcia et al., Nature, 2021
Reproduced in the laboratory, through plasma physics experiments, the so-called “collisionless shocks”. The results show that the relativistic energies to which the particles are accelerated by the shock surfing acceleration process are sufficient to trigger a process capable of bringing them to energy values typical of cosmic rays. Among the authors of the study, published today in Nature Physics, Marco Miceli of the University of Palermo and Salvatore Orlando Inaf of Palermo
Our planet is constantly bombarded with highly energetic particles from outer space called cosmic rays . Years of studies and observations have shown that these particles can be accelerated to the energy values observed by shock waves ( shock ) which propagate in different astrophysical environments, mainly in supernova remnants in expansion in the interstellar medium (a supernova remnant is a rapidly expanding nebula produced by a supernova explosion). Researchers have identified several mechanisms that, depending on the conditions of the shock and the surrounding environment, can accelerate particles to the typical values of cosmic rays.
To study the triggering of the acceleration process, a team of researchers led by physicists Weipeng Yao and Julien Fuchs (École Polytechnique, Sorbonne Université) reproduced in the laboratory the conditions of an expanding shock in an ambient gas, immersed in a magnetic field. uniform, and used magnetohydrodynamic simulations to study the evolution of the system. Ambient gas is made with low density hydrogen (10 18 cm -3), in which a solid Teflon target is immersed. This is then hit with a 1 ns high-powered laser pulse, which produced hot plasma that began to expand into the ambient gas. The initial rate of expansion of the resulting shock was 1500 km / s. Furthermore, the shock is initially characterized by parameters that describe the collisions between particles – mean free path and collision time of the collisions between ions – typical of conditions in which collisions are not important during the expansion of the shock wave ( collisionless shock). The whole system is immersed in a uniform magnetic field, aligned transversely with respect to the direction of propagation of the laser. This configuration reproduces the conditions of shock wave propagation in various astronomical environments, such as supernova remnants interacting with dense molecular clouds and the solar wind.
In addition to verifying the possibility of reproducing an expanding shock in the laboratory, the researchers measured, thanks to two spectrometers, the speed of protons accelerated during the experiment. From these measurements it was possible to demonstrate the presence of particles that were accelerated to energies of several hundred keV by the shock. Both the experiment and the numerical simulations show how these particles were accelerated in the first 2-3 ns of the system’s evolution, when the shock propagated at a speed greater than 1000 km / s. The dominant acceleration mechanism in this initial phase is shock surfing acceleration , a process in which the charged particles placed in front of the shock wave are accelerated by the electric field associated with the shock. «The process ofshock surfing acceleration allows the protons to be accelerated while “riding” the shock wave front, a bit like surfers, due to the effect of the electric field in the shock region “, one of the co-authors of the study explains to Media Inaf , Marco Miceli of the University of Palermo.
The experiment demonstrated that the relativistic energies at which the particles are accelerated by the shock surfing acceleration are sufficient to trigger a further process – known as diffusive shock acceleration and not reproduced by the experiment – capable of bringing the particles to typical energy values. of cosmic rays. “With laboratory experiments,” adds another of the co-authors, astrophysicist Salvatore Orlando of the INAF of Palermo, “we were able to recreate, on a scale, physical conditions extremely similar to those observed in some supernova remains, showing the surfing processit can trigger the acceleration mechanism of cosmic rays in these sources ». Finally, both the experiment and the simulations have shown the importance of the magnetic field, without which it is not possible to form the expanding shock.
This study therefore identifies for the first time in shock surfing acceleration the process responsible for the first phase of acceleration of cosmic rays by shocks similar to those that characterize supernova remnants. More generally, there are several astrophysical environments where this process plays an important role in the acceleration of charged particles. For example, during highly energetic phenomena on the Sun, such as in the shock front caused by coronal mass ejections, in interplanetary space in the region of interaction of the terrestrial magnetosphere with the solar wind, at the termination shock of the solar wind and, more generally, at the termination shock of the solar wind. in stellar winds.
Physics researchers at the University of North Florida’s Atomic LEGO Lab discovered a new electronic phenomenon they call “asymmetric ferroelectricity”. The research led by Dr. Maitri Warusawithana, UNF physics assistant professor, in collaboration with researchers at the University of Illinois and the Arizona State University, demonstrated this phenomenon for the first time in engineered two-dimensional crystals.
This discovery of asymmetric ferroelectricity in engineered crystals comes exactly 100 years following the discovery of ferroelectricity in certain naturally occurring crystals. Ferroelectric crystals – crystals that show two equal bistable polarization states – are now used in many high-tech applications including solid-state memory, RFID cards, sensors and precision actuators.
Utilizing atomic-scale materials design, the team of researchers has demonstrated a qualitatively new phenomenon, asymmetric ferroelectricity, for the first time. These engineered crystals lead to an asymmetric bi-stability with two unequal stable polarization states in contrast to a natural ferroelectric.
Warusawithana hopes this first observation of asymmetric ferroelectricity achieved through materials-by-design will further research on tailored electronic properties and may find its way into interesting technological applications.
Physicists from the University of Southampton and ETH Zürich have reached a new threshold of light-matter coupling at the nanoscale.
The international research, published this week in Nature Photonics, combined theoretical and experimental findings to establish a fundamental limitation of our ability to confine and exploit light.
The collaboration focused on photonic nano-antennas fabricated in ever reducing sizes on the top of a two-dimensional electron gas. The setup is commonly used in laboratories all over the world to explore the effect of intense electromagnetic coupling, taking advantage of the antennas’ ability to trap and focus light close to electrons.
Professor Simone De Liberato, Director of the Quantum Theory and Technology group at the University of Southampton, says: “The fabrication of photonic resonators able to focus light in extremely small volumes is proving a key technology which is presently enabling advances in fields as different as material science, optoelectronics, chemistry, quantum technologies, and many others.
“In particular, the focussed light can be made to interact extremely strongly with matter, making electromagnetism non-perturbative. Light can then be used to modify the properties of the materials it interacts with, thus becoming a powerful tool for material science. Light can be effectively woven into novel materials.”
Scientists discovered that light could no longer be confined in the system below a critical dimension, of the order of 250nm in the sample under study, when the experiment started exciting propagating plasmons. This caused waves of electrons to move away from the resonator and spill the energy of the photon.
Experiments performed in the group of Professors Jérôme Faist and Giacomo Scalari at ETH Zürich had obtained results that could not be interpreted with state-of-the-art understanding of light-matter coupling. The physicists approached Southampton’s School of Physics and Astronomy, where researchers led theoretical analysis and built a novel theory able to quantitatively reproduce the results.
Professor De Liberato believes the newfound limits could yet be exceeded by future experiments, unlocking dramatic technological advances that hinge on ultra-confined electromagnetic fields.
A team of researchers, affiliated with UNIST has recently unveiled a hemolysis-free and highly efficient blood plasma separation platform. Published in the May 2021 issue of Small, this breakthrough has been led by Professor Joo H. Kang and his research team in the Department of Biomedical Engineering at UNIST. The research team expects that the new technology will greatly improve the accuracy of point-of-care blood tests, which has shown the increased demand recently.
In their study, the research team used diamagnetic repulsion of blood cells to separate blood cells and blood plasma. Once superparamagnetic iron oxide nanoparticles (SPIONs) are supplemented to whole blood, the SPIONs turn the blood plasma into a paramagnetic condition, and thus, all blood cells are repelled by magnets. The research team collected hemolysis-free plasma without loss of plasma proteins, platelets, and exosomes.
“Many efforts have been made to develop various blood plasma separation methods. However, there always have been limitations, such as dilution of blood, blood cell impurity in plasma, and hemolysis,” noted Professor Kang. “Our approach overcame these unmet challenges and we could provide a huge impact on in vitro diagnosis once this platform is translated into a commercial point-of-care device.”
The developed blood plasma separation method achieved 100% of the plasma purity and 83.3% of the plasma volume recovery rate without noticeable hemolysis or loss of proteins in blood plasma, which was elusive with the conventional plasma separation devices. Moreover, this method enabled the greater recovery of bacterial DNA from the infected blood than centrifugation and immunoassays in whole blood without prior plasma separation.
“We have overcome the limitations of a filter-based blood plasma separation method that potentially could induce hemolysis or a microfluidic chip-based plasma separation method that has the problems in a plasma recovery rate and purity,” says Research Professor Seyong Kwon in the Department of Biomedical Engineering at UNIST, the first co-author of the study.
The research team also developed an ultra-compact, low-cost, high-precision diagnostic chip that can test blood directly without plasma separation. The diagnostic chip detected prostate-specific antigen (PSA) protein, a biomarker for prostate cancer diagnosis.
The developed blood plasma separation method also allowed them to collect platelet rich plasma (PRP). This capability is important because recent studies have revealed that platelets could be used as a biomarker for diagnosis of cancer or diabetes. “Unlike a complex process of the conventional centrifugation method to collect PRP, our method can simply collect PRP by just tuning flow rates,” says Jieung Oh, the first co-author of the study.
This study has been jointly carried out by Min Seok Lee in the Department of Biomedical Engineering at UNIST, together with Professor Joonwoo Jeong and Professor Eujin Um from the Department of Physics at UNIST. The findings of this research have been selected to make the back cover of the journal, Small. This work has been supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, as well as Young Researcher Program by the Ministry of Science and ICT (MSIT). It was made available online in May 2021 ahead of final publication in Small in June 2021.
All images credit: UNIST
Reference: Seyong Kwon, Jieung Oh, Min Seok Lee, “Enhanced Diamagnetic Repulsion of Blood Cells Enables Versatile Plasma Separation for Biomarker Analysis in Blood,” Small, (2021).
Precise theoretical predictions relevant for future neutrino experiments
A team in the PRISMA+ cluster of excellence at the Johannes Gutenberg University Mainz (JGU) succeeded in computing how atomic nuclei of the Calcium element behave in collisions with electrons. Results agree very well with available experimental data. For the first time, a calculation based on a fundamental theory is capable of correctly describing experiments for a nucleus as heavy as Calcium. Of particular relevance is the potential that such calculations could have in the future to interpret neutrino experiments. The renowned journal Physical Review Letters reports on the achieved milestone in its current volume.
The new publication stems from the group lead by Prof. Sonia Bacca, Professor for theoretical nuclear physics in the cluster of excellence PRISMA+, in collaboration with Oak Ridge National Laboratory. Bacca works with great success in predicting various properties of atomic nuclei deriving them from the interactions among their constituents — the nucleons — which can be described within chiral effective field theory. Her research aims at providing a solid connection between experimental observations and the underlying theory of quantum chromodynamics. In physics, such a procedure is described as an “ab initio calculation”, where “ab initio” means “from the beginning” in Latin.
Also cross sections of atomic nuclei probed by external fields, for example through the interaction with electrons or other particles, can be described within the same theory. This procedure is key to explaining existing data and interpreting future experiments, for example in neutrino physics —an important focus of the PRISMA+ research program.
Neutrinos are elusive particles that are constantly penetrating our Earth but are very difficult to detect and understand. With new planned experiments, such as the DUNE experiment in the USA, scientists want to investigate their fundamental properties, for example, the phenomenon in which one type of neutrinos transform into another —called in technical jargon, neutrino oscillation. In order to achieve that, they need important information from theoretical calculations. Specifically, the relevant question is: How do neutrinos interact with atomic nuclei in the detector?
Since experimental data on the scattering of neutrinos on atomic nuclei are rare, the team of researchers first looked at the scattering of another lepton —the electron— for which experimental data are available. “Calcium 40 is our test system, so to speak,” explains Dr. Joanna Sobczyk, postdoc in Mainz and first author of the study. “With our new ab initio method we were able to calculate very precisely what happens with electron scattering and how the Calcium atomic nucleus behaves.”
This is a great success: Until now it was not possible to carry out such calculations for an element as heavy as Calcium, which consists of 40 nucleons. “We are very pleased that we have succeeded in basically showing that our method works reliably,” says Sonia Bacca. “Now a new era begins, where the ab initio methods can be used to describe the scattering of leptons – these include electrons and neutrinos – on nuclei, even for 40 nucleons.”
“One of the nicest features of our approach is that it allows us to rigorously quantify uncertainties associated with our calculation. Uncertainty quantification is very time-consuming, but extremely important in order to be able to appropriately compare theory against experiment,” comments Dr. Bijaya Acharya, PRISMA+ postdoc and also co-author of the study.
After they were able to show the potential of their method for Calcium, the research team wants to look at the element Argon and its interaction with neutrinos in the future. Argon will play an important role as a target in the planned DUNE experiment.
In a new study from Skoltech and the University of Kentucky, researchers found a new connection between quantum information and quantum field theory. This work attests to the growing role of quantum information theory across various areas of physics. The paper was published in the journal Physical Review Letters.
Quantum information plays an increasingly important role as an organizing principle connecting various branches of physics. In particular, the theory of quantum error correction, which describes how to protect and recover information in quantum computers and other complex interacting systems, has become one of the building blocks of the modern understanding of quantum gravity.
“Normally, information stored in physical systems is localized. Say, a computer file occupies a particular small area of the hard drive. By “error” we mean any unforeseen or undesired interaction which scrambles information over an extended area. In our example, pieces of the computer file would be scattered over different areas of the hard drive. Error correcting codes are mathematical protocols that allow collecting these pieces together to recover the original information. They are in heavy use in data storage and communication systems. Quantum error correcting codes play a similar role in cases when the quantum nature of the physical system is important,” Anatoly Dymarsky, Associate Professor at the Skoltech Center for Energy Science and Technology (CEST), explains.
In a rather unexpected twist, scientists realized not too long ago that quantum gravity – the theory describing quantum dynamics of space and time – operates similar mathematical protocols to exchange information between different parts of space. “The locality of information within quantum gravity remains one of the few open fundamental problems in theoretical physics. That is why the appearance of well-studied mathematical structures such as quantum error correcting codes is intriguing,” Dymarsky notes. Yet the role of codes was only understood schematically, and the explicit mechanism behind the locality of information remains elusive.
In their new paper, he and his colleague, Alfred Shapere from the University of Kentucky Department of Physics and Astronomy, establish a novel connection between quantum error correcting codes and two-dimensional conformal field theories. The latter describe interactions of quantum particles and have become standard theoretical tools to describe many different phenomena, from fundamental elementary particles to quasi-particles emerging in quantum materials, such as graphene. Some of these conformal field theories also describe quantum gravity via holographic correspondence.
“Now we have a new playground to study the role of quantum error correcting codes in the context of quantum field theory. We hope this is a first step in understanding how locality of information actually works, and what hides behind all this beautiful mathematics,” Dymarsky concludes.