James Buchanan “Bucky’’ Barnes is a fictional superhero/supervillain character in the Marvel Cinematic Universe (MCU). Also known as the Winter Soldier, he is able to withstand the biological impairment of cryogenic freezing. But how? Well, Ilja Voets and colleagues now answered this question. Following his super-soldier experimentation, they suggested that, the Winter Soldier’s DNA has been modified to such an extent that he can naturally produce Anti-Freeze Glycoproteins (AF(G)Ps) when his body is subjected to freezing temperatures.
“It is possible that during these treatments the Winter Soldier’s DNA has been adequately modified to allow his body to naturally produce the winter flounder type I AFP.”
— they said.
They got this idea from Arctic and Antarctic fish species. These species produce AF(G)Ps in two different ways which help them to survive in their cold, ice-laden habitats. First, by lowering the freezing temperature of water in comparison to the melting temperature, creating a temperature gap known as the thermal hysteresis (TH) gap. Second, by the ice recrystallization inhibition (IRI) activity, in which these fishes ingest small ice crystals throughout their life span and their AF(G)Ps block further growth of the internalized ice crystals, enabling the fish to survive despite the presence of small ice grains in their blood and in certain vital organs.
“AF(G)Ps would be a more plausible way to improve cryopreservation given that they inhibit ice recrystallization in marine fish.” they said. “It is likely that the Winter Soldier is injected with some sort of serum or medication prior to being brought in cryostatus in the 2014 film Captain America: The Winter Soldier. This serum could contain synthetic cryoprotectants as well as an anaesthetic leading to loss of awareness and external sensation. However, we contend that following his super-soldier experimentation, the Winter Soldier’s DNA has been modified to such an extent that he can naturally produce AF(G)Ps when his body is subjected to freezing temperatures.”
They also hypothesized that, advancement in genetic engineering techniques like CRISPR/Cas9, could be very important for the possible development of genetically advanced humans such as the Winter Soldier in future scientific laboratories. It may be possible to insert the wf-afp gene into human DNA using the CRISPR/Cas9, thus providing the human body with the necessary genetic code to potentially produce the wf-AFP protein. As a result, we would be able to replicate in part the Winter Soldier’s ability to produce proteins to combat ice crystal growth that could arise during cryopreservation.
“However, giving the human body the ability to produce antifreeze proteins when in cryostatus is only part of the story. Unlike the films of the Marvel Cinematic Universe (MCU), scientists in the real world have yet to develop techniques that can resuscitate a person from cryostatus.”
— they concluded.
Reference: Suris-Valls, R., Mehmedbasic, M., & Voets, I. K. (2018). Marine Fish Antifreeze Proteins: The Key Towards Cryopreserving The Winter Soldier. Superhero Science and Technology, 1(1). https://doi.org/10.24413/sst.2018.1.2105
Note for editors of other websites: To reuse this article fully or partially kindly give credit either to our author/editor S. Aman or provide a link of our article
Mariam Bouhmadi-Lopez and colleagues have constructed a symmetric wormhole solution in General Relativity (GR), which is supported by a 3-form field with a potential that contains a quartic self-interaction term. They hint towards the possibility that, the 3-form wormholes could be potential black hole mimickers, as long as the coefficient of the quartic self-interaction term (Λ) is sufficiently large, precisely when NEC is weakly violated. Their study recently appeared in Arxiv.
General Relativity (GR) is a well-tested theory, so it would be interesting to find traversable wormhole solutions in GR with “physically reasonable” matter field to support the throat. Such a matter field should preferably possess a correct sign for its kinetic energy though necessarily still violate some energy conditions. It would be even better if such a matter field is in some sense natural (e.g., it can also be applied to explain cosmological puzzles such as the accelerated expansion of the Universe). One natural candidate is the 3-form field which is ubiquitous to string theory within a cosmological framework and beyond.
Now, Mariam Bouhmadi-Lopez and colleagues numerically constructed a symmetric wormhole solution in pure Einstein gravity supported by a massive 3-form field with a potential that contains a quartic self-interaction term.
They found that, the wormhole spacetimes have only a single throat and they are everywhere regular and asymptotically flat. Furthermore, their mass and throat circumference increase almost linearly as the coefficient of the quartic self-interaction term Λ increases.
The amount of violation of the null energy condition (NEC) is proportional to the magnitude of 3-form, thus the NEC is less violated as Λ increases, since the magnitude of 3-form decreases with Λ.
In addition, they have investigated the geodesic equations for null particles and timelike particles moving around the wormhole. It is found that the unstable photon sphere/orbit, on which photons can undergo circular motions around the wormhole, is exactly at the wormhole throat.
In addition, they have investigated the geodesics of particles moving around the wormhole and found that the unstable photon orbit is located at the throat. They also found that the wormhole can cast a shadow whose apparent size is smaller than that cast by the Schwarzschild black hole, but reduces to it when Λ acquires a large value.
Moreover, they also discussed the behavior of the innermost stable circular orbit (ISCO) around this wormhole and found that the radius of ISCO deviates from the Schwarzschild counterpart when Λ is small, but reduces to it for a larger Λ. Thus, their wormholes can be a black hole mimicker when Λ is large, precisely when NEC is less violated.
“Of course, most astrophysical black holes rotate, so it remains to be seen if this mimicry still holds when rotation is considered.”
— authors of the study.
“Future investigation will look into the radial perturbation on the background metric and the form field to check stability, among other considerations. Indeed, the wormhole solutions supported by the complex phantom scalar field are found to be unstable against linear perturbations. It would be interesting to check whether our wormholes suffer from the same instability, and if so, for which range of Λ.”, they concluded.
Reference: Mariam Bouhmadi-López, Che-Yu Chen, Xiao Yan Chew, Yen Chin Ong, Dong-han Yeom, “Traversable Wormhole in Einstein 3-Form Theory With Self-Interacting Potential”, Arxiv, pp. 1-12, 2021. arXiv:2108.07302
Note for editors of other websites: To reuse this article fully or partially kindly give credit either to our author S. Aman or provide a link of our article
Twenty-five years ago, an enigmatic signal was discovered while analyzing the polarization of sunlight with a new instrument, the Zurich Imaging Polarimeter (ZIMPOL).
This mysterious linear polarization signal appears at the 5896 Å wavelength of the D1 line of neutral sodium where, according to the line’s quantum numbers, no linear polarization due to scattering processes should be present. This polarization signal was therefore totally unexpected, and its interpretation immediately opened an intense scientific debate. The mystery further increased two years later, when the journal Nature published a letter with an explanation, which required that the sublevels of the lower level of the D1 line are not equally populated.
In that theoretical work, the enigmatic polarization signal of the D1 line was reproduced remarkably well. However, the proposed explanation implied that the region of the solar atmosphere known as the chromosphere is completely unmagnetized, in apparent contradiction with established results, which instead indicate that the quiet regions (outside sunspots) of the solar chromosphere are permeated by magnetic fields in the gauss range. This opened a serious paradox, which has challenged solar physicists for many years, and even led some scientists to question the quantum theory of atom-photon interactions.
A first breakthrough towards the resolution of the paradox was achieved in 2013 at the IAC, when Luca Belluzzi and Javier Trujillo Bueno theoretically discovered a new mechanism through which linear polarization can be produced in the sodium D1 line without the need of population imbalances in the lower level of the D1 line. However, that important step given by these researchers was for the idealized case of a solar atmosphere model without magnetic fields.
In an article published today by Physical Review Letters, the prestigious scientific journal of the American Physical Society, Ernest Alsina Ballester, Luca Belluzzi, and Javier Trujillo Bueno show the solution to this intriguing paradox, which has puzzled solar physicists since 1998. As shown in figure 1, this team of researchers has been able to reproduce the enigmatic observations of the D1 line polarization, in the presence of magnetic fields in the gauss range.
To achieve this result, it was necessary to carry out the most complete theoretical modeling of this polarization signal ever attempted, accounting for the joint action of very complex physical mechanisms. This required three years of work, carried out through a close cooperation between the Istituto Ricerche Solari (IRSOL) in Locarno-Monti (affiliated to the Università della Svizzera italiana) and the POLMAG group of the Instituto de Astrofísica de Canarias (IAC) in Tenerife (see http://research.iac.es/proyecto/polmag/).
This result has very important consequences. Linear polarization signals, like the one observed in the D1 line of sodium, are extremely interesting because they encode unique information on the elusive magnetic fields present in the solar chromosphere. This key interface layer of the solar atmosphere, located between the underlying cooler photosphere and the overlying million-degree corona, is at the core of several enduring problems in solar physics, including the understanding and prediction of the eruptive phenomena that may strongly impact our technology-dependent society.
The magnetic field is known to be the main driver of the spectacular dynamical activity of the solar chromosphere, but our empirical knowledge of its intensity and geometry is still largely unsatisfactory. The solution of the long-standing paradox of solar D1 line polarization proves the validity of the present quantum theory of spectral line polarization, and opens up a new window to explore the magnetism of the solar atmosphere in the present new era of large-aperture solar telescopes.
This research has received funding from the Swiss National Science Foundation (SNSF) through Grant 200021-175997 and from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Advanced Grant agreement No. 742265).
“We have never modeled the trajectory of an asteroid with this precision before,” explains Davide Farnocchia of NASA, the first author of a new study that recalculated the orbits of the asteroid Bennu with the data obtained from the Osiris-Rex spacecraft. the next two centuries. Until 2135 we can rest assured; after that, he has some small chance of going through the wrong gravity gate
On September 24, 2023, an abundant handful of carbonaceous regolith is expected to arrive on earth that the NASA Osiris-Rex spacecraft collected in October 2020 directly from the surface of the asteroid Bennu . Also on September 24, but many years later, precisely in 2182, the Earth risks being delivered home to the entire mass of the asteroid from half a kilometer in diameter. Let’s see why.
The asteroid (101955) Bennu , discovered in 1999 and repeatedly observed with optical telescopes and radar, is a Near-Earth Object potentially at risk of intersecting its orbit with Earth’s. Indeed, in the Palermo scale – used by astronomers to assess the risk of impact – it is one of the two objects with the highest overall probability of causing damage to the Earth .
A new study , published in the journal Icarus , has now considerably refined the predictions on a possible encounter between Bennu and our planet, based on the information gathered by the Osiris-Rex mission in the more than two years in which he visited Bennu. Osiris-Rex has closely investigated the size, shape and composition of the asteroid, studying its rotation and orbital trajectory in much more detail than any Earth-based telescope could. This has now made it possible to more accurately estimate its path over the next couple of centuries .
The new study determined an overall probability of impact by the year 2300 of 1 in 1750 (or 0.057 percent), while – as mentioned – September 24, 2182 would be the single most significant date in terms of potential, with a probability of impact of 1 in 2700 (or about 0.037 percent).
“The Osiris-Rex data give us much more precise information, we can test the limits of our models and calculate Bennu’s future trajectory with a very high degree of certainty up to 2135”, comments the first author of the study Davide Farnocchia , of the Center for Near Earth Object Studies ( Cneos ) by NASA. “We have never modeled the trajectory of an asteroid with this precision before.”
In 2135, the asteroid Bennu will get very close to Earth. While not posing any risk at that time , it is essential for scientists to be able to understand as accurately as possible how Earth’s gravity will alter the asteroid’s path around the Sun. In particular, if the space body will travel some specific , passing trajectories . through certain points called ” holes of gravity lock ” ( gravitational keyhole ), which would put it on a collision course with Earth during future orbit.
To calculate exactly where Bennu will be during his close passage of 2135 and whether or not he can pass through a gravitational keyhole , Farnocchia and his team evaluated various types of forces that, however small, can affect the path of the asteroid while orbiting the Sun.
Among these forces, the heat of the Sun itself plays a crucial role thanks to a phenomenon called the Yarkovsky effect , which, with the rotation of the body between “day” and “night”, causes the energy absorbed by the Sun to be dispersed into space, generating a small amount of thrust on the asteroid. With the forthcoming availability of a sample of the Bennu surface, the thermal properties of the asteroid can be determined even more accurately .
“The Yarkovsky effect acts on asteroids of all sizes and, although it was measured for a small fraction of the asteroid population from afar, Osiris-Rex gave us the first opportunity to measure it in detail as Bennu traveled around the Sun”, explains Steve Chesley of Jpl Nasa, one of the authors of the new study. “The effect on Bennu is equivalent to the weight of three grapes constantly acting on the asteroid: tiny, yes, but significant when determining the future possibilities of Bennu impact in the decades and centuries to come.”
The research team also considered many other perturbing forces , including the gravity of the Sun, planets, their moons and more than 300 other asteroids, resistance caused by interplanetary dust, solar wind pressure, and ejection events from particles from Bennu himself.
The researchers also evaluated the force exerted by Osiris-Rex during the performance of its Touch-And-Go sample collection event on October 20, 2020, to see if it could slightly alter Bennu’s orbit. Theoretical estimates were confirmed, according to which the small touch of the sampler would have a negligible effect .
By favoring the release of hydrogen from the atmosphere of Mars, regional dust storms contribute in an important way to the loss of water by the Red Planet: these are the results of the analysis of a Martian dust storm observed in 2019 by the Mars Reconnaissance Orbiter probes. , Trace Gas Orbiter and Maven. Giancarlo Bellucci from INAF also participated in the study
Mars , now an arid and dry planet, has not always been this way: three and a half billion years ago it was quite similar to Earth, with vast oceans of liquid water covering its surface and an atmosphere denser and more humid than it is today. . Some of this water is currently found in the ice caps at the poles or in the planet’s crust, but a significant portion has been lost over billions of years.
In the 1970s, the Mariner 6 and 7 probes provided the first data to study these ” gas leaks “, which scientists had initially interpreted as a slow and constant phenomenon. However, more recent observations have shown a strong seasonal variability: during the summer, the atmosphere is warmer and water vapor tends to rise more. Above, the atmosphere is more tenuous and the molecules are more vulnerable to ultraviolet radiation from the Sun, which separates the hydrogen atoms from the oxygen ones and favors their escape.
On Mars, however, not all summers are the same: there are peaks of hydrogen loss in the southern hemisphere summer – which occurs near perihelion, the closest point to the Sun along the planet’s orbit – higher than ten to one hundred times the values measured in the northern hemisphere summer. In addition to seasonal variations, the role of dust storms in favoring this process has also been investigated for some time , in particular the global ones that envelop the planet every 1–3 Martian years: the dust heats the atmosphere, triggering similar hydrogen leaks.
A new study, published today in the journal Nature Astronomy , reveals the importance in this balance of a factor largely underestimated until now: the dust storms on a regional scale . Smaller but more frequent than their global counterparts, these short-lived storms would nowadays play a key role in fostering the loss of hydrogen and with it the relentless drying out of the Red Planet. According to the new findings, Mars loses twice the amount of water during a regional dust storm compared to an entire summer season in the southern hemisphere in the absence of regional storms.
So far, it was the lack of data that had not allowed researchers to study regional dust storms in detail. That all changed in early 2019, when a more unique than rare coincidence sees the orbits of three space probes around Mars – NASA’s Mars Reconnaissance Orbiter (Mro) and Maven , and ExoMars’ Trace Gas Orbiter (Tgo), a collaboration between ESA and Roscosmos – to line up precisely on the occasion of one of these “fuss”. It was thus possible to observe the region affected by the storm, which unleashed in early January and lasted for several weeks, with four instruments: Mars Climate Sounder on Mro, Imaging Ultraviolet Spectrograph on Maven, and on board the Tgo both Nomad (Nadir and Occultation for Mars Discovery), a Belgian-Spanish-Italian-British-led instrument, whose Italian contribution is financed by the Italian Space Agency, which the Russian-led instrument Acs (Atmospheric Chemistry Suite).
“We really captured the whole system in action,” comments lead author Michael S. Chaffin of the University of Colorado at Boulder.
By combining the data from the four instruments, the researchers were able to measure the temperature and concentration of dust and water ice from the surface up to about one hundred kilometers of altitude with Mro, the concentration of water vapor and ice in the same altitude range with Tgo. , and the quantity of hydrogen – deriving from water vapor molecules “broken” by solar radiation – up to a thousand kilometers of altitude with Maven. In particular, the Nomad instrument aboard Tgo measures with extreme precision the abundance of gases present at various altitudes in the atmosphere of Mars through the solar occultation technique., observing sunlight passing through the atmosphere to measure very small amounts of gas molecules, including water vapor, present at different altitudes.
“In the study published today, Nomad data were used to measure the abundance of water vapor before and during a regional dust storm,” explains Giancarlo Bellucci , INAF researcher in Rome, co-principal investigator of the Nomad instrument and co-principal author of the research.
Together with Acs, Nomad showed the presence of water vapor in the lower atmosphere before the start of the dust storm; then, as it rose, heating the atmosphere and triggering the winds, the instruments revealed the water vapor catapulted to higher altitudes. “It has been seen that the water, due to the dust storm, is pumped from the troposphere up to altitudes of 60 km. At these heights, water molecules are broken up by solar radiation and dispersed into space. Since regional storms are more frequent than global ones, this mechanism can explain how Mars has lost much of its water over time », adds Bellucci.
After the storm began, the instruments aboard TGO measured ten times more water in the central part of the central atmosphere. These observations coincide with what MRO detected: the radiometer on board the probe has in fact measured the increase in temperatures in the atmosphere as the dust accumulated above Mars.
“The instruments should all tell the same story, and they do,” points out co-author Geronimo Villanueva of the NASA Goddard Space Flight Center in Greenbelt, Maryland.
MRO also saw the clouds of water ice disappear, as expected as the ice could no longer condense near the surface, which had warmed in the meantime. Further confirmation came from the ultraviolet spectrograph images aboard Maven: the ice clouds, visible before the storm over the volcanoes in the Martian region of Tharsis and completely disappeared during the storm, reappeared soon after. Maven’s observations also captured an aurora in the upper atmosphere accompanied by a fifty percent increase in hydrogen during the storm, reflecting the breakdown of water vapor into hydrogen and oxygen by solar radiation; the increase in hydrogen corresponded to the water measured, a week earlier, at 60 km of altitude,
“This article helps us to virtually go back in time and say,” Okay, now we have another way to lose water that will help us relate the little water we have today on Mars to the huge amount of water we had in the world. past ”», concludes Villanueva.
Featured image: Diagram of the water cycle in the atmosphere of Mars (click to enlarge). Credits: Esa
To know more:
Read on Nature Astronomy the article ” Martian Water Loss to Space Enhanced by Regional Dust Storms ” by MS Chaffin, DM Kass, S. Aoki, AA Fedorova, J. Deighan, K. Connour, NG Heavens, A. Kleinböhl, SK Jain , J.-Y. Chaufray, M. Mayyasi, JT Clarke, AIF Stewart, JS Evans, MH Stevens, W E. McClintock, M. Crismani, GM Holsclaw, F. Lefevre, DY Lo, F. Montmessin6, NM Schneider, B. Jakosky, G. Villanueva, G. Liuzzi, F. Daerden, IR Thomas, J.-J. Lopez-Moreno, MR Patel, G. Bellucci, B. Ristic, JT Erwin, AC Vandaele, A. Trokhimovskiy, OI Korablev
An international team of astronomers, including researchers from INAF and the University of Bologna, has published the most detailed radio band images ever seen of distant galaxies, revealing their inner workings in unprecedented detail. Images with an angular resolution below the second of arc, obtained thanks to data collected by the Low Frequency Array (LoFar), a network of over 70 thousand small antennas scattered in nine European countries
What you see in the image opposite are jets of radio waves emitted by the supermassive black hole at the center of Hercules A, a galaxy with an active core located in the constellation of Hercules at a distance of two billion light years from us. It’s not an artistic representation: those jets are just “the real thing” – the image that emerges from the data. Not an artistic representation, we said, but not even a traditional photograph. Being invisible to optical telescopes, radio antennas are needed to “see” these jets. Special radio antennas, however, tuned to very normal frequencies: more or less around the traditional 88-108 MHz range used by FM stations here on Earth. But there’s more: to produce such detailed images of galaxies billions of light years away, a single radio antenna is not enough. A technique is needed – saidinterferometry – which requires dozens of antennas. Sometimes hundreds or thousands. And they must be far from each other. Well, to reconstruct the image at the beginning – and the others you see in the animation below – over 70 thousand antennas spread over eight European countries were used: the antennas of the LoFar network . The result, presented in a series of ten articles put online today and which will be published starting tomorrow in Astronomy & Astrophysics , is a record: a resolution of 300 milliarcoseconds, the highest ever reached at these frequencies.
“These are the first low-frequency radio images ever obtained with a quality and angular resolution comparable to those produced by the Hubble telescope in the optical band. To do this “, the first author of one of the ten articles, the astrophysicist Etienne Bonnassieux , postdoc researcher at the University of Bologna and associate Inaf , explains to Media Inaf ,” we had to understand how to correct the effects of the ionosphere, which cause the correlation of radio signals received by international LoFar stations: it is a disturbance similar to seeing for ground-based telescopes, but our stations are far from each other even many hundreds of km ».
And what do these images tell us? Thanks to the fact that radio waves – unlike normal visible light – can easily pass through the clouds of gas and dust that envelop many astronomical objects, LoFar’s antennas allow astronomers to see what is happening in star-forming regions. for instance. Or in active galactic nuclei, in fact, where supermassive black holes lodge. Black holes that devour the falling matter towards their event horizon, thus triggering the emission of the very powerful radio jets picked up by the LoFar antennas, allowing us to observe in detail the internal structure and to reveal aspects unknown until now. The resolution obtained is also a record for LoFar itself: these images are 20 times sharper than those that the European radio telescope had previously managed to produce. This is thanks to the innovative way the team of scientists used the array– that is to say, the set of antennas. Usually, to produce standard resolution images, while acquiring data with all antennas, only the signals from those located in the Netherlands are correlated, which allows to obtain with interferometry a “virtual telescope” with a collecting mirror from 120 km in diameter – such is in fact the maximum distance between the Dutch antennas. Instead, using the signals of all 70 thousand European antennas, as was done in this case, it is as if the diameter of the virtual mirror had increased up to almost 2000 km, thus increasing the resolution by about twenty times.
However, the high number of antennas and the large extension of the area in which they are arranged are only two of the requirements necessary to obtain such sharp images. It is also necessary that the 70 thousand antennas work as if they were a single antenna, therefore it is essential to correlate the signal acquired by each of them. Unlike arraysconventional, which to produce images correlate the signals from the various antennas in real time, the signals collected by each LoFar antenna are first digitized, then sent to the central processor and finally combined to create an image. Each LoFar image is therefore the result of the combination of signals from over 70,000 antennas, which implies enormous computing powers: we are talking, for a single image, of over 13 terabits of raw data per second – the equivalent of three hundred DVDs – from to elaborate.
And we are only at the beginning. “What we want to do now”, in fact, another of the authors of the articles published on A&A , the astrophysicist of INAF from Bologna, Gianfranco Brunetti, tells Media Inaf, Italian coordinator of the LoFar collaboration, «is to map large areas of the sky with high angular resolution. In this case a difficulty is represented by the great computational demands: it will in fact be necessary to be able to effectively distribute the calculations of the complex algorithms for data analysis on supercomputers. This is the new challenge. Think that today to map a single LoFar pointing at 140 MHz at high angular resolution – an area of the sky of 2.5 × 2.5 degrees, therefore – it takes about seven thousand hours of computation on a node of a latest generation cluster ».
Speaking of calculation, it should finally be emphasized that, in addition to the scientific results, the LoFar team has also made public its algorithms – and in particular the so-called data analysis pipeline , described in detail in one of the articles published on A&A , so to allow anyone to produce high-resolution images with relative ease. “Our goal”, concludes another scientist on the LoFar team, radio astronomer Leah Morabito of Durham University, “is to enable the scientific community to use the entire European network of LoFar telescopes to produce science. without being forced to take years to become experts ”.
Featured image: Hercules A is powered by a supermassive black hole, located at its center, which feeds on the surrounding gas and channels part of this gas in extremely fast jets. The new high-resolution observations made with LoFar (click to enlarge) have shown how the intensity of this jet has a modulation – from stronger to weaker, and vice versa – over a period of a few hundred thousand years. This variability is at the origin of the beautiful structures that we can admire in the giant lobes, each of which is about the size of the entire Milky Way. Credits: R. Timmerman; LoFar & Hubble Space Telescope
Using multi-wavelength observations – from millimeters to gamma rays – of the star-forming region of Ophiuchus, the astronomers revealed a stream of the aluminum-26 radionuclide from a nearby star cluster. Their discovery indicates that supernovae in the star cluster are the most likely source of short-lived radionuclides in star-forming clouds. All the details on Nature Astronomy
An active star-forming region in the constellation Ofphiuchus is providing astronomers with new insights into the conditions under which the solar system was born. In particular, a new study shows that concerns him directly how the solar system may have been enriched with radioactive elements in the short life (in English short-lived radionuclide , elements whose average life is less than 100 million years).
Evidence of this enrichment process dates back to the 1970s, when scientists studying some mineral inclusions in meteorites concluded that they must have been pristine remains of the early solar system and contained the decay products of short-lived radionuclides. These radioactive elements may have been “blown” into the forming Solar System by a nearby supernova or by strong stellar winds emitted by a massive star known as the Wolf-Rayet star .
The authors of the new study, published Aug. 16 in Nature Astronomy , used multi-wavelength observations of the Ophiucus star-forming region – including spectacular new infrared data – to reveal interactions between gas clouds. star formation and the radionuclides produced in a nearby cluster of young stars . Their discovery indicates that supernovae in the star cluster are the most likely source of short-lived radionuclides in star-forming clouds.
“The Solar System most likely formed in a gigantic molecular cloud together with a young star cluster, and one or more supernova events generated by some massive stars belonging to the cluster contaminated the gas that transformed into the Sun and its planetary system, ”explains Douglas NC Lin , professor emeritus of astronomy and astrophysics at UC Santa Cruz . “Although this scenario has been suggested in the past, the strength of this paper is to use multi-wavelength observations and sophisticated statistical analysis to infer a quantitative measurement of the likelihood of the model.”
The data from gamma-ray space telescopes, explains John Forbes , first author of the study and researcher at the Flatiron Institute’s Center for Computational Astrophysics , allow the detection of gamma rays emitted by the radionuclide aluminum-26, a radioactive isotope of aluminum whose half-life is equal to 720 thousand years. “These are challenging observations,” says the researcher. “We can only convincingly detect it in two star-forming regions, and the best data come from the Ofiuco complex.”
The Ophiuchus cloud complex contains many dense protostellar nuclei in various stages of star formation and protoplanetary disk development , which represent the earliest stages in the formation of a planetary system. By combining imaging data at wavelengths ranging from millimeters to gamma rays, the researchers were able to visualize a flux of aluminum-26 from the nearby star cluster to the Ophiuchus star-forming region.. “The enrichment process we are seeing in Ophiuchus is consistent with what happened during the formation of the Solar System 5 billion years ago,” explains Forbes. “Once we saw this nice example of how the process could happen, we started trying to model the nearby star cluster that produced the radionuclides we see in gamma rays today.”
Forbes has developed a model that takes into account every massive star that may have existed in this region – including mass, age, and the likelihood of exploding as a supernova – and incorporates the potential returns of aluminum-26 from stellar winds and supernovae. . The model allowed him to determine the likelihood of different scenarios for aluminum-26 production observed today. “We now have enough information to say that there is a 59 percent chance that it is due to supernovae and a 68 percent chance that it is from multiple sources and not just a supernova,” continues Forbes. This type of statistical analysis assigns probabilities to scenarios that astronomers have been discussing for the past 50 years, Lin noted. “This is the new direction for astronomy to quantify likelihood,” he adds.
The new findings also show that the amount of short-lived radionuclides incorporated into newly formed star systems can vary widely. “Many new star systems will be born with abundances of aluminum-26 in line with our solar system, but the variation is enormous, several orders of magnitude,” says Forbes. “This is important for the early evolution of planetary systems, as aluminum-26 is the main source of initial warming. More aluminum-26 probably means drier planets. ‘
The infrared data, which allowed the team to peer through dusty clouds at the heart of the star-forming complex, was obtained by co-author João Alves of the University of Vienna , as part of the ESO’s Vision survey of nearby star nurseries carried out using the Vista telescope , Chile. “There is nothing special about Ofiuco as a star-forming region,” concludes Alves. “It’s just a typical configuration of gas and massive young stars, so our results should be representative of the enrichment of short-lived radioactive elements in the formation of stars and planets throughout the Milky Way .”
Featured image: Deep near infrared color composite image of the L1688 cloud in the Ophiuco star formation complex from the ESO Visions survey, where blue, green and red are mapped to the Nir J (1.2 μm), H (1.6 μm) and Ks bands (2.2 μm), respectively. Credits: João Alves / Eso Visions
Using an ultracold quantum gas of dysprosium atoms, it was possible for the first time to obtain a two-dimensional “droplet” lattice that possesses both the properties of a solid and a superfluid. We talk about it with the scientist at the head of the team of experimental physicists who conducted the experiment, Francesca Ferlaino of the University of Innsbruck
It’s called supersolidity and it’s a state of matter chased by physicists for decades. A quantum state with almost magical properties: a supersolid, in fact, simultaneously possesses the properties of a solid – with the atoms arranged in an ordered way, forming a crystalline structure – and those of a superfluid – in particular, the absence of friction. A bit like an ice cube sliding on water, or water sliding on an ice cube, wanting to look for an analogy, but with the individual atoms that are each and at the same time water andice. A state that is difficult even to imagine, let alone to achieve. Yet in the laboratory of the Department of Experimental Physics of the University of Innsbruck, Austria, they have succeeded.
They succeeded by using an ultra-cold quantum gas – we are talking about just a hundred nanokelvin, therefore on the border with absolute zero – of dysprosium atoms : an element of the rare earth group which, at low temperatures, is strongly magnetic. And what they managed to produce represents an absolute novelty: not a single row of atoms – therefore a one-dimensional supersolid , already made two years ago – but a matrix, or a two-dimensional supersolid (see opening image). A lattice of what scientists call droplets: “Droplets” thickened in a sea of ultra-cold quantum gas. With a purely quantum peculiarity, and completely counterintuitive, which establishes its supersolidity: despite the densification, the atoms of dysprosium are indistinguishable, as each of them is spread over the entire lattice. “Usually, one would think that every atom is in a given droplet , and that there is no way for this atom to move between them,” says the first author of the study published today in Nature , Matthew Norcia , of Institute of Quantum Optics and Quantum Information ( Iqoqi) of the Austrian Academy of Sciences in Innsbruck. On the contrary, in the supersolid state, each particle is delocalized on all the droplets , being simultaneously in each droplet. Thus, the system forms a series of high-density regions ( droplets ) that share the same delocalized atoms ».
But how did they get there? To understand this, it is better to go back to the origins of the studies on supersolidity, explains to Media Inaf the scientist at the head of the group that carried out the experiment , Francesca Ferlaino , who after a degree in physics from Federico II in Naples and a doctorate from Lens in Florence is today full professor at the University of Innsbruck, Austria, as well as managing and research director of Iqoqi, also in Innsbruck.
“If we have to choose a year from which to start this story, I would say 1957, when the great founding fathers of quantum mechanics started wondering what were the most paradoxical states that this theory could support, and superfluidity was starting to be an increasingly better phenomenon. known. First Eugene Gross , then Andreev, Lifshitz and Chester, wondered if it was possible to create a state of matter that on the one hand shows the rigidity of a solid, but on the other hand flows like a liquid “
And was it possible?
«From the point of view of the rules of quantum mechanics, yes: this type of state seemed to exist. Of course it is a paradoxical state, even just imagining a crystal flowing is quite difficult. However, this opened up a whole series of theoretical debates in the 1960s and 1970s, with scientists like Oliver Penrose and Lars Onsager saying no: superfluidity, with a fluid that flows without friction, and localization are two types of orders of matter that contradict each other – one prevents the formation of the other, so this super-solid state cannot exist ».
On the other hand, there were those who said that yes, maybe it existed?
“Well, there were those who wondered: but if it existed, what system could it be seen in? Which system in nature? In this sense, a great contribution came from Nobel laureate Anthony Leggett, theoretical physicist who in 1970 published an article – now considered a milestone on the subject – entitled “ Can a solid be superfluid? “. An article in which he discusses various possibilities ».
«The experimental approaches – or theoretical simulations – started from the idea of creating a solid that at some point, if cooled enough, could have superfluid properties. A material in which there were no two components – there were no solid and superfluid – but a single particle, indistinguishable as much as mechanically, which behaved at the same time as localized and diffused ».
Which material? Was there any candidate?
“From an experimental point of view, we started looking for this state in solid helium. A great result was then published in Nature in 2004 by two scientists at Penn State University – E. Kim and MHW Chan – in an article entitled ” Probable observation of a supersolid in helium “. A result that has attracted the attention of the whole condensed matter community: there have been great debates, there have been many theoretical groups that have tried to reproduce it by doing calculations, numerical simulations, etc. And the community began to split into two parts: those who believed in that result and those who tended to be a little skeptical. On the other hand, even just from the title of the paper – “ probableobservation… ”- it was obvious that not everything was completely under control at that time. The debate was very heated, but very heated within this community ».
How did it turn out?
«There has been an intense scientific work, some experimental groups have repeated the experiment of Kim and Chan obtaining different results, thus increasing even more the mystery around this state. Subsequently the same Kim and Chan repeated their experiment, this time on the basis of the calculations and the understanding of the system developed over the years, ending with a comment on their own work, published in 2012 in Physical Review Letters , entitled ” Absence of supersolidity in solid helium “”.
A tombstone …
«Let’s say that in the condensed matter community the question has become: game over , is it all over, or is this system still possible? A part of the research – above all theoretical, but also experimental – continues to develop considering these solids as the mother platform for the observation of supersolidity. Other theorists, however, have shifted attention to another type of platform: that of “cold atoms”. Because in cold atoms, in fact, various ingredients necessary for supersolidity are, let’s say, innate ».
What is special about cold atoms?
“In an ultra-cold gas, such as a Bose-Einstein condensate , with long-range interactions – therefore with the atoms that are” seen “from a distance – and not spherical, therefore anisotropic – such that if the atoms are seen in a direction they attract but if they look in another direction they repel – the system tends to be somewhat unstable. The system wants to crystallize: it is a gas, it never becomes a true crystalline structure, but it wants to spontaneously organize itself into a structure like a wave of matter – like a regular, crystalline wave. The problem is that this wantsof the system is too strong: only with this “desire” of theirs – because we know that in physics all systems seek a lower energy state – in a direction that decreases energy, the desire to crystallize would make them collapse. What was lacking in understanding – and this is where the experiments really mattered – was to find a mechanism that could stabilize , that is to say: you can crystallize up to a certain point, but then you have to stop. In this way the system undergoes a phase transition into a new state, which is precisely this state of minimum energy which is supersolid: the wave of matter forms and remains ».
And what is this mechanism capable of stabilizing the system?
“It was discovered, first by a group from Stuttgart and then quantitatively confirmed by our group, that in the dipolar atoms the system is stabilized by its own quantum fluctuations. The system, therefore, does create a crystal, but it cannot completely “explode” with infinitely high peaks, let’s say, because there is this stabilization system that tells it: you can’t create density peaks that much. It’s a kind of “quantum pressure”. All these ingredients together then allowed the observation of the fact that the system, by itself, spontaneously enters a new fundamental state: a completely coherent state, in which each atom is identical to the others – therefore they are indistinguishable – and they are all both localized than widespread. And this is precisely the supersolid state ».
State that you have made with an ultracold gas of dysprosium atoms, we said. But how did you get there, using dysprosium? Do you take the table of elements and choose a few at random – elements, by the way, which for us are absolutely exotic – until you find the right one to achieve the effect you are looking for? Or do you go without fail, already knowing more or less what region of the periodic table is to go fishing in order to achieve such a state?
«The first condensate was made with simple atoms, the alkalis , then atoms from the first column of the periodic table, which have only one valence electron. More or less there and in the next column, that of alkaline earths , the search for cold atoms had stopped a bit. Then, as we began to understand these simpler systems, we took the courage to go further and ask ourselves: I want a certain atomic property, which atom can give it to me?
And you made the leap from helium to dysprosium …
“Yes, but with a series of steps. We tried ytterbium, but it looks a lot like alkaline earthy. Then with chromium, which already showed dipolar effects, but needed an even higher magnetic moment. So we went to see which were the most magnetic atoms of the periodic table, and we arrived at dysprosium ».
Supersolids aside, among all the elements of the periodic table which is the one that gives you the most satisfaction? The one to which, by dint of using it in the laboratory, are you most fond of?
«I am very attached to erbium, because we were the first to condense it, but also to find a“ recipe ”that is now used in many laboratories . But it must be said that the erbium its lanthanide brothers are almost the same: you learn one and you have learned them all. So yes, I feel a certain sense of “motherhood” – so to speak – towards the erbium, the firstborn, but also for the other children ».
Featured image: Supersolid in 2D created by an ultracold gas. The colors represent the density, from black (low) to yellow (high). Source: Matthew A. Norcia et al., Nature, 2021
Astronomers have identified a strange feature never before seen in our galaxy: a cluster of young stars and gas clouds protruding from Sagittarius’ arm like a splinter sticking out of a wooden plank. About three thousand light years long, it is the first large structure identified with an orientation so drastically different from that of the arm. All the details on A&A
You will know the saying Devil is in the details . Its origin is very ancient and is thought to derive from the form le bon Dieu est dans le détail , generally attributed to Gustave Flaubert. The sentence reminds us that the more you examine something, the more you look at the details, the more you appreciate its complexity and get closer to the truth. It also applies to its most negative meaning, the one that recalls the devil: something may seem simple at first or on the surface, but when you look at it in detail, a complexity can emerge that requires much more time and effort to be understood.
It seems that this is the case of the Milky Way , where scientists – studying its details with NASA’s Spitzer Space Telescope and ESA’s Gaia mission – have identified for the first time a strange feature: a “contingent” of young stars and clouds of gas protruding from one of its spiral arms, like a splinter sticking out of a wooden plank. About 3000 light years long , it is the first large structure identified with an orientation so drastically different from that of the arm.
Astronomers have a rough idea of the size and shape of the Milky Way’s arms, but much remains unknown as it is not possible to see the entire structure as the Earth is inside it. It’s like trying to draw a map of Rome from inside the Colosseum. It is not possible to measure distances accurately enough to know if two buildings were on the same block or a few streets away… or to hope to see as far as the ring road, with so many buildings along the line of sight.
To learn more, the authors of the new study focused on a nearby portion of one of the galaxy’s arms, the Sagittarius Arm . Using the Spitzer Space Telescope – prior to its retirement in January 2020 – they searched for newborn stars, nestled in the gas and dust nebulae where they form. Spitzer detected infrared light that can penetrate those clouds, while visible light is blocked.
Young stars and nebulae are thought to be closely aligned with the shape of the arms in which they reside. To get a 3D view of the arm segment, the scientists used the latest data released by the Gaia mission to accurately measure the distances to the stars. The combined data revealed that the long, thin structure associated with Sagittarius’ arm is made up of young stars moving at almost the same speed and in the same direction through space.
“A key property of spiral arms is how tightly they wrap around the galaxy,” says Michael Kuhn , Caltech astrophysicist and first author of the study. This characteristic is measured by the pitch angle of the arm, the angle between the tangent to the spiral arm at a point and the tangent to the circle passing through the same point. A circle has a pitch angle of 0 degrees and as the spiral becomes more open, the pitch angle increases. “Most of the models of the Milky Way suggest that the Sagittarius arm forms a spiral at an angle of pitchof about 12 degrees, but the structure we have examined stands out at an angle of almost 60 degrees ».
Similar structures – sometimes called spurs or feathers – have been found in the arms of other spiral galaxies. For decades, scientists have wondered if the spiral arms of our Milky Way also had them, and now one has been found.
The newly discovered feature contains four nebulae known for their breathtaking beauty: the Eagle Nebula (inside which are the famous Pillars of Creation ), the Omega Nebula , the Trifid Nebula and the Lagoon Nebula . In the 1950s, a team of astronomers made rough measurements of the distance of some of the stars in these nebulae and were able to deduce the existence of the Sagittarius arm. Their work provided some of the earliest evidence of our galaxy’s spiral structure.
“Distances are among the hardest things to measure in astronomy,” said co – author Alberto Krone-Martins , an astrophysicist and professor of computer science at the University of California, Irvine and a member of the Gaia Data Processing and Analysis Consortium (Dpac). “It’s only Gaia’s recent direct distance measurements that make the geometry of this new structure so obvious.”
In the new study, the researchers also drew on a catalog of more than 100,000 newborn stars discovered by Spitzer in an investigation of the galaxy called the Galactic Legacy Infrared Mid-Plane Survey Extraordinaire (Glimpse). “When we put together the data from Gaia and Spitzer and finally see this detailed three-dimensional map, we can see that this region is more complex than previously,” reports Kuhn.
Astronomers have not yet fully understood what causes the spiral arms to form in galaxies like ours. While we can’t see the complete structure of the Milky Way, the ability to measure the motion of individual stars is useful for understanding this phenomenon: the stars in the newly discovered structure probably formed around the same time, in the same area, and were affected. by the forces acting within the galaxy, including gravity and the shear due to the rotation of the galaxy.
“Ultimately, this reminds us that there are many uncertainties about the large-scale structure of the Milky Way, and we need to look at the details if we are to understand that bigger picture,” concludes one of the study’s co-authors, Robert Benjamin , an astrophysicist at the University of Wisconsin-Whitewater and principal investigator of the Glimpse survey. “This structure is a small piece of the Milky Way, but it could tell us something significant about the Galaxy as a whole.”
Featured image: A cluster of stars and star-forming clouds have been found protruding from the Sagittarius arm of the Milky Way. The inset shows the size of the structure and the distance from the Sun. Each orange star indicates star-forming regions that can contain dozens to thousands of stars. Credits: Nasa / Jpl-Caltech