Two Planets Around A Red Dwarf (Planetary Science)

Red dwarfs are the coolest kind of star. As such, they potentially allow liquid water to exist on planets that are quite close to them. In the search for habitable worlds beyond the borders of our solar system, this is a big advantage: the distance between an exoplanet and its star is a crucial factor for its detection. The closer the two are, the higher the chance that astronomers can detect the planet from Earth.

The SAINT-EX Observatory is a fully robotic facility hosting a 1-metre telescope based in Mexico. ©Institute of Astronomy, UNAM / E. Cadena

“But these stars are rather small and emit little light compared to most other stars, such as our Sun”, Brice-Olivier Demory, lead author of the study and Professor of Astrophysics at the University of Bern explains. These factors make them challenging to observe in detail. Without the proper instruments, any planets that might orbit them could easily be overlooked – especially terrestrial planets, like Earth, that are comparably small.

A dedicated telescope

One instrument, with which it is possible to study red dwarfs and their planets closely, is the Mexico-based SAINT-EX telescope, co-operated by the NCCR PlanetS. SAINT-EX is an acronym that stands for Search And characterIsatioN of Transiting EXoplanets. The project has been named in honor of Antoine de Saint-Exupéry (Saint-Ex), the famous writer, poet and aviator.

The SAINT-EX Observatory is a fully robotic facility hosting a 1-metre telescope. It is equipped with instrumentation specifically suited to enable high-precision detection of small planets orbiting cool stars. Now, this specialization pays off: earlier this year, the telescope was able to detect two exoplanets orbiting the star TOI-1266, located around 120 light years from Earth. The research, published recently in the journal Astronomy and Astrophysics, provides a first impression of their characteristics.

A peculiar pair

Compared to the planets in our solar system, TOI-1266 b and c are much closer to their star – it takes them only 11 and 19 days respectively to orbit it. However, as their host star is much cooler than the Sun, their temperatures are not very extreme: the outer planet has approximately the temperature of Venus (although it is 7 times closer to its star than Venus is to the Sun). The two planets are of similar density, possibly corresponding to a composition of about a half of rocky and metallic material and half water. This makes them about half as rocky as Earth or Venus but also far rockier than Uranus or Neptune.

In size, the planets clearly differ from each other. The inner planet, TOI-1266 b, measures up to a little under two-and-a-half times the Earth’s diameter. This makes it a so-called “sub-Neptune”. The outer planet, TOI-1266 c, is just over one-and-a-half times the size of our planet. Thus, it belongs to the category of “super-Earths”.

This places the two planets at the edges of the so-called radius-valley, as Brice-Olivier Demory explains: “Planets between about the radius of TOI-1266 b and c are quite rare, likely because of the effect of strong irradiation from the star, which can erode their atmospheres”. Yilen Gómez Maqueo Chew, SAINT-EX Project Coordinator and researcher at the National Autonomous University of Mexico adds: “Being able to study two different types of planets in the same system is a great opportunity to better understand how these different sized planets come to be”.

Good timing and help from the embassy

Having this opportunity, especially this year, is anything but a given. The scientists were fortunate enough to be able to complete their observations shortly before the Covid-19-related lockdown in Mexico. Shortly after the observations were made, the observatory had to be closed due to the consequences of the pandemic. This has not changed until today. The scientists hope to resume operations of SAINT-EX in the next few months and to target the next red dwarf and its potential planets. “Also, the Mexican Embassy in Bern was a great help in facilitating the discussions with the Mexican government and in providing continued support to the project”, says Demory.

SAINT-EX – Search and characterisation of exoplanets

SAINT-EX is an international collaboration which had its kick-off meeting in the National Astronomical Observatory in San Pedro Martir (MEX) in September of 2016. The project’s principal investigator is Prof. Brice-Olivier Demory, from the Center for Space and Habitability of the University of Bern in Switzerland and National Center of Competence in Research PlanetS; the project’s coordinator and leader in Mexico is Dr. Yilen Gomez Maqueo Chew from the Instituto de Astronomía of the Universidad Nacional Autonoma de Mexico (UNAM). Also, part of the project are Prof. Willy Benz from the National Center of Competence in Research PlanetS, Prof. François Bouchy from the University of Geneva in Switzerland, Dr. Michaël Gillon from the University of Liège in Belgium, Prof. Kevin Heng from the University of Bern in Switzerland, Prof. Didier Queloz from the University of Geneva, Switzerland, and Cambridge in the UK, and Dr. Laurence Sabin, also from Instituto de Astronomía de Astronomía in UNAM. SAINT-EX has been funded by the Swiss National Science Foundation and the Universities of Bern, Geneva, Liège and Cambridge as well as UNAM. SAINT-EX also received support from the National Council for Science and Technology (CONACYT) through the National Laboratories call for proposals for the National Astronomical Observatory of San Pedro Martir.

Bernese space exploration: With the world’s elite since the first moon landing

When the second man, “Buzz” Aldrin, stepped out of the lunar module on July 21, 1969, the first task he did was to set up the Bernese Solar Wind Composition experiment (SWC) also known as the “solar wind sail” by planting it in the ground of the moon, even before the American flag. This experiment, which was planned and the results analysed by Prof. Dr. Johannes Geiss and his team from the Physics Institute of the University of Bern, was the first great highlight in the history of Bernese space exploration. Ever since Bernese space exploration has been among the world’s elite. The numbers are impressive: 25 times were instruments flown into the upper atmosphere and ionosphere using rockets (1967-1993), 9 times into the stratosphere with balloon flights (1991-2008), over 30 instruments were flown on space probes, and with CHEOPS the University of Bern shares responsibility with ESA for a whole mission. The successful work of the Department of Space Research and Planetary Sciences (WP) from the Physics Institute of the University of Bern was consolidated by the foundation of a university competence center, the Center for Space and Habitability (CSH). The Swiss National Fund also awarded the University of Bern the National Center of Competence in Research (NCCR) PlanetS, which it manages together with the University of Geneva.

References: B.-O. Demory, F. J. Pozuelos, Y. Gómez Maqueo Chew, L. Sabin, R. Petrucci, U. Schroffenegger, S. L. Grimm, M. Sestovic, M. Gillon, J. McCormac, K. Barkaoui, W. Benz, A. Bieryla, F. Bouchy, A. Burdanov, K. A. Collins, J. de Wit, C. D. Dressing, L. J. Garcia, S. Giacalone, P. Guerra, J. Haldemann, K. Heng, E. Jehin, E. Jofré, S. R. Kane, J. Lillo-Box, V. Maigné, C. Mordasini, B. M. Morris, P. Niraula, D. Queloz, B. V. Rackham, A. B. Savel, A. Soubkiou, G. Srdoc, K. G. Stassun, A. H. M. J. Triaud, R. Zambelli, G. Ricker, D. W. Latham, S. Seager, J. N. Winn, J. M. Jenkins, T. Calvario-Velásquez, J. A. Franco Herrera, E. Colorado, E. O. Cadena Zepeda, L. Figueroa, A. M. Watson, E. E. Lugo-Ibarra, L. Carigi, G. Guisa, J. Herrera, G. Sierra Díaz, J. C. Suárez, D. Barrado, N. M. Batalha, Z. Benkhaldoun, A. Chontos, F. Dai, Z. Essack, M. Ghachoui, C. X. Huang, D. Huber, H. Isaacson, J. J. Lissauer, M. Morales-Calderón, P. Robertson, A. Roy, J. D. Twicken, A. Vanderburg and L. M. Weiss, “A super-Earth and a sub-Neptune orbiting the bright, quiet M3 dwarf TOI-1266”, A&A, 642 (2020) A49
DOI: https://doi.org/10.1051/0004-6361/202038616

Provided by University Of Bern

World’s Greatest Mass Extinction Triggered Switch To Warm-Bloodedness (Paleontology)

Mammals and birds today are warm-blooded, and this is often taken as the reason for their great success.

University of Bristol palaeontologist Professor Mike Benton, identifies in the journal Gondwana Research that the ancestors of both mammals and birds became warm-blooded at the same time, some 250 million years ago, in the time when life was recovering from the greatest mass extinction of all time.

The origin of endothermy in synapsids, including the ancestors of mammals. The diagram shows the evolution of main groups through the Triassic, and the scale from blue to red is a measure of the degree of warm-bloodedness reconstructed based on different indicators of bone structure and anatomy. Credit: Mike Benton, University of Bristol. Animal images are by Nobu Tamura, Wikimedia.

The Permian-Triassic mass extinction killed as much as 95 per cent of life, and the very few survivors faced a turbulent world, repeatedly hit by global warming and ocean acidification crises. Two main groups of tetrapods survived, the synapsids and archosaurs, including ancestors of mammals and birds respectively.

Palaeontologists had identified indications of warm-bloodedness, or technically endothermy, in these Triassic survivors, including evidence for a diaphragm and possible whiskers in the synapsids.

More recently, similar evidence for early origin of feathers in dinosaur and bird ancestors has come to light. In both synapsids and archosaurs of the Triassic, the bone structure shows characteristics of warm-bloodedness. The evidence that mammal ancestors had hair from the beginning of the Triassic has been suspected for a long time, but the suggestion that archosaurs had feathers from 250 million years ago is new.

Posture shift at the end of the Permian, 252 million years ago. Before the crisis, most reptiles had sprawling posture; afterwards they walked upright. This may have been the first sign of a new pace of life in the Triassic. Credit: animal drawings by Jim Robins, University of Bristol.

But a strong hint for this sudden origin of warm-bloodedness in both synapsids and archosaurs at exactly the time of the Permian-Triassic mass extinction was found in 2009. Tai Kubo, then a student studying the Masters in Palaeobiology degree at Bristol and Professor Benton identified that all medium-sized and large tetrapods switched from sprawling to erect posture right at the Permian-Triassic boundary.

Their study was based on fossilised footprints. They looked at a sample of hundreds of fossil trackways, and Kubo and Benton were surprised to see the posture shift happened instantly, not strung out over tens of millions of years, as had been suggested. It also happened in all groups, not just the mammal ancestors or bird ancestors.

Professor Benton said: “Modern amphibians and reptiles are sprawlers, holding their limbs partly sideways.

“Birds and mammals have erect postures, with the limbs immediately below their bodies. This allows them to run faster, and especially further. There are great advantages in erect posture and warm-bloodedness, but the cost is that endotherms have to eat much more than cold-blooded animals just to fuel their inner temperature control.”

The evidence from posture change and from early origin of hair and feathers, all happening at the same time, suggested this was the beginning of a kind of ‘arms race’. In ecology, arms races occur when predators and prey have to compete with each other, and where there may be an escalation of adaptations. The lion evolves to run faster, but the wildebeest also evolves to run faster or twist and turn to escape.

Something like this happened in the Triassic, from 250 to 200 million years ago. Today, warm-blooded animals can live all over the Earth, even in cold areas, and they remain active at night. They also show intensive parental care, feeding their babies and teaching them complex and smart behaviour. These adaptations gave birds and mammals the edge over amphibians and reptiles and in the present cool world allowed them to dominate in more parts of the world.

Professor Benton added: “The Triassic was a remarkable time in the history of life on Earth. You see birds and mammals everywhere on land today, whereas amphibians and reptiles are often quite hidden.

“This revolution in ecosystems was triggered by the independent origins of endothermy in birds and mammals, but until recently we didn’t realise that these two events might have been coordinated.

“That happened because only a tiny number of species survived the Permian-Triassic mass extinction – who survived depended on intense competition in a tough world. Because a few of the survivors were already endothermic in a primitive way, all the others had to become endothermic to survive in the new fast-paced world.”

References: Michael Benton, “The origin of endothermy in synapsids and archosaurs and arms races in the Triassic”, Gondwana Research, 2020. Doi: https://doi.org/10.1016/j.gr.2020.08.003 link: https://www.sciencedirect.com/science/article/pii/S1342937X20302252

Provided by University Of Bristol

Membranes For Capturing Carbon Dioxide From The Air (Chemistry)

Researchers in I2CNER, Kyushu University suggest the potential of the advanced gas separation membranes for CO2 extraction from ambient air.

Climate change caused by emissions of greenhouse gases into the atmosphere is a most important issue for our society. Acceleration of global warming results in catastrophic heatwaves, wildfires, storms and flooding. The anthropogenic nature of climate change necessitates development of novel technological solutions in order to reverse the current CO2 trajectory.

Technological solutions for the CO2 emission into the atmosphere should include variety of approaches as there is no one “silver bullet” solution. In this work researchers from I2CNER, Kyushu University and NanoMebrane Technologies Inc. Japan suggest using the gas separation membranes as a tool for direct air capture. When combined with advanced technologies for CO2 conversion the envisaged systems can be widely employed in carbon-recycling sustainable society. ©Kyushu University

Direct capture of the carbon dioxide (CO2) from the air (direct air capture, DAC) is one among a variety of negative emission technologies that are expected to keep global warming below 1.5 °C, as recommended by the Intergovernmental Panel for Climate Change (IPCC). Extensive deployment of the DAC technologies is needed to mitigate and remove so-called legacy carbon or historical emissions. Effective reduction of the CO2 content in the atmosphere would be achieved only by extracting huge amounts of CO2 that are comparable to that of the current global emissions. Current DAC technologies are mainly based on sorbent-based systems where CO2 is trapped in the solution or on the surface of the porous solids covered with the compounds with high CO2 affinity. These processes are currently rather expensive, although the cost is expected to go down as the technologies developed and deployed at scale.

The ability of membranes to separate carbon dioxide is well documented and its usefulness is established for industrial processes. Unfortunately, its efficiency is less than satisfactory for the practical operation of the DAC.

In a recent paper, researchers from International Institute for Carbo-Neutral Energy Research (I2CNER), Kyushu University and NanoMembrane Technologies Inc. in Japan discussed the potential of membrane-based DAC (m-DAC), by taking advantage of the state-of-the-art performance of organic polymer membranes. Based on the process simulation, they showed the targeted performance for the m-DAC is achievable with competitive energy expenses. It is shown that a mult-stage application separation process can enable the preconcentration of air CO2 (0.04%) to 40%. This possibility and combination of the membranes with advanced CO2 conversion may lead to realistic means for opening circular CO2 economy. `Based on this finding, Kyushu University team has initiated a Government-supported Moonshot Research and Development Program (Program Manager: Dr. Shigenori Fujikawa). In this program, direct CO2 capture from the atmosphere by membranes and the subsequent conversiont to valuable materials is the major development target.

References: “A New Strategy of Membrane-Based Direct Air Capture” Shigenori Fujikawa, Roman Selyanchyn, and Toyoki Kunitake, Polymer Journal (2020), https://doi.org/10.1038/s41428-020-00429-z

Provided by Kyushu University

A New Strategy For siRNA Stabilization By An Artificial Cationic Oligosaccharide (Medicine)

RNA interference is a gene regulatory mechanism in which the expression of specific genes is downregulated by endogenous microRNAs or by small interfering RNAs (siRNAs). Although siRNAs have broad potential for gene-silencing therapy, their instability is one of the difficulties to develop siRNA-based agents. To improve their stability, most of the developed siRNA-based drugs are chemically modified in their nucleotides or phosphodiester linkages. However, chemical modification is not a perfect strategy for siRNA stabilisation because extensive modification may interrupt the gene-silencing activity of siRNAs and also induce cytotoxicity.

ODAGal4 strongly stabilises duplex structures of siRNAs, particularly nucleotides with phosphorothioate linkages, and prevents degradation of the RNAs. ©TMIMS

siRNAs consist of oligonucleotide duplexes of 21-23 bases and form an A-form helix structure in which the major grooves have highly negative potential, therefore cationic molecules that can bind to the major grooves are expected to stabilise the RNA duplexes and protect them against cleavage in the body fluid. Based on this idea, organic chemists at Tokyo University of Science have recently synthesised an artificial cationic oligosaccharide, oligodiaminogalactose 4mer (ODAGal4), that can preferentially bind to the major grooves of RNA duplexes for siRNA stabilisation (Iwata, RI. et al. Org. Biomol. Chem. https://doi.org/10.1039/c6ob02690g (2017)).

Now, Atsushi Irie and his colleague at Tokyo Metropolitan Institute of Medical Science in collaboration with the team at Tokyo University of Science have developed a new strategy for siRNA stabilisation using ODAGal4 combined with phosphorothioate modification of RNAs. In the study published online on 9th September in the Scientific Reports, the researchers have proved that ODAGal4 strongly enhances biological and thermal stability of siRNAs in vitro.

The researchers show that ODAGal4 has several unique characteristics for stabilising siRNAs. Firstly, ODAGal4 can improve stability of various siRNAs independent of nucleotide sequence because ODAGal4 binds to phosphodiester linkages of RNA duplexes but not to nucleobases of the nucleotides. In addition, importantly, ODAGal4 does not compromise gene-silencing activity of any siRNAs. This character of ODAGal4 is in sharp contrast to those of known chemical modifications, which may interrupt gene-silencing activity of siRNAs. ODAGal4, therefore, has great potential for siRNA stabilisation, being widely applicable to various siRNA-based drugs.

Secondly, the effect of ODAGal4 on siRNA stabilisation is further enhanced by chemical modification of the siRNAs; in particularly, ODAGal4 prominently improves stability of RNAs with phosphorothioate linkages. This improvement in siRNA stability is superior to that observed for other chemical modifications (e.g., 2′-O-methyl, locked nucleic acid and 2′-deoxy-2′-fluoro nucleotides) suggesting that ODAGal4 combined with phosphorothioate modification is highly effective for stabilising siRNAs.

Lastly, another striking property of ODAGal4 is its binding specificity to RNA duplexes; ODAGal4 binds to A-form RNA helix but not to B-form DNA helix nor to single-stranded RNA/DNA. Although various gene delivery systems consisting of cationic polymers have been developed to stabilise nucleotides, the molecular structures of the polymers are not designed to specifically bind to nucleotides. The binding of the polymers to nucleotides relies on the ionic interaction between them, and thereby the polycation complexes are prone to induce cytotoxicity due to nonspecific binding of the polymers to other biomolecules. In marked contrast, ODAGal4 escapes from causing cytotoxicity because of its restricted binding with high affinity to RNA duplexes.

“Our goal will be application of ODAGal4 for siRNA-based agents. Although we should study in vivo experiments to confirm and expand our findings, we emphasise that ODAGal4 has a great advantage as improving siRNA stability and has a potential for reducing total dose and frequency of administration of siRNA-based drugs in future application,” concludes Atsushi Irie.

References: Irie, A., Sato, K., Hara, R.I. et al. An artificial cationic oligosaccharide combined with phosphorothioate linkages strongly improves siRNA stability. Sci Rep 10, 14845 (2020). https://doi.org/10.1038/s41598-020-71896-w

Provided by Tokyo Metropolitan Institute of Medical Science

Novel Mechanical Mechanism Of Metastatic Cancer Cells In Substrates Of Different Stiffness Revealed (Medicine)

During metastasis, cancer cells actively interact with microenvironments of new tissues. How metastatic cancer cells respond to new environments in the secondary tissues is a crucial question in cancer research but still remains elusive. Recently, researchers from the Hong Kong University of Science and Technology (HKUST), along with their international collaborators, discovered a novel mechanical mechanism of metastatic cancer cells in substrates of different stiffness, which could contribute to developing diagnostic tools for metastatic cancer cells and cancer therapeutics.

Experimental schematics of viscoelastic measurements using magnetic tweezers. ©HKUST

This study was published in the Journal of Physical Chemistry Letters on Sept 18, 2020.

In the study, the team of researchers, led by Prof. Hyokeun Park, assistant professor at the Department of Physics and Division of Life Science, HKUST, mimicked mechanical stiffness of diverse tissues from soft brain to bone using polyacrylamide (PAA) substrates and measured the mechanical responses of single metastatic breast cancer cells (MDA-MB-231 cells) against different stiffness, using advanced imaging techniques and the state-of-the-art magnetic tweezers which Prof. Park’s group built in HKUST.

Using single-molecule tension sensors, they found that metastatic breast cancer cells change their tension in focal adhesions against the stiffness to adapt new environments whereas normal breast cells (MCF-10A cells) keep the similar tension regardless of stiffness. They also measured the viscoelasticity of single metastatic breast cancer cells using magnetic tweezers and found that metastatic cancer cells become more elastic on stiffer substrates while the viscoelasticity of normal cells remain similar.

These results show that metastatic breast cancer cells have stronger capacity to adapt to the mechanical environments of diverse tissues.

“How do metastatic breast cancer cells migrate and proliferate the secondary tissues of different stiffness from soft to hard tissue interface like brain to bone is a crucial question in cancer research,” said Prof. Park. “Our work addressed how metastatic breast cancer cells proliferate the substrate of varying stiffness from 1kPa (similar to brain) to 50GPa(similar to bone), and we discovered that metastatic cancer cells change their viscoelasticity depending on physical environment to adapt their new physical environment and survive their new environment. This is one of big achievements in cancer physics and mechanobiology. The findings will contribute to developing diagnostic tools for metastatic cancer cells and, eventually, treatment of cancer.”

This work was done in collaboration with Prof. Ching-Hwa Kiang at Rice University, Professor Jun Chu at Shenzhen Institutes of Advanced Technology and Prof. Ophelia Tsui in Department of Physics of HKUST.

The team is planning to develop a cancer diagnostic kit making use of the mechanism to measure tension of living potential cancer cells at different stiffness, which will be much simpler and more user-friendly than the existing diagnosis tools for metastatic cancer cells. Such mechanism could also be used to develop a drug-screening test for metastatic cancer to see how metastatic cancer cells’ focal adhesion and viscoelasticity respond to different drugs and find the most effective drug for them.

References: Fang Tian, Tsung-Cheng Lin, Liang Wang, Sidong Chen, Xingxiang Chen, Pak Man Yiu, Ophelia K. C. Tsui, Jun Chu, Ching-Hwa Kiang, and Hyokeun Park, “Mechanical Responses of Breast Cancer Cells to Substrates of Varying Stiffness Revealed by Single-Cell Measurements”, J. Phys. Chem. Lett. 2020, 11, 18, 7643–7649, 2020.
https://doi.org/10.1021/acs.jpclett.0c02065

Provided by Hong Kong University Of Science And Technology

Researchers Discover A Uniquely Quantum Effect In Erasing Information (Quantum)

Researchers from Trinity College Dublin have discovered a uniquely quantum effect in erasing information that may have significant implications for the design of quantum computing chips. Their surprising discovery brings back to life the paradoxical “Maxwell’s demon”, which has tormented physicists for over 150 years.

A bit of information can be encoded in the position of a particle (left or right). A demon can erase a classical bit (blue) by raising one side until the particle is definitely on the right. A quantum particle (red) can also tunnel under the barrier, which generates more heat. © Professor Goold, Trinity College Dublin

The thermodynamics of computation was brought to the fore in 1961 when Rolf Landauer, then at IBM, discovered a relationship between the dissipation of heat and logically irreversible operations. Landauer is known for the mantra “Information is Physical”, which reminds us that information is not abstract and is encoded on physical hardware.

The “bit” is the currency of information (it can be either 0 or 1) and Landauer discovered that when a bit is erased there is a minimum amount of heat released. This is known as Landauer’s bound and is the definitive link between information theory and thermodynamics.

Professor John Goold’s QuSys group at Trinity is analysing this topic with quantum computing in mind, where a quantum bit (a qubit, which can be 0 and 1 at the same time) is erased.

In just-published work in the journal, Physical Review Letters, the group discovered that the quantum nature of the information to be erased can lead to large deviations in the heat dissipation, which is not present in conventional bit erasure.

Thermodynamics and Maxwell’s demon

One hundred years previous to Landauer’s discovery people like Viennese scientist, Ludwig Boltzmann, and Scottish physicist, James Clerk Maxwell, were formulating the kinetic theory of gases, reviving an old idea of the ancient Greeks by thinking about matter being made of atoms and deriving macroscopic thermodynamics from microscopic dynamics.

Professor Goold says:

“Statistical mechanics tells us that things like pressure and temperature, and even the laws of thermodynamics themselves, can be understood by the average behavior of the atomic constituents of matter. The second law of thermodynamics concerns something called entropy which, in a nutshell, is a measure of the disorder in a process. The second law tells us that in the absence of external intervention, all processes in the universe tend, on average, to increase their entropy and reach a state known as thermal equilibrium.

“It tells us that, when mixed, two gases at different temperatures will reach a new state of equilibrium at the average temperature of the two. It is the ultimate law in the sense that every dynamical system is subject to it. There is no escape: all things will reach equilibrium, even you!”

However, the founding fathers of statistical mechanics were trying to pick holes in the second law right from the beginning of the kinetic theory. Consider again the example of a gas in equilibrium: Maxwell imagined a hypothetical “neat-fingered” being with the ability to track and sort particles in a gas based on their speed.

Maxwell’s demon, as the being became known, could quickly open and shut a trap door in a box containing a gas, and let hot particles through to one side of the box but restrict cold ones to the other. This scenario seems to contradict the second law of thermodynamics as the overall entropy appears to decrease and perhaps physics’ most famous paradox was born.

But what about Landauer’s discovery about the heat-dissipated cost of erasing information? Well, it took another 20 years until that was fully appreciated, the paradox solved, and Maxwell’s demon finally exorcised.

Landauer’s work inspired Charlie Bennett – also at IBM – to investigate the idea of reversible computing. In 1982 Bennett argued that the demon must have a memory, and that it is not the measurement but the erasure of the information in the demon’s memory which is the act that restores the second law in the paradox. And, as a result, computation thermodynamics was born.

New findings

Now, 40 years on, this is where the new work led by Professor Goold’s group comes to the fore, with the spotlight on quantum computation thermodynamics.

In the recent paper, published with collaborator Harry Miller at the University of Manchester and two postdoctoral fellows in the QuSys Group at Trinity, Mark Mitchison and Giacomo Guarnieri, the team studied very carefully an experimentally realistic erasure process that allows for quantum superposition (the qubit can be in state 0 and 1 at same time).

Professor Goold explains:

“In reality, computers function well away from Landauer’s bound for heat dissipation because they are not perfect systems. However, it is still important to think about the bound because as the miniaturisation of computing components continues, that bound becomes ever closer, and it is becoming more relevant for quantum computing machines. What is amazing is that with technology these days you can really study erasure approaching that limit.

“We asked: ‘what difference does this distinctly quantum feature make for the erasure protocol?’ And the answer was something we did not expect. We found that even in an ideal erasure protocol – due to quantum superposition – you get very rare events which dissipate heat far greater than the Landauer limit.

“In the paper we prove mathematically that these events exist and are a uniquely quantum feature. This is a highly unusual finding that could be really important for heat management on future quantum chips – although there is much more work to be done, in particular in analysing faster operations and the thermodynamics of other gate implementations.

“Even in 2020, Maxwell’s demon continues to pose fundamental questions about the laws of nature.”

References: Harry J. D. Miller, Giacomo Guarnieri, Mark T. Mitchison, and John Goold, “Quantum Fluctuations Hinder Finite-Time Information Erasure near the Landauer Limit”, Phys. Rev. Lett. 125, 160602 – Published 15 October 2020. https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.125.160602

Provided by Trinity College Dublin

New Feature Found In Energy Spectrum Of Universe’s Most Powerful Particles (Astronomy)

Particles smaller than an atom hurtle through the universe nearly at the speed of light, blasted into space from something, somewhere, in the cosmos.

University of Delaware researchers are part of collaboration studying cosmic rays. In addition to Cherenkov detector tanks filled with water, the Pierre Auger Observatory in Argentina has a second kind of cosmic-ray catcher — fluorescence detectors. The charged particles in a cosmic-ray air shower interact with atmospheric nitrogen, causing it to emit ultraviolet light through a process called fluorescence, which is invisible to the human eye — but not to this optical detector. Credit: University of Delaware

A scientific collaboration of the Pierre Auger Observatory, including researchers from the University of Delaware, has measured the most powerful of these particles—ultra-high-energy cosmic rays—with unprecedented precision. In doing so, they have found a “kink” in the energy spectrum that is shining more light on the possible origins of these subatomic space travelers.

The team’s findings are based on the analysis of 215,030 cosmic ray events with energies above 2.5 quintillion electron volts (eV), recorded over the past decade by the Pierre Auger Observatory in Argentina. It is the largest observatory in the world for studying cosmic rays.

The new spectral feature, a kink in the cosmic-ray energy spectrum at about 13 quintillion electron volts, represents more than points plotted on a graph. It brings humanity a step closer to solving the mysteries of the most energetic particles in nature, according to Frank Schroeder, assistant professor at the Bartol Research Institute in UD’s Department of Physics and Astronomy, who was involved in the study with the support of the University of Delaware Research Foundation. The research is published in Physical Review Letters and Physics Review D.

In this pre-pandemic photo, UD Professor Frank Schroeder works with colleagues to install a radio antenna on one of the cosmic ray detector stations of the Pierre Auger Observatory, located near Malargüe, Argentina. Credit: University of Delaware

“Since cosmic rays were discovered 100 years ago, the longstanding question has been, what accelerates these particles?” Schroeder said. “The Pierre Auger Collaboration’s measurements provide important hints about what we can exclude as the source. From previous work, we know the accelerator is not in our galaxy. Through this latest analysis, we can further corroborate our earlier indications that ultra-high-energy cosmic rays are not just protons of hydrogen, but also a mix of nuclei from heavier elements, and this composition changes with energy.”

Between the “ankle” and the “toe”

Schroeder and UD postdoctoral researcher Alan Coleman, who contributed to the data analysis, have been members of the Pierre Auger Collaboration for several years. UD officially joined the collaboration as an institutional member in 2018. This team of more than 400 scientists from 17 countries operates the observatory, which occupies a 1,200-square-mile area, about the size of Rhode Island.

An array of cosmic ray detector stations of the Pierre Auger Observatory near Malargüe, Argentina. The University of Delaware is a member of the international collaboration that operates the observatory, which includes more than 400 scientists from 17 countries. Credit: University of Delaware

The observatory has more than 1,600 detectors called water-Cherenkov stations spread across the high plains of the Pampa Amarilla (Yellow Prairie), overlooked by 27 fluorescence telescopes. Collectively, these instruments measure the energy that an ultrahigh-energy cosmic ray particle releases in the atmosphere and provide an indirect evaluation of its mass. All of these data—energy, mass and the direction from which these extraordinary particles arrived—yield important clues about their origins.

Previously, scientists thought these ultra-high-energy cosmic ray particles were mostly protons of hydrogen, but this latest analysis confirms that the particles have a mix of nuclei—some heavier than oxygen or helium, such as silicon and iron, for example.

Plotted on the curving graph representing the cosmic-ray energy spectrum, you can see the kink—a steep, flattened section—between the area referred to by scientists as “the ankle,” and the beginning point of the graph, called “the toe.”

“We don’t have a specific name for it,” said Coleman, who was on the 20-person team that wrote the computer code and did the number crunching required for the extensive data analysis. “I guess we are running out of parts of the anatomy to call it,” he said, joking.

Directly involved in the finding, Coleman improved reconstruction of the particle cascade, which cosmic-rays create when impinging the atmosphere, in order to estimate the energy. He also performed detailed studies to ensure that this new inflection point was real and not an artifact of the detector. The data group’s work took more than two years.

“Obviously, it’s pretty slight,” Coleman said of the spectral kink. “But every time you see a bump like this, that signals the physics is changing and that’s very exciting.”

It’s very hard to determine the mass of incoming cosmic rays, Coleman said. But the collaboration’s measurement is so robust and precise that a number of other theoretical models for where ultra-high-energy cosmic rays are coming from can now be eliminated, while other pathways can be pursued with more vigor.

Scientists speculate that active galactic nuclei may be a source of ultra-high-energy cosmic rays. Active galactic nuclei are supermassive black holes in the center of galaxies, which feature gigantic jets of matter that escape falling into the black hole. Centaurus A, shown here, is an example of this galaxy class in our galactic neighborhood less than 20 million light-years from Earth. Credit: University of Delaware
The flux of cosmic rays is dependent on their energy. The higher the energy, the rarer the cosmic rays. However, the larger figure shows this relationship is not smooth. Several features indicate something is happening at different energies, referred to informally by scientists as the “knee,” the “ankle” and the “toe,” along with the “new kink,” measured by the Pierre Auger Observatory Collaboration. The inset shows this new measurement in detail. Each feature can be interpreted as a change in the composition of cosmic rays at the respective energies. Credit: University of Delaware

Active galactic nuclei (AGN) and starburst galaxies are now in the running as potential sources. While their typical distance is some 100 million light years away, a few candidates are within 20 million light years.

“If we learned what the sources were, we could look into new details about what is going on,” Coleman said. What’s happening that allows these incredibly high energies? These particles may be coming from something we don’t even know.”

Ongoing research by the UD team focuses on further increasing the measurement accuracy of ultra-high-energy cosmic rays and extending the precise measurement of the cosmic ray spectrum down to lower energies. That would create a better overlap with other experiments, Schroeder said, such as the cosmic ray measurements of IceCube at the South Pole—another unique astroparticle observatory with major involvement of the University of Delaware.

References: (1) A. Aab et al. Features of the Energy Spectrum of Cosmic Rays above 2.5×1018 eV Using the Pierre Auger Observatory, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.125.121106 (2) A. Aab et al. Measurement of the cosmic-ray energy spectrum above 2.5×1018 eV using the Pierre Auger Observatory, Physical Review D (2020). DOI: 10.1103/PhysRevD.102.062005

Provided by University of Delaware

Ultrafast Camera Films 3-D Movies At 100 Billion Frames Per Second (Science And Technology)

In his quest to bring ever-faster cameras to the world, Caltech’s Lihong Wang has developed technology that can reach blistering speeds of 70 trillion frames per second, fast enough to see light travel. Just like the camera in your cell phone, though, it can only produce flat images.

A three-dimensional video showing a pulse of laser light passing through a laser-scattering medium and bouncing off reflective surfaces. ©Caltech

Now, Wang’s lab has gone a step further to create a camera that not only records video at incredibly fast speeds but does so in three dimensions. Wang, Bren Professor of Medical Engineering and Electrical Engineering in the Andrew and Peggy Cherng Department of Medical Engineering, describes the device in a new paper in the journal Nature Communications.

The new camera, which uses the same underlying technology as Wang’s other compressed ultrafast photography (CUP) cameras, is capable of taking up to 100 billion frames per second. That is fast enough to take 10 billion pictures, more images than the entire human population of the world, in the time it takes you to blink your eye.

Wang calls the new iteration “single-shot stereo-polarimetric compressed ultrafast photography,” or SP-CUP.

In CUP technology, all of the frames of a video are captured in one action without repeating the event. This makes a CUP camera extremely quick (a good cell-phone camera can take 60 frames per second). Wang added a third dimension to this ultrafast imagery by making the camera “see” more like humans do.

When a person looks at the world around them, they perceive that some objects are closer to them, and some objects are farther away. Such depth perception is possible because of our two eyes, each of which observes objects and their surroundings from a slightly different angle. The information from these two images is combined by the brain into a single 3-D image.

The SP-CUP camera works in essentially the same way, Wang says.

“The camera is stereo now,” he says. “We have one lens, but it functions as two halves that provide two views with an offset. Two channels mimic our eyes.”

Just as our brain does with the signals it receives from our eyes, the computer that runs the SP-CUP camera processes data from these two channels into one three-dimensional movie.

SP-CUP also features another innovation that no human possesses: the ability to see the polarization of light waves.

The polarization of light refers to the direction in which light waves vibrate as they travel. Consider a guitar string. If the string is pulled upwards (say, by a finger) and then released, the string will vibrate vertically. If the finger plucks it sideways, the string will vibrate horizontally. Ordinary light has waves that vibrate in all directions. Polarized light, however, has been altered so that its waves all vibrate in the same direction. This can occur through natural means, such as when light reflects off a surface, or as a result of artificial manipulation, as happens with polarizing filters.

Though our eyes cannot detect the polarization of light directly, the phenomenon has been exploited in a range of applications: from LCD screens to polarized sunglasses and camera lenses in optics to devices that detect hidden stress in materials and the three-dimensional configurations of molecules.

Wang says that the SP-CUP’s combination of high-speed three-dimensional imagery and the use of polarization information makes it a powerful tool that may be applicable to a wide variety of scientific problems. In particular, he hopes that it will help researchers better understand the physics of sonoluminescence, a phenomenon in which sound waves create tiny bubbles in water or other liquids. As the bubbles rapidly collapse after their formation, they emit a burst of light.

“Some people consider this one of that greatest mysteries in physics,” he says. “When a bubble collapses, its interior reaches such a high temperature that it generates light. The process that makes this happen is very mysterious because it all happens so fast, and we’re wondering if our camera can help us figure it out.”

Provided by Caltech

How Bacteria Adapt Their Machinery For Optimum Growth? (Biology)

The most important components for the functioning of a biological cell are its proteins. As a result, protein production is arguably the most important process for cell growth. The faster the bacterial growth rate, the faster protein synthesis needs to take place. Because protein synthesis is the most expensive cellular process in terms of cellular resources usage, it appears reasonable to assume that the cell to increases production capacities by hosting more copies of the complicated machinery in proportion to its growth rate. This would mean that in order for growth to double, twice as many copies of all components of the translation machinery would be needed.

It has been clear since the 1960s, however, that it’s not that simple. Instead, the composition of the ‘cocktail’ of individual components in the machinery, which itself is made from proteins and RNA, varies with the growth rate. A new, complex computer model developed in Düsseldorf shows what concentrations of the individual components are needed in order to produce different synthesis rates, explaining for the first time the reasons behind the observed variations across growth conditions.

Xiao-Pan Hu, a doctoral student in Prof. Dr. Martin Lercher’s Computational Cell Biology group at the HHU, developed the model. Hu used computer modelling to encode established biochemical principles at the cellular level. The resulting model can be used to calculate the speed with which a cell can produce its components and thus predicts cell growth based on a predefined composition of its machinery.

Theoretically, each production rate can be realised using a large number of different molecule concentrations. The question is: What does nature do? Which one of the many feasible compositions do real Escherichia coli (‘E. coli’) bacteria use and why? Hu and his colleagues have based their work on a simple assumption reflected everywhere in nature: an organism generally has an evolutionary advantage if it needs as few resources as possible for its development. Consequently, the team searched through the many possible compositions for the one that is ‘cheapest’ for the cell, i.e., the one that requires the smallest possible total mass of molecules.

Comparisons with experimental data show that this assumption is correct and accurately predicts the concentrations measured in real E. coli bacteria colonies. This allowed the Düsseldorf-based research team not only to describe the data quantitatively but also to actually understand the reasons behind the data, namely that a principle found in many other areas of life also applies here.

In further analyses, the model also proved accurate for situations where the bacteria are exposed to antibiotics. In exceptional circumstances like these, the bacteria are particularly stressed and need a toolset that is arranged differently in order to grow.

The research group is currently investigating whether the findings for protein synthesis can also be applied to other cellular processes and other organisms. The models developed as part of this work should also help to design biotech procedures more efficiently. They make it possible to calculate the optimum concentrations of the individual components in the cell for the desired biological production.

References: Xiao-Pan Hu, Hugo Dourado, Peter Schubert, Martin J. Lercher, The protein translation machinery is expressed for maximal efficiency in Escherichia coli, Nat. Comm.
DOI: 10.1038/s41467-020-18948-x http://dx.doi.org/10.1038/s41467-020-18948-x

Provided by Heinrich-Heine University Duesseldorf