AI-based ‘OxyGAN’ is a robust, effective method to measure tissue oxygen levels (Engineering)

New AI-based algorithm processes tissue oxygenation data faster and more accurately than conventional techniques.

Tissue oxygenation is a measure of the oxygen level in biological tissue and is a useful clinical biomarker for tissue viability. Abnormal levels may indicate the presence of conditions such as sepsis, diabetes, viral infection, or pulmonary disease, and effective monitoring is important for surgical guidance as well as medical care.

Comparison of profile-corrected SFDI, SSOP, and OxyGAN; doi 10.1117/1.JBO.25.11.112907 ©SPIE

Several techniques exist for the measurement of tissue oxygenation, but they all have some limitations. For instance, pulse oximetry is robust and low-cost but cannot provide a localized measure of oxygenation. Near-infrared spectroscopy, on the other hand, is prone to noisy measurements due to pressure-sensitive contact probes. Spatial frequency domain imaging (SFDI) has emerged as a promising noncontact technique that maps tissue oxygen concentrations over a wide field of view. While simple to implement, SFDI has its own limitations: it requires a sequence of several images for its predictions to be accurate and is prone to errors when working with single snapshots.

In a new study published in the Journal of Biomedical Optics, researchers from Johns Hopkins University, Mason T. Chen and Nicholas J. Durr, have proposed an end-to-end technique for accurate calculation of tissue oxygenation from single snapshots, called OxyGAN. They developed this approach using a class of machine-learning framework called a conditional generative adversarial network (cGAN), which utilizes two neural networks — a generator and a discriminator — simultaneously on the same input data. The generator learns to produce realistic output images, while the discriminator learns to determine whether a given image pair forms a correct reconstruction for a given input.

Using conventional SDFI, the researchers obtained oxygenation maps for the human esophagus (ex vivo), hands and feet (in vivo), and a pig colon (in vivo) under illumination with two different wavelengths (659 and 851 nm). They trained OxyGAN with the feet and esophagus samples and saved the hand and colon samples to later test its performance. Further, they compared its performance with a single-snapshot technique based on a physical model and a two-step hybrid technique that consisted of a deep-learning model to predict optical properties and a physical model to calculate tissue oxygenation.

OxyGAN produces tissue oxygenation maps directly from single-phase SFDI images with 659 nm and 851 nm illumination. ©SPIE

The researchers found that OxyGAN could measure oxygenation accurately, not only for the samples it had seen during training (human feet), but also for the samples it had not seen (human hand and pig colon), demonstrating the robustness of the model. It performed better than both the single-snapshot model and the hybrid model by 24.9% and 24.7%, respectively. Moreover, the scientists optimized OxyGAN to compute ~10 times faster than the hybrid model, enabling real-time mapping at a rate of 25 Hz. Frédéric Leblond, Associate Editor for the Journal of Biomedical Optics, comments, “Not only does this paper represent significant advances that can contribute to the practical clinical implementation of spatial frequency domain imaging, but it will also be part of a relatively small (although rapidly increasing in size) pool of robust published work using AI-type methods to deal with real biomedical optics data.”

While the algorithm of OxyGAN could be optimized further, this approach holds promise as a novel technique to measure tissue oxygenation.

Read the original research article by M. T. Chen and N. J. Durr, “Rapid tissue oxygenation mapping from snapshot structured-light images with adversarial deep learning,” J. Biomed. Opt. 25(11), 112907 (2020), doi: 10.1117/1.JBO.25.11.112907.

References: Mason T. Chen, Nicholas J. Durr, “Rapid tissue oxygenation mapping from snapshot structured-light images with adversarial deep learning”, J. of Biomedical Optics, 25(11) , 112907 (2020).

Provided by SPIE

Molecule That Regulates Muscle Adaptation to Exercise Is Discovered (Biology)

An article in Cell shows that the metabolite succinate is released by muscle cells during physical exercise and triggers a process of tissue remodeling that makes muscles stronger and enhances metabolic efficiency.

The onset of any physical exercise program causes muscle pain that can hinder movements as simple as getting up from a sofa. With time and a little persistence, the muscles become accustomed to the effort, developing more strength and endurance. Researchers affiliated with Harvard University in the United States and the University of São Paulo (USP) in Brazil describe the cellular mediator that makes this adaptation to exercise possible in the journal Cell.

The mediator is succinate, a metabolite hitherto known only for its participation in mitochondrial respiration. The authors of the article include Julio Cesar Batista Ferreira, a professor at USP’s Biomedical Sciences Institute (ICB) and a member of the Center for Research on Redox Processes in Biomedicine (Redoxome), one of the Research, Innovation and Dissemination Centers (RIDCs) funded by FAPESP (São Paulo Research Foundation), and postdoctoral fellow Luiz Henrique Bozi, who conducted the investigation while he was a research intern at Harvard with FAPESP’s support.

“Our results show that succinate leaves muscle cells during exercise and sends their neighbors signals that induce a process of muscle tissue remodeling,” Ferreira explained to Agência FAPESP. “The motor neurons create new ramifications, the muscle fibers become more uniform to gain strength on contracting, and blood sugar uptake increases in all cells to produce ATP [adenosine triphosphate, the cellular fuel]. There’s an increase in efficiency.”

The findings reported in the article are based on a large number of experiments with animals and human volunteers. The first entailed comparisons of more than 500 metabolites present in mouse leg muscles before and after the mice ran on a treadmill until they were exhausted.

“Besides muscle fibers, muscle tissue also contains immune, nerve, and endothelial cells. If each one was a house, the streets between houses would be the interstitium or interstitial space. We isolated and analyzed each of the houses as well as the streets to find out what changes in the neighborhood after exercise, and observed a significant increase in succinate only in muscle fibers and interstitial spaces,” Ferreira said.

A similar phenomenon was observed in healthy volunteers aged 25-35 during 60 minutes of intense exercise on a stationary bicycle. In this case, the researchers analyzed blood samples obtained via catheters in the femoral artery and vein and found that succinate levels rose substantially in venous blood exiting the muscle and fell rapidly during recovery.

At this point, the researchers were convinced that muscle cells released succinate in response to the stress caused by exercise, but they wanted to find out how, and above all why. Analysis of the volunteers’ blood offered a clue: another compound that increased with exercise, in both venous and arterial blood, was lactate (the ionized form of lactic acid), a sign that the cells had activated their emergency energy generation system.

“Succinate is a metabolite that is normally unable to cross the cell membrane and leave the cell. Inside the cell, it participates in the Krebs cycle, a series of chemical reactions that occur in the mitochondria and result in ATP formation,” Bozi explained. “But when energy demand increases sharply and the mitochondria can’t keep up, an anaerobic system is activated, causing excess lactate formation and cell acidification. We found that this change in pH causes a change in the chemical structure of succinate such that it’s able to get through the membrane and escape into the extracellular medium.”

The transport protein that helps succinate exit the cell was identified by proteomics, an analysis of all the proteins in the membranes of mouse and human muscle cells. The results showed an increase in MCT1 in muscle tissue after exercise. MCT1 is a protein that specializes in transporting monocarboxylate out of the cell.

“The kind of molecule MCT1 transports is similar to succinate when it undergoes chemical modification in an acid medium. It ceases to be dicarboxylate and becomes monocarboxylate. We performed several in vitro experiments to confirm that this was the mechanism induced by exercise,” Bozi said.

One of the experiments consisted of submitting cultured muscle cells to hypoxia (oxygen deprivation) in order to activate the anaerobic energy production mechanism and produce lactate. This was seen to be sufficient to induce succinate release into the interstitial space.

Another experiment involved germ cells (oocytes) from frogs genetically modified to express human MCT1. The researchers found that the oocytes released succinate only when they were placed in an acid medium.

“By this stage, we knew acidity makes succinate undergo protonation, a chemical process that enables it to bind to MCT1 and pass through the membrane into the extracellular medium, but we had yet to discover the significance of this accumulation of succinate in the interstitial space during exercise,” Ferreira said.


The importance of communication between cells in the organism’s adaptation to any kind of stress is well-established in the scientific literature. Signals are exchanged by means of molecules released into the interstitial space to bind to proteins in the membranes of nearby cells. Activation of these membrane receptors triggers processes that lead to structural and functional tissue modifications.

“Our hypothesis was that succinate performed this role of regulation in muscles, by binding to a protein called SUCNR1 [succinate receptor 1] that’s highly expressed in the membranes of motor neurons, for example,” Bozi said.

To test the theory, they conducted experiments with mice that had been genetically modified not to express SUCNR1. The mice were allowed to run freely on a resistance wheel for three weeks, considered long enough for morphological and functional changes to occur in muscle tissue.

“The muscle fibers were expected to become more uniform and stronger, but they didn’t,” Ferreira said. “In addition, exercise didn’t promote motor neuron ramification, which is crucial to enhance contraction efficiency. We also observed that cellular glucose uptake didn’t increase and that insulin sensitivity was lower than in the wild mice that served as controls. In other words, exercise-induced remodeling didn’t happen without the succinate receptor.”

According to Ferreira, the study is the first to show the paracrine action of succinate in muscle tissue, i.e. its role in cell-to-cell signaling to alert nearby cells that they must modify their internal processes to adapt to a “new normal”.

“The next step is to find out whether this mechanism is disrupted in other diseases characterized by energy metabolism alterations and cell acidification, such as neurodegenerative diseases, in which astrocyte-neuron communication is critical to disease progression,” he said.


Provided by São Paulo Research Foundation (FAPESP)

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at and visit FAPESP news agency at to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at

Trees Can Help Slow Climate Change, But At A Cost (Earth Science)

This ‘green’ option is more expensive than originally thought, study shows.

Widespread forest management and protections against deforestation can help mitigate climate change – but will come with a steep cost if deployed as broadly as policymakers have discussed, new research suggests.

Planting or protecting trees around the world, especially in the tropics, could help deal with climate change — but at a cost. Photo by Lucian Dachma on Unsplash

The study, published today in the journal Nature Communications, found that planting and protecting trees, especially in the tropics, could reduce carbon dioxide emissions by as much as 6 gigatons a year from 2025 to 2055. That reduction, the researchers’ economic model showed, would cost as much as $393 billion a year over the same time period.

“There is a significant amount of carbon that can be sequestered through forests, but these costs aren’t zero,” said Brent Sohngen, co-author of the study and a professor of environmental economics at The Ohio State University.

A 6-gigaton reduction by 2055 would amount to about 10 percent of the total reduction needed to keep the climate from warming beyond 1.5 or 2 degrees Celsius. Policymakers and scientists agree that humans worldwide need to put strategies in place that will keep the climate from warming above that threshold.

Limiting warming by that amount would not mean climate change is solved. At a 2-degree increase, for example, extreme heatwaves will become widespread, droughts would plague many urban environments around the world, and heavy rainfalls – including those from tropical storms or hurricanes – would become more commonplace. Many insects and animals would also die.

Those levels, though, would allow for some adaptation, according to the Intergovernmental Panel on Climate Change, the United Nations body that is trying to help the world address climate change. The world is currently on pace to warm by about 4 degrees.

A number of recent studies have suggested that tree planting, management and conservation can solve a significant share of the world’s climate problem, but most studies have ignored the costs. This analysis updated the Global Timber Model, which considers potential climate change policies and the effects they would have on forest land use, management and trade. The model allowed the researchers to calculate the worldwide financial costs of sequestering carbon in the world’s forests.

The researchers found that protecting existing forests is cheaper than planting new ones, and that forest management, including changing how and when trees are harvested, provides low-cost options to store carbon in regions where timber management is an important economic activity.

One important reason why the costs in this study are higher than other estimates is because researchers captured carbon leakage, which occurs if carbon emissions shift to other sectors or geographic areas. For example, if planting trees in one location on agricultural land causes additional deforestation somewhere else, the net change in carbon will be smaller because of this leakage.

The study illustrates the geography and timing of the costs of forestry actions to mitigate climate change. Specifically, the researchers found forests in Brazil, the Democratic Republic of Congo and Indonesia are likely to contribute the greatest possible carbon sequestration efforts at the lowest costs. Protecting tropical forest land in the short term gives way to planting and managing tropical and temperate forests, often with longer rotations, to maintain high levels of mitigation beyond 2050.

“Protecting, managing and restoring the world’s forests will be necessary for avoiding dangerous impacts of climate change, and have important co-benefits such as biodiversity conservation, ecosystem service enhancement and protection of livelihoods,” said Kemen Austin, lead author of the study and senior policy analyst with RTI International, a nonprofit research institute based in North Carolina. “Until now, there has been limited research investigating the costs of climate change mitigation from forests. Better understanding the costs of mitigation from global forests will help us to prioritize resources and inform the design of more efficient mitigation policies.”

The researchers also found that in the temperate and subarctic regions, the United States would likely be responsible for the greatest share of forest-related carbon mitigation efforts – about 24 percent of the possible mitigation in those climate regions would come from the United States based on forested and deforested land. China and Canada have the potential to contribute significantly, too, the researchers found.

And managing the world’s forests would be only one piece of the broader puzzle. Energy sources that don’t rely on fossil fuels – things like solar and wind power – likely will play the largest role in any climate change mitigation strategy.

“What we see is that you should devote about a third of your effort to this stuff and two-thirds to the other stuff – to reducing coal, to investing in solar, to switching to electric,” Sohngen said. “If you want your total mitigation to be as cheap as possible, that’s what you would do.”

CONTACTS: Brent Sohngen,
Kemen Austin,

Written by: Laura Arenschield,

References: Austin, K.G., Baker, J.S., Sohngen, B.L. et al. The economic costs of planting, preserving, and managing the world’s forests to mitigate climate change. Nat Commun 11, 5946 (2020).

Provided by Ohio State University

Study Identifies Novel Mechanisms that Cause Protein Clumping in Brain Diseases (Psychiatry)

A team of researchers at the Case Western Reserve University School of Medicine has taken a major step toward understanding the mechanisms involved in the formation of large clumps of tau protein, a hallmark of Alzheimer’s disease and several other neurodegenerative disorders.

© iStock/image_jungle

Their findings may help to better understand the pathological process and possibly lead to developing medications to treat such devastating brain diseases.

The study, “Regulatory mechanisms of tau protein fibrillation under the conditions of liquid-liquid phase separation,” was published this week in the journal Proceedings of the National Academy of Sciences.

The senior author of the study is Witold Surewicz, a professor of physiology and biophysics at the School of Medicine. Solomiia Boyko, a graduate student, and Krystyna Surewicz, a senior research associate, co-authored the study, which was supported by the National Institute on Aging.

Alzheimer’s disease is characterized by the death of nerve cells in the brain, resulting in progressive memory loss and cognitive decline. More than 5 million people in the United States suffer from Alzheimer’s, and this number is projected to triple by 2050, according to the Alzheimer’s Association. There is no cure for this devastating disease.

In Alzheimer’s disease, clumps of a tau protein begin to form inside nerve cells in the brain. Brain accumulation of these aggregates are known as “neurofibrillary tangles.”

Similar tangles of tau, which spread among nerve cells, are also associated with a host of other neurodegenerative diseases, collectively known as “tauopathies.” These include Pick’s disease, frontotemporal dementia, progressive supranuclear palsy and chronic traumatic encephalopathy.

Recent studies have demonstrated that, like some other proteins, tau can undergo liquid-liquid phase separation, a process resulting in formation of liquid-like droplets containing highly concentrated protein. This phenomenon, similar to how oil and water separate when mixed, is believed to be important for normal functions of cells. However, under certain conditions, this separation within cells may also have pathological consequences.

The new study establishes a critical link between these two phenomena–tau liquid-liquid phase separation and tangle formation–demonstrating that the environment of liquid droplets greatly facilitates aggregation of tau into fibrillar structures similar to those found in the brain of someone with Alzheimer’s disease.

The researchers also describe the mechanism by which this liquid-liquid phase separation regulates clumping when different variants of tau protein are present. In particular, the authors show that, because of the unique properties of liquid droplets, the presence of a shorter, slowly aggregating tau variant inhibits clumping of a longer, normally fast aggregating variant, slowing down the overall process of tangle formation.

This novel regulatory mechanism may play a major role in determining the clinical outcome of the disease, as the ratio of these two tau variants in brain varies substantially in different tauopathies.

For example, Alzheimer’s disease is usually characterized by an equal proportion of both tau isoforms, whereas fibrillary tangles in progressive supranuclear palsy and Pick’s disease consist largely of the longer and shorter variant, respectively.

“While the present results provide exciting new insights into formation of pathological clumps of tau protein,” Surewicz said, “our study was limited to experiments with purified proteins in the test tube. The next step is to verify these findings in cell and animal models of the disease.”

Provided by Case Western Reserve University

Telomere Shortening Protects Against Cancer (Medicine)

As time goes by, the tips of your chromosomes–called telomeres–become shorter. This process has long been viewed as an unwanted side-effect of aging, but a recent study shows it is in fact good for you.

Human telomeres (green) at the ends of chromosomes (blue). ©Laboratory of Cell Biology and Genetics at The Rockefeller University

“Telomeres protect the genetic material,” says Titia de Lange, Leon Hess Professor at Rockefeller. “The DNA in telomeres shortens when cells divide, eventually halting cell division when the telomere reserve is depleted.”

New results from de Lange’s lab provide the first evidence that telomere shortening helps prevent cancer in humans, likely because of its power to curtail cell division. Published in eLife, the findings were obtained by analyzing mutations in families with exceptional cancer histories, and they present the answer to a decades-old question about the relationship between telomeres and cancer.

A longstanding controversy

In stem cells, including those that generate eggs and sperm, telomeres are maintained by telomerase, an enzyme that adds telomeric DNA to the ends of chromosomes. Telomerase is not present in normal human cells, however, which is why their telomeres wither away. This telomere shortening program limits the number of divisions of normal human cells to about 50.

The idea that telomere shortening could be part of the body’s defense against cancer was first proposed decades ago. Once an early-stage tumor cell has divided 50 times, scientists imagined, depletion of the telomere reserve would block further cancer development. Only those cancers that manage to activate telomerase would break through this barrier.

Clinical observations seemed to support this hypothesis. “Most clinically detectable cancers have re-activated telomerase, often through mutations,” de Lange says. Moreover, mouse experiments showed that shortening telomeres can indeed protect against cancer. Nonetheless, evidence for the telomere tumor suppressor system remained elusive for the past two decades, and its existence in humans remained controversial.

The solution to a decades-old problem

The telomere tumor suppressor pathway can only work if we are born with telomeres of the right length; if the telomeres are too long, the telomere reserve would not run out in time to stop cancer development. Longer telomeres will afford cancer cells additional divisions during which mutations can creep into the genetic code, including mutations that activate telomerase.

For decades, de Lange’s lab has been studying the complex process by which telomeres are regulated. She and others identified a set of proteins that can limit telomere length in cultured human cells, among them a protein called TIN2. When TIN2 is inhibited, telomerase runs wild and over-elongates telomeres. But it was not known whether TIN2 also regulated telomere length at birth.

The stalemate on the telomere tumor suppressor continued until physicians at the Radboud University Medical Center in Holland reached out to de Lange about several cancer-prone families.  The doctors found that these families had mutations in TINF2, the gene that encodes the TIN2 protein instrumental to controlling telomere length. That’s when they asked de Lange to step in.

Isabelle Schmutz, a Women&Science postdoctoral fellow in the de Lange lab, used CRISPR gene-editing technology to engineer cells with precisely the same mutations as those seen in the Dutch families and examined the resulting mutant cells. She found that the mutant cells had fully functional telomeres and no genomic instability. They were, for all intents and purposes, normal healthy cells.

But there was one thing wrong with the cells. “Their telomeres became too long, ” de Lange says.  Similarly, the patient’s telomeres were unusually long. “These patients have telomeres that are far above the 99th percentile,” de Lange says.

“The data show that if you’re born with long telomeres, you are at greater risk of getting cancer, ” says de Lange. “We are seeing how the loss of the telomere tumor suppressor pathway in these families leads to breast cancer, colorectal cancer, melanoma, and thyroid cancers. These cancers would normally have been blocked by telomere shortening. The broad spectrum of cancers in these families shows the power of the telomere tumor suppressor pathway.”

The study is demonstration of the power of basic science to transform our understanding of medicine. “How telomeres are regulated is a fundamental problem,” de Lange says. “And by working on a fundamental problem, we were eventually able to understand the origins of a human disease.”

References: Isabelle Schmutz Arjen R Mensenkamp, Kaori K Takai, Maaike Haadsma, Liesbeth Spruijt, Richarda M de Voer, Seunga Sara Choo, Franziska K Lorbeer, Emma J van Grinsven, Dirk Hockemeyer, Marjolijn CJ Jongmans , Titia de Lange et al., “TINF2 is a haploinsufficient tumor suppressor that limits telomere length”, ELife, 2020.

Provided by Rockefeller University

What will the Climate be like When Earth’s Next Supercontinent forms? (Earth Science)

In roughly 200 million years, the continents will once again unite into a supercontinent; a new study explores how the next Pangea could affect the global climate.

Long ago, all the continents were crammed together into one large land mass called Pangea. Pangea broke apart about 200 million years ago, its pieces drifting away on the tectonic plates — but not permanently. The continents will reunite again in the deep future. And a new study, presented today during an online poster session at the meeting of the American Geophysical Union, suggests that the future arrangement of this supercontinent could dramatically impact the habitability and climate stability of Earth. The findings also have implications for searching for life on other planets.

How land could be distributed in the Aurica supercontinent (top) versus Amasia. The future land configurations are shown in gray, with modern-day outlines of the continents for comparison. © Way et al. 2020

The study, which has been submitted for publication, is the first to model the climate on a supercontinent in the deep future.

Scientists aren’t exactly sure what the next supercontinent will look like or where it will be located. One possibility is that, 200 million years from now, all the continents except Antarctica could join together around the north pole, forming the supercontinent “Amasia.” Another possibility is that “Aurica” could form from all the continents coming together around the equator in about 250 million years.

In the new study, researchers used a 3D global climate model to simulate how these two land mass arrangements would affect the global climate system. The research was led by Michael Way, a physicist at the NASA Goddard Institute for Space Studies, an affiliate of Columbia University’s Earth Institute.

The team found that, by changing atmospheric and ocean circulation, Amasia and Aurica would have profoundly different effects on the climate. The planet could end up being 3 degrees Celsius warmer if the continents all converge around the equator in the Aurica scenario.

In the Amasia scenario, with the land amassed around both poles, the lack of land in between disrupts the ocean conveyor belt that currently carries heat from the equator to the poles. As a result, the poles would be colder and covered in ice all year long. And all of that ice would reflect heat out into space.

With Amasia, “you get a lot more snowfall,” explained Way. “You get ice sheets, and you get this very effective ice-albedo feedback, which tends to lower the temperature of the planet.”

Distribution of snow and ice in winter and summer on Aurica (left) and Amasia. ©Way et al. 2020

In addition to cooler temperatures, Way suggested that sea level would probably be lower in the Amasia scenario, with more water tied up in the ice caps, and that the snowy conditions could mean that there wouldn’t be much land available for growing crops.

Aurica, by contrast, would probably be a bit beachier, he said. The land concentrated closer to the equator would absorb the stronger sunlight there, and there would be no polar ice caps to reflect heat out of Earth’s atmosphere — hence the higher global temperature.

Although Way likens Aurica’s shores to the paradisiacal beaches of Brazil, “the inland would probably be quite dry,” he warned. Whether or not much of the land would be farmable would depend on the distribution of lakes and what types of precipitation patterns it experiences — details that the current paper doesn’t delve into, but could be investigated in the future.

The simulations showed that temperatures were right for liquid water to exist on about 60% of Amasia’s land, as opposed to 99.8% of Aurica’s — a finding that could inform the search for life on other planets. One of the main factors that astronomers look for when scoping out potentially habitable worlds is whether or not liquid water could survive on the planet’s surface. When modeling these other worlds, they tend to simulate planets that are either completely covered in oceans, or else whose terrain looks like that of modern-day Earth. The new study, however, shows that it’s important to consider land mass arrangements while estimating whether temperatures fall in the ‘habitable’ zone between freezing and boiling.

Although it may be 10 or more years before scientists can ascertain the actual land and sea distribution on planets in other star systems, the researchers hope that having a larger library of land and sea arrangements for climate modeling could prove useful in estimating the potential habitability of neighboring worlds.

Hannah Davies and Joao Duarte from the University of Lisbon, and Mattias Green from Bangor University in Wales were co-authors on this research.


Provided by Earth Institute at Columbia University

Molecular ‘Barcode’ Helps Decide Which Sperm Will Reach an Egg (Biology)

A study in mice provides insights on the processes that determine which sperm will reach an egg to fertilise it, a discovery that may aid infertility research.

A protein called CatSper1 may act as a molecular ‘barcode’ that helps determine which sperm cells will make it to an egg and which are eliminated along the way.

Three-dimensional image showing the head (green) and tail (red) of sperm cells travelling towards the fertilisation site (to the left side of the image) in the reproductive tract (blue cells) of a female mouse. ©Lukas Ded (CC BY 4.0)

The findings in mice, published recently in eLife, have important implications for understanding the selection process that sperm cells undergo after they enter the female reproductive tract, a key step in reproduction. Learning more about these processes could lead to the development of new approaches to treating infertility.

“Male mammals ejaculate millions of sperm cells into the female’s reproductive tract, but only a few arrive at the egg,” explains senior author Jean-Ju Chung, Assistant Professor of Cellular & Molecular Physiology at Yale School of Medicine, New Haven, Connecticut, US. “This suggests that sperm cells are selected as they travel through the tract and excess cells are eliminated. But most of our knowledge about fertilisation in mammals has come from studying isolated sperm cells and eggs in a petri dish – an approach that doesn’t allow us to see what happens during the sperm selection and elimination processes.”

To address this challenge, Chung and colleagues, including lead author Lukas Ded, who was a postdoctoral fellow in the Chung laboratory when the study was carried out, devised a new molecular imaging strategy to observe the sperm selection process within the reproductive tract of mice. Using this technique, and combining it with more traditional molecular biology studies, the team revealed that a sperm protein called CatSper1 must be intact for a sperm cell to fertilise an egg.

The CatSper1 protein is one of four proteins that create a channel to allow calcium to flow into the membrane surrounding the tail of the sperm. This channel is essential for sperm movement and survival. If this protein is lopped off in the reproductive tract, the sperm never makes it to the egg and dies. “This highlights CatSper1 as a kind of barcode for sperm selection and elimination in the female reproductive tract,” says Chung.

The findings, and the new imaging platform created by the team, may enable scientists to learn more about the steps in the fertilisation process and what happens afterwards, such as when the egg implants into the mother’s uterus.

“Our study opens up new horizons to visualise and analyse molecular events in single sperm cells during fertilisation and the earliest stages of pregnancy,” Chung concludes. “This and further studies could ultimately provide new insights to aid the development of novel infertility treatments.”


The paper ‘3D in situ imaging of female reproductive tract reveals molecular signatures of fertilizing spermatozoa in mice’ can be freely accessed online at Contents, including text, figures and data, are free to reuse under a CC BY 4.0 license.

Media contact

Emily Packer, Media Relations Manager
01223 855373

About eLife

eLife is a non-profit organisation created by funders and led by researchers. Our mission is to accelerate discovery by operating a platform for research communication that encourages and recognises the most responsible behaviours. We work across three major areas: publishing, technology and research culture. We aim to publish work of the highest standards and importance in all areas of biology and medicine, including Cell Biology and Structural Biology and Molecular Biophysics, while exploring creative new ways to improve how research is assessed and published. We also invest in open-source technology innovation to modernise the infrastructure for science publishing and improve online tools for sharing, using and interacting with new results. eLife receives financial support and strategic guidance from the Howard Hughes Medical Institute, the Knut and Alice Wallenberg Foundation, the Max Planck Society and Wellcome. Learn more at

To read the latest Cell Biology research published in eLife, visit

And for the latest in Structural Biology and Molecular Biophysics, see

Provided by Elife

Early Human Landscape Modifications Discovered in Amazonia (Archeology)

No evidence of extensive savannah formations during the current Holocene period.

In 2002 Professor Alceu Ranzi (Federal University of Acre) and Prof. Martti Parssinen (University of Helsinki) decided to form an international research team to study large geometric earthworks, called geoglyphs, at the Brazilian state of Acre in South-western Amazonia. Soon it appeared that a pre-colonial civilization unknown to international scholars built there geometric ceremonial centers and sophisticated road systems. This civilization flourished in the rainforest 2,000 years ago. The discovery supported Prof. William Balee´s (Tulane University) theory of early human impacts on the current Amazonian tropical forest composition that radically altered the notion of the pristine Amazon rainforest.

Aerial view of a research site called Severino Calazans. ©Martti Pärssinen

Now, the team published an article in Antiquity demonstrating that the earthwork-building civilization had a much longer human history behind it than was expected. The team members demonstrate that humans have regularly used fire to clear small open patches in the rainforest. These activities started quite soon after the last Ice Age ended thousands of years before the first geoglyphs were constructed. Thanks to the charcoal the humans left in the Amazonian soil during the last 10 000 years, it was possible to measure systematically carbon-13 isotope values of many samples. By using these values taken from archaeologically dated charcoal it was possible to estimate past vegetation and precipitation. The results published in Antiquity indicate that the forest main vegetation and precipitation have remained quite unchanged during the last ten thousand years until the 20th century. No evidence of drier periods or natural/artificial savannah formations were observed before the current colonization started to penetrate into the southwestern Amazonia from the turn of the 19th and 20th centuries onward. Hence, the authors argue that the theories of extensive savannah formations in the South-western Amazonia during the current Holocene period are based on a false interpretation of the connection between charcoal accumulation and natural fires due to drier climatic periods. These interpretations have not taken into account the millennial human presence in Amazonia.

Alceu Ranzi says that “it is possible that opening patches were aimed to attract large mammals such as giant sloths and mastodons until the megafauna disappeared forever. In addition, ash and charcoal fertilized the soil and open areas were prepared for the growing of palms fruits, vegetables and root plants useful for human subsistence.” Martti Parssinen adds that “it is probably not a coincidence that today southwestern Amazonia is considered one of the most important centers of domestication: cassava/manioc, squash, chili-pepper and peach palm seem to have been domesticated there almost 10 000 years ago. In every case, domestication processes left important fingerprints on Amazonian forest composition. Therefore there is no such thing as virgin rainforest.”

In general, the study shows that indigenous peoples of the Amazon have been able to use their environment in a sustainable manner. Parssinen says that “there is no indication that large areas of Holocene forest would have been deforested before the second half of the 20th century. Deforestation is a current phenomenon.”

Martti Parssinen, William Balee and Alceu Ranzi are the authors of the current article. In addition, archaeologist Antonia Barbosa from the Superintendencia do Instituto do Patrimonia Historico e Artistico Nacional no Acre is the fourth author. The Academy of Finland financed the project and the Finnish Cultural and Academic Institute in Madrid also contributed to the project. In Brazil the research was authorized by Instituto do Patrimonio Historico e Artistico Nacional (IPHAN).


Provided by Helsinki University

Breaking The Rules of Chemistry Unlocks New Reaction (Chemistry)

Scientists have broken the rules of enzyme engineering to unlock a new method for creating chemical reactions that could unlock a wide range of new applications – from creating new drugs to food production.

In their paper published today in Nature Catalysis, Professor Francesca Paradisi and Dr. Martina Contente of the University of Nottingham and the University of Bern show a new method to produce chemical molecules more efficiently through a new one step reaction in the enzyme.

Professor Paradis is Professor of Biocatalysis in the School of Chemistry in Nottingham and Professor of Pharmaceutical Chemistry at the University of Bern, she explains: “We have demonstrated how a very simple mutation in one of the key residues of a useful enzyme has dramatically expanded its synthetic scope, enabling the use of the mutant variant in the preparation of challenging chemical molecules, as well as natural metabolites that are vital in many biological processes in the body.”

Any textbook on enzymes will report on how the catalytic amino acids in any given enzyme family are highly conserved, they are in fact a signature of the type of chemistry an enzyme can do. Variations do occur and in some cases, if the replacing amino acid is similar, both can be found in significant proportion in Nature, but others can be much less common and are found only in a limited number of species.

“In this study we have explored an untouched area of enzyme engineering and modified the a key catalytic residue in the active site of an enzyme” adds Professor Paradisi, “Previously it was thought that doing this would cause a loss of activity of the enzyme but we have found this is not the case when this biocatalyst is used in a synthetic direction and in fact challenging but very useful molecules can now be made under mild conditions which could be easily scaled up and replicated commercially for use in a wide range of products.”

To change the substrate scope of an enzyme the approach has generally been to mutate the residues involved in substrate recognition, whether through rational design or directed evolution, leaving always untouched the catalytic ones.

The mutant variant of an acyl transferase enzyme was rapidly created and while the native biocatalyst would work with alcohols and linear amines, the mutant work with thiols and much more complex amines too. The research demonstrated that indeed the new variant has lost the ability to hydrolyse esters, but for synthetic applications, where an ester or other functional groups need to be made (thioesters and amides) and not cleaved, this is in fact a major advantage.

Dr Martina Contente adds: “We have had fantastic feedback on this study from the scientific community as it is providing a new tool for chemistry that can be applied to a wide range of molecular reactions. The fact that it is a very stable reaction created without the need for specific conditions mean it has the potential for a low cost commercial application in the production of new pharmaceuticals. We believe we have unlocked a new combination in the catalytic triads which nature seem to have disfavoured, possibly to tighten the control on reactivity, but that for a chemist could be a real goldmine.”

References: Contente, M.L., Roura Padrosa, D., Molinari, F. et al. A strategic Ser/Cys exchange in the catalytic triad unlocks an acyltransferase-mediated synthesis of thioesters and tertiary amides. Nat Catal (2020).

Provided by University of Nottingham