Scientists Discover Plants’ Roadblock to Specialty Oil Production (Botany)

Research solves longstanding problem and opens path for growing crops to produce industrially important high-value fatty acids

Hundreds of naturally occurring specialty fatty acids (building blocks of oils) have potential for use as raw materials for making lubricants, plastics, pharmaceuticals, and more—if they could be produced at large scale by crop plants. But attempts to put genes for making these specialty building blocks into crops have had the opposite effect: Seeds from plants with genes added to make specialty fatty acids accumulated dramatically less oil. No one knew why.

Now two teams of biochemists working on separate aspects of oil synthesis at the U.S. Department of Energy’s Brookhaven National Laboratory have converged to discover the mechanism behind the oil-production slowdown. As described in the journal Plant Physiology, they crossbred model plants and conducted detailed biochemical-genetic analyses to demonstrate a strategy for reversing the roadblock and ramping up production. The work paves the way for making at least one industrially important specialty fatty acid in plants—and may work for many others.

“Since scientists discovered the genes responsible for making specialty fatty acids several decades ago, we’ve dreamed of putting them into crop plants to make abundant renewable sources of desired fatty acids,” said John Shanklin, chair of Brookhaven Lab’s biology department, who oversaw the project. “But we’ve been stymied from using them because we didn’t know why they dramatically slow fatty acid and oil synthesis. A number of research groups have been trying to figure out why this happens. We have now nailed down the mechanism and opened up the possibility of achieving that dream.”

Group leader and Biology Department Chair John Shanklin and fellow biochemists Jantana Keereetaweep and Yuanheng Cai collaborated with Xiao-Hong Yu (pictured above) on increasing specialty fatty acid synthesis in plants.

Two projects converge

This study grew out of two separate projects in Shanklin’s biochemistry lab. One, led by Xiao-Hong Yu and Yuanheng Cai, was focused on the challenges associated with specialized fatty acid production in plants. The other, led by Jantana Keereetaweep, was deciphering details of the biochemical feedback loop plants use to regulate ordinary fatty acid and oil production.

Through that second project, the team recently characterized a mechanism by which plants down-regulate oil synthesis when levels of a plant’s regular (endogenous) fatty acids get too high.

“This system operates like a thermostat,” Shanklin explained. “When heat gets above its set point, the furnace turns off.”

A plant enzyme called ACCase acts like a four-gear “machine” to crank out fatty acids, the building blocks of oil. High levels of ordinary fatty acids, or even small amounts of specialty hydroxy fatty acids, trigger a substitution in the machinery: BADC, which acts like a gear with no teeth, takes the place of BCCP, slowing production down. Plants bred to lack genes for making BADC could potentially produce large amounts of economically important fatty acids.

In the case of plant oils, the key machinery that controls production is an enzyme called ACCase. It has four parts, or subunits—you can think of them as gears. As long as endogenous fatty acids are below a certain level, the four “gears” mesh and the machine cranks out fatty acids for oil production. But feeding plants additional endogenous fatty acids triggers a substitution in the machinery. One of the ACCase subunits gets replaced by a version that isn’t functional. “It’s like a gear with no teeth,” Shanklin said. That toothless gear (known as BADC) slows the fatty acid-producing machinery until endogenous fatty acid levels fall.

In contrast, the shutdown mechanism triggered by the specialty fatty acids (ones being produced by genes artificially added to the plant) kicks in when even small amounts of the “foreign” fatty acids are present, and endogenous fatty acids aren’t in excess. “Because of this, they appeared to be two separate processes,” Shanklin said.

 But as the two teams discussed their projects, they began to wonder if the specialty fatty acids were triggering the same off switch triggered by high levels of ordinary fatty acids. “Imagine working in the same lab on different projects and in a lab meeting one day, you look at each other and ask, ‘Is it possible we’re working on the same thing?’” Shanklin said.

This idea provided a way for the teams to combine efforts on a new experiment.

Testing the hypothesis

Through earlier studies, Shanklin’s group had created a strain of Arabidopsis (a model plant) that has two of its BADC genes deleted. In these plants, the off switch is disabled and the plants crank out high levels of endogenous fatty acids. They wondered what would happen if the BADC genes were disabled in plants engineered to produce specialty fatty acids.

To find out, Xiao-Hong Yu and Yuanheng Cai designed a strategy to crossbreed the defective off-switch plants with an Arabidopsis strain engineered to produce hydroxy fatty acids—one of the specialty types scientists would like to produce for industrial applications. This latter strain could make the hydroxy fatty acids, but its rate of oil synthesis was only half that of normal plants and it accumulated much less oil in its seeds.

When crossing four separate genetic factors, it takes several plant generations to produce plants with the desired combination of genes: both deleted BADC genes and two genes that drive the production of hydroxy fatty acids, with two identical copies of each genetic factor.

“We were fortunate to have two very dedicated students working as interns through Brookhaven’s Office of Educational Programs—Kenneth Wei, who was then at Mount Sinai High School and is now at MIT, and Elen Deng, an undergraduate at Stony Brook University,” Yu said. “They did fantastic work running polymerase chain reaction (PCR) tests—similar to those used to test for COVID-19—to run detailed analyses of more than 600 plants to find those with the desired genetic makeup.”

Jantana Keereetaweep then worked with Yu to characterize those plants biochemically, to compare their rates of ACCase activity with those of the two Arabidopsis lines used to make the new genetic combinations.

The end result: Plants that had the combination of defective BADC genes and genes required for making hydroxy fatty acids produced normal levels of oil containing the specialty products. Compared with plants that had normal BADC genes, the new plants exhibited increases in the total amount of fatty acid per seed, the total seed oil content per plant, and the seed yield per plant.

“The BADC-defective plants are blind to the presence of hydroxy fatty acids and the usual response of turning off the ACCase—the oil-making machinery—is gone,” Keereetaweep said.

The results prove that BADC is the mechanism for reducing ACCase activity in both scenarios—the accumulation of excess endogenous fatty acids and the presence of hydroxy fatty acid.

“We are now testing to see if this mechanism is limited to hydroxy fatty acids, or, as we suspect, common to other ‘foreign’ fatty acids that also reduce ACCase activity,” Shanklin said. “If it’s a general mechanism, it opens the possibility of realizing the dream of making additional desired specialty fatty acids in the oil-rich seeds of crop plants,” Shanklin said. 

“This is a good example where a fundamental mechanistic understanding of biochemical regulation can be deployed to enable progress towards a viable, sustainable bioeconomy,” Shanklin said. “We can use this approach to make valuable renewable industrial starting materials at low cost in plants from carbon dioxide and sunlight, instead of relying on petrochemicals.”

This study was supported by the DOE Office of Science and the National Science Foundation.

Featured image: Biochemist Xiao-Hong Yu’s research on specialty fatty acid production in plants got a boost from collaborating with colleagues studying an off-switch that regulates ordinary fatty acid synthesis. © BNL


Reference: Xiao-Hong Yu, Yuanheng Cai, Jantana Keereetaweep, Kenneth Wei, Jin Chai, Elen Deng, Hui Liu, John Shanklin, Biotin attachment domain-containing proteins mediate hydroxy fatty acid-dependent inhibition of acetyl CoA carboxylase, Plant Physiology, 2021;, kiaa109, https://doi.org/10.1093/plphys/kiaa109


Provided by Brookhaven National Laboratory

Extreme Blood Sugar Swings in Patients with Type 2 Diabetes Suspected Trigger for Increased Risk of Heart Disease (Medicine)

In patients with type 2 diabetes, big swings in blood sugar levels between doctors’ visits are associated with an increased risk of heart disease. 

The study, published in the journal Diabetes, Obesity & Metabolism, looked at more than 29,000 patients with type 2 diabetes over a two-year period. Patients who already had heart disease were excluded. 

The American Diabetes Association recommends adults with diabetes maintain an A1c, the average blood sugar level over the past two to three months, of less than 7 percent to reduce complications from diabetes, such as heart disease. However, studies – including this one — have shown that wide swings in blood sugar levels may be a better predictor of diabetic complications than the A1c reading at any single doctor’s office visit. 

“The underlying mechanism for the relationship between wide variations in blood sugar levels between doctor’s appointments and high risk of heart disease in patients with type 2 diabetes is unclear,” said Gang Hu, MD, PhD, Associate Professor and Director, Chronic Disease Epidemiology Lab at Pennington Biomedical Research Center. “It’s possible that episodes of severely low blood sugar may be the connection.”

Research has shown that wide variations in blood sugar levels are associated with poor health outcomes and even death. A 2017 Johns Hopkins study found that one-third of people with diabetes hospitalized for a severe low blood sugar episode died within three years of the incident.

“We recommend that patients and their doctors implement therapies that can reduce wide swings in blood sugar levels and the associated episodes of severe low blood sugar,” Dr. Hu said. “Our findings suggest that measuring the swings in blood hemoglobin A1c levels over a specific time – six months to a year, for example – could serve as a supplemental blood sugar target,” he added. 

This work was funded by the Patient-Centered Outcomes Research Institute. PCORI is an independent, nonprofit organization authorized by Congress in 2010. Its mission is to fund research that will provide patients, their caregivers, and clinicians with the evidence-based information needed to make better-informed healthcare decisions. For more information about PCORI’s funding, visit www.pcori.org.

Several authors from Pennington Biomedical Research Center were partly supported by award U54GM104940 from the National Institute of General Medical Sciences of the National Institutes of Health. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.


Reference: Shen, Y, Zhou, J, Shi, L, et al. Association between visit‐to‐visit HbA1c variability and the risk of cardiovascular disease in patients with type 2 diabetes. Diabetes Obes Metab. 2021; 23: 125– 135. https://doi.org/10.1111/dom.14201


Provided by LSU Pennington Biomedical Research Center


About LSU’s Pennington Biomedical Research Center

LSU’s Pennington Biomedical Research Center is at the forefront of medical discovery as it relates to understanding the triggers of obesity, diabetes, cardiovascular disease, cancer and dementia. The center conducts basic, clinical and population research, and is affiliated with Louisiana State University. The research enterprise at Pennington Biomedical includes over 450 employees within a network of 40 clinics and research laboratories, and 13 highly specialized core service facilities. Its scientists and physician/scientists are supported by research trainees, lab technicians, nurses, dietitians and other support personnel. Pennington Biomedical is located in state-of-the-art research facilities on a 222-acre campus in Baton Rouge, Louisiana.

Moffitt Researchers Discover Mechanism that Regulates Anti-Tumor Activity of Immune Cells in Ovarian Cancer (Medicine)

The prognosis of ovarian cancer is poor, with an estimated five-year survival of only 40% for advanced disease, the stage at which most ovarian carcinomas are diagnosed. These poor outcomes are partly due to the lack of effective therapies for advanced disease and recurrence. Immunotherapies hold promise for many types of cancer; however, studies have shown that patients with ovarian cancer do not have strong responses to existing drugs. In a new article published in Nature, Moffitt Cancer Center researchers demonstrate why some ovarian cancer patients evolve better than others and suggest possible approaches to improve patient outcomes.

Immunotherapeutic drugs activate T cells, a type of immune cell, to put up a defense against tumor cells. Immunotherapies are approved to treat several different types of cancer and have greatly changed the standard of care and improved patient outcomes.  However, in ovarian cancer, clinical studies using immunotherapies aimed at stimulating T cells resulted in modest response rates. Studies have suggested that cancer patients who have a higher presence of other immune cells, such as plasma and memory B cells, could respond better to immunotherapies, but how these cell types promote better outcomes is unclear. Moffitt researchers wanted to confirm whether antibodies produced by these cells are associated with better outcomes and assess how these cells contribute to the spontaneous anti-tumor immune response against ovarian cancer.

The researchers analyzed a panel of 534 samples from ovarian cancer patients and found that patients who had a higher infiltration of B cells or B cell-derived plasma cells had better outcomes. B cells are a type of immune cell that produce antibodies and express one of five types of B cell receptors on their surface: IgM, IgD, IgG, IgE or IgA. These isotypes regulate different B cell signaling pathways and control B cell processes.  

The surprise came when, upon further analysis of the samples, the Moffitt team discovered that the antibodies produced by B and plasma cells were predominantly of the IgA subtype, followed by IgG.

“We found that the presence of IgA regulated downstream signaling pathways of the ovarian cancer cells. Specifically, IgA resulted in inhibition of the RAS signaling pathway, which is known to contribute to ovarian cancer development,” said Jose Conejo-Garcia, M.D., Ph.D., chair of Moffitt’s Immunology Department.

This inhibition of RAS sensitized the tumor cells to T cell mediated cell killing, produced by both novel CAR T cells and tumor-infiltrating lymphocytes. The team also assessed that IgA and IgG secreted by the B cells recognized specific ovarian tumor cell surface markers and stimulated other immune cells called myeloid cells to target ovarian cancer cells for destruction.

These data provide new insights into how components of the immune system regulate ovarian cancer progression and offer new opportunities to develop improved targeted agents. This includes a repertoire of tumor-derived antibodies that can be effectively used as novel immunotherapeutic agents. In addition, the study provides a mechanistic rationale for integrated antibody responses in the development of novel immunotherapies, which until now have been based on T cell-centric approaches.

“The findings indicate that immunotherapies that boost both coordinated B and T cell responses against ovarian cancer, an immunogenic disease currently resistant to checkpoint inhibitors, are likely to show superior therapeutic benefit,” said Subir Biswas, Ph.D., first author and  postdoctoral fellow in the Conejo-Garcia lab.

The study finally paves the way for the use of antibodies different from IgG as immunotherapeutic agents for at least tumors currently resistant to conventional immune checkpoint blockade.

This study was supported by the National Institutes of Health (P30CA076292, R01CA240434, R01CA157664, R01CA124515, T32CA009140, U01CA200495) and the American Cancer Society (PF-18-041-1-LIB).


Reference: Biswas, S., Mandal, G., Payne, K.K. et al. IgA transcytosis and antigen recognition govern ovarian cancer immunity. Nature (2021). https://doi.org/10.1038/s41586-020-03144-0 https://www.nature.com/articles/s41586-020-03144-0


Provided by Moffitt Cancer Center & Research Institute

Two Studies Shed Light on How, Where Body Can Add New Fat Cells (Biology)

Gaining more fat cells is probably not what most people want, although that might be exactly what they need to fight off diabetes and other diseases. How and where the body can add fat cells has remained a mystery – but two new studies from UT Southwestern provide answers on the way this process works.

The studies, both published online today in Cell Stem Cell, describe two different processes that affect the generation of new fat cells. One reports how fat cell creation is impacted by the level of activity in tiny organelles inside cells called mitochondria. The other outlines a process that prevents new fat cells from developing in one fat storage area in mice – the area that correlates with the healthy subcutaneous fat just under the skin in humans. (Both studies were done in mice.)

In the second study, a commonly used cancer drug was able to jump-start healthy fat cell creation in mice, a finding that raises the possibility of future drug treatments for humans.

While fat isn’t popular, as long as people overeat they will need a place to store the excess calories, explains Philipp Scherer, Ph.D., director of the Touchstone Center for Diabetes Research at UT Southwestern and senior author of the first study focusing on mitochondria. There are two options, he says: squeezing more lipids (fat) into existing fat cells and ballooning their size, leading to problems such as inflammation and, eventually, diabetes; or creating new fat cells to help spread the load. Fat stored properly – in fat cell layers under the skin (subcutaneous fat) that aren’t overburdened instead of around organs (visceral fat) or even inside organs – is the healthy alternative, he says.

Philipp Scherer, Ph.D. © UTSMC

Problems follow if existing fat cells are left on their own to become engorged, adds Rana Gupta, Ph.D., associate professor of internal medicine and senior author of the second study. “When these cells are so overwhelmed that they can’t take it anymore, they eventually die or become dysfunctional, spilling lipids into places not intended to store fat.”

Those lipids may move into the liver, leading to fatty liver disease; to the pancreas, resulting in diabetes; or even to the heart, causing cardiovascular disease, Gupta says. Visceral, or belly fat, may surround the organs, creating inflammation.

The healthiest place to store fat is in subcutaneous fat, adds Gupta. Ironically, that is where mice in his study were least able to create new fat cells, despite the fact that stem-cell-like progenitor cells primed to become fat cells were present there as well, he says.

Gupta’s study identified a process that prevents progenitor cells from developing into fat cells in mouse subcutaneous inguinal fat.

The protein HIF-1a (short for hypoxia-inducible factor-1 alpha) is central to the process. It kicks off a series of cellular actions that ultimately inactivate a second protein called PPARgamma, the key driver of fat cell formation.

These proteins are found in both humans and mice. In fact, in a culture of human subcutaneous fat cell progenitors, HIF-1a also inhibited new fat cells from being created, according to Gupta.

In Gupta’s mouse study, researchers used a genetic approach to inhibit HIF-1and found that the progenitor cells could then make subcutaneous inguinal fat cells and fewer were inflamed or fibrotic.

Rana Gupta, Ph.D. © UTSMC

Next, they tested the cancer drug imatinib (brand name Gleevec) and found it had the same effect. The cancer drug was tried because it was known to have beneficial effects against diabetes in cancer patients with both diseases, Gupta says.

In Scherer’s study, researchers manipulated a protein called MitoNEET in the outer membrane of the precursor cells’ mitochondria, organelles known as the cells’ power plants. The resulting mitochondrial dysfunction and drop in cell metabolism caused precursor cells to lose the ability to become new fat cells and increased inflammation.

“This study shows we can manipulate the precursor cells’ willingness to become fat cells,” Scherer says. “The ability to recruit new fat cells by tickling these pre-fat cells to become fat cells is very important and has profound beneficial effects on health, particularly in the obesity-prone environment that we all live in.” 

He says his goal is now to design a drug that could stimulate mitochondrial activity.

“Understanding the mechanism is an important first step,” Scherer says, referring to the findings from the two studies. “We will have to learn in the future how to manipulate these processes pharmacologically.”

Scherer holds the Gifford O. Touchstone, Jr. and Randolph G. Touchstone Distinguished Chair in Diabetes Research, and the Touchstone/West Distinguished Chair in Diabetes Research.

Other UT Southwestern researchers who participated in Scherer’s study include first author and postdoctoral researcher Nolwenn Joffin, plus Vivian A. Paschoal, Christy M. Gliniak, Clair Crewe, Abdallah Elnwasany, Luke I. Szweda, Qianbin Zhang, Christine M. Kusminski, Ruth Gordillo, Dayoung Oh, and Gupta.

UT Southwestern researchers participating in Gupta’s study include first author and assistant instructor Mengle Shao, Chelsea Hepler, Qianbin Zhang, Bo Shan, Lavanya Vishvanath, Gervaise H. Henry, Shangang Zhao, Yu An, and Douglas W. Strand.

Scherer’s study was supported by National Institutes of Health grants R01-DK55758, P01-DK088761, R01-DK099110, P01-AG051459, F32-DK113704, F31-DK113896, R01-DK104789, R01-DK119163, and RC2-DK118620. Joffin was supported by a postdoctoral fellowship from the Lipedema Foundation, LFA no. 18.

Gupta’s study was supported by National Institute of Diabetes and Digestive and Kidney Diseases grants F31DK113696, R01 DK104789, RC2 DK118620, R01 DK119163, R01 DK115477; American Diabetes Association grant 1-17-IBS-181; American Heart Association grants 16POST26420136 and 19CDA34670007; and support from the Japan Society for the Promotion of Science’s Grants-in-aid for Scientific Research (B) 18H02425.

Featured image: An image showing a blood vessel in fat tissue, surrounded by fat progenitor cells (in green). © UTSMC


Provided by UT Southwestern Medical Center

“Ghost Particle” ML Model Permits Full Quantum Description of the Solvated Electron (Physics)

Pinning down the nature of bulk hydrated electrons—extra electrons solvated in liquid water—has proven difficult experimentally because of their short lifetime and high reactivity. Theoretical exploration has been limited by the high level of electronic structure theory needed to achieve predictive accuracy. Now, joint work from teams at the University of Zurich and EPFL and colleagues has resulted in a highly accurate machine-learning (ML) model that is inexpensive enough to allow for a full quantum statistical and dynamical description, giving an accurate determination of the structure, diffusion mechanisms, and vibrational spectroscopy of the solvated electron. This new approach, outlined in the Nature Communications paper Simulating the Ghost: Quantum Dynamics of the Solvated Electron, could also be applied to excited states and quasiparticles such as polarons and would allow for high-accuracy simulations at a moderate price.

The behavior of the solvated electron e-aq has fundamental implications for electrochemistry, photochemistry, high-energy chemistry, as well as for biology—its nonequilibrium precursor is responsible for radiation damage to DNA—and it has understandably been the topic of experimental and theoretical investigation for more than 50 years.  

Though the hydrated electron appears to be simple—it is the smallest possible anion as well as the simplest reducing agent in chemistry—capturing its physics is…hard. They are short lived and generated in small quantities and so impossible to concentrate and isolate.  Their structure is therefore impossible to capture with direct experimental observation such as diffraction methods or NMR. Theoretical modelling has turned out to be as challenging. 

Density functional theory (DFT) is the electronic structure method most often used to study the solvated electron and water. Standard density functionals suffer from delocalization error though, making it impossible to model radicals accurately. Pure water complicates DFT approximations considerably, though choosing the right functionals can lead to acceptable results compared to high-level electronic structure benchmarks and values that can be observed through experiment. An accurate description of liquid water can be also achieved with many-body quantum chemistry methods, but they are extremely expensive.  

Though a recent picosecond-scale molecular dynamics-based breakthrough unprecedented in complexity and requiring computational resources at the limits of what’s possible provided a crucial argument in favor of a cavity structure for e-aq, it did not result in other new insights or in a complete statistical description. Comprehensive characterization of the system’s properties requires far longer timescales, but simulating quantum nuclei at this level of electronic structure theory is currently beyond computational reach.

The modern way of working around this problem involves the use of machine learning. Training an ML force field or potential energy surface (PES) based on ab initio data allows for much longer MD simulations because the cost of evaluating such energies and forces is almost negligible compared to that associated with electronic structure calculations. The problem is that the solvated electron is a non-typical species. It doesn’t have an atomistic formula, which poses a problem because machine learning PES work with atomistic representations.  

In the paper Simulating the Ghost: Quantum Dynamics of the Solvated Electron, University of Zurich researcher Vladimir Rybkin, doctoral student Jinggang Lan and lecturer Marcella Iannuzzi combined their expertise in electronic structure and solvated electrons with the knowledge of EPFL professor Michele Ceriotti and his former PhD students Venkat Kapil, now a researcher at Cambridge University, and Piero Gasparotto, now a researcher at Empa, in machine learning and quantum dynamics. That, with the contributions of other   colleagues, resulted in the application of the ML approach to data acquired from a many-body quantum chemistry method known as second-order Møller–Plesset perturbation theory (MP2), a method that gives an accurate description of water, anyway, without any special treatment of the excess electron.   

They were surprised to discover that the model was able to learn the presence of the solvated electron as a factor that distorted the structure of the pure liquid water. The dynamics run with the resulting ML PES was not only able to recover the stable cavity, but could also trace the correct localization dynamics, starting from the delocalized excess electron added to the water. In the end, ML simulated the electron as a sort of “ghost particle” that was not explicitly present in the model.

This allowed the researchers to achieve a time scale of several hundred picoseconds and collect reliable statistics by running a lot of computationally cheap classical trajectories and compute vibrational spectra, structures and diffusion. The ML approach also allowed them to simulate the quantum rather than classical nuclei with path-integral molecular dynamics (PIMD). This technique is at least one order of magnitude computationally more expensive than classical MD and cannot be carried out without ML PES at a high level of electronic structure theory.  

Taking the nuclear quantum effects into account delivered accurate vibrational spectra, allowing the researchers to quantify the impact of these effects—already shown to be very important in the relaxation dynamics of the excess electron—on the hydrated electron. It also revealed transient diffusion, an unusual, rare event that is not present in the classical regime. While non-transient diffusion of the solvated electron is achieved by solvent exchange followed by gradual displacement of the “electron cloud” or spin density distribution, transient diffusion is rather a jump of the spin density from the stable cavity to the adjacent one.  

While the ghost particle approach was applied here to the solvated electron, it could also be applied to excited states and quasiparticles such as polarons, opening up new opportunities for uniting high-level electronic structure theory with machine learning to achieve highly accurate dynamics simulations at a moderate price.

Featured image: The dynamics run with the resulting ML PES was not only able to recover the stable cavity, but could also trace the correct localization dynamics © Vladimir Rybkin


References: J. Lan, V. Kapil, P. Gasparotto, M. Ceriotti, M. Iannuzzi,1 and V. Rybkin, Simulating the ghost: quantum dynamics of the solvated electronNew Link. Nat Commun 12, 766 (2021). https://doi.org/10.1038/s41467-021-20914-0


Provided by NCCR Marvel

An Innovative and Non-destructive Strategy to Analyse Material from Mars (Planetary Science)

The IBeA research group has proposed an innovative strategy that can be used to characterise samples from the Mars Sample Return mission

The UPV/EHU’s IBeA research group, which includes experts in Raman spectroscopy, is currently analysing meteorites with the aim of developing non-destructive analytical strategies for upcoming explorations of Mars materials by the Perseverance rover, shortly due to arrive at the red planet. The strategies will also be used to examine materials collected by the Rosalind Franklin rover and returned to Earth following the Mars Sample Return mission, scheduled to commence in 2026.

The IBeA research group from the University of the Basque Country’s Department of Analytical Chemistry, Faculty of Science and Technology, is participating in NASA’s Mars2020 space mission, which is scheduled to touch down on Mars in February this year. Specifically, the group has participated in constructing and verifying the chemical homogeneity of the templates included on the calibration card of the SuperCam instrument mounted on the Perseverance. ‘We made a set of pads perfectly characterised in accordance the instruments we have here, in order to enable us to verify that the LIBS and Raman spectroscopy measurements taken by the SuperCam are correct,’ explains Doctor Cristina García-Florentino. ‘Raman spectroscopy is a technique for determining the molecular composition of unknown samples,’ she continues. ‘In other words, not only can we determine, for example, whether the sample contains calcium or iron, etc., we can also identify the molecular form in which they are present. Thus, we can see whether they contain calcite or gypsum, for example. We can determine the geochemical composition of the planet’.

Cristina García-Florentino. Photo: Tere Ormazabal. UPV/EHU

At the same time, the research group is also working on characterising meteorites, with a twofold objective: ‘Firstly, to get ready for the information that may be sent from Mars by the Perseverance rover; and secondly, to develop non-destructive analytical strategies for characterising Martian samples from the return mission (Mars Sample Return mission) when it reaches Earth’. To date, Martian meteorites have been the only Martian samples available for developing different analysis methods. In a recent study, the group has proposed an innovative non-destructive analytical strategy that could be added to the current arsenal of fast analysis techniques which can be used with future samples.

To demonstrate its capabilities, the group has used their analytical proposal to ‘characterise the Martian meteorite Dar al Gani 735, with the aim of identifying the terrestrial and non-terrestrial alterations suffered by it, as a very valuable complementary methodology to the more traditional petrographic analyses,’ explains Dr García-Florentino.

Access may be uncertain

In the researcher’s opinion, ‘this study demonstrates the potential of Raman spectroscopy as a key technique in the new upcoming explorations of Mars materials by the Rosalind Franklin rover (the ESA’s Exomars2022 mission) and the Perseverance rover (NASA’s Mars2020 mission), on which Raman spectrometers will be mounted for the first time in an extra-terrestrial research mission in the field’. According to Dr García-Florentino, the technique is important ‘because, once we have samples brought back directly from Mars, we cannot destroy them to analyse them in the initial stages of study. It is therefore important to be ready for when the Martian samples arrive, in order to gain as much information from them as possible, with the fewest possible errors and trying to destroy them as little as possible’. Nevertheless, the researcher warns that access to the information and to the samples themselves will be difficult: ‘We still do not know whether we will be granted access to the samples, whether they will allow us to analyse them as we propose here with the techniques we have developed’. Meanwhile, the IBeA group will continue its work, ‘because each meteorite is a world unto itself; each meteorite is totally different from all others’.

Featured image: Image of the distribution of certain elements from one of the meteorites analysed by the group
© IBeA / UPV/EHU


Reference: C. García-Florentino, I. Torre-Fdez, P. Ruiz-Galende, J. Aramendia, K. Castro, G. Arana, M. Maguregui, S. Fdz. Ortiz de Vallejuelo, J. M. Madariaga, “Development of innovative non-destructive analytical strategies for Mars Sample Return tested on Dar al Gani 735 Martian Meteorite“, Talanta, 2021. DOI: 10.1016/j.talanta.2020.121863


Provided by University of Basque Country

Quantum Tunneling in Graphene Advances the Age of Terahertz Wireless Communications (Physics)

Scientists from MIPT, Moscow Pedagogical State University and the University of Manchester have created a highly sensitive terahertz detector based on the effect of quantum-mechanical tunneling in graphene. The sensitivity of the device is already superior to commercially available analogs based on semiconductors and superconductors, which opens up prospects for applications of the graphene detector in wireless communications, security systems, radio astronomy, and medical diagnostics. The research results are published in a high-rank journal Nature Communications.

Information transfer in wireless networks is based on transformation of a high-frequency continuous electromagnetic wave into a discrete sequence of bits. This technique is known as signal modulation. To transfer the bits faster, one has to increase the modulation frequency. However, this requires synchronous increase in carrier frequency. A common FM-radio transmits at frequencies of hundred megahertz, a Wi-Fi receiver uses signals of roughly five gigahertz frequency, while the 5G mobile networks can transmit up to 20 gigahertz signals. This is far from the limit, and further increase in carrier frequency admits a proportional increase in data transfer rates. Unfortunately, picking up signals with hundred gigahertz frequencies and higher is an increasingly challenging problem.

A typical receiver used in wireless communications consists of a transistor-based amplifier of weak signals and a demodulator that rectifies the sequence of bits from the modulated signal. This scheme originated in the age of radio and television, and becomes inefficient at frequencies of hundreds of gigahertz desirable for mobile systems. The fact is that most of the existing transistors aren’t fast enough to recharge at such a high frequency.

An evolutionary way to solve this problem is just to increase the maximum operation frequency of a transistor. Most specialists in the area of nanoelectronics work hard in this direction. A revolutionary way to solve the problem was theoretically proposed in the beginning of 1990’s by physicists Michael Dyakonov and Michael Shur, and realized, among others, by the group of authors in 2018. It implies abandoning active amplification by transistor, and abandoning a separate demodulator. What’s left in the circuit is a single transistor, but its role is now different. It transforms a modulated signal into bit sequence or voice signal by itself, due to non-linear relation between its current and voltage drop.

In the present work, the authors have proved that the detection of a terahertz signal is very efficient in the so-called tunneling field-effect transistor. To understand its work, one can just recall the principle of an electromechanical relay, where the passage of current through control contacts leads to a mechanical connection between two conductors and, hence, to the emergence of current. In a tunneling transistor, applying voltage to the control contact (termed as ”gate”) leads to alignment of the energy levels of the source and channel. This also leads to the flow of current. A distinctive feature of a tunneling transistor is its very strong sensitivity to control voltage. Even a small “detuning” of energy levels is enough to interrupt the subtle process of quantum mechanical tunneling. Similarly, a small voltage at the control gate is able to “connect” the levels and initiate the tunneling current.

“The idea of a strong reaction of a tunneling transistor to low voltages is known for about fifteen years,” says Dr. Dmitry Svintsov, one of the authors of the study, head of the laboratory for optoelectronics of two-dimensional materials at the MIPT center for photonics and 2D materials. “But it’s been known only in the community of low-power electronics. No one realized before us that the same property of a tunneling transistor can be applied in the technology of terahertz detectors. Georgy Alymov (co-author of the study) and I were lucky to work in both areas. We realized then: if the transistor is opened and closed at a low power of the control signal, then it should also be good in picking up weak signals from the ambient surrounding. “

The created device is based on bilayer graphene, a unique material in which the position of energy levels (more strictly, the band structure) can be controlled using an electric voltage. This allowed the authors to switch between classical transport and quantum tunneling transport within a single device, with just a change in the polarities of the voltage at the control contacts. This possibility is of extreme importance for an accurate comparison of the detecting ability of a classical and quantum tunneling transistor.

The experiment showed that the sensitivity of the device in the tunnelling mode is few orders of magnitude higher than that in the classical transport mode. The minimum signal distinguishable by the detector against the noisy background already competes with that of commercially available superconducting and semiconductor bolometers. However, this is not the limit – the sensitivity of the detector can be further increased in “cleaner” devices with a low concentration of residual impurities. The developed detection theory, tested by the experiment, shows that the sensitivity of the “optimal” detector can be a hundred times higher.

“The current characteristics give rise to great hopes for the creation of fast and sensitive detectors for wireless communications,” says the author of the work, Dr. Denis Bandurin. And this area is not limited to graphene and is not limited to tunnel transistors. We expect that, with the same success, a remarkable detector can be created, for example, based on an electrically controlled phase transition. Graphene turned out to be just a good launching pad here, just a door, behind which is a whole world of exciting new research.”

The results presented in this paper are an example of a successful collaboration between several research groups. The authors note that it is this format of work that allows them to obtain world-class scientific results. For example, earlier, the same team of scientists demonstrated how waves in the electron sea of ??graphene can contribute to the development of terahertz technology. “In an era of rapidly evolving technology, it is becoming increasingly difficult to achieve competitive results.” – comments Dr. Georgy Fedorov, deputy head of the nanocarbon materials laboratory, MIPT, – “Only by combining the efforts and expertise of several groups can we successfully realize the most difficult tasks and achieve the most ambitious goals, which we will continue to do.”

Featured image: Quantum tunneling © Daria Sokol/MIPT Press Office


Reference: Gayduchenko, I., Xu, S.G., Alymov, G. et al. Tunnel field-effect transistors for sensitive terahertz detection. Nat Commun 12, 543 (2021). https://doi.org/10.1038/s41467-020-20721-z


Provided by Moscow Institute of Physics and Technology

True Identity of Mysterious Gamma-ray Source Revealed (Planetary Science)

An international research team including members from The University of Manchester has shown that a rapidly rotating neutron star is at the core of a celestial object now known as PSR J2039-5617.

The international collaboration used novel data analysis methods and the enormous computing power of the citizen science project Einstein@Home to track down the neutron star’s faint gamma-ray pulsations in data from NASA’s Fermi Space Telescope. Their results show that the pulsar is in orbit with a stellar companion about a sixth of the mass of our Sun. The pulsar is slowly but surely evaporating this star. The team also found that the companion’s orbit varies slightly and unpredictably over time. Using their search method, they expect to find more such systems with Einstein@Home in the future.

Searching for the so-called ‘Spider’ pulsar systems – rapidly spinning neutron stars whose high-energy outflows are destroying their binary companion star, required 10 years of precise data. The pulsars have been given arachnid names of ‘Black widows’ or ‘Redbacks’, after species of spider where the females have been seen to kill the smaller males after mating.

New research published in, Monthly Notices of the Royal Astronomical Society, details how researchers found a neutron star rotating 377 times a second in an exotic binary system using data from NASA’s Fermi Space Telescope.

The astronomer’s findings were uniquely boosted by the Einstein@Home project, a network of thousands of civilian volunteers lending their home computing power to the efforts of the Fermi Telescope’s work.

The group’s search required combing very finely through the data in order not to miss any possible signals. The computing power required is enormous. The search would have taken 500 years to complete on a single computer core. By using a part of the Einstein@Home resources it was done in 2 months.

With the computing power donated by the Einstein@Home volunteers, the team discovered gamma-ray pulsations from the rapidly rotating neutron star. This gamma-ray pulsar, now known as J2039?5617, rotates about 377 times each second.

“It had been suspected for years that there is a pulsar, a rapidly rotating neutron star, at the heart of the source we now know as PSR J2039?5617,” says Lars Nieder, a PhD student at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute; AEI) in Hannover. “But it was only possible to lift the veil and discover the gamma-ray pulsations with the computing power donated by tens of thousands of volunteers to Einstein@Home,” he adds.

The celestial object has been known since 2014 as a source of X-rays, gamma rays, and light. All evidence obtained so far pointed at a rapidly rotating neutron star in orbit with a light-weight star being at the heart of the source. But clear proof was missing.

The first step to solving this riddle were new observations of the stellar companion with optical telescopes. They provided precise knowledge about the binary system without which a gamma-ray pulsar search (even with Einstein@Home’s huge computing power) would be unfeasible.

The system’s brightness varies during an orbital period depending on which side of the neutron star’s companion is facing the Earth. “For J2039-5617, there are two main processes at work,” explains Dr. Colin Clark from Jodrell Bank Centre for Astrophysics, lead author of the study. “The pulsar heats up one side of the light-weight companion, which appears brighter and more bluish. Additionally, the companion is distorted by the pulsar’s gravitational pull causing the apparent size of the star to vary over the orbit. These observations allowed the team to get the most precise measurement possible of the binary star’s 5.5-hour orbital period, as well as other properties of the system.”

With this information and the precise sky position from Gaia data, the team used the aggregated computing power of the distributed volunteer computing project Einstein@Home for a new search of about 10 years of archival observations of NASA’s Fermi Gamma-ray Space Telescope. Improving on earlier methods they had developed for this purpose, they enlisted the help of tens of thousands of volunteers to search Fermi data for periodic pulsations in the gamma-ray photons registered by the Large Area Telescope onboard the space telescope. The volunteers donated idle compute cycles on their computers’ CPUs and GPUs to Einstein@Home.

The new knowledge of the frequency of the gamma-ray pulsations also allowed collaborators to detect radio pulsations in archival data from the Parkes radio telescope. Their results, also published in Monthly Notices of the Royal Astronomical Society, show that the pulsar’s radio emission is often eclipsed by material that has been blown off the companion star by its nearby Redback pulsar.

Featured image: Artist’s impression of PSR J2039-5617 and its companion. The binary system consists of a rapidly rotating neutron star © Knispel/Clark/Max Planck Institute for Gravitational Physics/NASA GSFC


Reference: C J Clark, L Nieder, G Voisin, B Allen, C Aulbert, O Behnke, R P Breton, C Choquet, A Corongiu, V S Dhillon, H B Eggenstein, H Fehrmann, L Guillemot, A K Harding, M R Kennedy, B Machenschalk, T R Marsh, D Mata Sánchez, R P Mignani, J Stringer, Z Wadiasingh, J Wu, Einstein@Home discovery of the gamma-ray millisecond pulsar PSR J2039–5617 confirms its predicted redback nature, Monthly Notices of the Royal Astronomical Society, Volume 502, Issue 1, March 2021, Pages 915–934, https://doi.org/10.1093/mnras/staa3484


Provided by University of Manchester

Discoveries at the Edge of the Periodic Table: First Ever Measurements of Einsteinium (Chemistry)

Experiments by Berkeley Lab scientists on this highly radioactive element reveal some unexpected properties

Since element 99 – einsteinium – was discovered in 1952 at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) from the debris of the first hydrogen bomb, scientists have performed very few experiments with it because it is so hard to create and is exceptionally radioactive. A team of Berkeley Lab chemists has overcome these obstacles to report the first study characterizing some of its properties, opening the door to a better understanding of the remaining transuranic elements of the actinide series.

Published in the journal Nature, the study, “Structural and Spectroscopic Characterization of an Einsteinium Complex,” was co-led by Berkeley Lab scientist Rebecca Abergel and Los Alamos National Laboratory scientist Stosh Kozimor, and included scientists from the two laboratories, UC Berkeley, and Georgetown University, several of whom are graduate students and postdoctoral fellows. With less than 250 nanograms of the element, the team measured the first-ever einsteinium bond distance, a basic property of an element’s interactions with other atoms and molecules.

“There’s not much known about einsteinium,” said Abergel, who leads Berkeley Lab’s Heavy Element Chemistry group and is an assistant professor in UC Berkeley’s Nuclear Engineering department. “It’s a remarkable achievement that we were able to work with this small amount of material and do inorganic chemistry. It’s significant because the more we understand about its chemical behavior, the more we can apply this understanding for the development of new materials or new technologies, not necessarily just with einsteinium, but with the rest of the actinides too. And we can establish trends in the periodic table.”

Short-lived and hard to make

Abergel and her team used experimental facilities not available decades ago when einsteinium was first discovered – the Molecular Foundry at Berkeley Lab and the Stanford Synchrotron Radiation Lightsource (SSRL) at SLAC National Accelerator Laboratory, both DOE Office of Science user facilities – to conduct luminescence spectroscopy and X-ray absorption spectroscopy experiments.

But first, getting the sample in a usable form was almost half the battle. “This whole paper is a long series of unfortunate events,” she said wryly.

Berkeley Lab scientists Jennifer Wacker (from left), Leticia Arnedo-Sanchez, Korey Carter, Katherine Shield work in the chemistry lab of Rebecca Abergel. © Marilyn Sargent/Berkeley Lab

The material was made at Oak Ridge National Laboratory’s High Flux Isotope Reactor, one of only a few places in the world that is capable of making einsteinium, which involves bombarding curium targets with neutrons to trigger a long chain of nuclear reactions. The first problem they encountered was that the sample was contaminated with a significant amount of californium, as making pure einsteinium in a usable quantity is extraordinarily challenging.

So they had to scrap their original plan to use X-ray crystallography – which is considered the gold standard for obtaining structural information on highly radioactive molecules but requires a pure sample of metal – and instead came up with a new way to make samples and leverage element-specific research techniques. Researchers at Los Alamos provided critical assistance in this step by designing a sample holder uniquely suited to the challenges intrinsic to einsteinium.

Then, contending with radioactive decay was another challenge. The Berkeley Lab team conducted their experiments with einsteinium-254, one of the more stable isotopes of the element. It has a half-life of 276 days, which is the time for half of the material to decay. Although the team was able to conduct many of the experiments before the coronavirus pandemic, they had plans for follow-up experiments that got interrupted thanks to pandemic-related shutdowns. By the time they were able to get back into their lab last summer, most of the sample was gone.

Bond distance and beyond

Still, the researchers were able to measure a bond distance with einsteinium and also discovered some physical chemistry behavior that was different from what would be expected from the actinide series, which are the elements on the bottom row of the periodic table.

“Determining the bond distance may not sound interesting, but it’s the first thing you would want to know about how a metal binds to other molecules. What kind of chemical interaction is this element going to have with other atoms and molecules?” Abergel said.

Once scientists have this picture of the atomic arrangement of a molecule that incorporates einsteinium, they can try to find interesting chemical properties and improve understanding of periodic trends. “By getting this piece of data, we gain a better, broader understanding of how the whole actinide series behaves. And in that series, we have elements or isotopes that are useful for nuclear power production or radiopharmaceuticals,” she said.

Tantalizingly, this research also offers the possibility of exploring what is beyond the edge of the periodic table, and possibly discovering a new element. “We’re really starting to understand a little better what happens toward the end of the periodic table, and the next thing is, you could also envision an einsteinium target for discovering new elements,” Abergel said. “Similar to the latest elements that were discovered in the past 10 years, like tennessine, which used a berkelium target, if you were to be able to isolate enough pure einsteinium to make a target, you could start looking for other elements and get closer to the (theorized) island of stability,” where nuclear physicists have predicted isotopes may have half-lives of minutes or even days, instead of the microsecond or less half-lives that are common in the superheavy elements.

Featured image: Berkeley Lab scientists Leticia Arnedo-Sanchez (from left), Katherine Shield, Korey Carter, and Jennifer Wacker had to take precautions against radioactivity as well as coronavirus to conduct experiments with the rare element, einsteinium. © Marilyn Sargent/Berkeley Lab


Reference: Carter, K.P., Shield, K.M., Smith, K.F. et al. Structural and spectroscopic characterization of an einsteinium complex. Nature 590, 85–88 (2021). https://doi.org/10.1038/s41586-020-03179-3


Provided by Berkeley Lab