Getting To The Core of HIV Replication (Biology)

HIV-1 viral capsid simulations on XSEDE-allocated Stampede2, Bridges, Darwin systems uncover nucleotide entry mechanism

Viruses lurk in the grey area between the living and the nonliving, according to scientists. Like living things, they replicate but they don’t do it on their own. The HIV-1 virus, like all viruses, needs to hijack a host cell through infection in order to make copies of itself.

Supercomputer simulations supported by the National Science Foundation-funded Extreme Science and Engineering Discovery Environment (XSEDE) have helped uncover the mechanism for how the HIV-1 virus imports into its core the nucleotides it needs to fuel DNA synthesis, a key step in its replication. It’s the first example found where a virus performs an activity such as recruiting small molecules from a cellular environment into its core to conduct a process beneficial for its life cycle.

Cooperative binding of small molecules to the central hexamer cavity. Two-dimensional free energy landscapes of deoxyadenosine triphosphate translocation through the cavity, in the presence of an additional IP6. Pathways connecting interior and exterior shown as a dashed line, with representative structures corresponding translocation events. Credit: Xu, et al.

The computational biophysics research, published December 2020 in PLOS Biology, challenges the prevailing view of the viral capsid, long considered to be just a static envelope housing the genetic material of the HIV-1 virus.

“To my knowledge, it’s the first piece of work that comprehensively shows an active role of the capsids in regulating a very specific lifecycle of the virus, not only computationally, but also in vitro assays and ultimately in the cells,” said study co-author Juan R. Perilla, a biophysical chemist at the University of Delaware.

The research team collaborated with several research groups, including experimental groups at the University of Pittsburgh School of Medicine and the Harvard Medical School. These groups validated the predictions from molecular dynamics (MD) simulations by using atomic force microscopy and transmission electron microscopy.

“For our part, we used MD simulations,” said lead author Chaoyi Xu, a graduate student in the Perilla Lab. “We studied how the HIV capsid allows permeability to small molecules, including nucleotides, IP6, and others.” IP6 is a metabolite that helps stabilize the HIV-1 capsid.

It’s rare for a computational paper to be in a biology journal, explained Perilla. “The reason this is possible is that we are discovering new biology,” he said. The biology relates to the stability of the virus to import small molecules that it needs for certain metabolic pathways. “In the context of HIV, it’s the fuel for the reverse transcription that occurs inside of the capsid.”

The enzyme reverse transcriptase generates complimentary DNA, one-half of DNA that pairs up in the cell to complete the full invading viral DNA. The viral DNA enters the host cell nucleus, integrates into the host cell DNA, and uses the cell’s machinery to crank out new viral DNA.

“In these series of experiments and computational predictions, what we have shown is that the capsid itself plays an active role in the infective cycle,” Perilla said. “It regulates the reverse transcription — how the viral DNA synthesizes inside of the capsid.” He explained that these processes are the result of millions of years of co-evolution between the virus and the target cell.

“Without supercomputers, the computational part of the study would have been impossible,” added Xu. The challenge was that the biological problem of nucleotide translocation would require a longer timescale than would be possible to sample using atomistic molecular dynamics simulations.

Chaoyi Xu (left) and Juan R. Perilla (bottom), Department of Chemistry & Biochemistry, University of Delaware. Jorge Salazar, TACC (right).

Instead, the researchers used a technique called umbrella sampling coupled with Hamiltonian replica exchange. “The advantage of using this technique is that we can separate the whole translocation process into small windows,” Xu said. In each small window, they ran individual small MD simulations in parallel on supercomputers.

“By using the resources provided from XSEDE, we were able to run and not only test the translocation processes, but also the effects of small molecules binding on the translocation process by comparing the free energy differences calculated from our results.”

XSEDE awarded Perilla and his lab access to two supercomputing systems used in the HIV capsid research: Stampede2 at the Texas Advanced Computing Center (TACC); and Bridges at the Pittsburgh Supercomputing Center (PSC).

“TACC and PSC have been extremely generous to us and very supportive,” Perilla said.

“When I transferred from Stampede1 to Stampede2, the hardware was a big improvement. At the time, we were fascinated with the Intel Xeon Skylake nodes. They were fantastic,” Perilla said.

“On Bridges, we took advantage of the high memory nodes. They have these massive memory machines with 3 and 12 terabytes of inline memory. They’re really good for analysis. Bridges provides a very unique service to the community,” he continued.

Molecular mechanism for nucleotide translocation through the HIV-1 CA hexamer. a) Nucleotide diffuses between the capsid exterior and central cavity. (b) Nucleotide binds to Arg18 and Lys25. (c) Second nucleotide enters. (d) Phosphate group of second nucleotide interacts with Arg18. (e) Second nucleotide enhances interactions between Lys25 and the first nucleotide. (f) Thermal fluctuations facilitate dissociation of dNTP. (g) Second nucleotide occupies canonical binding position (b) for a single nucleotide in the cavity. Credit: Xu, et al.

On related work, the Perilla Lab has also employed through XSEDE the PSC Bridges-AI system, and they have been part of the early user science program for PSC’s Bridges-2 platform.

“We’ve enjoyed this early science period on Bridges-2,” Perilla said. “The experts at PSC want us to hammer the machine as much as we can, and we’re happy to do that. We have a lot of work that needs to be done.”

Perilla related that the XSEDE Campus Champion program has also helped in his mission for training the next generation of computational scientists. The program enlists 600+ faculty and staff at more than 300 universities to help students, faculty, and postdocs take full advantage of XSEDE’s cyberinfrastructure resources.

“We received an immense amount of help from our XSEDE Campus Champion, Anita Schwartz.” Perilla said. “She helped us with everything that is related to XSEDE. We also took advantage of the training programs. The younger members of our lab took advantage of the training opportunities offered by XSEDE.”

Xu recalled finding them helpful for learning how to get started using XSEDE supercomputers, and also for learning the Simple Linux Utility for Resource Management (SLURM), which is the project job management used for supercomputers.

“By taking these courses, I familiarized myself with using these supercomputers, and also to use them to solve our research questions,” Xu said.

What’s more, the University of Delaware launched in December 2020 the Darwin supercomputer, a new XSEDE-allocated resource.

Stampede2 at TACC (left) and Bridges at PSC (right) are allocated through the NSF-funded Extreme Science and Engineering Discovery Environment.

“The students in the group have had the opportunity to train on these fantastic machines provided by XSEDE, they’re now at the point that they’re expanding that knowledge to other researchers on campus and explaining the details of how to make the best use of the resource,” Perilla said. “And now that we have an XSEDE resource here on campus, it’s helping us create a local community that is as passionate about high performance computing as we are,”

Perilla sees this latest work on the HIV-1 capsid as providing a new target for therapeutic development. Because there is no cure for HIV and the virus keeps getting drug resistance, there’s a constant need to optimize anti-retroviral drugs.

Said Perilla: “We’re very enthusiastic about supercomputers and what they can do, the scientific questions they allow us to pose. We want to reproduce biology. That’s the ultimate goal of what we do and what supercomputers enable us to do.”

The study, “Permeability of the HIV-1 capsid to metabolites modulates viral DNA synthesis,” was published December 17, 2020 in the journal PLOS Biology. The authors are Chaoyi Xu, Brent Runge, Roman Zadorozhnyi, Tatyana Polenova, and Juan R. Perilla of the University of Delaware; Douglas K. Fischer, Jinwoo Ahn, and Zandrea Ambrose of the University of Pittsburgh School of Medicine; Wen Li and Alan N. Engelman of Harvard Medical School; Sanela Rankovic and Itay Rousso of the Ben-Gurion University of Negev; Robert A. Dick of Cornell University; Christopher Aiken of the Vanderbilt University Medical Center. This work was supported by the US National Institutes of Health grants P50AI1504817, P20GM104316, R01AI147890, R01AI070042, and R01AI107013.

Featured image: The HIV-1 virus has evolved a way to import into its core the nucleotides it needs to fuel DNA synthesis, according to research led by Juan R. Perilla at the University of Delaware. Using the TACC Stampede2 and PSC Bridges supercomputers, Perilla’s team has shown for the first time that a virus performs an activity such as recruiting small molecules from a cellular environment into its core to conduct a process beneficial for its life cycle. Credit: Xu, et al.


Provided by University of Texas

Decades of Hunting Detects Footprint of Cosmic Ray Superaccelerators in Our Galaxy (Astronomy)

Record-breaking gamma rays bathe the Milky Way in an energetic haze

An enormous telescope complex in Tibet has captured the first evidence of ultrahigh-energy gamma rays spread across the Milky Way. The findings offer proof that undetected starry accelerators churn out cosmic rays, which have floated around our galaxy for millions of years. The research is to be published in the journal Physical Review Letters on Monday, April 5.

“We found 23 ultrahigh-energy cosmic gamma rays along the Milky Way,” said Kazumasa Kawata, a coauthor from the University of Tokyo. “The highest energy among them amounts to a world record: nearly one petaelectron volt.”

That’s three orders of magnitude greater than any known cosmic-ray-induced gamma ray–or any particle humans have accelerated in state-of-the-art laboratories on Earth.

Since 1990, dozens of researchers from China and Japan have hunted for the elusive high-energy cosmic gamma rays. The Tibet ASγ Collaboration made its discovery using nearly 70,000 square meters of ground arrays and underground muon detectors on the Tibetan Plateau, sitting more than 14,000 feet above sea level.

“Scientists believe high energy gamma rays can be produced by the nuclear interaction between high energy cosmic rays escaping from the most powerful galactic sources and interstellar gas in the Milky Way galaxy,” said Huang Jing, a coauthor from Institute of High Energy Physics, Chinese Academy of Sciences.

Chen Ding of the National Astronomical Observatories, Chinese Academy of Sciences, another coauthor, added, “The detection of diffuse gamma rays above 100 teraelectron volts is a key to understanding the origin of very-high-energy cosmic rays, which has been a mystery since their discovery in 1912.”

Balloon experiments first identified cosmic rays, revealing they were a key source of radiation on Earth. Cosmic rays are highly energetic particles, mostly protons, that travel across space. Millions of these particles pass through your body every day. (They are believed harmless.)

But where do cosmic rays come from?

“We live together with cosmic-ray muons, though we are usually not sensitive to them,” said Kawata. “Isn’t it a fantasy to think of where and how these cosmic rays are produced and accelerated, traveling all the way to Earth?”

A popular theory argues that accelerators known as “PeVatrons” spew cosmic rays at energies up to one petaelectron volt (PeV). Possible PeVatrons include supernova explosions, star-forming regions, and the supermassive black hole at the center of our galaxy.

So far, no one has detected any such accelerators. If PeVatrons exist, their cosmic rays should leave trails of glowing gamma rays strewn across the galaxy. The new study reports the first evidence of this highly-energetic haze.

“These gamma rays did not point back to the most powerful known high-energy gamma-ray sources, but spread out along the Milky Way,” said Masato Takita, a coauthor and colleague of Kawata. “Our discovery confirms evidence of the existence of PeVatrons.”

The researchers now want to determine if the probable PeVatrons are active or dead.

“From dead PeVatrons, which are extinct like dinosaurs, we can only see the footprint–the cosmic rays they produced over a few million years, spread over the galactic disk,” said Takita.

“But if we can locate real, active PeVatrons, we can study many more questions,” he said. “What type of star emits our sub-PeV gamma rays and related cosmic rays? How can a star accelerate cosmic rays up to PeV energies? How do the rays propagate inside our galactic disk?”

Other future directions include looking for PeVatron footprints in the southern hemisphere and confirming the gamma-ray results using neutrino detectors in Antarctica and beyond.

The research could also aid in the quest for dark matter. Underground detectors allowed the researchers to cut away cosmic-ray background noise, revealing the kind of pure, diffuse gamma rays predicted to emanate from dark matter.

“We can reduce the cosmic ray background by a factor of one million. Then we see a high-purity gamma ray sky,” said Takita.

The experimental achievement moves physicists significantly closer to discovering where cosmic rays are born.

“This pioneering work opens a new window for the exploration of the extreme universe,” said Huang. “The observational evidence marks an important milestone toward revealing cosmic ray origins, which have puzzled mankind for more than one century.”

Funding Information

The Tibet ASgamma Collaboration is composed of the following institutions: Hirosaki Univ., Nanjing Univ., IHEP / CAS, NAOC / CAS, Tibet Univ., Hebei Normal Univ., Univ. of Chinese Academy of Sciences, Shandong Univ., SouthWest Jiaotong Univ., Kanagawa Univ., Utsunomiya Univ., Shibaura Institute of Technology, Yokohama National Univ., Shinshu Univ., ICRR / Univ. of Tokyo, ISAS / JAXA, NCSW / CMA, Shandong Agriculture Univ., Univ. of Science and Technology of China, THCA, Tsinghua Univ., Joint Research Center for Astrophysics / Tsinghua Univ., Tsinghua Univ., NII, Shandong Management Univ., NICT, China Univ. of Petroleum, Tokyo Metropolitan College of Industrial Technology, Konan Univ., Nihon Univ., Shonan Institute of Technology, Waseda Univ., JAEA, and PMO / CAS.

This work was or is supported by Grants-in-Aid for Scientific Research on Priority Areas from the MEXT of Japan, Grants-in-Aid for Science Research from the JSPS of Japan, the National Key R&D Program of China, the National Natural Science Foundation of China, the Key Laboratory of Particle Astrophysics, IHEP, CAS, and the joint research program of the ICRR.

Featured image: Ultrahigh-energy diffuse gamma rays (yellow points) are distributed along the Milky Way galaxy. The background color contour shows the atomic hydrogen distribution in the galactic coordinates. The gray shaded area indicates what is outside of the field of view. © HEASARC / LAMBDA / NASA / GFSC


Reference: M. Amenomori et al. (The Tibet ASγ Collaboration), Phys. Rev. Lett., “First Detection of Sub-PeV Diffuse Gamma Rays from the Galactic Disk: Evidence for Ubiquitous Galactic Cosmic Rays beyond PeV Energies”, 2021. https://journals.aps.org/prl/accepted/2207cYd3La91536bf3509f3189e65322ea6e4b7e0


Provided by American Physical Society

Q–Physicists Studied Particle Production from Oscillating Scalar Field & Consistency of Boltzmann Equation (Quantum / Maths)

Boltzmann equation plays important roles in particle cosmology in studying the evolution of distribution functions (also called as occupation numbers) of various particles. The success of the standard cosmology or cold dark matter (ΛCDM) essentially relies on the use of the Boltzmann equation in an expanding Universe. In cosmology, however, the stimulated emission or Pauli blocking effect is usually omitted.

Scalar fields may form condensation; examples of such scalar fields include inflaton, curvaton, axion, Affleck-Dine field for the baryogenesis, and so on. (Hereafter, such a scalar condensation is called φ.) If φ can decay into a pair of bosonic particles (called χ) as φ → χχ, the effects of the stimulated emission can be important since the daughter particles may be enormously populated. In particular, today itself, I wrote an article on a mechanism of producing bosonic dark matter from the inflaton decay, in which a stimulated emission of the bosonic dark matter plays an important role.

Such systems have been studied by employing the Boltzmann equation or in the context of the parametric resonance. In particular, particle production from an oscillating scalar field has been intensively investigated, particularly using the Mathieu equation. In the previous studies, the evolution equation of the expectation value of the field operator (which we call wave function) for the final-state particle is converted to the Mathieu equation, based on which the occupation number of the final state particle has been estimated. Then, it has been shown that there exist resonance bands and that the occupation numbers in the resonance bands grow exponentially. In the broad resonance region, it has been known that the particle production is non-perturbative and that the process cannot be described by the Boltzmann equation. On the contrary, sometimes it has been argued that the Boltzmann equation can provide a proper description in the narrow resonance regime.

Now, Moroi and Yin, in this paper, have studied the particle production from an oscillating scalar field (φ) in the narrow resonance regime (or assuming that the final state particle (χ) is very weakly interacting). They have paid particular attention to the consistency of the results from the Boltzmann equation and those from the QFT calculation. They have concentrated on the case that the production of χ is via the process φ → χχ.

“We study the particle production including the possible enhancement due to a large occupation number of the final state particle, known as the stimulated emission or the parametric resonance.

— told Moroi, first author of the study.

First, they have considered particle production in the flat spacetime. In such a case, they have discussed the evolution of the occupation number of each mode (i.e., mode with a fixed momentum k) separately in the narrow resonance regime. They have derived the evolution equations for the occupation number of each mode based on the QFT. A resonance band shows up at ω_k close to 1/2 ×m_φ, which corresponds to the lowest resonance band in the context of the parametric resonance. The modes within the resonance band can be effectively produced. For the timescale much longer than (qmφ)¯1, the occupation numbers of the modes in the resonance band exponentially grow; the growth rate obtained by them in analysis is consistent with that given by the study of the parametric resonance using the Mathieu equation. Then, comparing the occupation number obtained from the QFT calculation with that from the Boltzmann equation, they have found that they do not agree well when the occupation number is larger than ∼ 1. On the contrary, when f_k << 1, they have found a good agreement of two results. They have also argued how their evolution equation based on the QFT could be related to the ordinary Boltzmann equation. When the occupation number is larger than ∼ 1, some of the approximation and assumption necessary for such an argument cannot be justified, which, they expect, causes the disagreement.

Then, they have studied particle production taking into account the effects of cosmic expansion. With the cosmic expansion, the physical momentum redshifts. The momentum of each mode stays in the resonance band for a finite amount of time and then exits the resonance band. The exponential growth of the occupation number occurs only in the resonance band. The growth factor has been studied numerically and analytically, adopting the evolution equations based on the QFT. The agreement between numerical and analytical results is excellent. They have also analyzed the system by using the conventional Boltzmann equation and found that the growth rate obtained by solving the Boltzmann equation is a factor of 2 larger than that based on the QFT. Thus, the occupation number from the Boltzmann equation may become exponentially larger than that from the QFT, and a naive use of the conventional Boltzmann equation may result in a significant overestimation of the number density of χ.

In this paper, we have considered the production of a bosonic particle, concentrating on the lowest resonance band of the parametric resonance. Consideration of the production of fermionic particles and the study of the higher resonance bands, as well as the use of the evolution equations based on the QFT to other phenomena, are left as future works.

— concluded authors of the study.

Reference: Takeo Moroi, Wen Yin, “Particle Production from Oscillating Scalar Field and Consistency of Boltzmann Equation”, ArXiv, pp. 1-23, 2021. https://arxiv.org/abs/2011.12285


Copyright of this article totally belongs to our author S. Aman. One is allowed to reuse it only by giving proper credit either to him or to us

Quantum Material’s Subtle Spin Behavior Proves Theoretical Predictions (Quantum)

Using complementary computing calculations and neutron scattering techniques, researchers from the Department of Energy’s Oak Ridge and Lawrence Berkeley national laboratories and the University of California, Berkeley, discovered the existence of an elusive type of spin dynamics in a quantum mechanical system.

The team successfully simulated and measured how magnetic particles called spins can exhibit a type of motion known as Kardar-Parisi-Zhang, or KPZ, in solid materials at various temperatures. Until now, scientists had not found evidence of this particular phenomenon outside of soft matter and other classical materials.

These findings, which were published in Nature Physics, show that the KPZ scenario accurately describes the changes in time of spin chains — linear channels of spins that interact with one another but largely ignore the surrounding environment — in certain quantum materials, confirming a previously unproven hypothesis.

“Seeing this kind of behavior was surprising, because this is one of the oldest problems in the quantum physics community, and spin chains are one of the key foundations of quantum mechanics,” said Alan Tennant, who leads a project on quantum magnets at the Quantum Science Center, or QSC, headquartered at ORNL.

Observing this unconventional behavior provided the team with insights into the nuances of fluid properties and other underlying features of quantum systems that could eventually be harnessed for various applications. A better understanding of this phenomenon could inform the improvement of heat transport capabilities using spin chains or facilitate future efforts in the field of spintronics, which saves energy and reduces noise that can disrupt quantum processes by manipulating a material’s spin instead of its charge.

Typically, spins proceed from place to place through either ballistic transport, in which they travel freely through space, or diffusive transport, in which they bounce randomly off impurities in the material – or each other – and slowly spread out.

But fluid spins are unpredictable, sometimes displaying unusual hydrodynamical properties, such as KPZ dynamics, an intermediate category between the two standard forms of spin transport. In this case, special quasiparticles roam randomly throughout a material and affect every other particle they touch.

“The idea of KPZ is that, if you look at how the interface between two materials evolves over time, you see a certain kind of scaling akin to a growing pile of sand or snow, like a form of real-world Tetris where shapes build on each other unevenly instead of filling in the gaps,” said Joel Moore, a professor at UC Berkeley, senior faculty scientist at LBNL and chief scientist of the QSC.

Another everyday example of KPZ dynamics in action is the mark left on a table, coaster or other household surface by a hot cup of coffee. The shape of the coffee particles affects how they diffuse. Round particles pile up at the edge as the water evaporates, forming a ring-shaped stain. However, oval particles exhibit KPZ dynamics and prevent this movement by jamming together like Tetris blocks, resulting in a filled in circle.

KPZ behavior can be categorized as a universality class, meaning that it describes the commonalities between these seemingly unrelated systems based on the mathematical similarities of their structures in accordance with the KPZ equation, regardless of the microscopic details that make them unique.

To prepare for their experiment, the researchers first completed simulations with resources from ORNL’s Compute and Data Environment for Science, as well as LBNL’s Lawrencium computational cluster and the National Energy Research Scientific Computing Center, a DOE Office of Science user facility located at LBNL. Using the Heisenberg model of isotropic spins, they simulated the KPZ dynamics demonstrated by a single 1D spin chain within potassium copper fluoride.

“This material has been studied for almost 50 years because of its 1D behavior, and we chose to focus on it because previous theoretical simulations showed that this setting was likely to yield KPZ hydrodynamics,” said Allen Scheie, a postdoctoral research associate at ORNL.

The team simulated a single spin chain’s KPZ behavior, then observed the phenomenon experimentally in multiple spin chains. Credit: Michelle Lehman/ORNL, U.S. Dept. of Energy

The team then used the SEQUOIA spectrometer at the Spallation Neutron Source, a DOE Office of Science user facility located at ORNL, to examine a previously unexplored region within a physical crystal sample and to measure the collective KPZ activity of real, physical spin chains. Neutrons are an exceptional experimental tool for understanding complex magnetic behavior due to their neutral charge and magnetic moment and their ability to penetrate materials deeply in a nondestructive fashion.

Both methods revealed evidence of KPZ behavior at room temperature, a surprising accomplishment considering that quantum systems usually must be cooled to almost absolute zero to exhibit quantum mechanical effects. The researchers anticipate that these results would remain unchanged, regardless of variations in temperature.

“We’re seeing pretty subtle quantum effects surviving to high temperatures, and that’s an ideal scenario because it demonstrates that understanding and controlling magnetic networks can help us harness the power of quantum mechanical properties,” Tennant said.

This project began during the development of the QSC, one of five recently launched Quantum Information Science Research Centers competitively awarded to multi-institutional teams by DOE. The researchers had realized their combined interests and expertise perfectly positioned them to tackle this notoriously difficult research challenge.

Through the QSC and other avenues, they plan to complete related experiments to cultivate a better understanding of 1D spin chains under the influence of a magnetic field, as well as similar projects focused on 2D systems.

“We showed spin moving in a special quantum mechanical way, even at high temperatures, and that opens up possibilities for many new research directions,” Moore said.

This work was funded by the DOE Office of Science. Additional support was provided by the Quantum Science Center, a DOE Office of Science National Quantum Information Science Research Center, and the Simons Foundation’s Investigator program.

UT-Battelle manages Oak Ridge National Laboratory for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.— Elizabeth Rosenthal


Reference: Scheie, A., Sherman, N.E., Dupont, M. et al. Detection of Kardar–Parisi–Zhang hydrodynamics in a quantum Heisenberg spin-1/2 chain. Nat. Phys. (2021). https://www.nature.com/articles/s41567-021-01191-6 https://doi.org/10.1038/s41567-021-01191-6


Provided by Oak Ridge National Laboratory

Attention And Working Memory: Two Sides of the Same Neural Coin? (Neuroscience)

Princeton neuroscientists have demonstrated that attention and working memory are two sides of the same neural coin; what’s more, they have observed the coin as it flips inside the brain

In 1890, psychologist William James described attention as the spotlight we shine not only on the world around us, but also on the contents of our minds. Most cognitive scientists since then have drawn a sharp distinction between what James termed “sensorial attention” and “intellectual attention,” now usually called “attention” and “working memory,” but James saw them as two varieties of the same mental process.

New research by Princeton neuroscientists suggests that James was on to something, finding that attention to the outside world and attention to our own thoughts are actually two sides of the same neural coin. What’s more, they have observed the coin as it flips inside the brain.

A paper published in Nature on March 31 by Matthew Panichello, a postdoctoral research associate at the Princeton Neuroscience Institute, and Timothy Buschman, an assistant professor of psychology and neuroscience at Princeton, found that attention and working memory share the same neural mechanisms. Importantly, their work also reveals how neural representations of memories are transformed as they direct behavior.

“When we act on sensory inputs we call it ‘attention,'” said Buschman. “But there’s a similar mechanism that can act on the thoughts we hold in mind.”

In a pair of experiments with two rhesus macaque monkeys, the researchers found that neurons in the prefrontal cortices that focus attention on sensory stimuli are the very same ones that focus on an item in working memory. What’s more, Panichello and Buschman actually observed the neural representations of those memories realigning in the brain as the monkeys selected which memories to act upon.

In one experiment, each monkey was seated before a computer monitor and a camera that tracked their eye movements. The monitor displayed pairs of randomly selected colored squares, one above the other. Then the squares vanished, requiring the monkey to remember the color and location of the squares. After a brief pause, a symbol appeared, telling the monkey which square they should select from their working memory. Then, after another pause, they reported the color of the selected square by matching it to a color wheel.

To perform the task, each monkey needed to hold both colors in their working memory, select the target color from memory, and then report that color on the color wheel. After each response, the monkey was rewarded with droplets of juice. The closer their report was to the target color, the more droplets they earned.

Working memory and attention: Two sides of the same neural coin? Experiments Credit: Courtesy of Timothy Buschman and Matthew Panichello Caption: Princeton neuroscientists Timothy Buschman and Matthew Panichello have discovered that attention and working memory are much more closely connected than most modern cognitive scientists realized. They performed two experiments in which monkeys were shown two color blocks and a symbol that directed them to select the top one (a circle or an upward slanted line) or the bottom one (a triangle or a downward slanted line). They then matched the selected color to its spot on the color wheel. This image shows data from the first experiment: The spectrum of possible colors for the two blocks (upper and lower) are represented as a ring in the activity of neurons in the prefrontal cortex. When the animal is remembering both items (before selecting the target), these rings lie on separate “planes” within the brain. These planes are perpendicular to one another to keep the items separate. When one of the items is selected, the color rings rotate in order to align the colors for either item. This allows the brain to “read out” the color of the selected item, regardless of whether it was originally the upper or lower item. Courtesy of the Buschman Lab

In a second experiment, to compare the selection of items from working memory to a more classic attention task, the researchers indicated the direction to the monkeys before they saw the colored squares. This allowed the macaques to focus all their attention on the indicated square (and ignore the other one). As expected, the monkeys performed better on this task because they knew in advance which square to attend to and which to ignore.

The researchers recorded neural activity in the prefrontal cortex, parietal cortex and visual cortex. The prefrontal cortex is associated with a variety of executive function processes including attention, working memory, planning and inhibition. In this study, the researchers discovered that the same neurons in the prefrontal cortex that directed attention were also used to select an item from the monkey’s working memory.

This wasn’t true everywhere in the brain. In an area in the visual cortex associated with color recognition and in an area in the parietal lobe associated with visual and spatial analysis, the processes of attending to sensory input and selecting the target color from working memory involved distinct neural mechanisms.

“Attention allows you to focus your resources on a particular stimulus, while a similar selection process happens with items in working memory,” said Buschman. “Our results show the prefrontal cortex uses one representation to control both attention and working memory.”

The same neural recordings also showed how selecting an item changes memories so that they are either hidden away in working memory or used to make a response. This involves dynamically rotating the memory representation in the prefrontal cortex.

This can be likened to holding a piece of paper with text on it. If you hold the paper edge-on to your face, you can’t read it. This concealment, Buschman explained, prevents the brain from triggering the wrong response, or triggering a response too early.

“The brain is holding information in a way that the network can’t see it,” he said. Then, when it came time to respond at the end of the trial, the memory representation rotated. Just as rotating the paper allows you to read and act upon the text, rotating the neural representation allows the brain to direct behavior.

“This dynamic transformation just blew me away,” said Buschman. “It shows how the brain can manipulate items in working memory to guide your action.”

“It is an important paper,” said Massachusetts Institute of Technology neuroscientist Earl Miller, who was not involved in this research. “Attention and working memory have often been discussed as being two sides of the same coin, but that has mainly been lip service. This paper shows how true this is and also shows us the ‘coin’ — the coding and control mechanisms that they share.”

“Our goal is not to overwrite the word ‘attention,'” said Buschman. Instead, he hopes that findings from decades of research on attention can be generalized to shed light on other forms of executive function. “Attention has been well-studied as the cognitive control of sensory inputs. Our results begin to broaden these concepts to other behaviors.”

“Shared mechanisms underlie the control of working memory and attention,” by Matthew F. Panichello and Timothy J. Buschman, appears in the March 31 issue of the journal Nature (DOI: 10.1038/s41586-021-03390-w). This research was supported by by the National Institute for Mental Health (R01MH115042 to TJB) and the Department of Defense (a National Defense Science and Engineering Graduate Fellowship to MFP).

Featured image: Princeton neuroscientists Timothy Buschman and Matthew Panichello have discovered that attention and working memory are much more closely connected than most modern cognitive scientists realized. They performed two experiments in which monkeys were shown two color blocks and a symbol that directed them to look at the top one (a circle or an upward slanted line) or the bottom one (a triangle or a downward slanted line). They then matched the selected color to its spot on the color wheel. In the first experiment (left), they saw the blocks first and then the directional signal. In the second (right), they saw the directional signal first and then the color blocks. Courtesy of Timothy Buschman and Matthew Panichello


Reference: Panichello, M.F., Buschman, T.J. Shared mechanisms underlie the control of working memory and attention. Nature (2021). https://www.nature.com/articles/s41586-021-03390-w https://doi.org/10.1038/s41586-021-03390-w


Provided by Princeton University

Century-old Problem Solved With First-ever 3D Atomic Imaging of an Amorphous Solid (Physics)

UCLA-led study captures the structure of metallic glass

Glass, rubber and plastics all belong to a class of matter called amorphous solids. And in spite of how common they are in our everyday lives, amorphous solids have long posed a challenge to scientists.

Since the 1910s, scientists have been able to map in 3D the atomic structures of crystals, the other major class of solids, which has led to myriad advances in physics, chemistry, biology, materials science, geology, nanoscience, drug discovery and more. But because amorphous solids aren’t assembled in rigid, repetitive atomic structures like crystals are, they have defied researchers’ ability to determine their atomic structure with the same level of precision.

Until now, that is.

A UCLA-led study in the journal Nature reports on the first-ever determination of the 3D atomic structure of an amorphous solid — in this case, a material called metallic glass.

“We know so much about crystals, yet most of the matter on Earth is non-crystalline and we know so little about their atomic structure,” said the study’s senior author, Jianwei “John” Miao, a UCLA professor of physics and astronomy and member of the California NanoSystems Institute at UCLA.

Observing the 3D atomic arrangement of an amorphous solid has been Miao’s dream since he was a graduate student. That dream has now been realized, after 22 years of relentless pursuit.

“This study just opened a new door,” he said.

Metallic glasses tend to be both stronger and more shapeable than standard crystalline metals, and they are used today in products ranging from electrical transformers to high-end golf clubs and the housings for Apple laptops and other electronic devices. Understanding the atomic structure of metallic glasses could help engineers design even better versions of these materials, for an even wider array of applications.

The researchers used a technique called atomic electron tomography, a type of 3D imaging pioneered by Miao and collaborators. The approach involves beaming electrons through a sample and collecting an image on the other side. The sample is rotated so that measurements can be taken from multiple angles, yielding data that is stitched together to produce a 3D image.

“We combined state-of-the-art electron microscopy with powerful algorithms and analysis techniques to study structures down to the level of single atoms,” said co-author Peter Ercius, a staff scientist at Lawrence Berkeley National Laboratory’s Molecular Foundry, where the experiment was conducted. “Direct knowledge of amorphous structures at this level is a game changer for the physical sciences.”

The researchers examined a sample of metallic glass about 8 nanometers in diameter, made of eight different metals. (A nanometer is one-billionth of a meter.) Using 55 atomic electron tomography images, Miao and colleagues created a 3D map of the approximately 18,000 atoms that made up the nanoparticle.

Because amorphous solids have been so difficult to characterize, the researchers expected the atoms to be arranged chaotically. And although about 85% of the atoms were in a disordered arrangement, the researchers were able to identify pockets where a fraction of atoms coalesced into ordered superclusters. The finding demonstrated that even within an amorphous solid, the arrangement of atoms is not completely random.

Miao acknowledged one limitation of the research, borne of the limits of electron microscopy itself. Some of the metal atoms were so similar in size that electron imaging couldn’t distinguish between them. For the purposes of the study, the researchers grouped the metals into three categories, uniting neighbors from the periodic table of elements: cobalt and nickel in the first category; ruthenium, rhodium, palladium and silver in the second; and iridium and platinum in the third.

The research was supported primarily by the STROBE National Science Foundation Science and Technology Center, of which Miao is deputy director, and in part by the U.S. Department of Energy.

“This groundbreaking result exemplifies the power of a transdisciplinary team,” said Charles Ying, the National Science Foundation program officer who oversees funding for the STROBE center. “It demonstrates the need for long-term support of a center to address this type of complex research project.”

The study’s co-first authors are graduate student Yao Yang, former assistant project scientist Jihan Zhou, former postdoctoral researcher Fan Zhu, and postdoctoral researcher Yakun Yuan, all current or former members of Miao’s research group at UCLA. Other UCLA co-authors are graduate students Dillan Chang and Arjun Rana; former postdoctoral scholars Dennis Kim and Xuezeng Tian; assistant adjunct professor of mathematics Minh Pham; and mathematics professor Stanley Osher.

Other co-authors are Yonggang Yao and Liangbing Hu of University of Maryland, College Park; and Andreas Schmid and Peter Ercius of Lawrence Berkeley National Laboratory.

“This work is a great illustration of how to address longstanding grand challenges by bringing together scientists with many different backgrounds in physics, mathematics, materials and imaging science, with strong partnerships between universities and national laboratories,” said Margaret Murnane, director of the STROBE center. “This is a spectacular team.”

Featured image: At left, an experimental 3D atomic model of a metallic glass nanoparticle, 8 nanometers in diameter. Right, the 3D atomic packing of a representative ordered supercluster in the metallic glass structure, with differently colored balls representing different types of atoms. © Yao Yang and Jianwei “John” Miao/UCLA


Reference: Yang, Y., Zhou, J., Zhu, F. et al. Determining the three-dimensional atomic structure of an amorphous solid. Nature 592, 60–64 (2021). https://doi.org/10.1038/s41586-021-03354-0


Provided by University of California – Los Angeles

Sugar Not So Nice For Your Child’s Brain Development (Food)

New research shows how high consumption affects learning, memory

Sugar practically screams from the shelves of your grocery store, especially those products marketed to kids.

Children are the highest consumers of added sugar, even as high-sugar diets have been linked to health effects like obesity and heart disease and even impaired memory function.

However, less is known about how high sugar consumption during childhood affects the development of the brain, specifically a region known to be critically important for learning and memory called the hippocampus.

New research led by a University of Georgia faculty member in collaboration with a University of Southern California research group has shown in a rodent model that daily consumption of sugar-sweetened beverages during adolescence impairs performance on a learning and memory task during adulthood. The group further showed that changes in the bacteria in the gut may be the key to the sugar-induced memory impairment.

Emily Noble in her lab. (Photo by Cal Powell)

Supporting this possibility, they found that similar memory deficits were observed even when the bacteria, called Parabacteroides, were experimentally enriched in the guts of animals that had never consumed sugar.

“Early life sugar increased Parabacteroides levels, and the higher the levels of Parabacteroides, the worse the animals did in the task,” said Emily Noble, assistant professor in the UGA College of Family and Consumer Sciences who served as first author on the paper. “We found that the bacteria alone was sufficient to impair memory in the same way as sugar, but it also impaired other types of memory functions as well.”

Guidelines recommend limiting sugar

The Dietary Guidelines for Americans, a joint publication of the U.S. Departments of Agriculture and of Health and Human Services, recommends limiting added sugars to less than 10 percent of calories per day.

Data from the Centers for Disease Control and Prevention show Americans between the ages 9-18 exceed that recommendation, the bulk of the calories coming from sugar-sweetened beverages.

Considering the role the hippocampus plays in a variety of cognitive functions and the fact the area is still developing into late adolescence, researchers sought to understand more about its vulnerability to a high-sugar diet via gut microbiota.

Juvenile rats were given their normal chow and an 11% sugar solution, which is comparable to commercially available sugar-sweetened beverages.

Researchers then had the rats perform a hippocampus-dependent memory task designed to measure episodic contextual memory, or remembering the context where they had seen a familiar object before.

“We found that rats that consumed sugar in early life had an impaired capacity to discriminate that an object was novel to a specific context, a task the rats that were not given sugar were able to do,” Noble said.

A second memory task measured basic recognition memory, a hippocampal-independent memory function that involves the animals’ ability to recognize something they had seen previously.

In this task, sugar had no effect on the animals’ recognition memory.

“Early life sugar consumption seems to selectively impair their hippocampal learning and memory,” Noble said.

Additional analyses determined that high sugar consumption led to elevated levels of Parabacteroides in the gut microbiome, the more than 100 trillion microorganisms in the gastrointestinal tract that play a role in human health and disease.

To better identify the mechanism by which the bacteria impacted memory and learning, researchers experimentally increased levels of Parabacteroides in the microbiome of rats that had never consumed sugar. Those animals showed impairments in both hippocampal dependent and hippocampal-independent memory tasks.

“(The bacteria) induced some cognitive deficits on its own,” Noble said.

Noble said future research is needed to better identify specific pathways by which this gut-brain signaling operates.

“The question now is how do these populations of bacteria in the gut alter the development of the brain?” Noble said. “Identifying how the bacteria in the gut are impacting brain development will tell us about what sort of internal environment the brain needs in order to grow in a healthy way.”

The article, “Gut microbial taxa elevated by dietary sugar disrupt memory function,” appears in Translational Psychiatry. Scott Kanoski, associate professor in USC Dornsife College of Letters, Arts and Science, is corresponding author on the paper.

Additional authors on the paper are Elizabeth Davis, Linda Tsan, Clarissa Liu, Andrea Suarez and Roshonda B. Jones from the University of Southern California; Christine Olson, Yen-Wei Chen, Xia Yang and Elaine Y. Hsiao from the University of California-Los Angeles; and Claire de La Serre and Ruth Schade from UGA.


Provided by University of Georgia

Uranus: First X-rays From Uranus Discovered (Planetary Science)

  • Astronomers have announced the first detection of X-rays from Uranus.
  • Uranus, the seventh planet from the Sun, is an ice giant planet in the outer Solar System.
  • Like Jupiter and Saturn, Uranus and its rings appear to mainly produce X-rays by scattering solar X-rays, but some may also come from auroras.
  • Chandra observations from 2002 and 2017 were used to make this discovery.

Astronomers have detected X-rays from Uranus for the first time, using NASA’s Chandra X-ray Observatory. This result may help scientists learn more about this enigmatic ice giant planet in our solar system.

Uranus is the seventh planet from the Sun and has two sets of rings around its equator. The planet, which has four times the diameter of Earth, rotates on its side, making it different from all other planets in the solar system. Since Voyager 2 was the only spacecraft to ever fly by Uranus, astronomers currently rely on telescopes much closer to Earth, like Chandra and the Hubble Space Telescope, to learn about this distant and cold planet that is made up almost entirely of hydrogen and helium.

In the new study, researchers used Chandra observations taken in Uranus in 2002 and then again in 2017. They saw a clear detection of X-rays from the first observation, just analyzed recently, and a possible flare of X-rays in those obtained fifteen years later. The main graphic shows a Chandra X-ray image of Uranus from 2002 (in pink) superimposed on an optical image from the Keck-I Telescope obtained in a separate study in 2004. The latter shows the planet at approximately the same orientation as it was during the 2002 Chandra observations.

2017 HRC Composite Image (Credit: X-ray: NASA/CXO/University College London/W. Dunn et al; Optical: W.M. Keck Observatory)

What could cause Uranus to emit X-rays? The answer: mainly the Sun. Astronomers have observed that both Jupiter and Saturn scatter X-ray light given off by the Sun, similar to how Earth’s atmosphere scatters the Sun’s light. While the authors of the new Uranus study initially expected that most of the X-rays detected would also be from scattering, there are tantalizing hints that at least one other source of X-rays is present. If further observations confirm this, it could have intriguing implications for understanding Uranus.

One possibility is that the rings of Uranus are producing X-rays themselves, which is the case for Saturn’s rings. Uranus is surrounded by charged particles such as electrons and protons in its nearby space environment. If these energetic particles collide with the rings, they could cause the rings to glow in X-rays. Another possibility is that at least some of the X-rays come from auroras on Uranus, a phenomenon that has previously been observed on this planet at other wavelengths.

On Earth, we can see colorful light shows in the sky called auroras, which happen when high-energy particles interact with the atmosphere. X-rays are emitted in Earth’s auroras, produced by energetic electrons after they travel down the planet’s magnetic field lines to its poles and are slowed down by the atmosphere. Jupiter has auroras, too. The X-rays from auroras on Jupiter come from two sources: electrons traveling down magnetic field lines, as on Earth, and positively charged atoms and molecules raining down at Jupiter’s polar regions. However, scientists are less certain about what causes auroras on Uranus. Chandra’s observations may help figure out this mystery.

Uranus is an especially interesting target for X-ray observations because of the unusual orientations of its spin axis and its magnetic field. While the rotation and magnetic field axes of the other planets of the solar system are almost perpendicular to the plane of their orbit, the rotation axis of Uranus is nearly parallel to its path around the Sun. Furthermore, while Uranus is tilted on its side, its magnetic field is tilted by a different amount, and offset from the planet’s center. This may cause its auroras to be unusually complex and variable. Determining the sources of the X-rays from Uranus could help astronomers better understand how more exotic objects in space, such as growing black holes and neutron stars, emit X-rays.

A paper describing these results appears in the most recent issue of the Journal of Geophysical Research and is available online at [insert link]. The authors are William Dunn (University College London, United Kingdom), Jan-Uwe Ness (University of Marseille, France), Laurent Lamy (Paris Observatory, France), Grant Tremblay (Center for Astrophysics | Harvard & Smithsonian), Graziella Branduardi-Raymont (University College London), Bradford Snios (CfA), Ralph Kraft (CfA), Z. Yao (Chinese Academy of Sciences, Beijing), Affelia Wibisono (University College London).

NASA’s Marshall Space Flight Center manages the Chandra program. The Smithsonian Astrophysical Observatory’s Chandra X-ray Center controls science from Cambridge Massachusetts and flight operations from Burlington, Massachusetts.

Featured image: The first X-rays from Uranus have been captured by Chandra during observations obtained in 2002 and 2017, a discovery that may help scientists learn more about this ice giant planet. Researchers think most of the X-rays come from solar X-rays that scatter off the Uranus’s atmosphere as well as its ring system. Some of the X-rays may also be from auroras on Uranus, a phenomenon that has previously been observed at other wavelengths. This Uranus image is a composite of optical light from the Keck telescope in Hawaii (blue and white) and X-ray data from Chandra (pink). © X-ray: NASA/CXO/University College London/W. Dunn et al; Optical: W.M. Keck Observatory


Reference: Dunn, W. R., Ness, J.‐U., Lamy, L., Tremblay, G. R., Branduardi‐Raymont, G., Snios, B., et al. (2021). A low signal detection of X‐rays from Uranus. Journal of Geophysical Research: Space Physics, 126, e2020JA028739. https://doi.org/10.1029/2020JA028739


Provided by CFA Harvard

Study: Doctors ‘Overusing’ Costly, Riskier Method For Clearing Clogged Or Blocked Vessels (Medicine)

In a recent study reviewing Medicare claims data from 2019 for nearly 59,000 patients with peripheral artery disease (PAD), a Johns Hopkins Medicine research team provides statistical evidence that one method for restoring blood flow to clogged or completely blocked vessels is being overused or inappropriately used in the United States. This occurs, the researchers say, even though the procedure has not been shown in clinical studies to be more effective than two other less-expensive, less-risky surgical methods.

The findings were reported March 22, 2021, in JACC: Cardiovascular Interventions.

“We wanted to characterize physician practice patterns in treating PAD and determine if the therapy in question, atherectomy, was being used appropriately when compared to the use of balloon angioplasty, stents or a combination of angioplasty and stents,” says study lead author Caitlin Hicks, M.D., associate professor of surgery at the Johns Hopkins University School of Medicine. “What we discovered is that although there is a wide distribution of practices across the nation for the use of atherectomies and only slightly more than half of PVIs [peripheral vascular interventions] performed in 2019 relied on the technique, atherectomy accounted for 90% of all Medicare PVI payments.”

Atherectomy is one of the methods that clinicians use to remove plaque (the buildup of fat, cholesterol, calcium and other substances found in the blood) from blood vessels that have become narrowed or blocked. Unlike balloon angioplasties and stents that push plaque into the vessel wall to open the passageway, the atherectomy cuts it out.

However, clinical studies have never demonstrated that atherectomies are any more effective in treating PAD than angioplasties, stents or a combination of the two. Other studies have suggested that atherectomy increases the risk of distal embolization, in which a piece of plaque breaks free of a vessel and travels to the legs, dangerously reducing blood flow to the feet.

For the period Jan. 1 – Dec. 31, 2019, the Johns Hopkins Medicine researchers reviewed Medicare fee-for-service claims for 58,552 U.S. patients who received elective PVI — atherectomy, angioplasty or stenting — for the first time. Patients were characterized for their demographics — including age, sex, race and ZIP code of residence — as well as their reason for getting PVI — including claudication (pain or cramping in the lower leg due to reduced blood flow) and chronic limb-threatening ischemia (inadequate blood supply to a leg that causes pain at rest or leads to gangrene). Histories of other conditions, such as kidney disease and diabetes, and lifestyle behaviors, such as smoking, also were noted.

Medicare claims for 1,627 doctors who performed PVIs on 10 or more patients during 2019 were reviewed in the study. The researchers documented a number of physician characteristics, including sex, years since graduation from medical school, primary specialty, census region of practice, population density of practice location, number of patients treated for PAD, and the type of medical facility in which care was primarily given.

Analysis of the data showed that during the study period, 31,476 (53.8%) of patients received atherectomies as their PVI. For age and sex, the numbers of atherectomies and non-atherectomies were approximately the same. However, the researchers found that atherectomies were performed more frequently on Black or Hispanic patients, those with claudication as their reason for treatment, and people living in urban settings and in the southern United States.

Physician use of atherectomies ranged from 0% (never used) to 100% (always used), with the latter being the case for 133 clinicians — nearly 10% of the total studied. Men were more likely to use atherectomies, as were doctors in practice for more than 20 years, cardiologists and radiologists, and those who practiced in regions with a higher median Medicare-allowed payment for PVIs per patient.

Additionally, Hicks says, physicians who worked in ambulatory surgical centers or office-based laboratories used atherectomy seven times more often than physicians who worked primarily in controlled facilities such as hospitals.

Overall, nearly $267 million was reimbursed by Medicare for PVIs performed in 2019. Of this, approximately $241 million — a resounding 90.2% — was for atherectomies.

“We feel that these numbers — especially when there is no solid evidence that atherectomies treat PAD any more effectively than angioplasty, stents or a combination of angioplasty and stents — suggest there is potential overuse of atherectomies in certain situations,” says Hicks. “This poses a high health care burden and should be addressed with professional guidelines for more appropriate use of the procedure.”

Featured image: Research News Tip Sheet: Story Ideas From Johns Hopkins Medicine © Johns Hopkins Medicine


Reference: Caitlin W. Hicks, Courtenay M. Holscher, Peiqi Wang, Chen Dun, Christopher J. Abularrage, James H. Black, Kim J. Hodgson, Martin A. Makary, Use of Atherectomy During Index Peripheral Vascular Interventions, JACC: Cardiovascular Interventions, Volume 14, Issue 6, 2021, Pages 678-688, ISSN 1936-8798, https://doi.org/10.1016/j.jcin.2021.01.004. (https://www.sciencedirect.com/science/article/pii/S1936879821000807)


Provided by Johns Hopkins Medicine