The immune system’s attempt to eliminate Salmonella bacteria from the gastrointestinal (GI) tract instead facilitates colonization of the intestinal tract and fecal shedding, according to National Institutes of Health scientists. The study, published in Cell Host & Microbe, was conducted by National Institute of Allergy and Infectious Diseases (NIAID) scientists at Rocky Mountain Laboratories in Hamilton, Montana.
Salmonella Typhimurium bacteria (hereafter Salmonella) live in the gut and often cause gastroenteritis in people. The Centers for Disease Control and Prevention estimates Salmonella bacteria cause about 1.35 million infections, 26,500 hospitalizations and 420 deaths in the United States every year. Contaminated food is the source for most of these illnesses. Most people who get ill from Salmonella have diarrhea, fever and stomach cramps but recover without specific treatment. Antibiotics typically are used only to treat people who have severe illness or who are at risk for it.
Salmonella bacteria also can infect a wide variety of animals, including cattle, pigs and chickens. Although clinical disease usually resolves within a few days, the bacteria can persist in the GI tract for much longer. Fecal shedding of the bacteria facilitates transmission to new hosts, especially by so-called “super shedders” that release high numbers of bacteria in their feces.
NIAID scientists are studying how Salmonella bacteria establish and maintain a foothold in the GI tract of mammals. One of the first lines of defense in the GI tract is the physical barrier provided by a single layer of intestinal epithelial cells. These specialized cells absorb nutrients and are a critical barrier that prevent pathogens from spreading to deeper tissues. When bacteria invade these cells, the cells are ejected into the gut lumen–the hollow portion of the intestines. However, in previous studies, NIAID scientists had observed that some Salmonella replicate rapidly in the cytosol–the fluid portion–of intestinal epithelial cells. That prompted them to ask: does ejecting the infected cell amplify rather than eliminate the bacteria?
To address this question, the scientists genetically engineered Salmonella bacteria that self-destruct when exposed to the cytosol of epithelial cells but grow normally in other environments, including the lumen of the intestine. Then they infected laboratory mice with the self-destructing Salmonella bacteria and found that replication in the cytosol of mouse intestinal epithelial cells is important for colonization of the GI tract and fuels fecal shedding. The scientists hypothesize that, by hijacking the epithelial cell response, Salmonella amplify their ability to invade neighboring cells and seed the intestine for fecal shedding.
The researchers say this is an example of how the pressure exerted by the host immune response can drive the evolution of a pathogen, and vice versa. The new insights offer new avenues for developing novel interventions to reduce the burden of this important pathogen.
A Chong et al. Cytosolic replication in epithelial cells fuels intestinal expansion and chronic fecal shedding of Salmonella Typhimurium. Cell Host & Microbe DOI: 10.1016/j.chom.2021.04.017 (2021).
Olivia Steele-Mortimer, Ph.D., chief of NIAID’s Salmonella-Host Cell Interactions Section, and Audrey Chong, Ph.D., in the Salmonella-Host Cell Interactions Section, are available to comment.
The Dark Matter Particle Explorer (DAMPE) collaboration reported the precise measurement of the energy spectrum of cosmic ray helium nuclei from 70 GeV to 80 TeV energies on May 18, 2021.
For the first time, DAMPE reveals a softening structure at about 34 TeV energies in the helium spectrum with a high significance (~4.3σ). Together with the softening energy of the DAMPE proton spectrum, the results are consistent with a charge-dependent softening energy of protons and helium nuclei.
The common softening is likely an imprint of a nearby cosmic ray source, e.g., a supernova remnant. The softening energy, which is likely Z-dependent for protons and helium nuclei, corresponds to the acceleration upper-limit of such a nearby source.
DAMPE, also known as “Wukong”, is a space satellite dedicated to high-energy cosmic ray and gamma-ray observations. Besides probing the nature of dark matter particles, one of the main scientific goals of DAMPE is to precisely measure the energy spectra of cosmic ray particles.
DAMPE has an excellent energy resolution (for electrons and gamma-rays), a very good particle identification capability, and a reasonably large acceptance, making it well suitable for the studies of precise spectral structures of cosmic rays. Cosmic rays (CRs) are energetic particles coming from outer space. They are mostly made up of nuclei of various elements, together with small amounts of electrons/positions, gamma-ray photons, and neutrinos.
Cosmic rays are generally believed to originate from extreme astrophysical objects, e.g. supernova remnant (SNR), accretion by black hole, etc. Therefore, CRs are a unique probe to explore the astrophysical laws under extreme environments. The origin, acceleration, and propagation of CRs are very interesting and fundamental questions in modern physics and astrophysics, which remain unanswered after a century-long observation and research.
The energy spectrum of CRs, which represents the relation of particle flux to energy, is expected to be a power-law form according to the canonical shock acceleration of particles. Precise measurement of the energy spectrum of CRs is the key to understanding those fundamental questions of cosmic ray physics.
Protons and helium nuclei, are the most two abundant components of cosmic rays, which account for more than 99% in total cosmic rays. The excellent charge resolution enables DAMPE to have a powerful capability to identify proton and helium, and precisely measure their spectra respectively. Fig. 1 shows the excellent charge measurement of DAMPE at two typical energies.
Since the launch at the end of 2015, the DAMPE detector has been working very stably in-orbit for four years. Significant progresses in the observations of cosmic ray electrons/positions, protons, and helium nuclei have been achieved. With the continuous operation and data collection of DAMPE, it is expected that more and more high-quality data will shed new light on the fundamental questions about cosmic ray physics.
With the first 30 months on-orbit data, the DAMPE collaboration obtained the precise measurement of the energy spectrum of cosmic ray protons from 40 GeV to 100 TeV energies. The DAMPE result shows that the proton spectrum is not compatible with the paradigm of a unique power-law in a wide energy range.
Especially, DAMPE newly discovered a spectral “softening” (drop behavior) at about 14 TeV energies. The break energy is expected to be the acceleration limit of a possible nearby cosmic ray source.
The DAMPE result has significantly improved the measurement accuracy of helium spectrum in the energy range above TeV. The spectrum of CR helium shows a very similar TeV structure with the one of CR proton, which suggests a common origin of them.
Reference: F. Alemanno et al. (DAMPE Collaboration), “Measurement of the Cosmic Ray Helium Energy Spectrum from 70 GeV to 80 TeV with the DAMPE Space Mission”, Phys. Rev. Lett. 126, 201102 – Published 18 May 2021. Link to paper
Solar prominences or filaments are cool and dense plasma structures suspended in the hot and tenuous corona.
Recent high-resolution solar limb observations reveal that some dark “bubbles” with bright arch-like boundaries form below prominences. It is puzzling that how these bubbles, semi-circular voids, form below dense prominences.
Ph.D. student GUO Yilin from National Astronomical Observatories of Chinese Academy of Sciences (NAOC), together with Dr. HOU Yijun, Dr. LI Ting, and Prof. ZHANG Jun, found and investigated an on-disk bubble based on stereoscopic observations for the first time.
It is widely accepted that bubbles are closely related to the overlying prominence system and could eventually lead to the generation of a coronal mass ejection. This has serious effect on space weather.
“However, previous studies are all based on the solar limb observations or numerical simulations. If the bubble could be found on the solar disk, we could unveil the magnetic nature of the bubble,” said Dr. HOU Yijun, the corresponding author of the study.
Checking high-resolution images from New Vacuum Solar Telescope (NVST), the researchers found an on-disk bubble with a sharp arch-like boundary around a filament barb.
“Fortunately, this bubble can be simultaneously observed by Spacecraft-A of the Solar TErrestrial RElations Observatory (STEREO-A). Therefore, based on stereoscopic observations, we reconstruct the 3D structure of bubble boundary,” said GUO Yilin, the first author of the study.
Then, based on photospheric vector field observations, the researchers further reconstructed 3D magnetic fields and calculate the squashing factor Q map. The Q map depicts a distinct arch-shaped interface.
The interface agrees well with the 3D structure of the bubble boundary. Under the interface lies a set of magnetic loops, which are rooted on a surrounding photospheric magnetic patch (N).
These results indicate that the prominence (filament) dips (barb) interact with the underlying magnetic loops at some locations. Then, an arch-shaped interface is formed. The interface corresponds to the bubble boundary. Therefore, it is reasonable to speculate that the bubble can form around a filament barb below which there is a photospheric magnetic patch.
“The on-disk bubble is probably not a rare structure. Further studies on on-disk bubble will hopefully answer the key question of whether the bubbles form from flux emergence below a pre-existing prominence, and are important for better understanding of the magnetic topology and dynamic evolution of prominences (filaments),” said Dr. HOU Yijun.
Featured image: Overview of the on-disk bubble. (Image by GUO Yilin)
Spinning disks and oddball stars and planets help astronomers test theories about the formation of planetary systems
Many of us remember those school-room models of our Solar System, with tiny wooden planets rotating at the ends of their wires around a bright-orange painted sun. But how accurate is the model? Do the planets really align in a plane, or do their orbits crisscross around the sun at different angles? It turns out the toy isn’t too far off, at least in this one aspect.
Our solar system is actually pretty flat, with most of its planets orbiting within three degrees of the plane of the Earth’s orbit around the sun, called the ecliptic. This flatness extends to the asteroid belt between Mars and Jupiter, though some members of the region of icy objects past Neptune called the Kuiper belt are more extreme, with inclinations up to 30 degrees.
This relative flatness, which is not an unusual feature of solar systems, results from how stars and planetary systems typically form. The process begins with a slowly rotating, roughly spherical cloud of gas and dust, about one light year across. Eventually, a portion of this material collapses toward the center, forming a star, and the spinning cloud begins to flatten into a disk due to its rotation. It’s out of this rotating protoplanetary disk of gas and dust that planets are then spun out, resulting in a relatively flat solar system. Eventually, when most of the gas has settled onto the star or planets or has dissipated, the system is left with a debris disk of planetary leftovers, like our own asteroid-strewn Kuiper belt.
Some astronomers at Penn State study protoplanetary and debris disks to get a better idea of how planetary systems form. But not all stars and planets form in exactly the same manner — and not all planetary systems are flat.
The interdisciplinary nature of astronomy research at Penn State, including at the University’s Center for Exoplanets and Habitable Worlds, allows its scientists to paint a bigger picture of the formation and evolution of planetary systems.
“It’s an exciting time, because so many planets have been discovered in other solar systems, for example by NASA’s Kepler space telescope and Transiting Exoplanet Survey Satellite (TESS), and a lot of them look very different from the planets in our solar system,” said Rebekah Dawson, Shaffer Career Development Professor in Science and assistant professor of astronomy and astrophysics. “So, we have to come up with new ways of thinking about planet formation that can account for the diversity of planets we now know about.”
In addition to studying disks, researchers like Dawson study the exceptions to the norm, unusual stars and planets that could support or make us rethink current theories. Together, these investigations are helping scientists improve our understanding of how and where different kinds of stars and planets form, and what makes a planet habitable.
Catching planetary formation in the act
While some researchers study mature systems and infer aspects of the planet formation process, Assistant Professor of Astronomy and Astrophysics Ian Czekala tries to catch planetary formation in the act.
“I study the protoplanetary disks that surround young stars for the first 10 million years of their lives,” he said. “That may sound like a long time period, but it’s actually very small compared to a star’s total lifetime. Our sun, for example, is about five billion years old; most of the exoplanet systems that people study are a least a billion years old.”
While Kepler and other survey missions have found thousands of mature solar systems, there are fewer nearby protoplanetary systems that easily lend themselves to detailed study. To investigate these early systems, Czekala uses the Atacama Large Millimeter/submillimeter Array (ALMA), one of the most complex astronomical observatories ever built. Located in Chile, ALMA uses a network of high-precision antennas working together to provide a high-resolution look at the universe, using wavelengths of light between the infrared and radio regions of the electromagnetic spectrum.
ALMA can directly detect the gas and dust in protoplanetary disks, which produce very cold thermal emissions (20 to 30 degrees Kelvin, or -400 to -424 degrees Fahrenheit) at millimeter wavelengths. Czekala uses gas in the disk, including carbon monoxide, as a tracer to determine how the disk is rotating. This data provides a glimpse of the disk’s dynamics, building a sort of three-dimensional picture as a function of velocity. Some disks show gaps that might be produced by a planet orbiting in that space.
“What’s interesting is not just that we see the disk rotating, but that we are starting to sense the ways in which the velocity of the field deviates from its expected rotation at a very subtle level,” said Czekala. “It’s like watching a river flow downstream. Sure, you see the bulk flow of the river, but when you look at the eddies and turbulent waves, you can infer that there might be a submerged rock in one area or even a large underwater cavern in another. It’s what lies underneath that gets me really excited.”
Where do planets form?
Dawson in some cases studies debris disks, but she is also very interested in understanding the formation of planets that look nothing like those in our solar system. In particular, she is studying how planetary orbits might have changed in an early solar system, which can tell us how planets came to be where they are today.
“When talking about theories of planet formation, there is some debate about where planets form, even if the processes involved in those different locations may have some similarities,” she said.
Some of Dawson’s research has focused on large gas giant planets called “hot Jupiters” that are similar in mass to our own Jupiter but are found unexpectedly close to their stars. Because of the proximity to their stars, these planets have a surprisingly short orbit of only three or four days.
There are several theories about how hot Jupiters formed. One suggests that they formed where they are currently located, close to their stars. Another theory suggests that they formed farther away, but a disturbance of some sort exhibited a gravitational force that made the hot Jupiter’s orbit extremely elliptical in a way that it passed very close to its star. Eventually, the pull of the star produced tides on the planet that caused its orbit to shrink and become more circular.
“To try to disentangle these possible formation pathways, we sometimes use computer simulations of the process, which leads to other expectations for the properties of hot Jupiters,” said Dawson. “We can then compare observed properties of hot Jupiters based on visible and infrared observations of their stars with the simulations to see if they are consistent with a particular theory. We can also look for planets in the same system as the hot Jupiters for clues to their formation.”
If a hot Jupiter forms near its star, for example, it’s plausible that other planets formed nearby that could be observed. But if it is formed through the pathway with a highly elliptical orbit, it’s likely that any other planets between the hot Jupiter and the sun would get ejected or collide with it.
“What we see in the data is that most Jupiters don’t have other planets nearby, but there are a few exceptions,” said Dawson. “I’ve come to believe that none of these theories can explain all of the hot Jupiters that we see. There may be different ways to make a hot Jupiter, and that’s probably true of other types of planets that we see that don’t look like the planets in our solar system.”
Tiny star or giant planet?
Just as the study of unusual planets like hot Jupiters can help us understand fundamental processes, so too can the study of unusual stars. Stars exist across a wide range of masses, the heaviest of which is 150 times the mass of our sun. The lightest stars, known as brown dwarfs, are less than one-tenth the mass of the sun and therefore can be cool and faint enough to look like a gas giant planet.
Kevin Luhman, professor astronomy and astrophysics, has spent much of his career studying how brown dwarfs are like stars and how they are like planets. To determine if they are born more like stars or planets, he is trying to identify the smallest mass at which brown dwarfs exist.
“There are different theories about the formation of stars that make different predictions for the minimum mass at which brown dwarfs exist,” he said. “If you can measure that minimum mass, you can test theories of how stars are born.”
Because they are cool and faint, brown dwarfs can also be challenging to find; the first wasn’t discovered until 1995. However, when they are very young, brown dwarfs are relatively bright — almost as bright as other stars — making them easier to detect.
“We look for newborn brown dwarfs in nebulas of gas and dust that are already known to be giving birth to stars, like the nearby Orion Nebula,” said Luhman. “Much of my work has involved searching these nebulas, using very sensitive telescopes that are able to see them.”
Luhman has helped identify brown dwarfs as small as five times the mass of Jupiter, which overlaps with the masses of some planets. He hopes that the launch of the James Webb Space Telescope in 2021 will allow astronomers to determine the minimum mass of these unusual stars.
“Brown dwarfs tend to be brightest in the infrared, and James Webb will be the most powerful infrared telescope to date,” he said. “We also hope to answer whether and how often planets can form around brown dwarfs. There’s already good evidence of protoplanetary disks existing around brown dwarfs, meaning they have the building blocks for making planets around them.”
These questions will help inform the bigger picture about planet formation, including whether planets form around any kind of star or only stars like the sun — and, of course, whether it is possible for planets around brown dwarfs, if they exist, to harbor life.
While star and planet formation might be thought of as separate fields of research, the processes involved are all intrinsically linked. Protoplanetary disks not only spawn planets but continuously funnel gas and dust onto the young star. And the very act of planet formation changes the structure of the disk, which may affect the subsequent planets that form in the same disk. It’s somewhat of a chicken and egg scenario, according to Czekala.
In this way, it’s useful to have a variety of researchers using different tactics to study these processes all in one place. The interdisciplinary nature of astronomy research at Penn State, including at the University’s Center for Exoplanets and Habitable Worlds, allows its scientists to paint a bigger picture of the formation and evolution of planetary systems.
“We’re trying to piece together the processes of star and planet formation, but we only glimpse clues here and there, which we need to use to guide us to a holistic theory,” said Czekala. “New observations always lead to a bloom of new theories, but at the end of the day the big picture needs to hang together, including the implications for the formation epoch that we’re studying with the protoplanetary disks. We have a unique opportunity to bring the different communities together.”
Featured image: A newly formed star is surrounded by a rotating disk of gas and dust, called a protoplanetary disk. This disk, illustrated here around a brown dwarf, provides the materials for planet formation. Image: NASA/JPL-Caltech
When enjoying a chocolate bar, most people don’t think about how the molecules within it are organized. But different arrangements of the fats in chocolate can influence its taste and texture. Now, researchers reporting in ACS’ Crystal Growth & Design have found that the side of a chocolate bar facing the mold has a more orderly crystalline structure than the side facing air, knowledge that might help chocolatiers produce tastier confections, the researchers say.
Chocolate is a mixture of cocoa solids, cocoa butter, sugar and other ingredients that interact with each other in complex ways. In particular, the fat molecules, or triacylglycerols, can remain liquid or crystallize into several phases with different melting points. The temperature at which a particular chocolate melts affects its taste and texture, with a melting point near body temperature being ideal. When chocolatiers make bars, they pour tempered chocolate into a mold and let it cool at temperatures that favor the most desirable crystal form. Fumitoshi Kaneko and colleagues wondered how the mold, which conveys heat more efficiently than air and also provides a physical barrier, affects fat crystallization.
To find out, the researchers analyzed the structure of fat components at three different positions in a chocolate bar using polarized Fourier transform infrared spectroscopy and attenuated total reflection sampling. They found that the mold-side face of the bar contained highly ordered, regularly packed fatty acid chains, whereas the air-side face had disordered, irregularly packed chains, and the midpoint showed intermediate characteristics. The mold side also contained more of the most desirable fat crystal form than the other locations. These results could be explained by the large difference in thermal conductivity between the mold’s material and air, which causes the chocolate to preferentially cool and crystallize on the mold-side face. The mold also provides a barrier that controls the direction of crystallization, yielding a more orderly arrangement. These results suggest that a chocolate bar’s structure is much less uniform than previously thought, and that improving the crystallization process might lead to better-tasting, melt-in-your-mouth and better-looking chocolate bars, the researchers say.
The authors do not acknowledge any funding sources for this study.
Engine made with single material outpowers conventional technology
Thermoelectric power generators that make electrical power from waste heat would be a useful tool to reduce greenhouse gas emissions if it weren’t for a most vexing problem: the need to make electrical contacts to their hot side, which is often just too hot for materials that can generate a current.
The heat causes devices to fail over time.
Devices known as transverse thermoelectrics avoid this problem by producing a current that runs perpendicular to the conducting device, requiring contacts only on the cold end of the generator. Though considered a promising technology, the materials known to create this sideways voltage are impractically inefficient – or so scientists thought.
Ohio State University researchers show in a new study that a single material, a layered crystal consisting of the elements rhenium and silicon, turns out to be the gold standard of transverse thermoelectric devices.
The scientists demonstrated that this single compound functions as a highly effective thermoelectric generator because of a rare property: simultaneously carrying both positive and negative charges that can move independently rather than running parallel to each other, which forces them to zig-zag their way to the contacts to generate an electrical current.
By building a thermoelectric generator with a crystal about two inches long, the researchers also determined that when the crystal is situated at a specific angle in the device, it can churn out an impressive amount of power.
“We showed that these materials are as effective as conventional thermoelectric generator technology, but overcome its major disadvantages,” said study co-author Joshua Goldberger, professor of chemistry and biochemistry at Ohio State.
“This is the first time this kind of device has ever been shown to be feasible. With efficiencies that are orders of magnitude higher than any previous transverse device, this compound is just as good as what you can buy commercially, but promises to be much simpler and more reliable.”
While 97% of energy is generated from heat, we throw most heat away, letting it escape from smokestacks, car exhaust pipes and the like.
“Waste heat is really important. Forever and ever there has been a quest to improve the efficiency of all engines that make power from heat – the amount of work you can get out of them that you can use,” said study co-author Joseph Heremans, professor of mechanical and aerospace engineering and Ohio Eminent Scholar in Nanotechnology at Ohio State.
“For a long time, we’ve dreamt of finding little engines that would not have moving parts that can take heat and make electricity.”
And now they have.
Most materials conduct only one type of charge, causing most thermoelectric devices to be composed of multiple compounds – yet the complexity of making contacts to them has hampered efforts to build an efficient and effective thermoelectric generator that is easy to construct and can withstand high temperatures.
Two years ago, this research team discovered unexpected properties in a different compound that allowed electrons and holes, the sources of the negative and positive charges, respectively, that generate an electrical current, to run along what might resemble a north-south highway for one charge and an east-west highway for the other.
After that discovery, the researchers combed through existing research on other crystals that had been found by other scientists to do the same thing.
“We got interested in this because at first, we didn’t realize it could exist. When we figured out it could exist, we’ve been really pushing to find these materials,” Goldberger said. To date, they’ve experimentally confirmed 15 materials with these properties – out of the over 110,000 crystal structures discovered and cataloged in an international database.
“A few had been discovered, but none was exploited for functionality. What we have found is that we can actually do something with it,” said Wolfgang Windl, a professor of materials science and engineering at Ohio State and co-author of the study.
“All we have to do is put wires to one end and orient the crystal a certain way and suddenly we have a power generator with no moving parts. And you make it warm with whatever waste heat you have in your home, car or rocket, and this will generate emission-free power all by itself and basically endlessly. It’s a little bit like black magic to me.”
Theoretically, a generator made with this compound could be put to use any place heat is generated – the size of the crystal can be variable, and in this study was dictated by the size of the furnace in which it was grown.
Heremans said the generator could produce enough electricity from car exhaust to propel the vehicle forward, but he favors the idea of using this technology on a smaller scale: “The smaller-scale applications are where complex solutions are not welcome because they’re too expensive,” he said. “That’s where a simple solution like this one is probably best.”
Co-authors include Michael Scudder, Bin He (now at the Max Planck Institute) and Yaxian Wang (now at Harvard University) of Ohio State and Akash Rai and David Cahill of the University of Illinois at Urbana-Champaign.
Featured image: Researchers say the transverse thermoelectric generator could produce enough electricity from car exhaust to propel a vehicle forward.Photo: Shutterstock.com
Using novel imaging methods for studying brain metabolism, University of Kentucky researchers have identified the reservoir for a necessary sugar in the brain. Glycogen serves as a storage depot for the sugar glucose. The laboratories of Ramon Sun, Ph.D., assistant professor of neuroscience, Markey Cancer Center at the University of Kentucky College of Medicine, and Matthew Gentry, Ph.D., professor of molecular and cellular biochemistry and director of the Lafora Epilepsy Cure Initiative at the University of Kentucky College of Medicine discovered that glucose – the sugar used for cellular energy production – was not the only sugar contained in glycogen in the brain. Brain glycogen also contained another sugar called glucosamine. The full study was recently published in Cell Metabolism.
Some forms of glucosamine, such as glucosamine sulfate and glucosamine hydrochloride, are common supplements used to improve joint movement. However, within cells, glucosamine is an essential sugar needed for the complex carbohydrate chains that are attached to proteins in a process called glycosylation. These sugar chains decorate proteins and the sugar decorations are critical for the appropriate function of myriad proteins.
Discovering that glucosamine is a major component of brain glycogen provides key insight into neurological diseases caused by aberrant glycogen-like cellular aggregates called polyglucosan bodies (PGBs). Lafora disease is a rare, inherited childhood dementia caused by PGBs and this study demonstrates that the Lafora disease PGBs sequester glucosamine, leading to numerous cellular perturbations. PGBs also accumulate in the brain as people age and in people with other forms of dementia. Thus, the discovery that glycogen is also a storage cache for glucosamine has broad implications for understanding neurological changes associated with aging.
Using biochemical approaches, the researchers determined the sugar composition of glycogen in the muscle, liver, and brain of mice. Unlike muscle glycogen, which had only 1% glucosamine, and liver glycogen, which had less than 1% glucosamine, brain glycogen contained 25% glucosamine. “The discovery that brain glycogen is comprised of 25% glucosamine was stunning,” stated Sun.
Upon making this surprising discovery, they then identified the enzymes responsible for incorporating glucosamine into glycogen and for releasing glucosamine from glycogen. Again, the discovery was unexpected as these enzymes are the same ones used to incorporate glucose into and release glucose from glycogen.
To understand the implications of their findings for Lafora disease and neurological problems arising from PGBs, the researchers used their newly developed technique called matrix-assisted laser desorption/ionization traveling-wave ion-mobility high-resolution mass spectrometry (MALDI TW IMS) to measure and visualize the amount of glycogen in different regions of the brain. They also used this technique to quantify changes in the specific patterns of the sugar decorations on proteins in multiple regions of the brain.
The team applied MALDI TW IMS to analyze the brains of healthy mice and of two different mouse models of glycogen storage diseases: a model of Lafora disease and a model of glucose storage disease (GSD) type III. Sun commented, “This new technique allows us to quantify the amount of these sugars with high accuracy while also maintaining the spatial distribution within the brain regarding where the sugars are located. It is crucial that the brain has the correct sugars in the right location within the brain.”
These studies revealed that without the ability to properly regulate brain glycogen metabolism, not only do PGBs form, which perturbs cell metabolism, but the sugar decoration of proteins is also altered. Excitingly, they could restore protein sugar decoration by injecting an antibody-enzyme fusion (VAL-0417) into the brains of Lafora disease mice to degrade the PGBs. Their findings show a direct connection between abnormal glycogen storage and defective protein function in the brain. Their findings have implications for many other GSDs and congenital disorders of glycosylation, which cause severe neurological symptoms, including epilepsy and dementia.
“Multiple neurological diseases have blockades in these metabolic pathways. I’m sure these pathways are going to be important in other neuro-centric diseases as well. Brain glycogen is comprised of glucose and glucosamine and brain metabolism has to balance both in order to stay healthy,” explained Gentry.
The Gentry and Sun laboratories collaborated with several others from UK College of Medicine including Drs. Craig Vander Kooi, professor of molecular and cellular biochemistry, Charles Waechter, professor of molecular and cellular biochemistry, Lance Johnson, assistant professor of physiology, Christine Brainson, assistant professor of toxicology and cancer biology. They also worked with researchers from Indiana University School of Medicine including Drs. Anna A. DePaoli-Roach, professor of biochemistry and molecular biology, Peter J. Roach, professor of biochemistry and molecular biology, Thomas D. Hurley, professor of biochemistry and molecular biology. Richard Taylor, professor of chemistry and biochemistry, from the University of Notre Dame, and Richard Drake, professor of cell and molecular pharmacology and experimental therapeutics from the Medical University of South Carolina, also contributed to this work.
“This type of transdisciplinary collaborative research takes place at UK because of strong leadership from College of Medicine Dean Robert DiPaola, Dr. Mark Evers, Vice President for Research Lisa Cassis, Ph.D. and others,” stated Sun.
Research reported in this publication was supported by the National Institute of Neurological Disorders and Stroke of the National Institutes of Health under Award Numbers R35NS116824 and P01NS097197, the National Institute on Aging of the National Institutes of Health under Award Number R01AG066653 and R01AG062550, the National Institute of Diabetes and Digestive and Kidney Diseases of the National Institutes of Health under Award Number R01DK27221, and the National Cancer Institute of the National Institutes of Health under Award Number P30CA177558. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. This research was also supported by St Baldrick’s Career Development Award, V-Scholar Grant, Rally Foundation Independent Investigator grant, and the University of Notre Dame Reisenauer Family GSD Research Fund.
Featured image: Ramon Sun and Matt Gentry collaborated with other researchers to discover an additional type of sugar in the brain. Photo by Pete Comparoni | UKphoto
Reference: Ramon C. Sun, Lyndsay E.A. Young, Ronald C. Bruntz, Kia H. Markussen, Zhengqiu Zhou, Lindsey R. Conroy, Tara R. Hawkinson, Harrison A. Clarke, Alexandra E. Stanback, Jessica K.A. Macedo, Shane Emanuelle, M. Kathryn Brewer, Alberto L. Rondon, Annette Mestas, William C. Sanders, Krishna K. Mahalingan, Buyun Tang, Vimbai M. Chikwana, Dyann M. Segvich, Christopher J. Contreras, Elizabeth J. Allenger, Christine F. Brainson, Lance A. Johnson, Richard E. Taylor, Dustin D. Armstrong, Robert Shaffer, Charles J. Waechter, Craig W. Vander Kooi, Anna A. DePaoli-Roach, Peter J. Roach, Thomas D. Hurley, Richard R. Drake, Matthew S. Gentry, Brain glycogen serves as a critical glucosamine cache required for protein glycosylation, Cell Metabolism, 2021, , ISSN 1550-4131, https://doi.org/10.1016/j.cmet.2021.05.003. (https://www.sciencedirect.com/science/article/pii/S1550413121002205)
Researchers from the Danish psychiatry research-project iPSYCH have contributed to identify 33 new genetic variants which, as it turns out, play a role in bipolar disorder. To achieve this, they have examined DNA profiles from 413,000 people.
A number of scientific working groups are currently attempting to identify the genetic architecture underlying heritable and severe psychiatric disorders such as schizophrenia, depression and bipolar disorder.
One of these working groups is iPSYCH, Denmark’s largest research project focusing on psychiatric disorders. Together with international colleagues, they have recently examined the genetic risk factors behind bipolar disorder. The research groups have examined DNA profiles from a total of 413,000 people of European descent.
These comprise 42,000 patients with bipolar disorder together with 371,000 individuals without the disorder. By comparing the results from these two groups, the researchers have succeeded in identifying 33 genetic variants which, as it turns out, play a role in the risk of developing bipolar disorder.
More knowledge about the disorder
This means that the number of mapped genetic variants – genetic risk factors – behind bipolar disorder has more than doubled, explains one of the Danish participants in the project, Associate Professor Thomas Damm Als from the Department of Biomedicine at Aarhus University:
“In the international collaboration that we’re part of and which is focused on mapping genetic risk factors behind bipolar disorder, we’ve carried out three studies. Before we began the third study, we had knowledge of 31 risk variants – so this is a really significant increase in our knowledge of the genetic architecture of the disease.”
The 413,000 DNA profiles that the research group has examined stem from a total of 57 European health databases – with around 7,000 profiles from iPSYCH, the Lundbeck Foundation’s Initiative for Integrated Psychiatric Research.
But what exactly are the researchers examining when they map out genetic variants in connection with a psychiatric disorder?
“On a general level we’re looking for certain patterns in the prevalence of large pieces of DNA with several variants. Over time a whole ‘library’ of these genetic variants, which can appear in different places in the genome – our DNA that is – has been built up,” says Thomas Damm Als.
Ideas for new treatment
By examining DNA from people who have a particular disease – and then comparing the results with DNA from people who are not affected by it – it is therefore possible to utilise the ‘library’ to see whether certain genetic variants are especially present in connection with the disease.
“This was how we found the 33 variants,” says Thomas Damm Als: “But before we got to this point, we had to take all 413,000 DNA samples and look for variants at eight million places in the genome. It was a huge analysis, and it was only possible because more than two hundred researchers participated in the work.”
However, the genetic variants do not provide the complete explanation of how a person develops bipolar disorder.
“They are various contributory factors, environmental factors, however, also play a role – and a similar ‘cocktail’ of hereditary and environmental factors are also underlying other psychiatric disorders such as schizophrenia and depression,” explains the researcher.
In a broad sense, the identified genetic variants can be related to brain functions. Some of the variants are thus involved in genes expressed in the brain, while others influence the signalling between nerve cells in the brain.
This knowledge broadens our understanding of bipolar disorder and may also generate ideas for the development of new medical treatments. But as the researcher emphasizes, this is in no way a more detailed explanation of bipolar disorder,
“Today, we know that bipolar disorder appear to be more heritable than e.g. depression – but how these genetic factors and environmental factors interact is something we still need to understand. And we haven’t yet identified all of the relevant genetic variants.”
The road to a better understanding of the disorder involves even more DNA studies, as a similar mapping of depression which is also presently being carried out by a large team of international researchers has demonstrated. In this study it took a long time to find genetic variants specific to depression. Only when the research group had analysed approximately 500,000 DNA profiles did they really begin to see a clearer picture.
“It’s true for all psychiatric disorders, that you need to have a certain study size to have any hope of finding genetic variants that can be considered risk factors,” says Thomas Damm Als.
Note: Bipolar disorder (formerly called manic depression) is a psychiatric disorder that typically occurs in adolescence. It leads to periods of mania or slight mania. These periods, which are characterised by unnatural exhilaration with increased energy, activity and self-esteem, are replaced by depressive periods. Manic and depressive symptoms may also occur simultaneously or with rapid shifts.1.3 per cent of men and 1.8 per cent of women are either hospitalised or receive ambulatory treatment for bipolar disorder during their life.
Background for the results
Type of study: Case control and case cohort studies.
External funding: The Lundbeck Foundation and several international foundations including the Stanley Foundation and NIH/NIMH. Partners: The result is a collaboration between 230 researchers, primarily from Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, USA; Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA and Division of Mental Health and Addiction, Oslo University Hospital, Oslo, Norway; The Broad Institute at Harvard and MIT together with Massachusetts General Hospital in Boston, but also including a number of other international researchers and institutions who are organised in The Psychiatric Genomics Consortium. The scientific artcle can be read in Nature Genetics.
Reference: Mullins, N., Forstner, A.J., O’Connell, K.S. et al. Genome-wide association study of more than 40,000 bipolar disorder cases provides new insights into the underlying biology. Nat Genet (2021). https://doi.org/10.1038/s41588-021-00857-4
By applying mass spectrometry, scientists at the University of Copenhagen provide some of the most detailed data on how mitochondrial proteins cluster into supercomplexes – a process that makes mitochondria more efficient at producing energy. The findings, which were published in Cell Reports, is a precious resource for the scientific community, especially those tackling mitochondrial adaptations to exercise training or mitochondrial diseases.
Mitochondria are the cell’s power plants and produce the majority of a cell’s energy needs through an electrochemical process called electron transport chain coupled to another process known as oxidative phosphorylation. A number of different proteins in mitochondria facilitate these processes, but it’s not fully understood how these proteins are arranged inside mitochondria and the factors that can influence their arrangement.
Now, scientists at the University of Copenhagen have used state-of-the-art proteomics technology to shine new light on how mitochondrial proteins gather into electron transport chain complexes, and further into so-called supercomplexes. The research, which is published in Cell Reports, also examined how this process is influenced by exercise training.
“This study has allowed for a comprehensive quantification of electron transport chain proteins within supercomplexes and how they respond to exercise training. These data have implications for how exercise improves the efficiency of energy production in muscle,” says Associate Professor Atul S. Deshmukh from the Novo Nordisk Foundation Center for Basic Metabolic Research (CBMR) at the University of Copenhagen.
Traditional methods provide too little detail
It is already well established that exercise training stimulates mitochondrial mass and affects the formation of supercomplexes, which allows mitochondria in skeletal muscle to produce energy more efficiently. But questions remain about which complexes cluster into supercomplexes and how.
To better understand supercomplex formation, particularly in response to exercise, the team of scientists studied two groups of mice. One group was active and given an exercise wheel for 25 days, and the second group was sedentary and was not provided the exercise wheel. After 25 days, they measured the mitochondrial proteins in skeletal muscle from both groups to see how the supercomplexes had changed over time.
When scientists typically analyze how supercomplexes form, they use antibodies to measure one or two proteins per electron transport chain complex. But as there can be up to 44 proteins in a complex, this method is both time-consuming and provides limited information about what happens to the remainder of the proteins in each complex. As a result, there is a lack of detailed knowledge in the field.
Proteomics helps supercomplexes give up their secrets
To generate much more detailed data, the team applied a proteomic technology called mass spectrometry to measure the mitochondrial proteins. By applying proteomics instead of antibodies, the scientists were able to measure nearly all of the proteins in each complex. This provided unprecedented detail of mitochondrial supercomplexes in skeletal muscle and how exercise training influences their formation. Their approach demonstrated that not all of the proteins in each complex or a supercomplex respond to exercise in the same manner.
“Mitochondrial protein content is known to increase with exercise, thus understanding how these proteins assemble into supercomplexes is crucial to decipher how they work. Our research represents a valuable and precious resource for the scientific community, especially for those studying how the mitochondrial proteins organize to be better at what they do best: produce energy under demand,” explains Postdoc Alba Gonzalez-Franquesa.
The interdisciplinary project was a collaboration between the Deshmukh, Treebak and Zierath Groups at CBMR, and the Mann Group at the Novo Nordisk Foundation Center for Protein Research.