Escape From Mars: How Water Fled The Red Planet (Planetary Science)

New UArizona-led research updates our understanding of how water escapes Mars – not like a leaky faucet but with a sudden splash.

Mars once had oceans but is now bone-dry, leaving many to wonder how the water was lost. University of Arizona researchers have discovered a surprisingly large amount of water in the upper atmosphere of Mars, where it is rapidly destroyed, explaining part of this Martian mystery.

This artist’s concept depicts the early Martian environment (right) – believed to contain liquid water and a thicker atmosphere – versus the cold, dry environment seen at Mars today (left). NASA’s Goddard Space Flight Center

Shane Stone, a graduate student in the UArizona Lunar and Planetary Laboratory and lead author of a new paper published today in Science, describes himself as a planetary chemist. Once a laboratory chemist who helped to develop polymers that could be used to wrap and deliver therapeutic drugs more efficiently, he now studies the chemistry of planetary atmospheres.

Since 2014, he has worked on the NASA MAVEN mission, short for Mars Atmosphere and Volatile EvolutioN. The MAVEN spacecraft began orbiting Mars in 2014 and has been recording the composition of the upper atmosphere of Earth’s planetary neighbor ever since.

“We know that billions of years ago, there was liquid water on the surface of Mars,” Stone said. “There must have been a thicker atmosphere, so we know that Mars somehow lost the majority of its atmosphere to space. MAVEN is trying to characterize the processes responsible for this loss, and one portion of that is understanding exactly how Mars lost its water.”

Co-authors of the study include Roger Yelle, a UArizona planetary sciences professor and Stone’s research adviser, as well as researchers from NASA Goddard Space Flight Center and the Center for Research and Exploration in Space Science and Technology in Maryland.

Watching for Water

As MAVEN orbits Mars, it dips into the planet’s atmosphere every 4 1/2 hours. The onboard NGIMS instrument, short for Neutral Gas and Ion Mass Spectrometer, has been measuring the abundance of charged water molecules called ions in the upper Martian atmosphere, about 100 miles from the planet’s surface. From this information, scientists can infer how much water is present in the atmosphere.

Past observations using MAVEN and the Hubble Space Telescope showed that loss of water from the Martian upper atmosphere varies with the seasons. Compared to Earth, Mars takes a more oval-shaped path around the sun and is closest to it during summer in the Martian southern hemisphere.

Stone and his team found that when Mars is nearest the sun, the planet warms, and more water – found on the surface in the form of ice – moves from the surface to the upper atmosphere where it is lost to space. This happens once every Martian year or about every two Earth years. The regional dust storms that occur on Mars every Martian year and the global dust storms that occur across the planet about once every 10 years lead to further heating of the atmosphere and a surge in the upward movement of water.

The processes that make this cyclical movement possible contradict the classical picture of water escape from Mars, showing it is incomplete, Stone said. According to the classical process, water ice is converted to a gas and is destroyed by the sun’s rays in the lower atmosphere. This process, however, would play out as a slow, steady trickle, unaffected by the seasons or dust storms, which doesn’t mesh with current observations.

“This is important because we didn’t expect to see any water in the upper atmosphere of Mars at all,” Stone said. “If we compare Mars to Earth, water on Earth is confined close to the surface because of something called the hygropause. It’s just a layer in the atmosphere that’s cold enough to condense (and therefore stop) any water vapor traveling upward.”

The team argues that water is moving past what should be Mars’ hygropause, which is likely too warm to stop the water vapor. Once in the upper atmosphere, water molecules are broken apart by ions very quickly – within four hours, they calculate – and the byproducts are then lost to space.

“The loss of its atmosphere and water to space is a major reason Mars is cold and dry compared to warm and wet Earth. This new data from MAVEN reveals one process by which this loss is still occurring today,” Stone said.

A Dry and Dusty World

When the team extrapolated their findings back 1 billion years, they found that this process can account for the loss of a global ocean about 17 inches deep.

“If we took water and spread it evenly over the entire surface of Mars, that ocean of water lost to space due to the new process we describe would be over 17 inches deep,” Stone said. “An additional 6.7 inches would be lost due solely to the effects of global dust storms.”

During global dust storms, 20 times more water can be transported to the upper atmosphere. For example, one global dust storm lasting 45 days releases the same amount of water to space as Mars would lose during a calm Martian year, or 687 Earth days.

And while Stone and his team can’t extrapolate farther back than 1 billion years, he thinks that this process likely didn’t work the same before that, because Mars might have had a stronger hygropause long ago.

“Before the process we describe began to operate, there must have been a significant amount of atmospheric escape to space already,” Stone said. “We still need to nail down the impact of this process and when it began to operate.”

In the future, Stone would like to study the atmosphere of Saturn’s moon, Titan.

“Titan has an interesting atmosphere in which organic chemistry plays a significant role,” Stone said. “As a former synthetic organic chemist, I’m eager to investigate these processes.”


Provided by University of Arizona

Skoltech Scientists Developed A Novel Bone Implant Manufacturing Method (Engineering)

Scientists from the Skoltech Center for Design, Manufacturing, and Materials (CDMM) have developed a method for designing and manufacturing complex-shaped ceramic bone implants with a controllable porous structure, which largely enhances tissue fusion efficiency. Their research was published in the journal Applied Sciences.

Bone Implant. ©Pavel Odinev / Skoltech

Ceramic materials are resistant to chemicals, mechanical stress, and wear, which makes them a perfect fit for bone implants that can be custom-made thanks to advanced 3D printing technology. Various porous structures are used to ensure effective cell growth around the implant. For tissue fusion to be more efficient, the pores should have a size of several hundred microns, while the implants could be bigger than the pores by several orders of magnitude. In real life, an implant with a specific porous structure should be custom-designed in a very short time-frame. Conventional geometric modeling with the object representation limited to its surface does not work here due to the complex internal structure of the implant.

Skoltech scientists led by Professor Alexander Safonov modeled the implants using a Functional Representation (FRep) method developed by another Skoltech Professor, Alexander Pasko. “FRep modeling of microstructures has a wealth of advantages,” comments Evgenii Maltsev, a Research Scientist at Skoltech and a co-author of the paper. “First, FRep modeling always guarantees that the resulting model is correct, as opposed to the traditional polygonal representation in CAD systems where models are likely to have cracks or disjointed facets. Second, it ensures complete parametrization of the resulting microstructures and, therefore, high flexibility in the fast generation of variable 3D models. Third, it offers a diversity of tools for modeling various mesh structures.”

In their research, the scientists used the FRep method to design cylindrical implants and a cubic diamond cell to model the cellular microstructure. CDMM’s Additive Manufacturing Lab 3D-printed ceramic implants based on their design and tested them under axial compression.

Interestingly, the new method enables changing the porous structure so as to produce implants of different densities to accommodate the patients’ individual needs.

References : Alexander Safonov, Evgenii Maltsev, Svyatoslav Chugunov, Andrey Tikhonov, Stepan Konev, Stanislav Evlashin, Dmitry Popov, Alexander Pasko and Iskander Akhatov, “Design and Fabrication of Complex-Shaped Ceramic Bone Implants via 3D Printing Based on Laser Stereo lithography”, Appl. Sci. 2020, 10(20), 7138; link:

Provided by Skoltech

In A Warming Climate, Can Birds Take The Heat? (Biology)

We don’t know precisely how hot things will get as climate change marches on, but there’s reason to believe animals in the tropics may not fare as well as their temperate relatives. Many scientists think tropical animals, because they’re accustomed to a more stable thermal environment, may be pushed beyond their limits quickly as temperatures soar. And that could lead to massive species loss.

In a new University of Illinois study, tropical birds such as the cocoa woodcreeper (pictured) showed less acute heat stress when exposed to high temperatures than expected. ©Henry Pollock, University of Illinois

Yet, in a first-of-its-kind study, University of Illinois researchers show both temperate and tropical birds can handle acute heat stress much better than expected.

“In terms of their thermal physiology, a lot of these birds, including tropical species, can tolerate temperatures that are a lot higher than what they experience in their daily lives. That was surprising because tropical ectotherms, such as insects, have been shown to be extremely vulnerable to climate warming,” says Henry Pollock, postdoctoral researcher at Illinois and first author on the study. “We’re just not seeing the same things in birds. It is somewhat encouraging.”

Although they observed some promising trends, the researchers caution against celebrating too soon.

“It’s not necessarily comforting news. If someone walked away from this thinking tropical birds are going to do fine because they’re not going to overheat, that would be a simplistic bottom line to take away from this paper,” says Jeff Brawn, professor in the Department of Natural Resources and Environmental Sciences at Illinois and co-author on the study. “Warming is likely to affect tropical birds indirectly, by impacting their resources, the structure of tropical forests. So they may not be flying around panting, suffering from heat exhaustion, but there may be more indirect effects.”

To test the assumption that tropical and temperate birds differ in their ability to cope with heat stress, Pollock brought 81 species from Panama and South Carolina into field labs to test their responses to rising temperatures. Using tiny sensors, he was able to detect internal body temperatures, as well as metabolic rates, when he exposed the birds to warmer and warmer environments.

Species from both temperate and tropical zones handled the rising temperatures just fine. Birds from South Carolina had a higher heat tolerance, on average, than Panamanian birds, but both groups exceeded Pollock and Brawn’s expectations. And among all the birds, doves and pigeons emerged as thermal superstars. Most birds cool down by panting, but doves and pigeons take advantage of their unique-among-birds ability to “sweat.” In fact, Pollock says, they exceeded the limits of his testing equipment.

Although the study provided the first-ever heat tolerance data for many bird species, the results take on more meaning when put into the context of warming projections.

“Both temperate and tropical birds were able to tolerate temperatures into the 40s [in degrees Celsius], but they only experience maximum temperatures of around 30 degrees Celsius in their everyday environments, so they have a substantial buffer,” Pollock says.

In other words, even if maximum air temperatures rise 3 to 4 degrees Celsius, as projected by some scientists, that’s well within the thermal safety margins of all the birds Pollock measured.

It’s important to note the experiment, which measured acute heat stress, doesn’t exactly replicate what’s projected to happen during much more gradual climate warming. But few studies have examined the effects of chronic heat stress in birds, and having this baseline knowledge of their acute physiological limits is a good start.

“This is the first geographic comparison ever for birds. We need more data from more sites and studies of chronic heat stress over longer periods of time. But I think at the very least, what we can say is that they’re able to tolerate higher temperatures than I think anybody expected,” Pollock says.

Brawn adds, “We’re just starting to scratch the surface of what we need to do to really understand how climate change is going to affect birds. But this is an important first step.”

References: Henry Pollock, Jeff Brawn, and Zachary Cheviron, “Heat tolerances of temperate and tropical birds and their implications for susceptibility to climate warming,” is published in Functional Ecology [DOI: 10.1111/1365-2435.13693].

Provided by University of Illinois

Birth Of Magnetar From Colossal Collision Potentially Spotted For First Time (Planetary Science)

Neutron star merger results in magnetar with brightest kilonova ever observed.

Long ago and far across the universe, an enormous burst of gamma rays unleashed more energy in a half-second than the sun will produce over its entire 10-billion-year lifetime.

After examining the incredibly bright burst with optical, X-ray, near-infrared and radio wavelengths, a Northwestern University-led astrophysics team believes it potentially spotted the birth of a magnetar.

Researchers believe the magnetar was formed by two neutron stars merging, which has never before been observed. The merger resulted in a brilliant kilonova — the brightest ever seen — whose light finally reached Earth on May 22, 2020. The light first came as a blast of gamma-rays, called a short gamma-ray burst.

“When two neutron stars merge, the most common predicted outcome is that they form a heavy neutron star that collapses into a black hole within milliseconds or less,” said Northwestern’s Wen-fai Fong, who led the study. “Our study shows that it’s possible that, for this particular short gamma-ray burst, the heavy object survived. Instead of collapsing into a black hole, it became a magnetar: A rapidly spinning neutron star that has large magnetic fields, dumping energy into its surrounding environment and creating the very bright glow that we see.”

The research has been accepted by The Astrophysical Journal and will be published online later this year.

Fong is an assistant professor of physics and astronomy in Northwestern’s Weinberg College of Arts and Sciences and a member of CIERA (Center for Interdisciplinary Exploration and Research in Astrophysics). The research involved two undergraduates, three graduate students and three postdoctoral fellows from Fong’s laboratory.

‘There was a new phenomenon happening’

After the light was first detected by NASA’s Neil Gehrels Swift Observatory, scientists quickly enlisted other telescopes — including NASA’s Hubble Space Telescope, the Very Large Array, the W.M. Keck Observatory and the Las Cumbres Observatory Global Telescope network — to study the explosion’s aftermath and its host galaxy.

Collision sequence for a magnetar-powered kilonova blast ©NASA/ESA

Fong’s team quickly realized that something didn’t add up.

Compared to X-ray and radio observations, the near-infrared emission detected with Hubble was much too bright. In fact, it was 10 times brighter than predicted.

“As the data were coming in, we were forming a picture of the mechanism that was producing the light we were seeing,” said the study’s co-investigator, Tanmoy Laskar of the University of Bath in the United Kingdom. “As we got the Hubble observations, we had to completely change our thought process, because the information that Hubble added made us realize that we had to discard our conventional thinking and that there was a new phenomenon going on. Then we had to figure out about what that meant for the physics behind these extremely energetic explosions.”

Magnetic monster

Fong and her team have discussed several possibilities to explain the unusual brightness — known as a short gamma-ray burst — that Hubble saw. Researchers think short bursts are caused by the merger of two neutron stars, extremely dense objects about the mass of the sun compressed into the volume of a large city like Chicago. While most short gamma-ray bursts probably result in a black hole, the two neutron stars that merged in this case may have combined to form a magnetar, a supermassive neutron star with a very powerful magnetic field.

“You basically have these magnetic field lines that are anchored to the star that are whipping around at about 1,000 times a second, and this produces a magnetized wind,” Laskar explained. “These spinning field lines extract the rotational energy of the neutron star formed in the merger, and deposit that energy into the ejecta from the blast, causing the material to glow even brighter.”

“We know that magnetars exist because we see them in our galaxy,” Fong said. “We think most of them are formed in the explosive deaths of massive stars, leaving these highly magnetized neutron stars behind. However, it is possible that a small fraction form in neutron star mergers. We have never seen evidence of that before, let alone in infrared light, making this discovery special.”

Strangely bright kilonova

Kilonovae, which are typically 1,000 times brighter than a classic nova, are expected to accompany short gamma-ray bursts. Unique to the merger of two compact objects, kilonovae glow from the radioactive decay of heavy elements ejected during the merger, producing coveted elements like gold and uranium.

“We only have one confirmed and well-sampled kilonova to date,” said Jillian Rastinejad, a co-author of the paper and graduate student in Fong’s laboratory. “So it is especially exciting to find a new potential kilonova that looks so different. This discovery gave us the opportunity to explore the diversity of kilonovae and their remnant objects.”

If the unexpected brightness seen by Hubble came from a magnetar that deposited energy into the kilonova material, then, within a few years, the ejected material from the burst will produce light that shows up at radio wavelengths. Follow-up radio observations may ultimately prove that this was a magnetar, leading to an explanation of the origin of such objects.

“Now that we have one very bright candidate kilonova,” Rastinejad said, “I’m excited for the new surprises that short gamma-ray bursts and neutron star mergers have in store for us in the future.”

References: Tanmoy Laskar et al., “The broadband counterpart of the short GRB 2005221 at z = 0.5536: A luminous kilonova or a collimated outflow with a reverse shock?”, ArXiv, 2020.

Provided by Northwestern University

NIST Designs a Prototype Fuel Gauge For Orbit (Engineering / Astronomy)

Liquids aren’t as well behaved in space as they are on Earth. Inside a spacecraft, microgravity allows liquids to freely slosh and float about.

This behavior has made fuel quantity in satellites difficult to pin down, but a new prototype fuel gauge engineered at the National Institute of Standards and Technology (NIST) could offer an ideal solution. The gauge, described in the Journal of Spacecraft and Rockets, can digitally recreate a fluid’s 3D shape based on its electrical properties. The design could potentially provide satellite operators with reliable measurements that would help prevent satellites from colliding and keep them operational for longer.

Many satellites perform highly important and lucrative tasks, but some may be decommissioned with fuel still in the tank due to the current methods of measuring fuel quantity. Fuel gauges with higher accuracy could help ensure that satellites stay operational for longer and more is made of their time in orbit. ©NASA Jet Propulsion Laboratory

“Every day that a satellite stays in orbit amounts to probably millions of dollars of revenue,” said Nick Dagalakis, a NIST mechanical engineer and co-author of the study. “The operators want to utilize every drop of fuel, but not so much that they empty the tank.”

Letting a satellite’s tank run dry could leave it stranded in its original orbit with no fuel to avoid smashing into other satellites and producing dangerous debris clouds.

To reduce the probability of collision, operators save the last few drops of fuel to eject satellites into a graveyard orbit, hundreds of kilometers away from functioning spacecraft. They may be wasting fuel in the process, however.

For decades, gauging fuel in space has not been an exact science. One of the most frequently relied upon methods entails estimating how much fuel is being burned with each thrust and subtracting that amount from the volume of fuel in the tank. This method is quite accurate at the start when a tank is close to full, but the error of each estimate carries on to the next, compounding with every thrust. By the time a tank is low, the estimates become more like rough guesses and can miss the mark by as much as 10%.

Without reliable measurements, operators may be sending satellites with fuel still in the tank into an early retirement, potentially leaving a considerable amount of money on the table.

The concept of the new gauge — originally devised by Manohar Deshpande, a technology transfer manager at NASA Goddard Space Flight Center — makes use of a low-cost 3D imaging technique known as electrical capacitance volume tomography (ECVT).

Like a CT scanner, ECVT can approximate an object’s shape by taking measurements at different angles. But instead of shooting X-rays, electrodes emit electric fields and measure the object’s ability to store electric charge, or capacitance.

Deshpande sought the expertise of Dagalakis and his colleagues at NIST — who had previous experience fabricating capacitance-based sensors — to help make his designs a reality.

In the NanoFab clean room at NIST’s Center for Nanoscale Science and Technology, the researchers produced sensor electrodes using a process called soft lithography, in which they printed patterns of ink over copper sheets with a flexible plastic backing. Then, a corrosive chemical carved out the exposed copper, leaving behind the desired strips of metal, Dagalakis said.

The interior of the prototype fuel tank is lined with flexible electrodes, each capable of emitting electric fields (yellow arrows) which weaken as they pass through the heat transfer fluid (HT-90) filled balloon. The electrodes pick up on the fields generated by the others, weakened or at full strength. By combining the measurements of every electrode pair, the gauge can estimate the location and volume of the balloon. ©NIST/N. Hanacek

The team lined the interior of an egg-shaped container modeled after one of NASA’s fuel tanks with the flexible sensors. Throughout the inside of the tank, electric fields emitted by each sensor can be received by the others. But how much of these fields end up being transmitted depends on the capacitance of whatever material is inside the tank.

“If you have no fuel, you have the highest transmission, and if you have fuel, you’re going to have a lower reading, because the fuel absorbs the electromagnetic wave,” Dagalakis said. “We measure the difference in transmission for every possible sensor pair, and by combining all these measurements, you can know where there is and isn’t fuel and create a 3D image.”

To test out what the new system’s fuel gauging capabilities might look like in space, the researchers suspended a fluid-filled balloon in the tank, mimicking a liquid blob in microgravity.

Many liquids commonly used to propel satellites and spacecraft, such as liquid hydrogen and hydrazine, are highly flammable in Earth’s oxygen-rich atmosphere, so the researchers opted to test something more stable, Dagalakis said.

At Deshpande’s recommendation, they filled the balloons with a heat transfer fluid — normally used for storing or dissipating thermal energy in industrial processes — because it closely mimicked the electrical properties of space fuel.

The researchers activated the system and fed the capacitance data to a computer, which produced a series of 2D images mapping the location of fluid throughout the length of the tank. When compiled, the images gave rise to a 3D rendition of the balloon with a diameter that was less than 6% different than the actual balloon’s diameter.

“This is just an experimental prototype, but that is a good starting point,” Dagalakis said.

If further developed, the ECVT system could help engineers and researchers overcome several other challenges presented by liquid’s behavior in space.

“The technology could be used to continuously monitor fluid flow in the many pipes aboard the International Space Station and to study how the small forces of sloshing fluids can alter the trajectory of spacecraft and satellites,” Deshpande said.

References :

Provided by NIST

Advanced Atomic Clock Makes A Better Dark Matter Detector (Cosmology / Astronomy)

JILA researchers have used a state-of-the-art atomic clock to narrow the search for elusive dark matter, an example of how continual improvements in clocks have value beyond timekeeping.

Cartoon depicting a clock looking for dark matter. ©Hanacek/NIST.

Older atomic clocks operating at microwave frequencies have hunted for dark matter before, but this is the first time a newer clock, operating at higher optical frequencies, and an ultra-stable oscillator to ensure steady light waves, have been harnessed to set more precise bounds on the search. The research is described in Physical Review Letters .

Astrophysical observations show that dark matter makes up most of the “stuff” in the universe but so far it has eluded capture. Researchers around the world have been looking for it in various forms. The JILA team focused on ultralight dark matter, which in theory has a teeny mass (much less than a single electron) and a humongous wavelength–how far a particle spreads in space–that could be as large as the size of dwarf galaxies. This type of dark matter would be bound by gravity to galaxies and thus to ordinary matter.

Ultralight dark matter is expected to create tiny fluctuations in two fundamental physical “constants“: the electron’s mass, and the fine-structure constant. The JILA team used a strontium lattice clock and a hydrogen maser (a microwave version of a laser) to compare their well-known optical and microwave frequencies, respectively, to the frequency of light resonating in an ultra-stable cavity made from a single crystal of pure silicon. The resulting frequency ratios are sensitive to variations over time in both constants. The relative fluctuations of the ratios and constants can be used as sensors to connect cosmological models of dark matter to accepted physics theories.

The JILA team established new limits on a floor for “normal” fluctuations, beyond which any unusual signals discovered later might be due to dark matter. The researchers constrained the coupling strength of ultralight dark matter to the electron mass and the fine-structure constant to be on the order of 10-5 (1 in 100,000) or less, the most precise measurement ever of this value.

JILA is jointly operated by the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.

“Nobody actually knows at what sensitivity level you will start to see dark matter in laboratory measurements,” NIST/JILA Fellow Jun Ye said. “The problem is that physics as we know it is not quite complete at this point. We know something is missing but we don’t quite know how to fix it yet.”

“We know dark matter exists from astrophysical observations, but we don’t know how the dark matter connects to ordinary matter and the values we measure,” Ye added. “Experiments like ours allow us to test various theory models people put together to try to explore the nature of dark matter. By setting better and better bounds, we hope to rule out some incorrect theory models and eventually make a discovery in the future.”

Scientists are not sure whether dark matter consists of particles or oscillating fields affecting local environments, Ye noted. The JILA experiments are intended to detect dark matter’s “pulling” effect on ordinary matter and electromagnetic fields, he said.

Atomic clocks are prime probes for dark matter because they can detect changes in fundamental constants and are rapidly improving in precision, stability and reliability. The cavity’s stability was also a crucial factor in the new measurements. The resonant frequency of light in the cavity depends on the length of the cavity, which can be traced back to the Bohr radius (a physical constant equal to the distance between the nucleus and the electron in a hydrogen atom). The Bohr radius is also related to the values of the fine structure constant and electron mass. Therefore, changes in the resonant frequency as compared to transition frequencies in atoms can indicate fluctuations in these constants caused by dark matter.

Researchers collected data on the strontium/cavity frequency ratio for 12 days with the clock running 30% of the time, resulting in a data set 978,041 seconds long. The hydrogen maser data spanned 33 days with the maser running 94% of the time, resulting in a 2,826,942-second record. The hydrogen/cavity frequency ratio provided useful sensitivity to the electron mass although the maser was less stable and produced noisier signals than the strontium clock.

JILA researchers collected the dark matter search data during their recent demonstration of an improved time scale — a system that incorporates data from multiple atomic clocks to produce a single, highly accurate timekeeping signal for distribution. As the performance of atomic clocks, optical cavities and time scales improves in the future, the frequency ratios can be re-examined with ever-higher resolution, further extending the reach of dark matter searches.

“Any time one is running an optical atomic time scale, there is a chance to set a new bound on or make a discovery of dark matter,” Ye said. “In the future, when we can put these new systems in orbit, it will be the biggest ‘telescope’ ever built for the search for dark matter.”

References : C.J. Kennedy, E. Oelker, J.M. Robinson, T. Bothwell, D. Kedar, W.R. Milner, G.E. Marti, A. Derevianko and J. Ye. Precision Metrology Meets Cosmology: Improved Constraints on Ultralight Dark Matter from Atom-Cavity Frequency Comparisons. Physical Review Letters. Published online Nov. 12, 2020. DOI: 10.1103/PhysRevLett.125.201302

Provided by NIST

Astrocytes Identified As Master ‘Conductors’ Of the Brain (Neuroscience)

Star-shaped ‘glue’ cells make it their business to govern connections between neurons.

In the orchestra of the brain, the firing of each neuron is controlled by two notes–excitatory and inhibitory– that come from two distinct forms of a cellular structure called synapses. Synapses are essentially the connections between neurons, transmitting information from one cell to the other. The synaptic harmonies come together to create the most exquisite music–at least most of the time.

Astrocytes are highly complex cells that tightly envelope synaptic structures in the brain. This picture shows 3D-printed forms of astrocytes. ©Katie King – Duke University

When the music becomes discordant and a person is diagnosed with a brain disease, scientists typically look to the synapses between neurons to determine what went wrong. But a new study from Duke University neuroscientists suggests that it would be more useful to look at the white-gloved conductor of the orchestra — the astrocyte.

Astrocytes are star-shaped cells that form the glue-like framework of the brain. They are one kind of cell called glia, which is Greek for “glue.” Previously found to be involved in controlling excitatory synapses, a team of Duke scientists also found that astrocytes are involved in regulating inhibitory synapses by binding to neurons through an adhesion molecule called NrCAM. The astrocytes reach out thin, fine tentacles to the inhibitory synapse, and when they touch, the adhesion is formed by NrCAM. Their findings were published in Nature on November 11.

“We really discovered that the astrocytes are the conductors that orchestrate the notes that make up the music of the brain,” said Scott Soderling, PhD, chair of the Department of Cell Biology in the School of Medicine and senior author on the paper.

Excitatory synapses — the brain’s accelerator — and inhibitory synapses — the brain’s brakes — were previously thought to be the most important instruments in the brain. Too much excitation can lead to epilepsy, too much inhibition can lead to schizophrenia, and an imbalance either way can lead to autism.

However, this study shows that astrocytes are running the show in overall brain function, and could be important targets for brain therapies, said co-senior author Cagla Eroglu, PhD, associate professor of cell biology and neurobiology in the School of Medicine. Eroglu is a world expert in astrocytes and her lab discovered how astrocytes send their tentacles and connect to synapses in 2017.

“A lot of the time, studies that investigate molecular aspects of brain development and disease study gene function or molecular function in neurons, or they only consider neurons to be the primary cells that are affected,” said Eroglu. “However, here we were able to show that by simply changing the interaction between astrocytes and neurons — specifically by manipulating the astrocytes — we were able to dramatically alter the wiring of the neurons as well.”

Soderling and Eroglu collaborate often scientifically, and they hashed out the plan for the project over coffee and pastries. The plan was to apply a proteomic method developed in Soderling’s lab that was further developed by his postdoctoral associate Tetsuya Takano, who is the paper’s lead author.

Takano designed a new method that allowed scientists to use a virus to insert an enzyme into the brain of a mouse that labeled the proteins connecting astrocytes and neurons. Once tagged with this label, the scientists could pluck the tagged proteins from the brain tissue and use Duke’s mass spectrometry facility to identify the adhesion molecule NrCAM.

Then, Takano teamed up with Katie Baldwin, a postdoctoral associate in Eroglu’s lab, to run assays to determine how the adhesion molecule NrCAM plays a role in the connection between astrocyte and inhibitory synapses. Together the labs discovered NrCAM was a missing link that controlled how astrocytes influence inhibitory synapses, demonstrating they influence all of the ‘notes’ of the brain.

“We were very lucky that we had really cooperative team members,” said Eroglu. “They worked very hard and they were open to crazy ideas. I would call this a crazy idea.”

References: Takano, T., Wallace, J.T., Baldwin, K.T. et al. Chemico-genetic discovery of astrocytic control of inhibition in vivo. Nature (2020).

Provided by Duke University

Image Release: Galaxies in The Perseus Cluster (Astronomy)

Images show effect of environment on galaxies.

For galaxies, as for people, living in a crowd is different from living alone. Recently, astronomers used the National Science Foundation’s Karl G. Jansky Very Large Array (VLA) to learn how a crowded environment affects galaxies in the Perseus Cluster, a collection of thousands of galaxies some 240 million light-years from Earth.

Galaxies in the Perseus Cluster, left to right: NGC 1275, NGC 1265, IC 310. ©M. Gendron-Marsolais et al.; S. Dagnello, NRAO/AUI/NSF; Sloan Digital Sky Survey.

Left: The giant galaxy NGC 1275, at the core of the cluster, is seen in new detail, including a newly-revealed wealth of complex, filamentary structure in its radio lobes.

Center: The galaxy NGC 1265 shows the effects of its motion through the tenuous material between the galaxies. Its radio jets are bent backward by that interaction, then merge into a single, broad “tail.” The tail then is further bent, possibly by motions within the intergalactic material.

Right: The jets of the galaxy IC 310 are bent backward, similarly to NGC 1265, but appear closer because of the viewing angle from Earth. That angle also allows astronomers to directly observe energetic gamma rays generated near the supermassive black hole at the galaxy’s core.

Such images can help astronomers better understand the complex environment of galaxy clusters, which are the largest gravitationally-bound structures in the universe, and which harbor a variety of still poorly-understood phenomena.

“These images show us previously-unseen structures and details and that helps our effort to determine the nature of these objects,” said Marie-Lou Gendron-Marsolais, an ESO/ALMA Fellow in Santiago, Chile. She and a number of international collaborators are announcing their results in the Monthly Notices of the Royal Astronomical Society.

Provided by NRAO

Want to Lower Your Risk of Anxiety and Depression? Stay Fit (Psychiatry)

High fitness levels are associated with a lower risk of depression and anxiety.

Do you need a new, evidence-based source of motivation to get in shape or stay fit? If so, a recently published seven-year study of over 150,000 people reports that individuals with higher aerobic and muscular fitness levels are significantly less likely to experience depression and anxiety. These findings (Kandola et al., 2020) were published on Nov. 11 in the journal BMC Medicine.

Lead author Aaron Kandola of UCL’s Division of Psychiatry and colleagues found that people with low levels of cardiorespiratory and muscular fitness were almost twice as likely to experience depression than study participants with higher aerobic/muscular fitness. Low levels of fitness also predicted a significantly greater chance of generalized anxiety disorder. Karmel Choi of Harvard Medical School, whose previous research (Choi et al., 2019) found that physical activity keeps depression at bay, is a co-author of this study.

This prospective cohort study involved 152,978 participants age 40 to 69 who were part of the more extensive UK Biobank Study, consisting of 500,000 people from England, Scotland, and Wales recruited between April 2007 to December 2010 to participate in nationwide longitudinal research.

About seven years ago, Kandola et al. collected baseline measurements of participants’ cardiorespiratory fitness using stationary bicycles and conducted grip strength tests using a hydraulic hand dynamometer in each hand as a predictor of total muscle strength.

At baseline, the researchers accounted for confounding factors such as chronic illness, dietary habits, mental illness history, and socioeconomic status. Participants also filled out questionnaires that assessed depression and anxiety symptoms.

Seven years later, the researchers conducted the same tests again and found that “high aerobic and muscular fitness at the start of the study was associated with better mental health seven years later.”

More specifically, the follow-up analysis showed that “people with the lowest combined aerobic and muscular fitness had 98% higher odds of depression, 60% higher odds of anxiety, and 81% higher odds of having either one of the common mental health disorders, compared to those with high levels of overall fitness.” (Note: This is an observational study that identifies correlation, not causation.)

Based on these findings, the authors speculate that aerobic and resistance training may offset the risk of experiencing depression, anxiety, and other common mental health disorders. “While broadly increasing physical activity will be beneficial, structured aerobic and resistance exercises with sufficient intensity to improve fitness may have a greater effect on risk reduction,” the authors note.

“Our findings suggest that encouraging people to exercise more could have extensive public health benefits, improving not only our physical health but our mental health too,” senior author Joseph Hayes of University College London said in a news release. “Improving fitness through a combination of cardio exercise and strength and resistance training appears to be more beneficial than just focusing on aerobic or muscular fitness.”

“Other studies have found that just a few weeks of regular intensive exercise can make substantial improvements to aerobic and muscular fitness, so we are hopeful that it may not take much time to make a big difference to your risk of mental illness,” Kandola added. “Reports that people are not as active as they used to be are worrying, and even more so now that global lockdowns have closed gyms and limited how much time people are spending out of the house. Physical activity is an important part of our lives and can play a key role in preventing mental health disorders.”

From a public health perspective, the researchers speculate that encouraging the general population to improve their physical fitness through a combination of cardio workouts and strength-training exercises could significantly reduce the incidence of depression, anxiety, and other common mental disorders while simultaneously improving physical health outcomes.

References: (1) Aaron A. Kandola, David P. J. Osborn, Brendon Stubbs, Karmel W. Choi, and Joseph F. Hayes. “Individual and Combined Associations Between Cardiorespiratory Fitness and Grip Strength With Common Mental Disorders: A Prospective Cohort Study in the UK Biobank.” BMC Medicine (First published: November 11, 2020) DOI: 10.1186/s12916-020-01782-9 (2) Karmel W. Choi, Chia-Yen Chen, Murray B. Stein, Yann C. Klimentidis, Min-Jung Wang, Karestan C. Koenen, Jordan W. Smoller. “Assessment of Bidirectional Relationships Between Physical Activity and Depression Among Adults: A 2-Sample Mendelian Randomization Study.” JAMA Psychiatry (First published online: January 23, 2019) DOI: 10.1001/jamapsychiatry.2018.4175

This article is originally written by Christopher Bergland and is republished here from psychology today