Study Suggests Long-term Suppression of Hepatitis B in Patients Who Are HIV-coinfected May Lower Cancer Risk (Medicine)

Hepatitis B in the Blood Raises Risk of Hepatocellular Carcinoma Among HIV/Hepatitis B-Coinfected Patients

While the risk of hepatocellular carcinoma (HCC) – primary liver cancer – is higher among patients who have HIV, it’s even higher among patients who have HIV and detectable hepatitis B, according to research from the Perelman School of Medicine at the University of Pennsylvania. Among participants with HIV and hepatitis B, suppressing detectable hepatitis B infection with the use of antiretroviral therapy cut the risk of developing HCC by 58 percent. These findings suggest that the best care for individuals with HIV and detectable hepatitis B includes sustained hepatitis B suppression with antiretroviral therapy in order to cut the risk of developing HCC. The study is published in the journal Hepatology.

HCC affects approximately 25,000 people each year in the United States and is considered a very aggressive type of cancer. Internationally and in the United States, chronic hepatitis B is a leading cause of HCC through direct and indirect effects on the liver. Additionally, chronic hepatitis B is common among patients who have HIV.

In order to study the predictors of HCC among people co-infected with HIV and chronic hepatitis B, the researchers used data from the North American AIDS Cohort Collaboration on Research and Design, which contains health information spanning two decades. The study population included data from over 8,000 people co-infected with HIV and chronic hepatitis B. Those with detectable HIV and detectable hepatitis B had a higher likelihood of developing HCC compared to those who had both viruses suppressed. Antiretroviral treatment for chronic hepatitis B reduced the risk of developing HCC, and the risk dropped substantially when hepatitis B viremia was suppressed to undetectable levels and when the viral suppression lasted at least a year.

“Most HIV providers do not regularly monitor hepatitis B viral load in practice, even while on antiretroviral treatment,” said senior author Vincent Lo Re III, MD, MSCE, an associate professor of Medicine and Epidemiology at Penn Medicine. “Our data highlight the importance of regular assessment of hepatitis B viral load and achievement of hepatitis B suppression during antiretroviral therapy in people with HIV and chronic hepatitis B coinfection.” In addition, maintaining adherence can be a challenge for certain patients depending on their overall health and other factors. “This study highlights the importance of testing and regular care for HIV and chronic hepatitis B co-infected individuals as well as the value of programs and strategies that help co-infected individuals maximize antiretroviral adherence to achieve hepatitis B viral suppression.”

According to Dr. Lo Re and his team, including lead author H. Nina Kim, MD, MSc, an associate professor of medicine at University of Washington, heavy alcohol use and coinfection with hepatitis C were also associated with an increased risk of HCC among individuals with HIV and chronic hepatitis B co-infection. The study authors advise that reducing excessive drinking and using direct-acting antiviral therapy targeted to chronic hepatitis C infection could also help to lower the risk of liver cancer in dually infected people.

The study was supported by the National Institute of Allergy and Infectious Diseases (R21-AI124868).

Reference: Nina Kim, H., Newcomb, C.W., Carbonari, D.M., Roy, J.A., Torgersen, J., Althoff, K.N., Kitahata, M.M., Rajender Reddy, K., Lim, J.K., Silverberg, M.J., Mayor, A.M., Horberg, M.A., Cachay, E.R., Kirk, G.D., Sun, J., Hull, M., John Gill, M., Sterling, T.R., Kostman, J.R., Peters, M.G., Moore, R.D., Klein, M.B., Lo Re, V., III, the North American AIDS Cohort Collaboration on Research, f. and of IeDEA, D. (2021), Risk of Hepatocellular Carcinoma with Hepatitis B Viremia among HIV/Hepatitis B Virus-Coinfected Persons in North America. Hepatology. Accepted Author Manuscript.

Provided by Penn Medicine

ExoMars: Good News From Parachute Tests (Astronomy)

The parachute system that will help to land ExoMars Rosalind Franklin’s rover on Mars has completed recent dynamic ground extraction tests on the first main parachute. Apart from a small problem, which can be solved in a few days, the ESA says, the tests were carried out correctly, confirming the functionality of the upgrade carried out after the last failed tests. High-altitude tests on both main parachutes are scheduled in the coming months

Good news for the ExoMars 2022 mission : the first main parachute, essential to slow down the lander before landing on the Red Planet, has passed the ground extraction tests, which leaves the expected schedule unchanged, with the launch of the mission scheduled in the window that runs from 20 September to 1 October 2022.

The descent module of the Esa-Roscosmos ExoMars 2022 mission, consisting of the Rosalind Franklin rover , designed to search for signs of life on the planet, and the Kazachok surface platform, which will be responsible for monitoring the landing site – requires two main parachutes – each with your own extraction pilot parachute – to slow it down as it plunges into the Martian atmosphere.

The first of the two main parachutes, with a diameter of 15 meters, will be deployed while the descent module still travels at supersonic speeds. The second main parachute, with a diameter of 35 meters (it will be the largest ever deployed on Mars), will open about 20 seconds later, when the module has reached a speed of about 400 km / h.

Tests carried out at NASA’s Jet Propulsion Laboratory in California involved the first of the main parachutes and its containment and release bag.

These are dynamic ground extraction tests: tests that reproduce the high speeds at which the parachute will be extracted from the bags during the descent to Mars. This made it possible to verify the changes made to the project by Airborne Systems and Arescosmo after the criticalities that emerged in the latest high-altitude tests .

The Jpl Earth Dynamic Mining Test Facility, California. Credits: Nasa / Jpl-Caltech

In these tests, the parachute assembly (the white circular structure in the image opposite), with one end tied to a special machine and the other anchored to a structure suspended on a long cable, is literally fired at high speed by a system compressed air. When the extraction mechanism is activated, the parachute is released from the bag at the target speed, mimicking the deployment on Mars. At higher speeds, the tests allow for extraction at more than 200 km / h.

Those to which the parachute has been subjected is actually a double draw test. In the first, which took place on April 27, Arescosmo evaluated the modifications made to the prototype after the previous unsuccessful tests: the new design of the release bag and the new way of folding the parachute inside to avoid twisting and damage to the tissue during extraction. In the second, which took place three days later, Airborne Systems validated the extraction process.

“Both prototypes performed very well in testing,” says Thierry Blancquaert , head of ExoMars’ Spacecraft Systems Engineering team. “The inspection showed that some small areas of the parachute had been subjected to friction during the extraction process of the bag, reducing the resistance of the fabric in those points. Analysis of the video footage allowed the Airborne Systems team to pinpoint when the damage occurred and to make changes to the parachute bag and packaging. These changes – continues Blancquaert – could be made quickly, just a couple of days, to soon reach a successful result ».

The first main parachute of the ExoMars 2022 mission on the ground after a dynamic extraction test at the test bench of the Jpl. Credits: Nasa / Jpl-Caltech

The parachute was originally packed inside the bag around the central structure that contains its pilot parachute, so that after extraction it opened to 360 degrees. Folding the canopy in two parts, so that one half unfolds first and then the other half, has been shown to reduce the friction that the canopy undergoes as it unwinds around the core structure, prior to deployment.

This time, compared to the previous tests, there are therefore significant steps forward.

Once the changes are made, Airborne Systems will conduct the first high-altitude drop tests on this parachute. Tests scheduled in early June in Kiruna, Sweden. In these tests, a test vehicle is brought to about 29 km altitude by a sounding balloon and from there it is released, then sequentially opening the parachutes so as to subject them in the most likely possible way to the stress they will experience in the rarefied atmosphere of Mars.

Arescosmo will also conduct tests at high altitudes in the coming months, but its attention will be on the second main parachute, the one with a diameter of 35 meters. The improvements made to this parachute and its bag, which include the use of stronger canopy tensors and reinforced fabric around the apex of the parachute, have already been implemented and tested in dynamic extraction tests conducted in December 2020.

The high-altitude test involves the use of a pilot parachute – auxiliary to the second main parachute – slightly smaller: 3.7 meters in diameter instead of 4.5 meters. A decision taken with the aim of reducing the friction generated by the extraction of the main parachute from its bag. This parachute cannot be pre-tested with ground pull tests, as these focus only on pulling the main parachutes out of their bag.

But the tests that the ESA and Roscosmos intend to conduct do not end there. Further dynamic ground extraction tests are planned for August. After that, another couple of high-altitude drop tests will take place between October and November of this year in Oregon, United States. Other high-altitude tests could also take place during the first half of 2022, the configuration of which will largely depend on the outcome of the next tests in Kiruna.

«Our strategy of having two highly qualified teams working on the parachutes, together with the availability of the test bench on the ground, is already paying off» concludes Blancquaert. «We are ready and we can’t wait to carry out the next high altitude drop tests. Landing safely on Mars is notoriously difficult. Investing our efforts in this test strategy is essential to ensure the success of the mission when we arrive on Mars in 2023 ».

Featured image: The sequence of deployment of the two pilot parachutes and the two main parachutes (15 and 35 meters) of the second ExoMars mission. Credits: Esa

Provided by INAF

Waves Of Iron And Plutonium From Cosmic Catastrophes (Planetary Science)

The production of heavy metals in the cosmos is largely delegated to rapid neutron capture processes by some types of supernovae and neutron star mergers. By analyzing the radioactive isotope content of iron and plutonium in the oceanic crust, a group of researchers investigated the relative contribution of these two phenomena to the enrichment of the Solar System over time and, with it, of the Earth. The study is published in Science

The Sun comes from a cloud of gas and dust secondhand, “recycled” after the explosion of a supernova , and about 5-10 million years going through a region called the Local Bubble ( local bubble ), what remains of another supernova explosion. This is why the Earth is rich in chemical elements, including metals such as iron and even heavier ones, which have allowed the richness of shapes, colors and living organisms that we see. It does not end there, because other elements have probably been deposited in the course of history by other similar stellar catastrophes that occurred in the vicinity. But what is the contribution of all these contaminations? A study published today in Scienceregarding the analysis of the chemical abundances of some heavy radioactive isotopes present in the oceanic crust, it would seem to suggest a rather complex picture.

We are chemically children of the stars, this is now well established. Stars are large natural furnaces that produce chemical elements heavier than hydrogen. We are children of the stars, but not of all in the same way. Those similar to the Sun, however – or a little smaller, but also a little larger – are unable to produce heavy metals during their life. Not because they don’t live long enough, but because the production of these elements doesn’t happen by means of nuclear fusion or, in astronomical jargon, stellar nucleosynthesis.. Iron itself is at the limit: it is produced by fusion but, unlike the lighter elements – such as helium, nitrogen, carbon, oxygen – it does not return the favor by providing useful energy to the star to counteract the gravitational force and not fall under its own. same weight.

To form the heavier elements, then, real catastrophic events are needed, such as the explosion of a supernova or the merger of two neutron stars . Half of the elements heavier than iron in the cosmos are produced by rapid neutron capture ( r processes ). Supernovae, specifically, create many of the building blocks of human life, such as iron, potassium and iodine. Even heavier elements, such as gold, uranium and plutonium, are instead attributed to rarer events, such as the merger of two neutron stars. Together with the heavy elements, however, radioactive isotopes are also generated, which decay over time as they are unstable.

Star explosions, supernovae, neutron stars: all catastrophic events far away in time and space, all events that have not made their extreme violence felt so far. Not directly, at least. Instead, they threw their products into space that reached the Earth several times, creating deposits that remained almost unchanged – subject only to natural decay – in the most remote places.

The authors of this study went looking for them in the oceanic crust . They were looking for two in particular: iron-60 – produced mainly in massive stars and supernovae, and with a half-life of 2.6 million years – and plutonium-244 – produced exclusively by r- processes and with a much longer half-life, 80.6 million years. Two isotopes that directly testify violent cosmic events that occurred in the vicinity of the Earth millions of years ago, and whose relationship can tell us something more about the astronomical catastrophes that have given metals to the solar system.

What events are we talking about, exactly, and when did they happen?

“The story is complicated,” explains first author Anton Wallner , a professor at the Australian National University . “Maybe this plutonium-244 was produced in supernova explosions or it could have remained from a much older event. But it could have been produced in an even more spectacular way, as in the detonation of a neutron star “

Any isotope of plutonium-244 and iron-60 that was present when the Earth formed from interstellar gas and dust more than four billion years ago has long since completely decayed. The traces found now, therefore, must have originated from more recent events . In particular, the dating of the sample allows us to trace at least two supernova explosions that occurred near the Earth, in the last 10 million years. In both, the relationship between the isotope of iron and that of plutonium is similar. The plutonium abundance, however, is lower than expected if the supernovae were the only ones responsible for the r processes We need to think of something new.

“Our data could be the first evidence that supernovae actually produce plutonium-244, ” Wallner continues. “Or maybe it was already in the interstellar medium before the supernova exploded, and it was pushed through the Solar System along with the material ejected from the supernova.”

The data collected in this study, therefore, on the one hand are compatible with the scenario according to which the passage of the Solar System through the local bubble enriches interstellar space and our planet, more frequently than the radioactive half-life of the deposited elements themselves. On the other hand, contaminations from rarer astrophysical sources – such as neutron star mergers – are less frequent, but indispensable to explain the isotope ratios found. The hypothesis of a near rare event that occurred before the formation of the solar system remains open. In short, where does that family gold chain come from, that bracelet you gave away or that ring from which you never separate, scientists still cannot say exactly. In the meantime, you can appreciate them even more, thinking that deep down,

Featured image: Artist’s impression of the merger of two neutron stars. Credits: Robin Dienel – Carnegie Institution for Science

To know more:

Provided by INAF

Mushrooms On Mars? Five Unproven Claims That Alien Life Exists (Astronomy)

recent study claims to have found evidence for mushroom-like life forms on the surface of Mars. As it happens, these particular features are well known and were discovered by cameras aboard Nasa’s Mars Exploration Rover Opportunity, shortly after it landed in 2004.

They are not, in fact, living organisms at all, but “haematite concretions” – small sphere-shaped pieces of the mineral haematite, and their exact origin is still debated by scientists. Haematite is a compound of iron and oxygen and is commercially important on Earth. The spherical rocks on Mars may have been created by the gradual accumulation of the material in slowly evaporating liquid water environments. They could also have been produced by volcanic activity.

Either way, mushrooms they are not. The area around Opportunity’s landing site is littered with them – they can be seen all over the surface and were also found buried beneath the soil and even embedded within rocks.

Fossilised worms

These space “mushrooms” were not the first claim of alien life. On August 7, 1996, the then US president Bill Clinton stood on the White House lawn and announced the possibility that scientists had discovered the ancient, fossilised remains of micro-organisms in a meteorite that had been recovered from Antarctica in 1984.

The meteorite, ALH 84001, is one of a handful of rocks we have from Mars. These were blasted off the surface of the planet by volcanic eruptions or meteorite impacts, drifted through space probably for millions of years, before ending up on Earth.

Image of the tube-like structures in meteorite.
High-resolution scanning electron microscope image of the structures. NASA

The tiny structures discovered within, using powerful microscopes, resemble microscopic worm-like organisms and are likely to be billions of years old. Debate over the true origins of these structures continues today – many scientists have pointed out that well known inorganic processes are quite capable of producing structures which resemble living organisms. In other words, simply because something might look a bit like life (mushrooms or otherwise), that does not mean it is.

Mystery gases

In the 1970s Nasa’s Viking robotic landers carried a series of experiments designed to test the Martian soil for the presence of microorganisms.

The experiments chemically treated small samples of Martian soil in reaction chambers on board the landers. In one of them, nutrients containing radioactive carbon-14 were added to the soil samples. In theory, this should be absorbed by any growing and multiplying microbes. The carbon-14 would then increasingly be “breathed out” over time, showing a steady increase in concentration within the reaction chamber.

After the chemical analyses, each soil sample was steadily heated to hundreds of degrees to destroy any microbes, with the intention of seeing whether any such reactions in the soil ceased. Intriguingly, this particular experiment did show a steady increase in carbon-14 over time which was indeed terminated after heating to above the boiling point of water. Several inorganic chemical reactions have been proposed as an explanation. These results therefore remain inconclusive and are still debated today.

More recently, minute quantities of methane have been found in the Martian atmosphere. This is also intriguing as living organisms on Earth are known to release methane. Once again, however, it must be stressed that this not conclusive proof of life. Methane can also be produced by several inorganic processes, including by heated rocks.


In 1977, the Big Ear radio telescope  in the US detected an unusual radio signal while scanning the sky. The signal lasted for just a couple of minutes, was very high powered and was detected over a narrow range of frequencies. These factors make it quite difficult to envisage a natural cause, as most natural radio sources can be detected across a wide range of frequencies.

The exact signal has not been detected again since, despite frequent radio surveys of the same part of the sky. The signal was so remarkable at the time that the astronomer on duty, Jerry Ehman, circled the print out of the signal with red pen and wrote “Wow!” next to it.

Various explanations have been proposed over the years including, recently, that the signal was generated by a passing comet, or transmissions from an Earth-orbiting satellite. The exact origin of the Wow! signal is still not fully agreed upon today, and remains an intriguing mystery.

Tabby’s Star

A key tool of planet hunting is the dimming method – observing light from a star to see if it periodically dips in a regular fashion as an orbiting planet passes in front of it. In 2015, professional astronomers working with citizen scientists from the Planet Hunters project announced the discovery of a nearby star displaying unusually strong and consistent dimming over time.

Artist's impression of an alien megastructure.
Alien megastructure? Droneandy/Shutterstock

Tabby’s Star is named after astronomer Tabitha Boyajian who was lead author on the paper announcing the discovery. Data from the Kepler Space Telescope showed not just a regular dimming, as one might expect from a planetary orbit, but highly irregular dips in the light and, interestingly, a consistent decrease in light output over several years.

This highly unusual behaviour prompted numerous theories to explain the observations, including cometary dust or debris from a massive impact gradually spreading out to cover the face of the star. Some also speculated that these were signatures of an advanced alien species building a structure around the star. But further observations have found no corroborating evidence to support this possibility. For example, radio telescopes have failed to detect any unusual radio emissions from the star. Today, the scientists behind the discovery believe that the unusual dips in light are caused by clouds of cosmic dust passing across the face of the star.

As exciting as they are, it is important to treat claims of alien life with a healthy dose of scepticism, and this is indeed what scientists do. No conclusive evidence that extraterrestrial life exists has been found … yet.

Featured image: Mushroom-like structures on Mars. Credit: NASA

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Salts Could Be Important Piece of Martian Organic Puzzle, NASA Scientists Find (Astronomy)

A NASA team has found that organic salts are likely present on Mars. Like shards of ancient pottery, these salts are the chemical remnants of organic compounds, such as those previously detected by NASA’s Curiosity rover. Organic compounds and salts on Mars could have formed by geologic processes or be remnants of ancient microbial life.

Besides adding more evidence to the idea that there once was organic matter on Mars, directly detecting organic salts would also support modern-day Martian habitability, given that on Earth, some organisms can use organic salts, such as oxalates and acetates, for energy.

“If we determine that there are organic salts concentrated anywhere on Mars, we’ll want to investigate those regions further, and ideally drill deeper below the surface where organic matter could be better preserved,” said James M. T. Lewis, an organic geochemist who led the research, published on March 30 in the Journal of Geophysical Research: Planets. Lewis is based at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. 

Lewis’s lab experiments and analysis of data from the Sample Analysis at Mars (SAM), a portable chemistry lab inside Curiosity’s belly, indirectly point to the presence of organic salts. But directly identifying them on Mars is hard to do with instruments like SAM, which heats Martian soil and rocks to release gases that reveal the composition of these samples. The challenge is that heating organic salts produces only simple gases that could be released by other ingredients in Martian soil.

Video: What do you do if you have a sample from another planet, and you want to find out if it contains a certain molecule…maybe even one that will reveal whether the planet can sustain life? When scientists face a situation like this, they use an amazing tool: the mass spectrometer. It separates out materials, allowing scientists to look very closely at a sample and see what’s inside.Credits: NASA/Goddard Space Flight Center

However, Lewis and his team propose that another Curiosity instrument that uses a different technique to peer at Martian soil, the Chemistry and Mineralogy instrument, or CheMin for short, could detect certain organic salts if they are present in sufficient amounts. So far, CheMin has not detected organic salts.

Finding organic molecules, or their organic salt remnants, is essential in NASA’s search for life on other worlds. But this is a challenging task on the surface of Mars, where billions of years of radiation have erased or broken apart organic matter. Like an archeologist digging up pieces of pottery, Curiosity collects Martian soil and rocks, which may contain tiny chunks of organic compounds, and then SAM and other instruments identify their chemical structure.

Using data that Curiosity beams down to Earth, scientists like Lewis and his team try to piece together these broken organic pieces. Their goal is to infer what type of larger molecules they may once have belonged to and what those molecules could reveal about the ancient environment and potential biology on Mars.

“We’re trying to unravel billions of years of organic chemistry,” Lewis said, “and in that organic record there could be the ultimate prize: evidence that life once existed on the Red Planet.”

While some experts have predicted for decades that ancient organic compounds are preserved on Mars, it took experiments by Curiosity’s SAM to confirm this. For example, in 2018, NASA Goddard astrobiologist Jennifer L. Eigenbrode led an international team of Curiosity mission scientists who reported the detection of myriad molecules containing an essential element of life as we know it: carbon. Scientists identify most carbon-containing molecules as “organic.”

Video: Research scientist Dr. Jennifer Eigenbrode discusses the discovery of ancient organic molecules on Mars. Credits: NASA’s Goddard Space Flight Center/Dan Gallagher

“The fact that there’s organic matter preserved in 3-billion-year-old rocks, and we found it at the surface, is a very promising sign that we might be able to tap more information from better preserved samples below the surface,” Eigenbrode said. She worked with Lewis on this new study.

Analyzing Organic Salts in the Lab

Decades ago, scientists predicted that organic compounds on Mars could be breaking down into salts. These salts, they argued, would be more likely to persist on the Martian surface than big, complex molecules, such as the ones that are associated with the functioning of living things.

If there were organic salts present in Martian samples, Lewis and his team wanted to find out how getting heated in the SAM oven could affect what types of gases they would release. SAM works by heating samples to upwards of 1,800 degrees Fahrenheit (1,000 degrees Celsius). The heat breaks apart molecules, releasing some of them as gases. Different molecules release different gases at specific temperatures; thus, by looking at which temperatures release which gases, scientists can infer what the sample is made of. 

“When heating Martian samples, there are many interactions that can happen between minerals and organic matter that could make it more difficult to draw conclusions from our experiments, so the work we’re doing is trying to pick apart those interactions so that scientists doing analyses on Mars can use this information,” Lewis said.

Lewis analyzed a range of organic salts mixed with an inert silica powder to replicate a Martian rock. He also investigated the impact of adding perchlorates to the silica mixtures. Perchlorates are salts containing chlorine and oxygen, and they are common on Mars. Scientists have long worried that they could interfere with experiments seeking signs of organic matter.

First photo taken on Mars in 1976 by Viking 1.
This is the first photo ever taken on the surface of Mars. It was taken by NASA’s Viking 1 spacecraft just minutes after it landed on the Red Planet on July 20, 1976. Credits: NASA/JPL

Indeed, researchers found that perchlorates did interfere with their experiments, and they pinpointed how. But they also found that the results they collected from perchlorate-containing samples better matched SAM data than when perchlorates were absent, bolstering the likelihood that organic salts are present on Mars.

Additionally, Lewis and his team reported that organic salts could be detected by Curiosity’s instrument CheMin. To determine the composition of a sample, CheMin shoots X-rays at it and measures the angle at which the X-rays are diffracted toward the detector.

Curiosity’s SAM and CheMin teams will continue to search for signals of organic salts as the rover moves into a new region on Mount Sharp in Gale Crater.

Soon, scientists will also have an opportunity to study better-preserved soil below the Martian surface. The European Space Agency’s forthcoming ExoMars rover, which is equipped to drill down to 6.5 feet, or 2 meters, will carry a Goddard instrument that will analyze the chemistry of these deeper Martian layers. NASA’s Perseverance rover doesn’t have an instrument that can detect organic salts, but the rover is collecting samples for future return to Earth, where scientists can use sophisticated lab machines to look for organic compounds.

Banner image: This look back at a dune that NASA’s Curiosity Mars rover drove across was taken by the rover’s Mast Camera (Mastcam) on Feb. 9, 2014, or the 538th Martian day, or sol, of Curiosity’s mission. For scale, the distance between the parallel wheel tracks is about 9 feet (2.7 meters). The dune is about 3 feet (1 meter) tall in the middle of its span across an opening called “Dingo Gap.” This view is looking eastward. Credits: NASA/JPL-Caltech/MSSS.

Reference: Lewis, J. M. T., Eigenbrode, J. L., Wong, G. M., McAdam, A. C., Archer, P. D., Sutter, B., et al. (2021). Pyrolysis of oxalate, acetate, and perchlorate mixtures and the implications for organic salts on Mars. Journal of Geophysical Research: Planets, 126, e2020JE006803.

Provided by NASA

Computational Researchers Develop Advanced Model To Improve Safety Of Next-generation Reactors (Physics)

When one of the largest modern earthquakes struck Japan on March 11, 2011, the nuclear reactors at Fukushima-Daiichi automatically shut down, as designed. The emergency systems, which would have helped maintain the necessary cooling of the core, were destroyed by the subsequent tsunami. Because the reactor could no longer cool itself, the core overheated, resulting in a severe nuclear meltdown, the likes of which haven’t been seen since the Chernobyl disaster in 1986.  

 Since then, reactors have improved exponentially in terms of safety, sustainability and efficiency. Unlike the light-water reactors at Fukushima, which had liquid coolant and uranium fuel, the current generation of reactors has a variety of coolant options, including molten-salt mixtures, supercritical water and even gases like helium.

Dr. Jean Ragusa and Dr. Mauricio Eduardo Tano Retamales from the Department of Nuclear Engineering at Texas A&M University have been studying a new fourth-generation reactor, pebble bed reactors. Pebble-bed reactors use spherical fuel elements (known as pebbles) and a fluid coolant (usually a gas).

“There are about 40,000 fuel pebbles in such a reactor,” said Ragusa. “Think of the reactor as a really big bucket with 40,000 tennis balls inside.”

During an accident, as the gas in the reactor core begins to heat up, the cold air from below begins to rise, a process known as natural convection cooling. Additionally, the fuel pebbles are made from pyrolytic carbon and tristructural-isotropic particles, making them resistant to temperatures as high as 3,000 degrees Fahrenheit. As a very-high-temperature reactor (VHTR), pebble-bed reactors can be cooled down by passive natural circulation, making it theoretically impossible for an accident like Fukushima to occur.

Pebble-bed reactors use passive natural circulation to cool down, making it theoretically impossible for a core meltdown to occur. © Dr. Jean Ragusa and Dr. Mauricio Eduardo Tano Retamales/Texas A&M University Engineering

However, during normal operation, a high-speed flow cools the pebbles. This flow creates movement around and between the fuel pebbles, similar to the way a gust of wind changes the trajectory of a tennis ball. How do you account for the friction between the pebbles and the influence of that friction in the cooling process?

This is the question that Ragusa and Tano aimed to answer in their most recent publication  in the journal Nuclear Technology titled “Coupled Computational Fluid Dynamics–Discrete Element Method Study of Bypass Flows in a Pebble-Bed Reactor.

“We solved for the location of these ‘tennis balls’ using the Discrete Element Method, where we account for the flow-induced motion and friction between all the tennis balls,” said Tano. “The coupled model is then tested against thermal measurements in the SANA experiment.” 

The SANA experiment was conducted in the early 1990s and measured how the mechanisms in a reactor interchange when transmitting heat from the center of the cylinder to the outer part. This experiment allowed Tano and Ragusa to have a standard to which they could validate their models.

As a result, their teams developed a coupled Computational Fluid Dynamics-Discrete Element Methods model for studying the flow over a pebble bed. This model can now be applied to all high-temperature pebble-bed reactors and is the first computational model of its kind to do so. It’s very-high-accuracy tools such as this that allow vendors to develop better reactors. 

“The computational models we create help us more accurately assess different physical phenomena in the reactor,” said Tano. “As a result, reactors can operate at a higher margin, theoretically producing more power while increasing the safety of the reactor. We do the same thing with our models for molten-salt reactors for the Department of Energy.” 

As artificial intelligence continues to advance, its applications to computational modeling and simulation grow. “We’re in a very exciting time for the field,” said Ragusa. “And we encourage any prospective students who are interested in computational modeling to reach out, because this field will hopefully be around for a long time.”

Featured image: Pebble-bed reactors use passive natural circulation to cool down, making it theoretically impossible for a core meltdown to occur. | Image: Getty Images

Provided by Texas A&M University

Brain’s Memory Center Stays Active During ‘Infantile Amnesia’ (Neuroscience)

One trait shared by all humans is that they don’t remember specific life episodes that occurred before the age of 3 or 4.  Many scientists have attributed this so-called “infantile amnesia” to a lack of development in the hippocampus, an area of the brain located in the temporal lobe that is crucial to encoding memory.

However, a new brain imaging study by Yale scientists shows that infants as young as three months are already enlisting the hippocampus to recognize and learn patterns. The findings were published May 21 in the journal Current Biology.

“A fundamental mystery about human nature is that we remember almost nothing from birth through early childhood, yet we learn so much critical information during that time — our first language, how to walk, objects and foods, and social bonds,” said Nick Turk-Browne, a professor of psychology at Yale and senior author of the paper.

For the new study, the Yale team used a new functional magnetic resonance imaging

(fMRI) technology to capture activity in the hippocampus in 17 babies, aged three months to two years old, as they were presented two sets of images on a screen. One set of images appeared as a structured sequence containing hidden patterns that could be learned. In the other, images appeared in a random order that offered no opportunity for learning. After the babies were shown these two sets of images several times, the hippocampus responded more strongly to the structured image set than to the random image set.

What might be happening, Turk-Browne said, is that as a baby gains experience in the world, their brain searches for general patterns that help them understand and predict the surrounding environment. This happens even though the brain is not equipped to permanently store each individual experience about a specific moment in space and time – the hallmark of episodic memory that is also lost in adult amnesia.

The strategy makes sense because learning general knowledge — such as patterns of sounds that make up the words in a language — may be more important to a baby than remembering specific details, such as a single incident in which a particular word was uttered.

The size of the hippocampus doubles in the first two years of life and eventually develops connections necessary to store episodic memories, Turk-Browne said.

“As these circuit changes occur, we eventually obtain the ability to store memories,” he said. “But our research shows that even if we can’t remember infant experiences later on in life, they are being recorded nevertheless in a way that allows us to learn from them.”

Yale’s Cameron Ellis is first author of the study, and this research was included in his recently completed and award-winning PhD dissertation.

Featured image credit:

Reference: Cameron T. Ellis, Lena J. Skalaban et al., “Evidence of hippocampal learning in human infants”, Current Biology, 2021. DOI:

Provided by Yale University

Itch Insight: Skin Itch Mechanisms Differ On Hairless Versus Hairy Skin (Medicine)

Research breakthrough could inform different treatments for chronic skin itch sufferers

Chronic skin itching drives more people to the dermatologist than any other condition. In fact, the latest science literature finds that 7% of U.S. adults, and between 10 and 20% of people in developed countries, suffer from dermatitis, a common skin inflammatory condition that causes itching.

“Itch is a significant clinical problem, often caused by underlying medical conditions in the skin, liver, or kidney. Due to our limited understanding of itch mechanisms, we don’t have effective treatment for the majority of patients,” said Liang Han, an assistant professor in the Georgia Institute of Technology’s School of Biological Sciences who is also a researcher in the Parker H. Petit Institute for Bioengineering and Bioscience.

Until recently, neuroscientists considered the mechanisms of skin itch the same. But Han and her research team recently uncovered differences in itch in non-hairy versus hairy areas of the skin, opening new areas for research.

Their research, published April 13 in the journal PNAS (Proceedings of the National Academy of Sciences of the United States of America), could open new, more effective treatments for patients suffering from persistent skin itching.

Itch Origins More Than Skin Deep

According to researchers, there are two different types of stimuli from the nervous system that trigger the itch sensation through sensory nerves in the skin: chemical and mechanical. In their study, Han and her team identified a specific neuron population that controls itching in ‘glabrous’ skin ?the smoother, tougher skin that’s found on the palms of hands and feet soles.

Georgia Tech researchers Liang Han (left) and Haley Steele (right) have uncovered differences in itch on hairy versus non-hairy skin that could lead to more effective treatments for patients with persistent skin itching. © Christopher Moore, Georgia Tech

Itching in those areas poses greater difficulty for sufferers and is surprisingly common. In the U.S., there are an estimated 200,000 cases a year of dyshidrosis, a skin condition causing itchy blisters to develop only on the palm and soles. Another chronic skin condition, palmoplantar pustulosis (a type of psoriasis that causes inflamed, scaly skin and intense itch on the palms and soles), affects as many as 1.6 million people in the U.S. each year.

“That’s actually one of the most debilitating places (to get an itch),” said first author Haley R. Steele, a graduate student in the School of Biological Sciences. “If your hands are itchy, it’s hard to grasp things, and if it’s your feet, it can be hard to walk. If there’s an itch on your arm, you can still type. You’ll be distracted, but you’ll be OK. But if it’s your hands and feet, it’s harder to do everyday things.”

Ability to Block, Activate Itch-causing Neurons in Lab Mice

Since many biological mechanisms underlying itch — such as receptors and nerve pathways — are similar in mice and people, most itch studies rely on mice testing. Using mice in their lab, Georgia Tech researchers were able to activate or block these neurons.

The research shows, for the first time, “the actual neurons that send itch are different populations. Neurons that are in hairy skin that do not sense itch in glabrous skins are one population, and another senses itch in glabrous skins.”

Sensory nerves in the glabrous skin (non-hairy skin of the mouse plantar hindpaw) are labeled by red fluorescence. © Christopher Moore, Georgia Tech

Why has an explanation so far eluded science? “I think one reason is because most of the people in the field kind of assumed it was the same mechanism that’s controlling the sensation. It’s technically challenging. It’s more difficult than working on hairy skin,” Han said.

To overcome this technical hurdle, the team used a new investigative procedure, or assay, modeled after human allergic contact dermatitis, Steele said.

The previous method would have involved injecting itch-causing chemicals into mice skin, but most of a mouse’s skin is covered with hair. The team had to focus on the smooth glabrous skin on tiny mice hands and feet. Using genetically modified mice also helped identify the right sensory neurons responsible for glabrous skin itches.

“We activated a particular set of neurons that causes itch, and we saw that biting behavior again modeled,” said Steele, referring to how mice usually deal with itchy skin.

One set of study mice was given a chemical to specifically kill an entire line of neurons. Focusing on three previously known neuron mechanisms related to itch sensation found in hairy skin, they found that two of the neurons, MrgprA3+ and MrgprD+, did not play important roles in non-hairy skin itch, but the third neuron, MrgprC11+, did. Removing it reduced both acute and chronic itching in the soles and palms of test mice.

Potential to Drive New Treatments for Chronic Itch

Han’s team hopes that the research leads to treatments that will turn off those itch-inducing neurons, perhaps by blocking them in human skin.

“To date, most treatments for skin itch do not discriminate between hairy and glabrous skin except for potential medication potency due to the increased skin thickness in glabrous skin,” observed Ron Feldman, assistant professor in the Department of Dermatology in the Emory University School of Medicine. Georgia Tech’s findings “provide a rationale for developing therapies targeting chronic itching of the hands and feet that, if left untreated, can greatly affect patient quality of life,” he concluded.

What’s next for Han and her team? “We would like to investigate how these neurons transmit information to the spinal cord and brain,” said Han, who also wants to investigate the mechanisms of chronic itch conditions that mainly affect glabrous skin such as cholestatic itch, or itch due to reduced or blocked bile flow often seen in liver and biliary system diseases.

“I joined this lab because I love working with Liang Han,” added Steele, who selected glabrous skin itch research for her Ph.D. “because it was the most technically challenging and had the greatest potential for being really interesting and significant to the field.”

This work was supported by grants from the U.S. National Institutes of Health and the Pfizer Aspire Dermatology Award to Liang Han.

Featured image: These mouse hindpaw sections allow Georgia Tech researchers to visualize skin nerves. © Christopher Moore, Georgia Tech

Reference: H. Steele, et al., “MrgprC11+ sensory neurons mediate glabrous skin itch.” PNAS April 13, 2021 118 (15) e2022874118;

Provided by Georgia Institute of Technology

Study Confirm Controlling Blood Pressure Critically Important in Preventing Heart Disease and Stroke (Medicine)

UH Cleveland Medical Center and Case Western Reserve University’s Jackson T. Wright Jr., MD, PhD, and Mahboob Rahman, MD, authors on NEJM study showing persistent positive effects of 120 target blood pressure

Follow-up data from the landmark SPRINT study of the effect of high blood pressure on cardiovascular disease have confirmed that aggressive blood pressure management — lowering systolic blood pressure to less than 120 mm Hg — dramatically reduces the risk of heart disease, stroke, and death from these diseases, as well as death from all causes, compared to lowering systolic blood pressure to less than 140 mm Hg. Systolic blood pressure (SBP) is the upper number in the blood pressure measurement, 140/90, for example.

In findings published in the May 20, 2021 issue of the New England Journal of Medicine, investigators presented new evidence of the effectiveness of reducing SBP to a target range of less than 120 mm Hg.

Jackson T Wright Jr. MD, PhD., and Mahboob Rahman MD, investigators from University Hospitals Cleveland Medical Center and Case Western Reserve University School of Medicine, played a lead role in the design, conduct, analyses and publication of the SPRINT trial. UH and CWRU coordinated one of the five Clinical Center Networks (CCNs) across the country selected to conduct the trial that had recruited more than 9,300 participants.

“This final report of the findings from SPRINT, now including all cardiovascular and mortality trial events, confirms the benefit of more aggressive BP lowering compared the previously recommended target of less than 140/90 mmHg,” said Dr. Wright, Director of the Clinical Hypertension Program at UH and Professor Emeritus of Medicine at CWRU.

SPRINT was a randomized controlled clinical trial sponsored by the National Heart, Lung, and Blood Institute, part of the National Institutes of Health. Beginning in late 2009, it enrolled more than 9,000 participants at least 50 years old who had SBP 130 to 180 and had increased cardiovascular disease risk. The NIH ceased randomly assigned treatments in 2015, when data was presented to the Data and Safety Monitoring Board showing treatment to SBP of less than 120 decreased the rate of a composite cardiovascular disease (CVD) outcome by 25 percent and the rate of all-cause death by 27 percent. 

Researchers reported these findings in 2015, but continued to collect data into July 2016. The current paper confirms and enhances the earlier findings.

SPRINT’s primary outcome was lower risk of having one of a composite of different types of cardiovascular disease outcomes related to blood pressure. These included heart attack, an acute coronary syndrome not resulting in a heart attack, stroke, acute heart failure, or death from cardiovascular disease. 

The final results showed the risk of the primary outcome of the trial was decreased 27 percent and death from all causes was decreased by 25 percent in the group treated to less than 120 mm Hg compared to the group treated to less than 140 mm Hg. 

“One criticism of the original SPRINT findings was that, of the components of the primary outcome, only heart failure and death due to CVD were significantly lower in the intensively treated group,” Cora E. Lewis, MD, Professor and Chair of the Department of Epidemiology in the University of Alabama at Birmingham School of Public Health, and primary investigator of the study. “The final results found that risk of heart attack, along with heart failure, and death from CVD, was significantly lower in the group treated to less than 120, and the risk of the primary outcome excluding heart failure was still significantly lower in the more intensively treated group.”   

SPRINT also collected data on safety of the interventions. The investigators anticipated that serious adverse events, including hospitalizations overall, as well as hospitalizations and emergency room visits for specific conditions of interest, might be related to more intensive treatment of blood pressure with medicines. The final paper reports that overall serious adverse events did not differ, but there were more cases of some of the conditions of interest in the group treated to SBP of less than 120, including low blood pressure, fainting and acute injury to the kidneys, which usually resolved within one year. Falls leading to injury did not differ.

Hypertension, high blood pressure, is a hugely important risk factor for the leading cause of death worldwide: cardiovascular disease or CVD, said Dr. Rahman. “CVD has been the number one killer in the U.S. for decades, even in 2020, when we were dealing with COVID-19, which was the number three killer that year in the U.S.  Elevated blood pressure is the leading contributor to preventable deaths worldwide of 67 risk factors studied (including tobacco).”
“The take-home message from SPRINT is to talk to your doctor about your blood pressure to determine a good goal for you based on your overall cardiovascular disease risk. Then work with your doctor to achieve that goal,” said Dr. Rahman.

Prior to the SPRINT trial, research had shown that treating high blood pressure helped decrease risk of CVD, but the optimum SBP goal was unknown. In 2007, a group of experts in high blood pressure research suggested that determining the appropriate goal of SBP to reduce the risk of heart disease was of the utmost importance in preventing complications from hypertension.

“We know a lot about how to prevent and treat hypertension and SPRINT continues to greatly expand this knowledge, including the benefits of treatment on the heart, kidney and brain,” said David Goff, M.D., Ph.D., director of the Division of Cardiovascular Sciences at NHLBI. “As we implement what we know, more research is still needed to develop more effective prevention strategies for hypertension, improve its monitoring and control, and reduce the large health disparities associated with this disorder. Research teams supported by the NIH are continuing to work on these challenges.”  

Nearly half of adults age 20 years and older in the United States have high blood pressure, which is defined as SBP of 130 or more or diastolic blood pressure (the lower number) of 80 or more.
World Hypertension Day was May 17, 2021, recognizing how important high blood pressure is to the health of the world’s population.

In addition to primary sponsorship by the NHLBI, SPRINT was co-sponsored by the NIH’s National Institute of Diabetes and Digestive and Kidney Diseases, the National Institute of Neurological Disorders and Stroke, and the National Institute on Aging.

Institutions involved with the SPRINT study: University of Alabama at Birmingham; University Hospitals Cleveland Medical Center and Case Western Reserve University; National Heart, Blood and Lung Institute; University of Utah School of Medicine; University of Tennessee Health Science Center; Wake Forest School of Medicine, and Tulane University School of Public Health and Tropical Medicine.

Reference: The SPRINT Research Group, “Final Report of a Trial of Intensive versus Standard Blood-Pressure Control”, N Engl J Med 2021; 384:1921-1930
DOI: 10.1056/NEJMoa1901281

Provided by University Hospitals cleveland medical center