Astrofix: An Image Imputation Algorithm Based on Gaussian Process Regression (Astronomy)

The history of data artifacts is as long as the history of observational astronomy. Artifacts such as dead pixels, hot pixels, and cosmic ray hits are common in astronomical images. They at best render the pixels’ data unusable while, at worst, disable the entire image in downstream approaches.

In dealing with missing pixels, some astronomical procedures simply ignore them while others require imputing their values first. Optimal extraction of spectra and Point Spread Function (PSF) photometry ignore missing data, while box spectral extraction and aperture photometry do not. Aperture photometry and box extraction have the advantage of requiring little knowledge about the PSF or line-spread function (LSF). For this reason, aperture photometry has been used for ultra-precise Kepler photometry. Box extraction is currently standard for the SPHERE and GPI integral-field spectrographs.

In general, correcting the corrupted data in an image involves two steps: identifying what they are and imputing their values. Existing algorithms have emphasized bad pixel identification and rejection. For example, there are well-developed packages that detect cosmic rays (CRs) by comparing multiple exposures. When multiple exposures are not available, Rhoads rejects CRs by applying a PSF filter, van Dokkum by Laplacian edge detection (LACosmic), and Pych by iterative histogram analysis. Among the above methods, LACosmic offers the best performance. Approaches based on machine learning like deepCR, a deep-learning algorithm, may offer further improvements.

In contrast, the literature on methods of imputing missing data is sparse. Currently, the most common approach is the median replacement, which replaces a bad pixel with the median of its neighbours. Algorithms that apply median replacement include LACosmic. Some other packages, such as astropy.convolution, take an average of the surrounding pixels, weighted by Gaussian kernels. An alternative is a 1D linear interpolation. This approach is standard for the integral-field spectrographs GPI and SPHERE. deepCR, on the other hand, predicts the true pixel values by a trained neural network. However, none of these methods are statistically well-motivated, and they usually apply a fixed interpolation kernel to all images and everywhere on the same image. In reality, however, the optimal kernel could vary from image to image and even from pixel to pixel. Moreover, in a continuous region of bad pixels or near the boundary of an image, most existing data imputation approaches either have their performance compromised or have to treat these regions as special cases. Only deepCR can handle them naturally with minimal performance differences.

Now, Zhang and Brandt in their recent paper presented astrofix, a robust and flexible image imputation algorithm based on Gaussian Process Regression (GPR). Through an optimization process, astrofix chooses and applies a different interpolation kernel to each image, using a training set extracted automatically from that image. It naturally handles clusters of bad pixels and image edges and adapts to various instruments and image types, including both imaging and spectroscopy. The mean absolute error of astrofix is several times smaller than that of median replacement and interpolation by a Gaussian kernel (i.e. astropy.convulation).

Figure 1. Part of an image (top left) convolved with three different kernels: a median filter (second column), a Gaussian kernel with zero weight in the central pixel and renormalized to unit area (third column), and a GPR kernel (right column). The GPR kernel best restores the original image. It is constructed from the squared exponential covariance function with a = 3.02 and h = 0.72. The Gaussian kernel (third column) has standard deviation 0.72 to match the GPR kernel (right column). The bottom images show the interpolating kernels themselves. © Zhang and Brandt

astrofix accepts images with a bad pixel mask or images with bad pixels flagged as NaN, and it fixes any given image in three steps:

  1. Determine the training set of pixels that astrofix will attempt to reproduce.
  2. Find the optimal hyperparameters a and h (or a, hx and hy) given the training set from Step 1.
  3. Fix the image by imputing data for the ba.

According to authors, the actual performance of astrofix may depend on the initial guess for the optimization, the choice of the training set, and the resemblance of the covariance function to the instrumental PSF. Other covariance functions remain to be explored, and the use of sGPR should be considered carefully.

They also showed that astrofix also has good potential to be used for bad pixel detection. One could compare the GPR imputed values with the measured counts and the expected noise at each pixel, and iterate this procedure to reject continuous regions of bad pixels.

“astrofix has the potential to outperform conventional bad pixel detection algorithms because of its ability to train the imputation specifically for each image. This idea could be developed in future work.”

— concluded authors of the study.

They demonstrated good performance of astrofix on both imaging and spectroscopic data, including the SBIG 6303 0.4m telescope and the FLOYDS spectrograph of Las Cumbres Observatory and the CHARIS integral-field spectrograph on the Subaru Telescope.

algorithm is implemented in the Python package astrofix, which is available at this https URL

Featured image: Corrections to the CHARIS Image by GPR, the 5 × 5 median filter, and astropy.convolution. The counts are plotted on a logarithmic scale. GPR best restores the structure of the bars, while the two other approaches produce fuzzier images. © Zhang and Brandt


Reference: Hengyue Zhang, Timothy D. Brandt, “Cleaning Images with Gaussian Process Regression”, ArXiv, 23 March 2021. https://arxiv.org/abs/2103.12250


Copyright of this article totally belongs to our author S. Aman. One is allowed to reuse it only by giving proper credit either to him or to us

New Study Finds False Memories can be Reversed (Neuroscience)

Rich false memories of autobiographical events can be planted – and then reversed, a new paper has found.

The study highlights – for the first time – techniques that can correct false recollections without damaging true memories. It is published by researchers from the University of Portsmouth, UK, and the Universities of Hagen and Mainz, Germany.

There is plenty of psychological research which shows that memories are often reconstructed and therefore fallible and malleable. However, this is the first time research has shown that false memories of autobiographical events can be undone.

Studying how memories are created, identified and reversed could be a game changer in police and legal settings, where false memories given as evidence in a courtroom can lead to wrongful convictions.

According to Dr Hartmut Blank, co-author of the research from the University of Portsmouth’s Department of Psychology, “believing, or even remembering something that never happened may have severe consequences. In police interrogations or legal proceedings, for instance, it may lead to false confessions or false allegations, and it would be highly desirable, therefore, to reduce the risk of false memories in such settings.

“In this study, we made an important step in this direction by identifying interview techniques that can empower people to retract their false memories.”

The researchers recruited 52 participants for a study on ‘childhood memories’ and with the help of parents, they implanted two false negative memories that definitely didn’t happen, but were plausible. For example getting lost, running away or being involved in a car accident.

Along with two true events, which had actually happened, participants were persuaded by their parents that all four events were part of their autobiographical memory.

The participants were then asked to recall each event in multiple interview sessions. By the third session, most believed the false events had happened and – similar to previous research – about 40 per cent had developed actual false memories of them.

The researchers then attempted to undo the false memories by using two strategies.

The first involved reminding participants that memories may not always be based on people’s own experience, but also on other sources such as a photograph or a family member’s narrative. They were then asked about the source of each of the four events.

The second strategy involved explaining to them that being asked to repeatedly recall something can elicit false memories. They were asked to revisit their event memories with this in mind.

The result, according to Dr Blank, was that “by raising participants’ awareness of the possibility of false memories, urging them to critically reflect on their recollections and strengthening their trust in their own perspective, we were able to significantly reduce their false memories. Moreover, and importantly, this did not affect their ability to remember true events.

“We designed our techniques so that they can principally be applied in real-world situations. By empowering people to stay closer to their own truth, rather than rely on other sources, we showed we could help them realise what might be false or misremembered – something that could be very beneficial in forensic settings.”

The paper is published in the scientific journal Proceedings of the National Academy of Sciences.


Reference: Aileen Oeberst, Merle Madita Wachendörfer, Roland Imhoff, Hartmut Blank, “Rich false memories of autobiographical events can be reversed”, Proceedings of the National Academy of Sciences Mar 2021, 118 (13) e2026447118; DOI: 10.1073/pnas.2026447118


Provided by University of Portsmouth

How Activated T Cells Destroy the Liver? (Medicine)

Self-destructive immune cells cause fatty liver hepatitis

Fatty liver hepatitis can cause severe liver damage and liver cancer. A research team from the Technical University of Munich (TUM) has now discovered that self-destructive, so-called auto-aggressive cells of the immune system are responsible for the disease. This knowledge can help develop new forms of therapy to prevent the consequences of fatty liver hepatitis.

Excessive obesity is often associated with fatty liver inflammation. So far, it was largely unknown how the disease came about. The team led by the immunologist Prof. Percy Knolle from TUM has researched this process step by step in model systems on mice – and thus provides trend-setting insights into the development of fatty liver hepatitis in humans. “We were also able to observe all the steps that we saw in model systems in patients,” emphasizes Knolle. The team published the results in the renowned journal “Nature”.

Auto-aggressive immune cells destroy liver tissue

The immune system protects us from bacteria and viruses and from malignant tumors from developing. So-called killer CD8 T cells are particularly important here. They specifically recognize infected body cells and eliminate them. In fatty liver hepatitis, the CD8 T cells have lost this ability to specifically switch off. “We have discovered that the immune cells in fatty liver hepatitis are not activated by certain pathogens but by metabolic signals,” explains Michael Dudek, first author of the study. “The T cells activated in this way then indiscriminately destroy all cells in the liver.”

Gradual activation of T cells

Until then, the immune cells go through a unique, step-by-step cell activation that was previously unknown. Only when inflammatory signals and products of lipid metabolism act on the immune cells in the right order do they develop their destructive ability. “Similar to entering a security code to open a safe, the T cells are only ‘armed’ through the defined sequence of activation signals,” says Knolle, who is Professor of Molecular Immunology at TUM. The international research team identified an actually harmless metabolite as the trigger for the killing of tissue cells: the energy carrier ATP, which must be located outside the cells for this. Auto-aggressive CD8 T cells in the liver that reacted with ATP

Auto-aggression, but not autoimmunity

The destruction of tissue by auto-aggressive immune cells discovered by the researchers differs from conventional autoimmune diseases. Cells of the immune system specifically target certain body cells. However, tissue-destroying auto-aggressive T cells may also play a role in autoimmune diseases, although this is not yet known, according to the authors.

New therapies for fatty liver hepatitis

So far, fatty liver hepatitis can only be reversed if the triggering factors – obesity and a high-calorie diet – are eliminated, i.e. the patients change their lifestyle. The knowledge that fatty liver hepatitis is caused by activated immune cells is now providing impetus for the development of new therapies. “The destructive auto-aggressive form of the immune response can be clearly separated from the protective T-cell immune response against viruses and bacteria,” says Knolle. He is confident that targeted immunotherapies can be researched that only prevent tissue destruction.

Featured image: Prof. Percy Knolle and an international research team have discovered that immune cells are stimulated by certain signals to attack healthy liver cells and thus trigger fatty liver Hepatitis. Image: Andreas Heddergott


Publications:

M. Dudek, D. Pfister, S. Donakonda, P. Filpe, A. Schneider, M. Laschinger, D. Hartmann, N. Hüser, P. Meiser, F.Bayerl, D. Inverso, J. Wigger, M. Sebode, R. Öllinger, R. Rad, S. Hegenbarth, M. Anton, A. Guillot, A.Bowman, D. Heide, P. Ramadori, V. Leone, F. Müller, C. Garcia-Caceres, T. Gruber, G. Seifert, AM Kabat, J.-P. Malm, S. Reider, M. Effenberger, S. Roth, A. Billeter, B. Müller-Stich, EJ Pearce, F. Koch-Nolte, R. Käser, H. Tilg, R. Thimme, T. Böttler, F Tacke, J.-F. Dufour, D. Haller, PJ Murray, R. Heeren, D. Zehn, JP Böttcher, M. Heikenwälder, PA Knolle. Auto-aggressive CXCR6 + CD8 T cells cause liver immune pathology in NASH. Nature (2021). DOI: 10.1038 / s41586-021-03233-8 .


Provided by TUM

Is The Nearest Star Cluster to The Sun Being Destroyed? (Planetary Science)

Data from ESA’s Gaia star mapping satellite have revealed tantalising evidence that the nearest star cluster to the Sun is being disrupted by the gravitational influence of a massive but unseen structure in our galaxy.

If true, this might provide evidence for a suspected population of ‘dark matter sub-halos’. These invisible clouds of particles are thought to be relics from the formation of the Milky Way, and are now spread across the galaxy, making up an invisible substructure that exerts a noticeable gravitational influence on anything that drifts too close.

ESA Research Fellow Tereza Jerabkova and colleagues from ESA and the European Southern Observatory made the discovery while studying the way a nearby star cluster is merging into the general background of stars in our galaxy. This discovery was based on Gaia’s Early third Data Release (EDR3) and data from the second release.

The Hyades and their tidal tails © ESA

The team chose the Hyades as their target because it is the nearest star cluster to the Sun. It is located just over 153 light years away, and is easily visible to skywatchers in both northern and southern hemispheres as a conspicuous ‘V’ shape of bright stars that marks the head of the bull in the constellation of Taurus. Beyond the easily visible bright stars, telescopes reveal a hundred or so fainter ones contained in a spherical region of space, roughly 60 light years across.

A star cluster will naturally lose stars because as those stars move within the cluster they tug at each other gravitationally. This constant tugging slightly changes the stars’ velocities, moving some to the edges of the cluster. From there, the stars can be swept out by the gravitational pull of the galaxy, forming two long tails.

One tail trails the star cluster, the other pulls out ahead of it. They are known as tidal tails, and have been widely studied in colliding galaxies but no one had ever seen them from a nearby open star cluster, until very recently.

Video: Locating the Hyades tidal tails: The Hyades is an easily recognisable star cluster in the night sky. The brightest handful of stars define the face of Taurus, the Bull. Telescopes show that the central cluster itself contains many hundreds of fainter stars in a spherical region roughly 60 light years across. Previous studies have shown that stars were ‘leaking’ out of the cluster to form two tails that stretch into space. Gaia has now allowed astronomers to discover the true extent of those tails by tracing former members of the Hyades across the whole sky. The animation was created using Gaia Sky. © ESA/Gaia/DPAC, CC BY-SA 3.0 IGO; acknowledgement: S. Jordan/T. Sagrista.

The key to detecting tidal tails is spotting which stars in the sky are moving in a similar way to the star cluster. Gaia makes this easy because it is precisely measuring the distance and movement of more than a billion stars in our galaxy. “These are the two most important quantities that we need to search for tidal tails from star clusters in the Milky Way,” says Tereza.

Previous attempts by other teams had met with only limited success because the researchers had only looked for stars that closely matched the movement of the star cluster. This excluded members that left earlier in its 600–700 million year history and so are now travelling on different orbits.

To understand the range of orbits to look for, Tereza constructed a computer model that would simulate the various perturbations that escaping stars in the cluster might feel during their hundreds of millions of years in space. It was after running this code, and then comparing the simulations to the real data that the true extend of the Hyades tidal tails were revealed. Tereza and colleagues found thousands of former members in the Gaia data. These stars now stretch for thousands of light years across the galaxy in two enormous tidal tails.

But the real surprise was that the trailing tidal tail seemed to be missing stars. This indicates that something much more brutal is taking place than the star cluster gently ‘dissolving’.

Video: Evolution of Hyades star cluster from ~ 650 million years ago until now © Jerabkova et al., A&A, 2021

Running the simulations again, Tereza showed that the data could be reproduced if that tail had collided with a cloud of matter containing about 10 million solar masses. “There must have been a close interaction with this really massive clump, and the Hyades just got smashed,” she says.

But what could that clump be? There are no observations of a gas cloud or star cluster that massive nearby. If no visible structure is detected even in future targeted searches, Tereza suggests that object could be a dark matter sub-halo. These are naturally occurring clumps of dark matter that are thought to help shape the galaxy during its formation. This new work shows how Gaia is helping astronomers map out this invisible dark matter framework of the galaxy.

“With Gaia, the way we see the Milky Way has completely changed. And with these discoveries, we will be able to map the Milky Way’s sub-structures much better than ever before,” says Tereza. And having proved the technique with the Hyades, Tereza and colleagues are now extending the work by looking for tidal tails from other, more distant star clusters.

Featured image: Core of Hyades Cluster © ESA/Gaia/DPAC, CC BY-SA 3.0 IGO; acknowledgement: S. Jordan/T. Sagrista.


Notes for editors

“The 800pc long tidal tails of the Hyades star cluster: Possible discovery of candidate epicyclic over-densities from an open star cluster” by Tereza Jerabkova et al. will be published online by Astronomy and Astrophysics on 24 March 2021. https://www.aanda.org/10.1051/0004-6361/202039949


Provided by ESA

A Better Treatment For Sickle Cell Disease (Medicine)

An organ-on-a-chip device designed by Texas A&M researchers could provide a more personalized approach to addressing the illness.

Sickle cell disease is the most prevalent inherited blood disorder in the world, affecting 70,000 to 100,000 Americans. However, it is considered an orphan disease, meaning it impacts less than 200,000 people nationally, and is therefore underrepresented in therapeutic research.

A team led by Abhishek Jain from the Department of Biomedical Engineering at Texas A&M University is working to address this disease.

“I’m trying to create these new types of disease models that can impact health care, with the long-term goal of emphasizing on applying these tools and technologies to lower health care costs,” said Jain, assistant professor in the department. “We strategically wanted to pick up those disease systems which fall under the radar in orphan disease category.”

Jain’s research is in organ-on-a-chip, where cells from humans can be grown on USB-sized devices to mimic the way the organ would work inside the body. This sort of system is ideal for testing new drug treatments, as drugs cannot be tested on humans, and animal models have not shown to be a good representation of how a patient and disease would interact with a treatment. For sickle cell disease patients, the organ-on-a-chip would also be beneficial because patients can present with mild to severe cases.

Jain works with Tanmay Mathur, a fourth-year doctoral student who trained as a chemical engineer in his undergraduate years. His research focused on microfabrication techniques and simulations, skills he said merged well into the organ-on-a-chip research he now performs in Jain’s lab. The team collaborates closely with the Texas Medical Center in Houston.

The work was recently published in the journal Bioengineering & Translational Medicine. Their paper builds off a 2019 publication in the journal Lab on Chip, where the team demonstrated that endothelial cells (cells that line the blood vessels) could be used to model the disease physiology of a patient without having to stimulate the model to perform differently than a healthy vessel.

“Traditionally these cells were not used for disease modeling, so in that way our approach is very novel,” Mathur said. “We are one of first to harness these cells and employed them in disease modeling research.”

Mathur and Jain demonstrate that these models can be used to differentiate between patients. The first step: build a blood vessel that mimics a patient’s vessel. For that the team would need two components — patient blood and endothelial cells. Collecting the blood involved a simple blood draw. They faced a challenge with the endothelial cells, however. They would need to take a biopsy of the cells or use stem cells to grow their own, neither of which was ideal.

Then they found the answer was in the blood.

“What we learned is within blood samples are some endothelial cells also circulating,” Jain said. “We call them blood outgrowth endothelial cells that we can harness very easily. That’s what is new about this work. You can get those cells, grow them so that’s there’s enough in number and then you can make blood vessels.”

Now that they could build the vessels, the next step was to see if these models would show how the disease has various biological impacts in different patients. Again, the goal was to be able to test treatments on these models, so the closer they mimiced their human patient, the better.

“We’re able to differentiate a very severe sickle cell patient in terms of their phenotype from very mild patients,” Mathur said. “Moving forward, we can take a larger population of any sickle cell disease patients and assess them using our organ-chip technology and then categorize them into different groups based on symptoms.”

Their findings indicate that these organs-on-a-chip could lead to patient-centric, personalized treatment, improving how clinicians approach this and other cardiovascular diseases.

“When you take it to the field, now it can become a predictive device,” Jain said. “Now you do not have to know whether the patient is mild or severe, you can test for that. You can predict if patient is serious and can dictate their therapeutic needs.”

The next step is to continue to expand the patient cohort to collect more results. A long-term goal would be to use the patient information collected to develop a database to better predict disease progression.

“You take a history of a lot of these patients and their cardiovascular health with this device, and you can predict which patient might have better chance of having a stroke and you start treating them early on,” Jain said.

Mathur said even with future challenges, he looks forward to continuing their research.

“I think even though it may take 10, 15 years, we will at least push forward some of the research that we’re doing and get it out in the clinical field,” he said. “We are one of the only groups in the world that have started this field of personalized treatment. I feel that our impact is pretty high, and I’m sure we will be able to expand the same treatment to other cardiovascular diseases and attract more attention and deeper insights into the biology that we are looking at.”

This work is funded by a Trailblazer Award Jain received from the National Institute of Biomedical Imaging and Bioengineering.

Featured image: Doctoral student Tanmay Mathur (left) and Abhishek Jain review photos of blood cells formed on the organ-on-a-chip in their lab. © Texas A&M Engineering


Reference: Mathur, T, Flanagan, JM, Jain, A. Tripartite collaboration of blood‐derived endothelial cells, next generation RNA sequencing and bioengineered vessel‐chip may distinguish vasculopathy and thrombosis among sickle cell disease patients. Bioeng Transl Med. 2021;e10211. https://doi.org/10.1002/btm2.10211


Provided by Texas A&M University

Even Small Levels Of Nitrate In Drinking Water Results In Smaller Babies (Medicine)

It appears that the weight of newborn babies decreases if even small amounts of nitrates are present in the drinking water that mothers drink before and during pregnancy. This is shown by a major new register-based study carried out by researchers at the universities in Aarhus and Chicago, USA. They now question whether the threshold value is too high.

The more nitrate there is in mothers’ drinking water, the smaller the babies they give birth to. But alarmingly, the declining birth weight can also be registered when the women are exposed to nitrate levels below the EU’s threshold of 50 milligrams of nitrate per litre.

This is shown by a register-based study of more than 850,000 births in Denmark carried out in a Danish-American partnership led by Professor Torben Sigsgaard from the Department of Public Health at Aarhus University and PhD Vanessa Coffman from the corresponding department at the University of Illinois, Chicago (UIC).

Both shorter and weighing ten grams less

On the basis of Danish registry data, the research group concluded that babies born to mothers whose drinking water contains between 25 and 50 milligrams of nitrates per litre – i.e. from half of the current threshold value up to the maximum limit – on average weigh ten grams less than babies born to mothers with smaller amounts of nitrate in the tap water. Not only did the babies weigh less, they were also slightly shorter, while their head size was unaffected by the amount of nitrate – which is the form of nitrogen run off from the agricultural sector that most frequently appears in groundwater.

According to Professor Torben Sigsgaard from Aarhus University, it is difficult to say whether we should be concerned about public health in areas with high amounts of nitrate:

“The difference in body length and weight doesn’t sound like much at first as it’s on average only ten grams, but this is not insignificant if the newborn also begins life as underweight for other reasons. Birth weight is generally recognised as having a life-long impact on a person’s health and development,” says Torben Sigsgaard.

“There is no doubt that the results of the study challenge the threshold value that is in place throughout the Western world, and that any changes will be a bit like turning around a supertanker. But it’s important to discuss these results,” he adds with reference to the WHO, EU and American authorities who all view drinking water as harmful when the content of nitrates is higher than fifty milligrams per litre.

Blue children or small children?

The study was initiated because it has long been known that very high nitrate concentrations may lead to people being exposed to nitrite. This inhibits the body’s ability to absorb oxygen and can lead to the dangerous blue-baby syndrome, or methemoglobinemia to give it its medical name. Nitrate in drinking water is also suspected of causing other chronic diseases, including bowel cancer. Research has also documented how, depending on local geological and geochemical conditions in theearth, the fertiliser used in agriculture more or less percolates down to the groundwater.

“With the study, we’ve established that there is a need to explore the effect of the low nitrate concentrations in the drinking water, if we’re to assess the adequacy of the current threshold values – and this is possible thanks to the unique Danish registers. It wouldn’t be possible to carry out corresponding studies on the basis of US data alone, because such data simply doesn’t exist,” says Torben Sigsgaard.

See how the researchers gather information about the drinking water and read more below the animation:

The research results – more information

  • The register-based study compares data from more than 850,000 births in Denmark during the period 1991-2011 including the weight, height and head size of the newborn babies – before in turn comparing this with the content of nitrate in drinking water via the parents’ place of residence and the GEUS Jupiter register, which contains information about the quality of drinking water in Danish households based on more than 300,000 nitrate samples from routine water quality monitoring. GEUS stands for Geological Survey of Denmark and Greenland.
  • Vanessa Coffman and Leslie Stayner from the University of Illinois, Chicago (UIC) and Jörg Schullehner and Birgitte Hansen from GEUS, were important partners on the project. Additional contributions came from colleagues at Aarhus University, including the “Drinking water group” under the Center for Integrated Register-based Research (CIRRAU), as well as researchers from the University of Copenhagen.
  • The study is financed by the National Institute of Environmental Health Sciences (NIEHS), USA.

Featured image: There is no doubt that the results of the study challenge the threshold value that is in place throughout the Western world, says professor Torben Sigsgaard. Photo: Greg McQueen Photography.


Reference:

Vanessa R. Coffman,Anja Søndergaard Jensen,Betina B. Trabjerg,Carsten B. Pedersen,Birgitte Hansen,Torben Sigsgaard,Jørn Olsen,Inger Schaumburg,Jörg Schullehner,Marie Pedersen,and Leslie T. Stayner, 2021, “Prenatal Exposure to Nitrate from Drinking Water and Markers of Fetal Growth Restriction: A Population-Based Study of Nearly One Million Danish-Born Children“, Environmental Health Perspectives 129:2 CID: 027002 https://doi.org/10.1289/EHP7331


Provided by Aarhus University

Maletic-Savatic Lab Discovers a Novel Marker of Adult Human Neural Stem Cells (Neuroscience)

 The mammalian center for learning and memory, hippocampus, has a remarkable capacity to generate new neurons throughout life. Newborn neurons are produced by neural stem cells (NSCs) and they are crucial for forming neural circuits required for learning and memory, and mood control. During aging, the number of NSCs declines, leading to decreased neurogenesis and age-associated cognitive decline, anxiety, and depression. Thus, identifying the core molecular machinery responsible for NSC preservation is of fundamental importance if we are to use neurogenesis to halt or reverse hippocampal age-related pathology.  

While there are increasing number of tools available to study NSCs and neurogenesis in mouse models, one of the major hurdles in exploring this fundamental biological process in the human brain is the lack of specific NSCs markers amenable for advanced imaging and in vivo analysis. A team of researchers led by Dr. Mirjana Maletić-Savatić, associate professor at Baylor College of Medicine and investigator at the Jan and Dan Duncan Neurological Research Institute at Texas Children’s Hospital, and Dr. Louis Manganas, associate professor at the Stony Brook University, decided to tackle this problem in a rather unusual way. They reasoned that if they could find proteins that are present on the surface of NSCs, then they could eventually make agents to “see” NSCs in the human brain.  

“The ultimate goal of our research is to maintain neurogenesis throughout life at the same level as it is in the young brains, to prevent the decline in our cognitive capabilities and reduce the tendency towards mood disorders such as depression, as we age. To do that, however, we first need to better understand this elusive, yet fundamental process in humans. However, we do not have the tools to study this process in live humans and all the knowledge we have gathered so far comes from analyses of the postmortem brains. And we cannot develop tools to detect this process in people because existing NSC markers are present within cells and unreachable for in vivo visualization,” Maletić-Savatić said. “So, in collaboration with our colleagues from New York and Spain, we undertook this study to find surface markers and then develop tools such as ligands for positron emission tomography (PET) to visualize them using advanced real-time in vivo brain imaging.” 

Typically, antibodies are made against known antigens but the team set out to generate antibodies for unknown target proteins, which made their mission rather challenging. They solved this problem by relying on an age-old method of generating antibodies by injecting mice with whole-cell or membrane preparations. This resulted in 1648 clones out of which 39 reacted with NSCs. Upon closer examination, one potential candidate most strongly labeled NSCs. Mass spectrometric analysis of the human hippocampal tissue identified the target protein as the Brain-Abundant Signal Protein 1 (BASP-1), previously shown to be present in the neurons of the mouse brain but not in NSCs. Interestingly, the specific antibody that recognizes BASP-1 in NSCs did not label neurons or any other cells apart from NSCs, indicating that it could be used to visualize these cells in the live mammalian brain.  

“Using our new antibody, we found that BASP-1 is restricted to NSCs in neurogenic niches in the mammalian brains, including humans, during development in utero and after birth. Thus, our work identified membrane-bound BASP-1 protein as a possible biomarker of NSCs that would allow us to examine the mechanisms of adult human neurogenesis as well as to explore its role in the process,” Maletić-Savatić concluded. 

With this newly discovered biomarker, scientists can better understand the relevance and intricate mechanisms of neurogenesis, which may lead to new future therapeutic approaches to treat and manage neurological and neuropsychiatric disorders associated with diminished neurogenesis. The study was published in the journal, Nature Scientific Reports.

Other authors involved in the study include Louis N. Manganas, Irene Durá, Sivan Osenberg, Fatih Semerci, Mehmet Tosun, Rachana Mishra, Luke Parkitny and Juan M. Encinas. They are affiliated with one or more of the following institutions: Baylor College of Medicine (BCM), Texas Children’s Hospital, Jan and Dan Duncan Neurological Research Institute, Achucarro Basque Center for Neuroscience, Stony Brook University Medical Center, and the Basque Foundation for Science. The study was funded by the grants from the National Institutes of Health, U.S. Army Medical Research, Cynthia and Antony Petrello Endowment, and Mark A. Wallace Endowment; the National Institute of Diabetes and Digestive and Kidney Diseases, MINECO, FPI MICINN predoctoral fellowship; the Proteomics Center at Stony Brook University, and the BCM IDDRC.

Featured image: Dr. Mirjana Maletic-Savatic © TCH


Reference: Manganas, L.N., Durá, I., Osenberg, S. et al. BASP1 labels neural stem cells in the neurogenic niches of mammalian brain. Sci Rep 11, 5546 (2021). https://doi.org/10.1038/s41598-021-85129-1


Provided by Texas Children’s Hospital

Searching For Hints Of New Physics In The Subatomic World (Particle Physics)

Particle physicists use lattice quantum chromodynamics and supercomputers to search for physics beyond the Standard Model

Peer deeper into the heart of the atom than any microscope allows and scientists hypothesize that you will find a rich world of particles popping in and out of the vacuum, decaying into other particles, and adding to the weirdness of the visible world. These subatomic particles are governed by the quantum nature of the Universe and find tangible, physical form in experimental results.

Some subatomic particles were first discovered over a century ago with relatively simple experiments. More recently, however, the endeavor to understand these particles has spawned the largest, most ambitious and complex experiments in the world, including those at particle physics laboratories such as the European Organization for Nuclear Research (CERN) in Europe, Fermilab in Illinois, and the High Energy Accelerator Research Organization (KEK) in Japan.

These experiments have a mission to expand our understanding of the Universe, characterized most harmoniously in the Standard Model of particle physics; and to look beyond the Standard Model for as-yet-unknown physics.

“The Standard Model explains so much of what we observe in elementary particle and nuclear physics, but it leaves many questions unanswered,” said Steven Gottlieb, distinguished professor of Physics at Indiana University. “We are trying to unravel the mystery of what lies beyond the Standard Model.”

Ever since the beginning of the study of particle physics, experimental and theoretical approaches have complemented each other in the attempt to understand nature. In the past four to five decades, advanced computing has become an important part of both approaches. Great progress has been made in understanding the behavior of the zoo of subatomic particles, including bosons (especially the long sought and recently discovered Higgs boson), various flavors of quarks, gluons, muons, neutrinos and many states made from combinations of quarks or anti-quarks bound together.

Quantum field theory is the theoretical framework from which the Standard Model of particle physics is constructed. It combines classical field theory, special relativity and quantum mechanics, developed with contributions from Einstein, Dirac, Fermi, Feynman, and others. Within the Standard Model, quantum chromodynamics, or QCD, is the theory of the strong interaction between quarks and gluons, the fundamental particles that make up some of the larger composite particles such as the proton, neutron and pion.

PEERING THROUGH THE LATTICE

Carleton DeTar and Steven Gottlieb are two of the leading contemporary scholars of QCD research and practitioners of an approach known as lattice QCD. Lattice QCD represents continuous space as a discrete set of spacetime points (called the lattice). It uses supercomputers to study the interactions of quarks, and importantly, to determine more precisely several parameters of the Standard Model, thereby reducing the uncertainties in its predictions. It’s a slow and resource-intensive approach, but it has proven to have wide applicability, giving insight into parts of the theory inaccessible by other means, in particular the explicit forces acting between quarks and antiquarks.

A plot of the Unitarity Triangle, a good test of the Standard Model, showing constraints on the ρ, ¯ η¯ plane. The shaded areas have 95% CL, a statistical method for setting upper limits on model parameters. [Credit: A. Ceccucci (CERN), Z. Ligeti (LBNL) and Y. Sakai (KEK)]

DeTar and Gottlieb are part of the MIMD Lattice Computation (MILC) Collaboration and work very closely with the Fermilab Lattice Collaboration on the vast majority of their work. They also work with the High Precision QCD (HPQCD) Collaboration for the study of the muon anomalous magnetic moment. As part of these efforts, they use the fastest supercomputers in the world.

Since 2019, they have used Frontera at the Texas Advanced Computing Center (TACC) — the fastest academic supercomputer in the world and the 9th fastest overall — to propel their work. They are among the largest users of that resource, which is funded by the National Science Foundation. The team also uses Summit at the Oak Ridge National Laboratory (the #2 fastest supercomputer in the world); Cori at the National Energy Research Scientific Computing Center (#20), and Stampede2 (#25) at TACC, for the lattice calculations.

The efforts of the lattice QCD community over decades have brought greater accuracy to particle predictions through a combination of faster computers and improved algorithms and methodologies.

“We can do calculations and make predictions with high precision for how strong interactions work,” said DeTar, professor of Physics and Astronomy at the University of Utah. “When I started as a graduate student in the late 1960s, some of our best estimates were within 20 percent of experimental results. Now we can get answers with sub-percent accuracy.”

In particle physics, physical experiment and theory travel in tandem, informing each other, but sometimes producing different results. These differences suggest areas of further exploration or improvement.

“There are some tensions in these tests,” said Gottlieb, distinguished professor of Physics at Indiana University. “The tensions are not large enough to say that there is a problem here — the usual requirement is at least five standard deviations. But it means either you make the theory and experiment more precise and find that the agreement is better; or you do it and you find out, ‘Wait a minute, what was the three sigma tension is now a five standard deviation tension, and maybe we really have evidence for new physics.'”

DeTar calls these small discrepancies between theory and experiment ‘tantalizing.’ “They might be telling us something.”

Over the last several years, DeTar, Gottlieb and their collaborators have followed the paths of quarks and antiquarks with ever-greater resolution as they move through a background cloud of gluons and virtual quark-antiquark pairs, as prescribed precisely by QCD. The results of the calculation are used to determine physically meaningful quantities such as particle masses and decays.

Results for the B → πℓν semileptonic form factor (a function that encapsulates the properties of a certain particle interaction without including all of the underlying physics). The results from the FNAL/MILC 15 collaboration are the only ones that achieved the highest quality rating (green star) from the Flavour Lattice Averaging Group (FLAG) for control of continuum extrapolation and finite volume effects. [Credit: Y. Aoki, D. Beˇcirevi´c, M. Della Morte, S. Gottlieb, D. Lin, E. Lunghi, C. Pena]

One of the current state-of-the-art approaches that is applied by the researchers uses the so-called highly improved staggered quark (HISQ) formalism to simulate interactions of quarks with gluons. On Frontera, DeTar and Gottlieb are currently simulating at a lattice spacing of 0.06 femtometers (10-15 meters), but they are quickly approaching their ultimate goal of 0.03 femtometers, a distance where the lattice spacing is smaller than the wavelength of the heaviest quark, consequently removing a significant source of uncertainty from these calculations.

Each doubling of resolution, however, requires about two orders of magnitude more computing power, putting a 0.03 femtometers lattice spacing firmly in the quickly-approaching ‘exascale’ regime.

“The costs of calculations keeps rising as you make the lattice spacing smaller,” DeTar said. “For smaller lattice spacing, we’re thinking of future Department of Energy machines and the Leadership Class Computing Facility [TACC’s future system in planning]. But we can make do with extrapolations now.”

THE ANOMALOUS MAGNETIC MOMENT OF THE MUON AND OTHER OUTSTANDING MYSTERIES

Among the phenomena that DeTar and Gottlieb are tackling is the anomalous magnetic moment of the muon (essentially a heavy electron) – which, in quantum field theory, arises from a weak cloud of elementary particles that surrounds the muon. The same sort of cloud affects particle decays. Theorists believe yet-undiscovered elementary particles could potentially be in that cloud.

A large international collaboration called the Muon g-2 Theory Initiative recently reviewed the present status of the Standard Model calculation of the muon’s anomalous magnetic moment. Their review appeared in Physics Reports in December 2020. DeTar, Gottlieb and several of their Fermilab Lattice, HPQCD and MILC collaborators are among the coauthors. They find a 3.7 standard deviation difference between experiment and theory.

“… the processes that were important in the earliest instance of the Universe involve the same interactions that we’re working with here. So, the mysteries we’re trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well.”

— Carleton DeTar, Professor of Physics, University of Utah

While some parts of the theoretical contributions can be calculated with extreme accuracy, the hadronic contributions (the class of subatomic particles that are composed of two or three quarks and participate in strong interactions) are the most difficult to calculate and are responsible for almost all of the theoretical uncertainty. Lattice QCD is one of two ways to calculate these contributions.

“The experimental uncertainty will soon be reduced by up to a factor of four by the new experiment currently running at Fermilab, and also by the future J-PARC experiment,” they wrote. “This and the prospects to further reduce the theoretical uncertainty in the near future… make this quantity one of the most promising places to look for evidence of new physics.”

Gottlieb, DeTar and collaborators have calculated the hadronic contribution to the anomalous magnetic moment with a precision of 2.2 percent. “This give us confidence that our short-term goal of achieving a precision of 1 percent on the hadronic contribution to the muon anomalous magnetic moment is now a realistic one,” Gottlieb said. The hope to achieve a precision of 0.5 percent a few years later.

Other ‘tantalizing’ hints of new physics involve measurements of the decay of B mesons. There, various experimental methods arrive at different results. “The decay properties and mixings of the D and B mesons are critical to a more accurate determination of several of the least well-known parameters of the Standard Model,” Gottlieb said. “Our work is improving the determinations of the masses of the up, down, strange, charm and bottom quarks and how they mix under weak decays.” The mixing is described by the so-called CKM mixing matrix for which Kobayashi and Maskawa won the 2008 Nobel Prize in Physics.

The answers DeTar and Gottlieb seek are the most fundamental in science: What is matter made of? And where did it come from?

“The Universe is very connected in many ways,” said DeTar. “We want to understand how the Universe began. The current understanding is that it began with the Big Bang. And the processes that were important in the earliest instance of the Universe involve the same interactions that we’re working with here. So, the mysteries we’re trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well.”

Recent Collaborations of DeTar and Gottlieb

Hadronic-vacuum-polarization contribution to the muon’s anomalous magnetic moment from four-flavor lattice QCD, C.T.H. Davies(Glasgow U.), C. DeTar(Utah U.), A.X. El-Khadra(Illinois U., Urbana and Fermilab), E. Gámiz(CAFPE, Granada and Granada U., Theor. Phys. Astrophys.), Steven Gottlieb(Indiana U.), D. Hatton(Glasgow U.), A.S. Kronfeld(Fermilab and TUM-IAS, Munich), J. Laiho(Syracuse U.), G.P. Lepage(Cornell U., LEPP), Yuzhi Liu(Indiana U.), P.B. Mackenzie(Fermilab), C. McNeile(Plymouth U.), E.T. Neil(Colorado U.), T. Primer(Arizona U.), J.N. Simone(Fermilab), D. Toussaint(Arizona U.), R.S. Van de Water(Fermilab), A. Vaquero(Utah U.) Published in: Phys.Rev.D 101 (2020) 3, 034512 e-Print: 1902.04223 [hep-lat] DOI: 10.1103/PhysRevD.101.034512

Up-, down-, strange-, charm-, and bottom-quark masses from four-flavor lattice QCD, A. Bazavov(Michigan State U.), C. Bernard(Washington U., St. Louis), N. Brambilla(Munich, Tech. U. and TUM-IAS, Munich), N. Brown(Washington U., St. Louis), C. DeTar(Utah U.), A.X. El-Khadra(Illinois U., Urbana and Fermilab), E. Gámiz(CAFPE, Granada and Granada U., Theor. Phys. Astrophys.), Steven Gottlieb(Indiana U.), U.M. Heller(APS, New York), J. Komijani(Munich, Tech. U. and TUM-IAS, Munich and Glasgow U.), A.S. Kronfeld(Fermilab and TUM-IAS, Munich), J. Laiho(Syracuse U.), P.B. Mackenzie(Fermilab), E.T. Neil(Colorado U. and RIKEN BNL), J.N. Simone(Fermilab), R.L. Sugar(UC, Santa Barbara), D. Toussaint(Arizona U.), A. Vairo(Munich, Tech. U.), R.S. Van de Water(Fermilab) Published in: Phys.Rev.D 98 (2018) 5, 054517 e-Print: 1802.04248 [hep-lat] DOI: 10.1103/PhysRevD.98.054517

B- and D-meson leptonic decay constants from four-flavor lattice QCD, A. Bazavov(Michigan State U.), C. Bernard(Washington U., St. Louis), N. Brown(Washington U., St. Louis), C. Detar(Utah U.), A.X. El-Khadra(Illinois U., Urbana and Fermilab), E. Gámiz(CAFPE, Granada and Granada U., Theor. Phys. Astrophys.), Steven Gottlieb(Indiana U.), U.M. Heller(APS, New York), J. Komijani(Munich, Tech. U. and TUM-IAS, Munich), A.S. Kronfeld(Fermilab and TUM-IAS, Munich), J. Laiho(Syracuse U.), P.B. Mackenzie(Fermilab), E.T. Neil(Colorado U. and RIKEN BNL), J.N. Simone(Fermilab), R.L. Sugar(UC, Santa Barbara), D. Toussaint(Arizona U., Astron. Dept. – Steward Observ. and Glasgow U.), R.S. Van De Water(Fermilab) Published in: Phys.Rev.D 98 (2018) 7, 074512 e-Print: 1712.09262 [hep-lat] DOI: 10.1103/PhysRevD.98.074512

Featured image: This plot shows how the decay properties of a meson made from a heavy quark and a light quark change when the lattice spacing and heavy quark mass are varied on the calculation. [Credit: A. Bazavov (Michigan State U.), C. Bernard (Washington U., St. Louis), N. Brown (Washington U., St. Louis), C. DeTar (Utah U.), A.X. El-Khadra (Illinois U., Urbana and Fermilab) et al.]


Provided by TACC

Three Common Antiviral Drugs Potentially Effective Against COVID-19 (Medicine)

 found that three commonly used antiviral and antimalarial drugs are effective in vitro at preventing replication of SARS-CoV-2, the virus that causes COVID-19. The work also underscores the necessity of testing compounds against multiple cell lines to rule out false negative results.

The team, which included researchers from North Carolina State University and Collaborations Pharmaceuticals, looked at three antiviral drugs that have proven effective against Ebola and the Marburg virus: tilorone, quinacrine and pyronaridine.

“We were looking for compounds that could block the entry of the virus into the cell,” says Ana Puhl, senior scientist at Collaborations Pharmaceuticals and co-corresponding author of the research. “We chose these compounds because we know that other antivirals which successfully act against Ebola are also effective inhibitors of SARS-CoV-2.”

The compounds were tested in vitro against SARS-CoV-2, as well as against a common cold virus (HCoV 229E) and murine hepatitis virus (MHV). Researchers utilized a variety of cell lines that represented potential targets for SARS-CoV-2 infection in the human body. They infected the cell lines with the different viruses and then looked at how well the compounds prevented viral replication in the cells.

The results were mixed, with the compounds’ effectiveness depending upon whether they were used in human-derived cell lines versus monkey-derived cell lines, known as Vero cell lines.

“In the human-derived cell lines, we found that all three compounds worked similarly to remdesivir, which is currently being used to treat COVID-19,” says Frank Scholle, associate professor of biology at NC State and co-author of the research. “However, they were not at all effective in the Vero cells.”

“Researchers saw similar results when these compounds were initially tested against Ebola,” says Sean Ekins, CEO of Collaborations Pharmaceuticals and co-corresponding author of the research. “They were effective in human-derived cell lines, but not in Vero cells. This is important because Vero cells are one of the standard models used in this type of testing. In other words, different cells lines may have differing responses to a compound. It points to the necessity of testing compounds in many different cell lines to rule out false negatives.”

Next steps for the research include testing the compounds’ effectiveness in a mouse model and further work on understanding how they inhibit viral replication.

“One of the more interesting findings here is that these compounds don’t just prevent the virus from potentially binding to the cells, but that they may also inhibit viral activity because these compounds are acting on the lysosomes,” Puhl says. “Lysosomes, which are important for normal cell function, are hijacked by the virus for entry and exit out of the cell. So, if that mechanism is disrupted, it cannot infect other cells.”

“It’s also interesting that these compounds are effective not just against SARS-CoV-2, but against related coronaviruses,” Scholle says. “It could give us a head start on therapies as new coronaviruses emerge.”

The work appears in ACS Omega and was supported in part by NC State’s Comparative Medicine Institute and the National Institutes of Health. NC State undergraduates James Levi and Nicole Johnson, as well as Ralph Baric, from the University of North Carolina at Chapel Hill, contributed to the work. Other collaborating institutions included: Instituto Oswaldo Cruz and University of Campinas, both in Brazil; Utah State University; the University of Maryland; and SRI International.

Featured image: A Graphical abstract shows the ultrastructural morphology exhibited by coronavirus © CDC


Reference: Ana Puhl, Sean Ekins, Collaborations Pharmaceuticals; Frank Scholle, James Levi, Nicole Johnson, NC State University; et al, “Repurposing the Ebola and Marburg Virus Inhibitors Tilorone, Quinacrine, and Pyronaridine: In Vitro Activity against SARS-CoV‑2 and Potential Mechanisms”, March 12, 2021 in ACS Omega. DOI10.1021/acsomega.0c05996


Provided by NC State University