Study Finds Potential Therapeutic Target for Pediatric Acute Myeloid Leukemia (Medicine)

Novel gene variants may point the way to precision medicine for AML in children

Researchers have identified a gene expressed in children with acute myeloid leukemia (AML) that could serve as a new immunotherapy treatment target, according to a new study published today in Blood Advances, a journal of the American Society of Hematology. The study, co-authored by researchers with Nemours Children’s Health System, outlines the process and potential path for new immunotherapy drugs that improve survival and reduce treatment-related toxicity in children with AML.

Leukemia is the most common cancer in children and teens, and AML accounts for nearly one-fourth of those cases. AML is a fast-growing cancer that typically starts in immature bone marrow cells.

“Using genomic sequencing data, we identified novel targets for children’s cancer and worked with collaborators to engineer new therapies for children with AML, rather than repurpose drugs from the adult cancer realm that don’t work well in children,” said E. Anders Kolb, MD, director of Nemours’ Center for Childhood Cancer Research and a senior author of the study.

The researchers obtained genomic data from more than 2,000 pediatric patients with leukemia, to identify associated gene variants. Through genomic sequencing, they found that the gene mesothelin (MSLN) is abnormally expressed in more than one-third of childhood and young adult AML cases but was absent in normal bone marrow cells.

After this discovery, the researchers chose new immunotherapy drugs that would target MSLN to test in cell lines and animal models, to gauge pre-clinical effectiveness of leukemia therapies. Two experimental immunotherapy drugs were tested: anetumab ravtansine (Bayer), which is being tested in adult cancers, and a new compound, anti-MSLN-DGN462 (ImmunoGen). Each drug, in lab testing and in mouse models, produced potent destruction of leukemia cells. These drugs belong to a new class of cancer treatments known as anti-body drug conjugates (ADCs), which combine an antibody with a cancer-killing toxin. The antibody targets specific types of cancer cells and delivers the toxin directly to them, minimizing damage to healthy cells.

“We are working to show a proof of principle that we can create custom therapies for pediatric malignancies and turn the drugs we’re testing in the lab into clinical trials,” said Sonali P. Barwe, PhD, the study’s co-lead author and head of the Preclinical Leukemia Testing Laboratory in Nemours’ Center for Childhood Cancer Research.

The rapid evolution of genomic sequencing funded by the National Institutes of Health has led to the identification of new gene targets that are relevant for a significant number of patients. In addition, local organizations, such as the Leukemia Research Foundation of Delaware, have funded efforts like this study by Nemours to find new treatments.


Reference: Allison J. Kaeding, Sonali P. Barwe, Anilkumar Gopalakrishnapillai, Rhonda E. Ries, Todd A. Alonzo, Robert B. Gerbing, Colin Correnti, Michael R. Loken, Lisa Eidenschink Broderson, Laura Pardo, Quy H. Le, Thao Tang, Amanda R. Leonti, Jenny L. Smith, Cassie K. Chou, Min Xu, Tim Triche, Steven M. Kornblau, E. Anders Kolb, Katherine Tarlock, Soheil Meshinchi; Mesothelin is a novel cell surface disease marker and potential therapeutic target in acute myeloid leukemia. Blood Adv 2021; 5 (9): 2350–2361. doi: https://doi.org/10.1182/bloodadvances.2021004424


Provided by Nemours Children’s Health System

New Study Deconstructs Dunbar’s Number (Biology)

Yes, you can have more than 150 friends

An individual human can maintain stable social relationships with about 150 people. This is the proposition known as ‘Dunbar’s number’ – that the architecture of the human brain sets an upper limit on our social lives. A new study from Stockholm University indicates that a cognitive limit on human group sizes cannot be derived in this manner.

Dunbar’s number is named after the British anthropologist Robin Dunbar, who proposed the theory in the 1990s. The number 150 is based on an extrapolation of the correlation between the relative size of the neocortex and group sizes in non-human primates. Some empirical studies have found support for this number, while other have reported other group sizes.

“The theoretical foundation of Dunbar’s number is shaky. Other primates’ brains do not handle information exactly as human brains do, and primate sociality is primarily explained by other factors than the brain, such as what they eat and who their predators are. Furthermore, humans have a large variation in the size of their social networks,” says Patrik Lindenfors, Associate Professor of Zoological Ecology at Stockholm University and the Institute for Futures Studies, and one of the authors of the study.

When the Swedish researchers repeated Dunbar’s analyses using modern statistical methods and updated data on primate brains, the results were simultaneously much larger and far lower than 150.

The average maximum group size often turned out to be lower than 150 persons. But the main problem was that the 95% confidence intervals for these estimates were between 2 and 520 people.

“It is not possible to make an estimate for humans with any precision using available methods and data,” says Andreas Wartel, co-author of the study.

Dunbar’s number’ is often cited and has had a great impact in popular culture, not the least after featuring prominently in Malcolm Gladwell’s book “The Tipping Point”. In 2007, Swedish media reported that the Swedish Tax Authority reorganized their offices to stay within the 150-person limit.

“This reorganization would then be based on the implicit but hopefully unintended assumption that their employees have neither family nor friends outside of work,” says Patrik Lindenfors and adds, “I think Dunbar’s number is widely spread, also among researchers, since it’s so easy to understand. Our claim that it is not possible to calculate a number is not quite as entertaining”

Ideas such as Dunbar’s number highlight questions about the long reach of the gene.

“Are human social interactions genetically limited via the genes’ influence on the brain’s architecture? New research on cultural evolution has revealed the importance of cultural inheritance for what humans do and how we think. Culture affects everything from size of social networks to whether we can play chess or if we like hiking. Just like someone can learn to remember an enormous number of decimals in the number pi, our brain can be trained in having more social contacts” says Johan Lind, deputy director of the Centre for Cultural Evolution at Stockholm University and co-author of the study.

The article “‘Dunbar’s number’ deconstructed” is published in the scientific journal Zenodo, 2021. https://zenodo.org/record/4638943#.YJIUSZlN2Nw

Featured image: Does the brain limit our social capacity, in monkeys as well as humans? © Johan Lind/N


Provided by Stockholm University

UNC Charlotte Researchers Analyzed the Host Origins of SARS-CoV-2 and other coronaviruses (Biology)

Coronavirus (CoVs) infection in animals and humans is not new. The earliest papers in the scientific literature of coronavirus infection date to 1966. However, prior to SARS-CoV, MERS-CoV, and SARS-CoV-2, very little attention had been paid to coronaviruses.

Suddenly, coronaviruses changed everything we know about personal and public health, and societal and economic well-being. The change led to rushed analyses to understand the origins of coronaviruses in humans. This rush has led to a thus far fruitless search for intermediate hosts (e.g., civet in SARS-CoV and pangolin in SARS-CoV-2) rather than focusing on the important work, which has always been surveillance of SARS-like viruses in bats.

To clarify the origins of coronavirus’ infections in humans, researchers from the Bioinformatics Research Center (BRC) at the University of North Carolina at Charlotte (UNC Charlotte) performed the largest and most comprehensive evolutionary analyses to date. The UNC Charlotte team analyzed over 2,000 genomes of diverse coronaviruses that infect humans or other animals.

“We wanted to conduct evolutionary analyses based on the most rigorous standards of the field,” said Denis Jacob Machado, the first author of the paper. “We’ve seen rushed analyses that had different problems. For example, many analyses had poor sampling of viral diversity or placed excessive emphasis on overall similarity rather than on the characteristics shared due to common evolutionary history. It was very important to us to avoid those mistakes to produce a sound evolutionary hypothesis that could offer reliable information for future research.”

The study’s major conclusions are:

  • Bats have been ancestral hosts of human coronaviruses in the case of SARS-CoV and SARS-CoV-2. Bats also were the ancestral hosts of MERS-CoV infections in dromedary camels that spread rapidly to humans.
  • Transmission of MERS-CoV among camels and their herders evolved after the transmission from bats to these hosts. Similarly, there was transmission of SARS-CoV after the bat to human transmission among human vendors and their civets. These events are similar to the transmission of SARS-CoV-2 by fur farmers to their minks. The evolutionary analysis in this study helps to elucidate that these events occurred after the original human infection from lineages of coronaviruses hosted in bats. Therefore, these secondary transmissions to civet or mink did not play a role in the fundamental emergence of human coronaviruses.
  • The study corroborates the animal host origins of other human coronaviruses, such as HCoV-NL63 (from bat hosts), HCoV-229E (from camel hosts), HCoV-HKU1 (from rodent hosts) and HCoV-OC43 and HECV-4408 (from cow hosts).
  • Transmission of coronaviruses from animals to humans occurs episodically. From 1966 to 2020, the scientific community has described eight human-hosted lineages of coronaviruses. Although it is difficult to predict when a new human hosted coronavirus could emerge, the data indicate that we should prepare for that possibility.

“As coronavirus transmission from animal to human host occurs episodically at unpredictable intervals, it is not wise to attempt to time when we will experience the next human coronavirus,” noted professor Daniel A. Janies, Carol Grotnes Belk Distinguished Professor of Bioinformatics and Genomics and team leader for the study. “We must conduct research on viruses that can be transferred from animals to humans on a continuous rather than reactionary basis.”

Featured image: This tree is a summary of the selected host transformations in the clade of Betacoronavirus associated with SARS-CoV, MERS-CoV, and SARS-CoV-2. Bats have been fundamental hosts of these human coronaviruses. The host transformations indicated by dotted lines are independent events that are not important to the origins of these human coronaviruses. © Denis Jacob Machado


Reference: Jacob Machado, D., Scott, R., Guirales, S. and Janies, D.A. (2021), Fundamental evolution of all Orthocoronavirinae including three deadly lineages descendent from Chiroptera‐hosted coronaviruses: SARS‐CoV, MERS‐CoV and SARS‐CoV‐2. Cladistics. https://doi.org/10.1111/cla.12454


Provided by University of North Carolina at Charlotte

Machine Learning Accelerates Cosmological Simulations (Astronomy)

Using neural networks, researchers can now simulate universes in a fraction of the time, advancing the future of physics research

A universe evolves over billions upon billions of years, but researchers have developed a way to create a complex simulated universe in less than a day. The technique, published in this week’s Proceedings of the National Academy of Sciences, brings together machine learning, high-performance computing and astrophysics and will help to usher in a new era of high-resolution cosmology simulations.

Cosmological simulations are an essential part of teasing out the many mysteries of the universe, including those of dark matter and dark energy. But until now, researchers faced the common conundrum of not being able to have it all ­— simulations could focus on a small area at high resolution, or they could encompass a large volume of the universe at low resolution.

Carnegie Mellon University Physics Professors Tiziana Di Matteo and Rupert Croft, Flatiron Institute Research Fellow Yin Li, Carnegie Mellon Ph.D. candidate Yueying Ni, University of California Riverside Professor of Physics and Astronomy Simeon Bird and University of California Berkeley’s Yu Feng surmounted this problem by teaching a machine learning algorithm based on neural networks to upgrade a simulation from low resolution to super resolution.

“Cosmological simulations need to cover a large volume for cosmological studies, while also requiring high resolution to resolve the small-scale galaxy formation physics, which would incur daunting computational challenges. Our technique can be used as a powerful and promising tool to match those two requirements simultaneously by modeling the small-scale galaxy formation physics in large cosmological volumes,” said Ni, who performed the training of the model, built the pipeline for testing and validation, analyzed the data and made the visualization from the data.

The trained code can take full-scale, low-resolution models and generate super-resolution simulations that contain up to 512 times as many particles. For a region in the universe roughly 500 million light-years across containing 134 million particles, existing methods would require 560 hours to churn out a high-resolution simulation using a single processing core. With the new approach, the researchers need only 36 minutes.

The results were even more dramatic when more particles were added to the simulation. For a universe 1,000 times as large with 134 billion particles, the researchers’ new method took 16 hours on a single graphics processing unit. Using current methods, a simulation of this size and resolution would take a dedicated supercomputer months to complete.

Reducing the time it takes to run cosmological simulations “holds the potential of providing major advances in numerical cosmology and astrophysics,” said Di Matteo. “Cosmological simulations follow the history and fate of the universe, all the way to the formation of all galaxies and their black holes.”

Scientists use cosmological simulations to predict how the universe would look in various scenarios, such as if the dark energy pulling the universe apart varied over time. Telescope observations then confirm whether the simulations’ predictions match reality.

“With our previous simulations, we showed that we could simulate the universe to discover new and interesting physics, but only at small or low-res scales,” said Croft. “By incorporating machine learning, the technology is able to catch up with our ideas.”

Di Matteo, Croft and Ni are part of Carnegie Mellon’s National Science Foundation (NSF) Planning Institute for Artificial Intelligence in Physics, which supported this work, and members of Carnegie Mellon’s McWilliams Center for Cosmology.

“The universe is the biggest data sets there is — artificial intelligence is the key to understanding the universe and revealing new physics,” said Scott Dodelson, professor and head of the department of physics at Carnegie Mellon University and director of the NSF Planning Institute. “This research illustrates how the NSF Planning Institute for Artificial Intelligence will advance physics through artificial intelligence, machine learning, statistics and data science.”

“It’s clear that AI is having a big effect on many areas of science, including physics and astronomy,” said James Shank, a program director in NSF’s Division of Physics.  “Our AI planning Institute program is working to push AI to accelerate discovery. This new result is a good example of how AI is transforming cosmology.”

To create their new method, Ni and Li harnessed these fields to create a code that uses neural networks to predict how gravity moves dark matter around over time. The networks take training data, run calculations and compare the results to the expected outcome. With further training, the networks adapt and become more accurate.

The specific approach used by the researchers, called a generative adversarial network, pits two neural networks against each other. One network takes low-resolution simulations of the universe and uses them to generate high-resolution models. The other network tries to tell those simulations apart from ones made by conventional methods. Over time, both neural networks get better and better until, ultimately, the simulation generator wins out and creates fast simulations that look just like the slow conventional ones.

“We couldn’t get it to work for two years,” Li said, “and suddenly it started working. We got beautiful results that matched what we expected. We even did some blind tests ourselves, and most of us couldn’t tell which one was ‘real’ and which one was ‘fake.’”

Despite only being trained using small areas of space, the neural networks accurately replicated the large-scale structures that only appear in enormous simulations.

The simulations didn’t capture everything, though. Because they focused on dark matter and gravity, smaller-scale phenomena — such as star formation, supernovae and the effects of black holes — were left out. The researchers plan to extend their methods to include the forces responsible for such phenomena, and to run their neural networks ‘on the fly’ alongside conventional simulations to improve accuracy.

The research was powered by the Frontera supercomputer at the Texas Advanced Computing Center (TACC), the fastest academic supercomputer in the world. The team is one of the largest users of this massive computing resource, which is funded by the NSF Office of Advanced Cyberinfrastructure.

This research was funded by the NSF, the NSF AI Institute: Physics of the Future and NASA.

Featured image: The leftmost simulation ran at low resolution. Using machine learning, researchers upscaled the low-res model to create a high-resolution simulation (right). That simulation captures the same details as a conventional high-res model (middle) while requiring significantly fewer computational resources. Credit: Y. Li et al./Proceedings of the National Academy of Sciences 2021


Reference: Yin Li, Yueying Ni, Rupert A. C. Croft, Tiziana Di Matteo, Simeon Bird, Yu Feng, “AI-assisted superresolution cosmological simulations”, Proceedings of the National Academy of Sciences May 2021, 118 (19) e2022038118; DOI: https://doi.org/10.1073/pnas.2022038118


Provided by Carnegie Mellon University