Scientists discover a neuropeptide that reflects the current state of a fish’s social environment.
Have you recently wondered how social-distancing and self-isolation may be affecting your brain? An international research team led by Erin Schuman from the Max Planck Institute for Brain Research discovered a brain molecule that functions as a “thermometer” for the presence of others in an animal’s environment. Zebrafish “feel” the presence of others via mechanosensation and water movements – which turns the brain hormone on.
Varying social conditions can cause long-lasting changes in animal behavior. Social isolation, for instance, can have devastating effects on humans and other animals, including zebrafish. The brain systems that sense the social environment, however, are not well understood. To probe whether neuronal genes respond to dramatic changes in the social environment, graduate student, Lukas Anneser, and colleagues raised zebrafish either alone or with their kin for different periods of time. The scientists used RNA sequencing to measure the expression levels of thousands of neuronal genes.
Tracking social density
“We found a consistent change in expression for a handful of genes in fish that were raised in social isolation. One of them was parathyroid hormone 2 (pth2), coding for a relatively unknown peptide in the brain. Curiously, pth2 expression tracked not just the presence of others, but also their density. Surprisingly, when zebrafish were isolated, pth2 disappeared in the brain, but its expression levels rapidly rose, like a thermometer reading, when other fish were added to the tank,” explains Anneser.
Thrilled by this discovery, the scientists tested if the effects of isolation could be reversed by putting the previously isolated fish into a social setting. “After just 30 minutes swimming with their kin, there was a significant recovery of the pth2 levels. After 12 hours with kin the pth2 levels were indistinguishable from those seen in socially-raised animals,” says Anneser. “This really strong and fast regulation was unexpected and indicated a very tight link between gene expression and the environment.”
So which sensory modality do the animals use to detect others and drive changes in gene expression? “It turned out that the sensory modality that controls pth2 expression was not vision, taste or smell, but rather mechanosensation – they actually ‘felt’ the physical movements of the swimming neighboring fish,” explains Schuman.
Sensing water movements
Fish perceive movement (“mechano-sense”) in their immediate vicinity via a sensory organ called the lateral line. To test the role of mechanosensation in driving pth2 expression, the team ablated the mechanosensitive cells within the fish’s lateral line. In previously isolated animals, the ablation of the lateral line cells prevented rescue of the neuro-hormone that was usually induced by the presence of other fish.
Just as we humans are sensitive to touch, zebrafish appear to be specifically tuned to swimming motion of other fish. The scientists saw changes in pth2 levels caused by water movements that is triggered by conspecifics in the tank. “Zebrafish larvae swim in short bouts. We mimicked this water stimulation by programming a motor to create artificial fish movements. Intriguingly, in previously isolated fish the artificial movements rescued pth2 levels just like the real neighboring fish,” explains Anneser.
“Our data indicate a surprising role for a relatively unexplored neuropeptide, Pth2- it tracks and responds to the population density of an animal’s social environment. It is clear that the presence of others can have dramatic consequences on an animal’s access to resources and ultimate survival – it is thus likely that this neuro-hormone will regulate social brain and behavioral networks,” concludes Schuman.
References: Lukas Anneser, Ivan C. Alcantara, Anja Gemmer, Kristina Mirkes, Soojin Ryu, and Erin M. Schuman The Neuropeptide Pth2 Dynamically Senses Others via Mechanosensation. Nature, December 2, 2020 https://www.nature.com/articles/s41586-020-2988-z
Diagnosing liver damage earlier could help to prevent liver failure in many patients.
About 25 percent of the U.S. population suffers from fatty liver disease, a condition that can lead to fibrosis of the liver and, eventually, liver failure.
Currently there is no easy way to diagnose either fatty liver disease or liver fibrosis. However, MIT engineers have now developed a diagnostic tool, based on nuclear magnetic resonance (NMR), that could be used to detect both of those conditions.
“Since it’s a noninvasive test, you could screen people even before they have obvious symptoms of compromised liver, and you would be able to say which of these patients had fibrosis,” says Michael Cima, the David H. Koch Professor of Engineering in MIT’s Department of Materials Science and Engineering, a member of MIT’s Koch Institute for Integrative Cancer Research, and the senior author of the study.
The device, which is small enough to fit on a table, uses NMR to measure how water diffuses through tissue, which can reveal how much fat is present in the tissue. This kind of diagnostic, which has thus far been tested on mice, could help doctors catch fatty liver disease before it progresses to fibrosis, the researchers say.
MIT PhD recipient Ashvin Bashyam and graduate student Chris Frangieh are the lead authors of the paper, which appears today in Nature Biomedical Engineering.
Fatty liver disease occurs when liver cells store too much fat. This leads to inflammation and eventually fibrosis, a buildup of scar tissue that can cause jaundice and liver cirrhosis, and eventually liver failure. Fibrosis is usually not diagnosed until the patient begins to experience symptoms that include not only jaundice but also fatigue and abdominal swelling. A biopsy is needed to confirm the diagnosis, but this is an invasive procedure and may not be accurate if the biopsy sample is taken from a part of the liver that is not fibrotic.
To create an easier way to check for this kind of liver disease, Cima and his colleagues had the idea of adapting a detector that they had previously developed to measure hydration levels before and after patients undergo dialysis. That detector measures fluid volume in patients’ skeletal muscle by using NMR to track changes in the magnetic properties of hydrogen atoms of water in the muscle tissue.
The researchers thought that a similar detector could be used for identifying liver disease because water diffuses more slowly when it encounters fatty tissue or fibrosis. Tracking how water moves through tissue over time can reveal how much fatty or scarred tissue is present.
“If you watch how the magnetization changes, you can model how fast the protons are moving,” Cima says. “Those cases where the magnetization doesn’t go away very fast would be ones where the diffusivity was low, and they would be the most fibrotic.”
In a study of mice, the researchers showed that their detector could identify fibrosis with 86 percent accuracy, and fatty liver disease with 92 percent accuracy. It takes about 10 minutes to obtain the results, but the researchers are now working on improving the signal-to-noise ratio of the detector, which could help to reduce the amount of time it takes.
The current version of the sensor can scan to a depth of about 6 millimeters below the skin, which is enough to monitor the mouse liver or human skeletal muscle. The researchers are now working on designing a new version that can penetrate deeper below the tissue, to allow them to test the liver diagnosis application in human patients.
If this type of NMR sensor could be developed for use in patients, it could help to identify people in danger of developing fibrosis, or in the early stages of fibrosis, so they could be treated earlier, Cima says. Fibrosis can’t be reversed, but it can be halted or slowed down through dietary changes and exercise. Having this type of diagnostic available could also aid in drug development efforts, because it could allow doctors to more easily identify patients with fibrosis and monitor their response to potential new treatments, Cima says.
Another potential application for this kind of sensor is to evaluate human livers for transplant. In this study, the researchers tested the monitor on human liver tissue and found that it could detect fibrosis with 93 percent accuracy.
The research was funded by the Koch Institute Support (core) Grant from the National Cancer Institute, the National Institutes of Health, a Fannie and John Hertz Foundation Graduate Fellowship, and a National Science Foundation Graduate Fellowship.
One Friday evening in 1992, a meteorite ended a more than 150 million-mile journey by smashing into the trunk of a red Chevrolet Malibu in Peekskill, New York. The car’s owner reported that the 30-pound remnant of the earliest days of our solar system was still warm and smelled of sulfur.
Nearly 30 years later, a new analysis of that same Peekskill meteorite and 17 others by researchers at The University of Texas at Austin and the University of Tennessee, Knoxville, has led to a new hypothesis about how asteroids formed during the early years of the solar system.
The meteorites studied in the research originated from asteroids and serve as natural samples of the space rocks. They indicate that the asteroids formed though violent bombardment and subsequent reassembly, a finding that runs counter to the prevailing idea that the young solar system was a peaceful place.
The research began when co-author Nick Dygert was a postdoctoral fellow at UT’s Jackson School of Geosciences studying terrestrial rocks using a method that could measure the cooling rates of rocks from very high temperatures, up to 1,400 degrees Celsius.
Dygert, now an assistant professor at the University of Tennessee, realized that this method — called a rare earth element (REE)-in-two-pyroxene thermometer — could work for space rocks, too.
“This is a really powerful new technique for using geochemistry to understand geophysical processes, and no one had used it to measure meteorites yet,” Dygert said.
Since the 1970s, scientists have been measuring minerals in meteorites to figure out how they formed. The work suggested that meteorites cooled very slowly from the outside inward in layers. This “onion shell model” is consistent with a relatively peaceful young solar system where chunks of rock orbited unhindered. But those studies were only capable of measuring cooling rates from temperatures near about 500 degrees Celsius.
When Dygert and Michael Lucas, a postdoctoral scholar at the University of Tennessee who led the work, applied the REE-in-two-pyroxene method, with its much higher sensitivity to peak temperature, they found unexpected results. From around 900 degrees Celsius down to 500 degrees Celsius, cooling rates were 1,000 to 1 million times faster than at lower temperatures.
How could these two very different cooling rates be reconciled?
The scientists proposed that asteroids formed in stages. If the early solar system was, much like the old Atari game “Asteroids,” rife with bombardment, large rocks would have been smashed to bits. Those smaller pieces would have cooled quickly. Afterward, when the small pieces reassembled into larger asteroids we see today, cooling rates would have slowed.
To test this rubble pile hypothesis, Jackson School Professor Marc Hesse and first-year doctoral student Jialong Ren built a computational model of a two-stage thermal history of rubble pile asteroids for the first time.
Because of the vast number of pieces in a rubble pile –1015 or a thousand trillions — and the vast array of their sizes, Ren had to develop new techniques to account for changes in mass and temperature before and after bombardment.
“This was an intellectually significant contribution,” Hesse said.
The resulting model supports the rubble pile hypothesis and provides other insights as well. One implication is that cooling slowed so much after reassembly not because the rock gave off heat in layers. Rather, it was that the rubble pile contained pores.
“The porosity reduces how fast you can conduct heat,” Hesse said. “You actually cool slower than you would have if you hadn’t fragmented because all of the rubble makes kind of a nice blanket. And that’s sort of unintuitive.”
Tim Swindle of the Lunar and Planetary Laboratory at the University of Arizona, who studies meteorites but was not involved in the research, said that this work is a major step forward.
“This seems like a more complete model, and they’ve added data to part of the question that people haven’t been talking about, but should have been. The jury is still out, but this is a strong argument.”
The biggest implication of the new rubble pile hypothesis, Dygert said, is that these collisions characterized the early days of the solar system.
“They were violent, and they started early on,” he said.
The research was supported by NASA. The Smithsonian National Museum of Natural History supplied samples of meteorites for the study.
A specialist in spacecraft movement control analyzed the process of placing vehicle stages, boosters, and other space debris into the so-called disposal orbit and suggested cleaning lower orbits up with a spacecraft that has modules with engine units on board. These modules will attach to space debris objects and move them away. As for the geostationary orbit, a preferable way to clean it up would be a towing spacecraft that transports space debris objects into the disposal orbit. The research was carried out in collaboration with a team from Bauman Moscow State Technical University, and its results were published in the Advances in Space Research journal.
Besides satellites and the International Space Station, thousands of out-of-service spacecrafts, boosters, and other space debris objects move along different orbits around the Earth. Sometimes they collide and break down: for example, over 1,000 new observable fragments appeared in 2018 when eight objects fell to pieces in the near-Earth space. The more debris is left in space, the higher is the risk that it would damage the satellites, leaving us without communication and surveillance systems. Prof. Andrei Baranov from RUDN University together with his colleagues from Bauman Moscow State Technical University Dmitry Grishko and Grigory Shcheglov studied the parameters of space debris in different orbits and came up with the most feasible ways for cleaning it up.
160 vehicle stages (from 1.1 to 9 tons each) are situated in low near-Earth orbits, i.e. at a height from 600 to 2,000 km. As for the geostationary orbit at the height of 35,786 km, the most potentially dangerous objects there are 87 boosters, each weighing from 3.2 to 3.4 tons. The size, weight, and parameters of these objects are quite different, therefore, they require different equipment to collect them and move to the so-called disposal orbit where the debris is safe to store.
A spacecraft-collector suggested by the team to clean up the near-Earth orbits is 11.5 m long, 3 m in diameter, and weighs just over 4 tons. Such a collector can carry 8 to 12 modules with engine units on board. The movement of light vehicle stages will require 50 to 70 kg of fuel, while the transportation of a Zenit-2 stage that weighs 9 tons–around 350. The total weight of a spacecraft-collector at launch is expected to be from 8 to 12 tons. Modern-day boosters can easily place a weight like this into any orbit up to 1,000 km high. After a collector runs out of modules, it will attach itself to the last booster stage, move to the top layer of the atmosphere with it, and burn down.
As for the geostationary orbit, to clean it up the team suggested a spacecraft that is about 3.4 m long, 2.1 m wide, and weighs around 2 tons. According to their calculations, if loaded with modules, such a device would not be extremely efficient, and it would take 3-4 times more collectors to clean the orbit up. Therefore, in this case, the spacecraft-collector should work as a tow for space debris objects. Preliminary calculations suggest that it could operate for up to 15 years and transfer 40 to 45 space debris objects into the disposal orbit.
“Designing a spacecraft-collector for lower orbits is a more complicated task than creating one for the geostationary orbit. Best-case scenario, one aircraft would be able to move only 8 to 12 objects from lower orbits, while in the geostationary orbit it could transport 40 to 45. Therefore, cleaning up lower orbits is much more difficult. This factor should be taken into consideration by businesses and space agencies that plan to launch groups of hundreds or thousands of satellites in this area of the near-Earth space,” explained Prof. Andrei Baranov, a PhD in Physics and Mathematics from the Department of Mechanics and Mechatronics, RUDN University.
An international study has found a link between the brain’s network connections and grey matter atrophy caused by certain types of epilepsy, a major step forward in our understanding of the disease.
In neuroscience, it is becoming increasingly clear that the brain’s connectome is as important as its anatomy when studying human disease. The connectome is a map of neural connections that describes how brain regions interact and work together to perform certain tasks. While connectome research in epilepsy has moved forward in recent years, there is still a lot we do not know about its role in the disorder.
The study, led by researchers from The Neuro (Montreal Neurological Institute-Hospital), analyzed data from 1,021 individuals with epilepsy and 1,564 healthy controls over 19 sites around the world from the ENIGMA database, a collection of neuroimaging data available to researchers under Open Science principles. They used this data to map grey matter atrophy, a characteristic of epilepsy, in the patients.
They then collected data from another database called the Human Connectome Project, which provides connectome data from a large group of healthy controls. Their hypothesis was that grey matter atrophy would appear most often in parts of the brain where connectivity was highest, known as hubs.
“Hub regions are known to participate in brain signalling, have high plasticity, and high metabolic activity, making them a candidate for epilepsy-related atrophy,” says Sara Larivière, the study’s lead author and a PhD candidate at The Neuro.
The team found that areas of high atrophy in patients with both idiopathic generalized epilepsy and temporal lobe epilepsy also tended to be hub regions. Using further analyses, they were able to show their model could predict the damage the epilepsy did to the grey matter of individual patients over time.
“Our multi-site findings show that brain connectivity contributes to the effect that epilepsy has on whole brain structure,” says Boris Bernhardt, a researcher at The Neuro and the study’s senior author. “This will be important to understand common functional deficits in individual patients and to assess the effect of the disease over time.”
References: Sara Larivière, Raúl Rodríguez-Cruces, Jessica Royer, Maria Eugenia Caligiuri, Antonio Gambardella, Luis Concha, Simon S. Keller, Fernando Cendes, Clarissa Yasuda, Leonardo Bonilha, Ezequiel Gleichgerrcht, Niels K. Focke, Martin Domin, Felix von Podewills, Soenke Langner, Christian Rummel, Roland Wiest, Pascal Martin, Raviteja Kotikalapudi, Terence J. O’Brien, Benjamin Sinclair, Lucy Vivash, Patricia M. Desmond, Saud Alhusaini, Colin P. Doherty, Gianpiero L. Cavalleri, Norman Delanty, Reetta Kälviäinen, Graeme D. Jackson, Magdalena Kowalczyk, Mario Mascalchi, Mira Semmelroch, Rhys H. Thomas, Hamid Soltanian-Zadeh, Esmaeil Davoodi-Bojd, Junsong Zhang, Matteo Lenge, Renzo Guerrini, Emanuele Bartolini, Khalid Hamandi, Sonya Foley, Bernd Weber, Chantal Depondt, Julie Absil, Sarah J. A. Carr, Eugenio Abela, Mark P. Richardson, Orrin Devinsky, Mariasavina Severino, Pasquale Striano, Domenico Tortora, Sean N. Hatton, Sjoerd B. Vos, John S. Duncan, Christopher D. Whelan, Paul M. Thompson, Sanjay M. Sisodiya, Andrea Bernasconi, Angelo Labate, Carrie R. McDonald, Neda Bernasconi, Boris C. Bernhardt, “Network-based atrophy modeling in the common epilepsies: A worldwide ENIGMA study”, Science Advances 18 Nov 2020: Vol. 6, no. 47, eabc6457 DOI: 10.1126/sciadv.abc6457 https://advances.sciencemag.org/content/6/47/eabc6457
An automated AI measurement of visceral fat area on abdominal CT images predicts future heart attack or stroke risk better than overall weight or BMI.
The researchers studied 12,128 patients over 5 years.
Visceral fat area was independently associated with future heart attack and stroke. BMI was not associated with heart attack or stroke.
Automated deep learning analysis of abdominal CT images produces a more precise measurement of body composition and predicts major cardiovascular events, such as heart attack and stroke, better than overall weight or body mass index (BMI), according to a study presented today at the annual meeting of the Radiological Society of North America (RSNA).
“Established cardiovascular risk models rely on factors like weight and BMI that are crude surrogates of body composition,” said Kirti Magudia, M.D., Ph.D., an abdominal imaging and ultrasound fellow at the University of California San Francisco. “It’s well established that people with the same BMI can have markedly different proportions of muscle and fat. These differences are important for a variety of health outcomes.”
Unlike BMI, which is based on height and weight, a single axial CT slice of the abdomen visualizes the volume of subcutaneous fat area, visceral fat area and skeletal muscle area. However, manually measuring these individual areas is time intensive and costly.
As a radiology resident at Brigham and Women’s Hospital in Boston, Dr. Magudia was part of a multidisciplinary team of researchers, including radiologists, a data scientist and biostatistician, who developed a fully automated method using deep learning–a type of artificial intelligence (AI)–to determine body composition metrics from abdominal CT images.
“Abdominal CT scans that are routinely performed provide a more granular way of looking at body composition, but we’re not currently taking advantage of it,” Dr. Magudia said.
The study cohort was derived from the 33,182 abdominal CT outpatient exams performed on 23,136 patients at Partners Healthcare in Boston in 2012. The researchers identified 12,128 patients who were free of major cardiovascular and cancer diagnoses at the time of imaging. Mean age of the patients was 52 years, and 57% of patients were women.
The researchers selected the L3 CT slice (from the third lumbar spine vertebra) and calculated body composition areas for each patient. Patients were then divided into four quartiles based on the normalized values of subcutaneous fat area, visceral fat area and skeletal muscle area.
In this retrospective study, it was determined which of these 12,128 patients had a myocardial infarction (heart attack) or stroke within 5 years after their index abdominal CT scan. The researchers found 1,560 myocardial infarctions and 938 strokes occurred in this study group.
Statistical analysis demonstrated that visceral fat area was independently associated with future heart attack and stroke. BMI was not associated with heart attack or stroke.
Video 1. Dr. Kirti Magudia discusses her research on automated AI measurement of visceral fat area on abdominal CT images and the prediction of future heart attack or stroke risk better than overall weight or BMI.
“The group of patients with the highest proportion of visceral fat area were more likely to have a heart attack, even when adjusted for known cardiovascular risk factors,” said Dr. Magudia. “The group of patients with the lowest amount of visceral fat area were protected against stroke in the years following the abdominal CT exam.”
“These results demonstrate that precise measures of body muscle and fat compartments achieved through CT outperform traditional biomarkers for predicting risk for cardiovascular outcomes,” she added.
According to Dr. Magudia, this work demonstrates that fully automated and normalized body composition analysis could now be applied to large-scale research projects.
“This work shows the promise of AI systems to add value to clinical care by extracting new information from existing imaging data,” Dr. Magudia said. “The deployment of AI systems would allow radiologists, cardiologists and primary care doctors to provide better care to patients at minimal incremental cost to the health care system.”
This paper is the recipient of an RSNA 2020 Trainee Research Prize.
Co-authors are Christopher P. Bridge, D.Phil., Camden P. Bay, Ph.D., Florian J. Fintelmann, M.D., Ana Babic, Ph.D., Katherine P. Andriole, Ph.D., Brian M. Wolpin, M.D., and Michael H. Rosenthal, M.D., Ph.D.
For more information and images, visit RSNA.org/press20. Press account required to view embargoed materials.
RSNA is an association of radiologists, radiation oncologists, medical physicists and related scientists promoting excellence in patient care and health care delivery through education, research and technologic innovation. The Society is based in Oak Brook, Illinois. (RSNA.org)
Editor’s note: The data in these releases may differ from those in the published abstract and those actually presented at the meeting, as researchers continue to update their data right up until the meeting. To ensure you are using the most up-to-date information, please call the RSNA media relations team at Newsroom at 1-630-590-7762.
Harvard Medical School scientists have successfully restored vision in mice by turning back the clock on aged eye cells in the retina to recapture youthful gene function.
The team’s work, described Dec. 2 in Nature, represents the first demonstration that it may be possible to safely reprogram complex tissues, such as the nerve cells of the eye, to an earlier age.
In addition to resetting the cells’ aging clock, the researchers successfully reversed vision loss in animals with a condition mimicking human glaucoma, a leading cause of blindness around the world.
The achievement represents the first successful attempt to reverse glaucoma-induced vision loss, rather than merely stem its progression, the team said. If replicated through further studies, the approach could pave the way for therapies to promote tissue repair across various organs and reverse aging and age-related diseases in humans.
“Our study demonstrates that it’s possible to safely reverse the age of complex tissues such as the retina and restore its youthful biological function,” said senior author David Sinclair, professor of genetics in the Blavatnik Institute at Harvard Medical School, co-director of the Paul F. Glenn Center for Biology of Aging Research at HMS and an expert on aging.
Sinclair and colleagues caution that the findings remain to be replicated in further studies, including in different animal models, before any human experiments. Nonetheless, they add, the results offer a proof of concept and a pathway to designing treatments for a range of age-related human diseases.
“If affirmed through further studies, these findings could be transformative for the care of age-related vision diseases like glaucoma and to the fields of biology and medical therapeutics for disease at large,” Sinclair said.
For their work, the team used an adeno-associated virus (AAV) as a vehicle to deliver into the retinas of mice three youth-restoring genes—Oct4, Sox2 and Klf4—that are normally switched on during embryonic development. The three genes, together with a fourth one, which was not used in this work, are collectively known as Yamanaka factors.
The treatment had multiple beneficial effects on the eye. First, it promoted nerve regeneration following optic-nerve injury in mice with damaged optic nerves. Second, it reversed vision loss in animals with a condition mimicking human glaucoma. And third, it reversed vision loss in aging animals without glaucoma.
The team’s approach is based on a new theory about why we age. Most cells in the body contain the same DNA molecules but have widely diverse functions. To achieve this degree of specialization, these cells must read only genes specific to their type. This regulatory function is the purview of the epigenome, a system of turning genes on and off in specific patterns without altering the basic underlying DNA sequence of the gene.
This theory postulates that changes to the epigenome over time cause cells to read the wrong genes and malfunction—giving rise to diseases of aging. One of the most important changes to the epigenome is DNA methylation, a process by which methyl groups are tacked onto DNA. Patterns of DNA methylation are laid down during embryonic development to produce the various cell types. Over time, youthful patterns of DNA methylation are lost, and genes inside cells that should be switched on get turned off and vice versa, resulting in impaired cellular function. Some of these DNA methylation changes are predictable and have been used to determine the biologic age of a cell or tissue.
Yet, whether DNA methylation drives age-related changes inside cells has remained unclear. In the current study, the researchers hypothesized that if DNA methylation does, indeed, control aging, then erasing some of its footprints might reverse the age of cells inside living organisms and restore them to their earlier, more youthful state.
Past work had achieved this feat in cells grown in laboratory dishes but fell short of demonstrating the effect in living organisms.
The new findings demonstrate that the approach could be used in animals as well.
Overcoming an important hurdle
Lead study author, Yuancheng Lu, research fellow in genetics at HMS and a former doctoral student in Sinclair’s lab, developed a gene therapy that could safely reverse the age of cells in a living animal.
Lu’s work builds on the Nobel Prize winning discovery of Shinya Yamanaka, who identified the four transcription factors, Oct4, Sox2, Klf4, c-Myc, that could erase epigenetics markers on cells and return these cells to their primitive embryonic state from which they can develop into any other type of cell.
Subsequent studies, however, showed two important setbacks. First, when used in adult mice, the four Yamanaka factors could also induce tumor growth, rendering the approach unsafe. Second, the factors could reset the cellular state to the most primitive cell state, thus completely erasing a cell’s identity.
Lu and colleagues circumvented these hurdles by slightly modifying the approach. They dropped the gene c-Myc and delivered only the remaining three Yamanaka genes, Oct4, Sox2 and Klf4. The modified approach successfully reversed cellular aging without fueling tumor growth or losing their identity.
Gene therapy applied to optic nerve regeneration
In the current study, the researchers targeted cells in the central nervous system because it is the first part of body affected by aging. After birth, the ability of the central nervous system to regenerate declines rapidly.
To test whether the regenerative capacity of young animals could be imparted to adult mice, the researchers delivered the modified three-gene combination via an AAV into retinal ganglion cells of adult mice with optic nerve injury.
For the work, Lu and Sinclair partnered with Zhigang He, HMS professor of neurology and of ophthalmology at Boston Children’s Hospital, who studies optic nerve and spinal cord neuro-regeneration.
The treatment resulted in a two-fold increase in the number of surviving retinal ganglion cells after the injury and a five-fold increase in nerve regrowth.
“At the beginning of this project, many of our colleagues said our approach would fail or would be too dangerous to ever be used,” said Lu. “Our results suggest this method is safe and could potentially revolutionize the treatment of the eye and many other organs affected by aging.”
Reversal of glaucoma and age-related vision loss
Following the encouraging findings in mice with optic nerve injuries, the team partnered with colleagues at Schepens Eye Research Institute of Massachusetts Eye and Ear Bruce Ksander, HMS associate professor of ophthalmology, and Meredith Gregory-Ksander, HMS assistant professor of ophthalmology. They planned two sets of experiments: one to test whether the three-gene cocktail could restore vision loss due to glaucoma and another to see whether the approach could reverse vision loss stemming from normal aging.
In a mouse model of glaucoma, the treatment led to increased nerve cell electrical activity and a notable increase in visual acuity, as measured by the animals’ ability to see moving vertical lines on a screen. Remarkably, it did so after the glaucoma-induced vision loss had already occurred.
“Regaining visual function after the injury occurred has rarely been demonstrated by scientists,” Ksander said. “This new approach, which successfully reverses multiple causes of vision loss in mice without the need for a retinal transplant, represents a new treatment modality in regenerative medicine.”
The treatment worked similarly well in elderly, 12-month-old mice with diminishing vision due to normal aging. Following treatment of the elderly mice, the gene expression patterns and electrical signals of the optic nerve cells were similar to young mice, and vision was restored. When the researchers analyzed molecular changes in treated cells, they found reversed patterns of DNA methylation—an observation suggesting that DNA methylation is not a mere marker or a bystander in the aging process, but rather an active agent driving it.
“What this tells us is the clock doesn’t just represent time—it is time,” said Sinclair. “If you wind the hands of the clock back, time also goes backward.”
The researchers said that if their findings are confirmed in further animal work, they could initiate clinical trials within two years to test the efficacy of the approach in people with glaucoma. Thus far, the findings are encouraging, researchers said. In the current study, a one-year, whole-body treatment of mice with the three-gene approach showed no negative side effects.
Scientists at the Centro Nacional de Investigaciones Cardiovasculares (CNIC) have identified a mitochondrial protein as a potential marker for the diagnosis of cardiovascular disease (CVD) and as a possible target for future treatments. The study is published today in the journal Nature.
Cardiovascular disease is the leading cause of death in the world, with most deaths from CVD caused by a heart attack or stroke. The leading underlying cause of the blood clots triggering these events is atherosclerosis, a chronic inflammatory disease that produces plaques in blood vessel wall composed of cell debris, fats, and fibrous material. Atherosclerosis manifests clinically as a thrombosis (a blood clot inside a blood vessel), which is the principal cause of acute myocardial infarction and stroke.
Atherosclerosis develops for many years without causing symptoms, and there is therefore a pressing need for new tools for diagnosis and therapy. Study leader Dr. Almudena Ramiro, of the CNIC, explained that, “we know that atherosclerosis includes an immunological component and that the innate and adaptive immune systems are both involved in the origin and progression of this disease.” However, little is known about the specific response of B cells in these processes or the repertoire of antibodies these cells produce during atherosclerosis.
The new study published in Nature now shows that the mitochondrial protein ALDH4A1 is an autoantigen involved in atherosclerosis. Autoantigens are molecules produced by the body that, through a variety of mechanisms, are recognized as foreign and trigger an immune response. “ALDH4A1 is recognized by the protective antibodies produced during atherosclerosis, making it a possible therapeutic target or diagnostic marker for this disease,” Ramiro said.
The study characterized the antibody response associated with atherosclerosis in mice lacking the low-density lipoprotein receptor (LDLR-/-) and fed a high-fat diet. During the study, the CNIC team collaborated with researchers at the German Cancer Research Center (DKFZ), the Spanish Cardiovascular Biomedical Research Network (CIBERCV), the Fundación Jiménez Díaz Institute for Medical Research, and the Universidad Autónoma de Madrid.
Describing the study, first author Cristina Lorenzo said, “We found that atherosclerosis is associated with the generation of specific antibodies in the germinal centers, where B cells diversify their antibodies and differentiate into high-affinity memory B cells and plasma cells.”
To study the repertoire of antibodies produced during atherosclerosis, the research team performed a high-throughput analysis based on isolating individual B cells and sequencing their antibody genes. “Analysis of the sequences of more than 1700 antibody genes showed that mice with atherosclerosis produced a distinct antibody repertoire. The production of these antibodies allowed us to study their targets (their antigen specificity) and their functional properties,” said Hedda Wardemann, of the DKFZ in Heidelberg.
Among the atherosclerosis-associated antibodies, the research team found that the antibody A12 was able to recognize plaques not only in the atherosclerosis-prone mice, but also in samples from patients with atherosclerosis in the carotid arteries. “Proteomics analysis showed that A12 specifically recognized a mitochondrial protein called aldehyde dehydrogenase 4 family, member A1 (ALDH4A1), identifying this protein as an autoantigen in the context of atherosclerosis,” said Lorenzo.
Ramiro adds, “The study shows that ALDH4A1 accumulates in plaques and that its plasma concentration is elevated in the atherosclerosis-prone mice and in human patients with carotid atherosclerosis, establishing ALDH4A1 as a possible biomarker of the disease.”
The team also found that infusion of A12 antibodies into the atherosclerosis-prone mice delayed plaque formation and reduced the circulating levels of free cholesterol and LDL, suggesting that anti-ALDH4A1 antibodies have therapeutic potential in the protection against atherosclerosis. “These results,” explained Ramiro, “broaden our knowledge of the humoral response during atherosclerosis and highlight the potential of ALDH4A1 as a new biomarker and of A12 as a therapeutic agent for this disease.”
The scientists conclude that their study opens the path to new diagnostic and therapeutic interventions in cardiovascular disease.
This fundamental finding could help improve the nutritional value of plant seeds and even reduce allergenicity of legume seeds in the future.
Thanks to this work, as the journal Trends in Plant Science recently noted, it is now known that functional amyloids serve biological functions in representatives of almost all large groups of living organisms: bacteria, archaea, animals, fungi, and plants.
This fundamental finding could help improve the nutritional value of plant seeds and even reduce allergenicity of legume seeds in the future.
On the packaging of more than a half of snacks and sweets, which may not contain nuts at all, a warning phrase is often found: ‘It may contain traces of peanuts.’ Some people are so allergic to this product that the smallest particles of the peanut or even its powder causes an unpleasant and sometimes dangerous reaction: from a simple rash to severe swelling. Peanut seeds contain many proteins, some of which can cause allergies. One of the most potent allergens is vicilin, which is found in various legumes, including peanuts and peas.
The study was conducted by the team of researchers from: St Petersburg University; the All-Russian Research Institute for Agricultural Microbiology; the Institute of Cytology of the Russian Academy of Sciences; the Institute of Theoretical and Experimental Biophysics of the Russian Academy of Sciences; Kazan Federal University; and the University of Burgundy (France). The research findings are published in the journal PLOS Biology. The scientists managed to show by experiments for the first time (previously they managed to predict this using bioinformatic algorithms) that the seeds of garden pea contain amyloid-like aggregates of storage proteins – amyloid fibrils. They were previously found in bacteria, archaea, animals and fungi, but were first found in plants. Interestingly, most of the amyloids in pea seeds are formed by the aforementioned protein vicilin.
‘Vicilin is one of the most important food allergens found in legumes. The mechanism of its allergenicity can potentially be associated with the amyloid properties of this protein that we have discovered. We have shown that storage proteins, which are the main reservoir of nutrients for the embryo, accumulate in seeds as amyloids. In the future, studying these mechanisms could help create less allergenic varieties of peas, peanuts, and other legumes,’ said Anton Nizhnikov, the corresponding author of the research, Associate Professor at the Department of Genetics and Biotechnology at St Petersburg University, Laboratory Head at the All-Russian Research Institute of Agricultural Meteorology.
‘Interestingly, according to our bioinformatic data, the storage proteins of seeds not only of peas, but of a number of plants that do not belong to legumes, turned out to be abundant in sites that are prone to the formation of amyloids, that is, compact and stable fibrillar aggregates. This explains the ability of seeds to survive various unfavourable conditions and germinate after many years,’ noted Kirill Antonets, the first author of the research, Associate Professor at the Department of Cytology and Histology at St Petersburg University, Senior Research Associate at the All-Russian Research Institute of Agricultural Meteorology.
Another possible applied value of this work is the creation of plant cultures with super nourishing seeds in the future. The in vitro experiments performed by the scientists have shown that mammals cannot completely digest plant amyloids: they cannot be broken down by digestive enzymes. As Anton Nizhnikov explains, amyloids significantly impair the nutritional value of seeds. It is therefore important to understand how the formation of amyloids in plant seeds can be reduced in order to obtain varieties with a larger amount of common proteins. Such crops can become particularly useful and nutritious for humans.
‘Today we are also studying the amyloids of root nodule bacteria. These are the microorganisms that can enter into symbiosis with legumes and bind atmospheric nitrogen so that plants can receive more nutrients,’ said Anton Nizhnikov. ‘There is an assumption that amyloids can also play an important role in a mutually beneficial symbiotic process. At least root nodule bacteria, as we have shown, also have amyloids. We hope that our findings will be of benefit to the development of plant biology and microbiology, as well as for agriculture.’
For reference: a special fibrillar aggregate of proteins – amyloids – has become known for its association with a number of diseases caused by abnormal protein aggregation, known as amyloidosis. In these severe diseases, monomeric soluble proteins are converted into polymeric fibrillar deposits that form amyloid ‘plaques’ in various tissues and organs. In total, there are more than 40 human diseases associated with amyloids, and they are very difficult to treat or are completely incurable.
However, as this research also confirms, in recent decades, scientists around the world have been finding more and more evidence that amyloids function in healthy organisms. This form of protein makes it possible to ‘conserve’ and stabilise various substances. Moreover, it acts as a kind of structural ‘template’. This happens not only in plants. For example, in humans and animals, some of the hormones are stored precisely in the form of amyloids, while other functional amyloids are involved in melanin biosynthesis and the formation of long-term memory.