Scientists probing a prehistoric crocodile group’s shadowy past have discovered a timeless truth – pore over anyone’s family tree long enough, and something surprising will emerge.
Scientists probing a prehistoric crocodile group’s shadowy past have discovered a timeless truth – pore over anyone’s family tree long enough, and something surprising will emerge.
Despite 300 years of research, and a recent renaissance in the study of their biological make-up, the mysterious, marauding teleosauroids have remained enduringly elusive.
Scientific understanding of this distant cousin of present day long snouted gharials has been hampered by a poor grasp of their evolutionary journey – until now.
Researchers from the University of Edinburgh have identified one previously unknown species of teleosauroid and seven of its close relatives – part of a group that dominated Jurassic coastlines 190 to 120 million years ago.
Their analysis offers tantalising glimpses of how teleosauroids adapted to the momentous changes that occurred during the Jurassic period, as the earth’s seas experienced many changes in temperature.
“Our study just scratches the surface of teleosauroid evolution,” says study lead Dr Michela M. Johnson, of the University’s School of GeoSciences. “But the findings are remarkable, raising interesting questions about their behaviour and adaptability.
“These creatures represented some of the most successful prehistoric crocodylomorphs during the Jurassic period and there is so much more to learn about them.”
The study reveals that not all teleosauroids were engaged in cut and thrust lifestyles, snapping at other reptiles and fish from the seas and swamps near the coast.
Instead, they were a complex, diverse group that were able to exploit different habitats and seek out a variety of food sources. Their physical make-up is also more diverse than was previously understood, the scientists say.
Previous research had provided insights into the origins and evolution of this fossilised croc’s whale-like relatives metriorhynchids, but less was known about teleosauroids.
To address this, the expert team of palaeontologists examined more than 500 fossils from more than 25 institutions around the world.
Cutting edge computer software enabled the team to glean swathes of revealing data regarding their anatomical similarities and differences, by examining the entire skeleton, teeth and bony armor, which indicated whether species were closely related or not.
This information enabled the team to create an up-to-date family tree of the teleosauroids group from which emerged two new large groups, whose anatomy, abundance, habitat, geography and feeding styles differ from one another significantly.
The first group, teleosaurids, were more flexible in terms of their habitat and feeding. The second group known as machimosaurids – which included the fearsome turtle crushers, Lemmysuchus and Machimosaurus – were more abundant and widespread.
Names given by the team to seven newly described fossils, found in both teleosaurids and machimosaurids, reflect a curious range of anatomical features – among them Proexochokefalos, meaning ‘large head with big tuberosities’ and Plagiophthalmosuchus, the ‘side-eyed crocodile’.
There are even hints of their diverse behavioural characteristics and unique locations – Charitomenosuchus, meaning ‘graceful crocodile’ and Andrianavoay, the ‘noble crocodile’ from Madagascar.
Researchers have named the newly discovered species, Indosinosuchus kalasinensis, after the Kalasin Province in Thailand, where the fossil – now housed in Maha Sarakham University – was found.
The recognition of I. kalasinensis shows that at least two species were living in similar freshwater habitats during the Late Jurassic – an impressive feat as teleosauroids, with the exception of Machimosaurus, were becoming rare during this time.
Dr Steve Brusatte, Reader in Vertebrate Palaentology, at the School of Geosciences, University of Edinburgh, said: “The same way family trees of our own ancestors and cousins tell us about our history, this huge new family tree of teleosauroids clarifies their evolution. They were some of the most diverse and important animals in the Jurassic oceans, and would have been familiar sights along the coastlines for tens of millions of years.”
Rochester Institute of Technology scientists perform first-ever simulation of large mass ratio black hole merger on Frontera.
Solving the equations of general relativity for colliding black holes is no simple matter.
Physicists began using supercomputers to obtain solutions to this famously hard problem back in the 1960s. In 2000, with no solutions in sight, Kip Thorne, 2018 Nobel Laureate and one of the designers of LIGO, famously bet that there would be an observation of gravitational waves before a numerical solution was reached.
He lost that bet when, in 2005, Carlos Lousto, then at The University of Texas at Brownsville, and his team generated a solution using the Lonestar supercomputer at the Texas Advanced Computing Center. (Concurrently, groups at NASA and Caltech derived independent solutions.)
In 2015, when the Laser Interferometer Gravitational-Wave Observatory (LIGO) first observed such waves, Lousto was in shock.
“It took us two weeks to realize this was really from nature and not from inputting our simulation as a test,” said Lousto, now a professor of mathematics at Rochester Institute of Technology (RIT). “The comparison with our simulations was so obvious. You could see with your bare eyes that it was the merger of two black holes.”
Lousto is back again with a new numerical relativity milestone, this time simulating merging black holes where the ratio of the mass of the larger black hole to the smaller one is 128 to 1 — a scientific problem at the very limit of what is computational possible. His secret weapon: the Frontera supercomputer at TACC, the eighth most powerful supercomputer in the world and the fastest at any university.
His research with collaborator James Healy, supported by the National Science Foundation (NSF), was published in Physical Review Letters [https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.125.191102] this week. It may require decades to confirm the results experimentally, but nonetheless it serves as a computational achievement that will help drive the field of astrophysics forward.
“Modeling pairs of black holes with very different masses is very computational demanding because of the need to maintain accuracy in a wide range of grid resolutions,” said Pedro Marronetti, program director for gravitational physics at NSF. “The RIT group has performed the world’s most advanced simulations in this area, and each of them takes us closer to understanding observations that gravitational-wave detectors will provide in the near future.”
LIGO is only able to detect gravitational waves caused by small and intermediate mass black holes of roughly equal size. It will take observatories 100 times more sensitive to detect the type of mergers Lousto and Healy have modeled. Their findings show not only what the gravitational waves caused by a 128:1 merger would look like to an observer on Earth, but also characteristics of the ultimate merged black hole including its final mass, spin, and recoil velocity. These led to some surprises.
“These merged black holes can have speeds much larger than previously known,” Lousto said. “They can travel at 5,000 kilometers per second. They kick out from a galaxy and wander around the universe. That’s another interesting prediction.”
The researchers also computed the gravitational waveforms — the signal that would be perceived near Earth — for such mergers, including their peak frequency, amplitude, and luminosity. Comparing those values with predictions from existing scientific models, their simulations were within 2 percent of the expected results.
Previously, the largest mass ratio that had ever been solved with high-precision was 16 to 1 — eight times less extreme than Lousto’s simulation. The challenge of simulating larger mass ratios is that it requires resolving the dynamics of the interacting systems at additional scales.
Like computer models in many fields, Lousto uses a method called adaptive mesh refinement to get precise models of the dynamics of the interacting black holes. It involves putting the black holes, the space between them, and the distant observer (us) on a grid or mesh, and refining the areas of the mesh with greater detail where it is needed.
Lousto’s team approached the problem with a methodology that he compares to Zeno’s first paradox. By halving and halving the mass ratio while adding internal grid refinement levels, they were able to go from 32:1 black hole mass ratios to 128:1 binary systems that undergo 13 orbits before merger. On Frontera, it required seven months of constant computation.
“Frontera was the perfect tool for the job,” Lousto said. “Our problem requires high performance processors, communication, and memory, and Frontera has all three.”
The simulation isn’t the end of the road. Black holes can have a variety of spins and configurations, which impact the amplitude and frequency of the gravitational waves their merger produces. Lousto would like to solve the equations 11 more times to get a good first range of possible “templates” to compare with future detections.
The results will help the designers of future Earth- and space-based gravitational wave detectors plan their instruments. These include advanced, third generation ground based gravitational wave detectors and the Laser Interferometer Space Antenna (LISA), which is targeted for launch in the mid-2030s.
The research may also help answer fundamental mysteries about black holes, such as how some can grow so big — millions of times the mass of the Sun.
“Supercomputers help us answer these questions,” Lousto said. “And the problems inspire new research and pass the torch to the next generation of students.”
Researchers at the Max Planck Institute for Marine Microbiology in Bremen are developing a user-friendly method to reconstruct and analyze SSU rRNA from raw metagenome data.
First the background: Microbiologists traditionally determine which organisms they are dealing with using the small subunit ribosomal RNA or in short SSU rRNA gene. This marker gene allows to identify almost any living creature, be it a bacterium or an animal, and thus assign it to its place in the tree of life. Once the position in the tree of life is known, specific DNA probes can be designed to make the organisms visible in an approach called FISH (fluorescence in situ hybridization). FISH has many applications, for example to sort cells, or to microscopically record their morphology or spatial position. This approach – which leads from DNA to gene to tree and probe to image – is called the “full-cycle rRNA approach”. To make the SSU rRNA measurable, it is usually amplified with polymerase chain reaction (PCR). Today, however, PCR is increasingly being replaced by so-called metagenomics, which record the entirety of all genes in a habitat. Rapid methodological advances now allow the fast and efficient production of large amounts of such metagenomic data. The analysis is performed using significantly shorter DNA sequence segments – much shorter than the SSU gene – which are then laboriously assembled and placed into so-called metagenomically assembled genomes (MAGs). The short gene snippets do not provide complete SSU rRNA, and even in many assemblies and MAGs we do not find this important marker gene. This makes it hard to molecularly identify organisms in metagenomes, to compare them to existing databases or even to visualize them specifically with FISH.
phyloFlash provides remedy
Researchers at the Max Planck Institute for Marine Microbiology in Bremen now present a method that closes this gap and makes it possible to reconstruct and analyze SSU rRNA from raw metagenome data. “This software called phyloFlash, which is freely available through GitHub, combines the full-cycle rRNA approach for identification and visualization of non-cultivated microorganisms with metagenomic analysis; both techniques are well established at the Max Planck Institute for Marine Microbiology in Bremen,” explains Harald Gruber-Vodicka, who chiefly developed the method. “phyloFlash comprises all necessary steps, from the preparation of the necessary genome database (in this case SILVA), data extraction and taxonomic classification, through assembly, to the linking of SSU rRNA sequences and MAGs”. In addition, the software is very user-friendly and both installation and application are largely automated.
Especially suitable for simple communities
Gruber-Vodicka and his colleague Brandon Seah – who are shared first authors of the publication now presenting phyloFlash in the journal mSystems – come from symbiosis research. The communities they are dealing with in this field of research are comparatively simple: Usually a host organism lives together with one or a handful of microbial symbionts. Such communities are particularly well suited for analysis with phyloFlash. “For example, we do a lot of research on the deep-sea mussel Bathymodiolus, which is home to several bacterial subtenants,” says Gruber-Vodicka. “With the help of this well-studied community, we were able to test whether and how reliably phyloFlash works”. And indeed, the new software reliably identified both the mussel and its various symbionts. Niko Leisch, also a symbiosis researcher at the Max Planck Institute for Marine Microbiology, tested phyloFlash on small marine roundworms. Analyses of various such nematodes showed that some of the species of these inconspicuous worms might be associated with symbionts. “These exciting glimpses underline the great potential of our simple and fast method”, Gruber-Vodicka points out.
OpenSource and all-purpose
phyloFlash is an OpenSource software. Extensive documentation and a very active community ensure its continuous testing and further development. “phyloFlash is certainly not only interesting for microbiologists,” emphasizes Gruber-Vodicka. “Already now, numerous scientists from diverse fields of research make use of our software. The simple installation was certainly helpful in this respect, as it lowers the threshold for use”. This easy access and interactive character is also particularly important to Brandon Seah, who now works at the Max Planck Institute for Developmental Biology: “The most satisfying thing for me about this project is to see other people using our software to drive their own research forward,” says Seah. ” From the beginning, we’ve added features and developed the software in response to user feedback. These users are not just colleagues down the hall, but also people from the other side of the world who have given it a try and gotten in touch with us online. It underlines how open-source is more productive and beneficial both for software development and for science.”
Paleontologists describe skeleton of a juvenile Plateosaurus for the first time.
Long neck, small head and a live weight of several tons – with this description you could have tracked down the Plateosaurus in Central Europe about 220 million years ago. Paleontologists at the University of Bonn (Germany) have now described for the first time an almost complete skeleton of a juvenile Plateosaurus and discovered that it looked very similar to its parents even at a young age. The fact that Plateosaurus showed a largely fully developed morphology at an early age could have important implications for how the young animals lived and moved around. The young Plateosaurus, nicknamed “Fabian”, was discovered in 2015 at the Frick fossil site in Switzerland and is exhibited in the local dinosaur museum. The study was published in the journal “Acta Palaeontologica Polonica”.
In order to study the appearance of dinosaurs more closely, researchers today rely on a large number of skeletons in so-called bone beds, which are places where the animals sank into the mud in large numbers during their lifetime. However, juvenile animals had hardly been found in these until now. Researchers described fossils of still juvenile plateosaurs for the first time just a few years ago, but these were already almost as large as the adults. One possible reason: “The smaller individuals probably did not sink into the mud quite as easily and are therefore underrepresented at the bone beds,” suspects study leader Prof. Martin Sander of the University of Bonn.
He and his team used comparative anatomy to examine the new skeleton, which was immediately remarkable because of its small size. “Based on the length of the vertebrae, we estimate the total length of the individual to be about 7.5 feet (2.3 meters), with a weight of about 90 to 130 lbs. (40 to 60 kilograms),” explains Darius Nau, who was allowed to examine the find for his bachelor’s thesis. For comparison: Adult Plateosaurus specimens reached body lengths of 16 to 33 feet (five to ten meters) and could weigh more than four tons. Because of its small size alone, it was obvious to assume that “Fabian” was a juvenile animal. This assumption was confirmed by the fact that the bone sutures of the spinal column had not yet closed. Background: Similar to skull sutures in human babies, bone sutures only fuse over the course of life.
Young and old resembled each other anatomically and in their body proportions
Researchers found that the young dinosaur resembled its older relatives both in anatomical details, such as the pattern of the laminae on the vertebrae (bony lamellae connecting parts of the vertebrae, which are important anatomical features in many dinosaurs), and in the rough proportions of its body. “The hands and neck of the juveniles may be a little longer, the arm bones a little shorter and slimmer. But overall, the variations are relatively small compared to the variation within the species overall and also compared to other dinosaur species,” stresses Nau. The juveniles of the related Mussaurus for instance were still quadrupeds after hatching, but the adults were bipeds.
“The fact that the Plateosaurus juvenile already looked so similar to the adults is all the more remarkable considering that they were ten times heavier,” emphasizes paleontologist Dr. Jens Lallensack from the University of Bonn. It is however conceivable that the morphological development differed greatly from animal to animal, depending on the climatic conditions or the availability of food. Such differences are still seen in reptiles today.
The well-known descendants of Plateosaurus, the sauropods, are the subject of a current exhibition at the Zoological Research Museum Alexander Koenig in Bonn. The largest Plateosaurus skeleton ever found can be seen there.
References: Darius Nau, Jens N. Lallensack, Ursina Bachmann, P. Martin Sander: Postcranial Osteology of the First Early-Stage Juvenile Skeleton of Plateosaurus trossingensis (Norian, Frick, Switzerland). Acta Palaeontologica Polonica; DOI: 10.4202/app.00757.2020 http://app.pan.pl/article/item/app007572020.html
Scientists at the National Center for Tumor Diseases Dresden (NCT/UCC) and Dresden University Medicine, together with an international team of researchers, were able to demonstrate that certain white blood cells, so-called neutrophil granulocytes, can potentially – after completing a special training programme – be utilised for the treatment of tumours. In order to stimulate the training of this part of the innate immune system, the scientists used beta-glucan, a long-chain sugar molecule that occurs as a natural fibre mainly in the cell walls of fungi, oats or barley. The immune training already became effective at the level of blood formation in the bone marrow, in the precursor cells of the neutrophil granulocytes. Based on this newly described mechanism, it is possible that novel cancer immunotherapies which improve treatment for cancer patients will be developed in the future. The scientists published their results in the renowned specialist journal Cell.
The National Center for Tumor Diseases Dresden (NCT/UCC) is a joint institution of the German Cancer Research Center (DKFZ), the University Hospital Carl Gustav Carus Dresden, Carl Gustav Carus Faculty of Medicine at TU Dresden and the Helmholtz- Zentrum Dresden-Rossendorf (HZDR).
Tumour cells can evade the immune system in a variety of ways and in this manner nullify its protective effect. Immunotherapies aim at preventing these evasive manoeuvers and at redirecting the natural defence mechanisms in the patient’s body against the cancer cells.
Modern immunotherapies rely on the specialists of our defence system, such as T cells, dendritic cells or certain antibodies. As part of the specific immune system, these are able to recognise suitable structures on tumour or immune cells and initiate or execute a precisely tailored defence reaction. For the first time, scientists at the National Center for Tumor Diseases Dresden (NCT/UCC) and University Medicine Dresden were now able to demonstrate that even the non-specific immune response of our body can – through special training – be weaponised against tumours. “Based on the mechanism described, new forms of cancer immunotherapy are conceivable which could improve the chances for treatment for certain patients in the future,” says Prof. Triantafyllos Chavakis, Director of the Institute of Clinical Chemistry and Laboratory Medicine (IKL) of the University Hospital Carl Gustav Carus Dresden.
Training of neutrophil granulocytes inhibits tumour growth
At the centre of the described mechanism are special immune cells, so-called neutrophil granulocytes – or neutrophils for short. These form the most common subgroup of the white blood cells and are part of the innate, non-specific immune defence. In contrast to the specific part of our immune system – which first analyses foreign structures in the body in detail and then, with a time lag, activates tailor-made defence mechanisms – the non-specific part of the body’s own defence acts as a rapid response force: if pathogens enter the body or cells degenerate, it reacts very quickly and mostly stereotypically.
However, certain stimuli can also influence – or even train – the non-specific immune response. Training causes certain actors of the rapid response force to exhibit altered properties and perform their tasks better and over a longer period of time than before: the impact of the rapid response force increases. The researchers have now been able to demonstrate for the first time that this effect, which is already known to occur in infections, can also be used against tumours.
The neutrophil granulocytes play an important role in this process. In certain tumours, they accumulate in the environment of the tumour or migrate into it. These “tumour-associated neutrophils” – located directly at the tumour – can inhibit tumour growth, but some also have tumour-promoting properties. It is assumed that the tumour itself releases substances that turn the neutrophils into drivers of tumour growth. In experimental models, the scientists were able to partially reverse this process, which is detrimental to healing, by specially training the non-specific immune response. In order to stimulate the immune system, they used the long- chain sugar molecule (polysaccharide) beta-glucan. This is a natural fibre found mainly in the cell walls of fungi, oats or barley. Administrating beta-glucan caused the proportion of neutrophils with tumour-inhibiting properties to increase significantly and tumour growth to decrease.
Change in blood formation ensures long-term effect
Of particular importance in this context was proving that the reprogramming of neutrophil granulocytes already begins in the bone marrow. Here, from stem cells, various precursor cells develop and it is from these that the different blood cells emerge. The administration of beta-glucan altered the gene activity of the myeloid precursor cells. The neutrophils later also develop from these. “This causes the properties of the short-lived neutrophils to change in the longer term, towards activity directed against the tumour. This is because the precursor cells form neutrophils with tumour-inhibiting properties over a longer period of time,” explains joint first author Lydia Kalafati from IKL and NCT/UCC.
As the next step, it would be conceivable to utilise the principle of neutrophil training in combination with already approved immunotherapies in cancer patients. “In doing so, we also want to investigate in which types of tumours the method works particularly well, in order to then use it in a very targeted manner in future,” says Prof. Martin Bornhäuser, member of the Managing Directorate of the NCT/UCC and Director of the Department of Medicine I of the University Hospital Dresden.
Scientists have found a way to generate electricity from nylon, raising hopes that the clothes on our backs will become an important source of energy.
Researchers have found a way to produce nylon fibres that are smart enough to produce electricity from simple body movement, paving the way for smart clothes that will monitor our health through miniaturised sensors and charge our devices without any external power source.
This discovery – a collaboration between the University of Bath, the Max Planck Institute for Polymer Research in Germany and the University of Coimbra in Portugal – is based on breakthrough work on solution-processed piezoelectric nylons led by Professor Kamal Asadi from the Department of Physics at Bath and his former PhD student Saleem Anwar.
Piezoelectricity describes the phenomenon where mechanical energy is transformed into electric energy. To put it simply, when you tap on or distort a piezoelectric material, it generates a charge. Add a circuit and the charge can be taken away, stored in a capacitor for instance and then put to use – for example, to power your mobile phone.
While wearing piezoelectric clothing, such as a shirt, even a simple movement like swinging your arms would cause sufficient distortions in the shirt’s fibres to generate electricity.
Professor Asadi said: “There’s growing demand for smart, electronic textiles, but finding cheap and readily available fibres of electronic materials that are suitable for modern-day garments is a challenge for the textile industry.
“Piezoelectric materials make good candidates for energy harvesting from mechanical vibrations, such as body motion, but most of these materials are ceramic and contain lead, which is toxic and makes their integration in wearable electronics or clothes challenging.”
Scientists have been aware of the piezoelectric properties of nylon since the 1980s, and the fact that this material is lead-free and non-toxic has made it particularly appealing. However, the silky, man-made fabric often associated with cheap T-shirts and women’s stockings is “a very difficult material to handle”, according to Professor Asadi.
“The challenge is to prepare nylon fibres that retain their piezoelectric properties,” he said.
In its raw polymer form, nylon is a white powder that can be blended with other materials (natural or man-made) and then moulded into myriad products, from clothes and toothbrush bristles to food packaging and car parts. It’s when nylon is reduced to a particular crystal form that it becomes piezoelectric. The established method for creating these nylon crystals is to melt, rapidly cool and then stretch the nylon. However this process results in thick slabs (known as ‘films’) that are piezoelectric but not suited to clothing. The nylon would need to be stretched to a thread to be of woven into garments, or to a thin film to be used in wearable electronics.
The challenge of producing thin piezoelectric nylon films was thought to be insurmountable, and initial enthusiasm for creating piezoelectric nylon garments turned to apathy, resulting in research in this area virtually grinding to a halt in the 1990s.
On a whim, Professor Asadi and Mr Anwar – a textile engineering- took a completely new approach to producing piezoelectric nylon thin films. They dissolved the nylon powder in an acid solvent rather than by melting it. However, they found that the finished film contained solvent molecules that were locked inside the materials, thereby preventing formation of the piezoelectric phase.
“We needed to find a way to remove the acid to make the nylon useable,” said Professor Asadi, who started this research at the Max Planck Institute for Polymer Research in Germany before moving to Bath in September.
By chance, the pair discovered that by mixing the acid solution with the acetone (a chemical best known as a paint thinner or nail varnish remover), they were able to dissolve the nylon and then extract the acid efficiently, leaving the nylon film in a piezoelectric phase.
“The acetone bonds very strongly to the acid molecules, so when the acetone is evaporated from nylon solution, it takes the acid with it. What you’re left with is nylon in its piezoelectric crystalline phase. The next step is to turn nylon into yarns and then integrate it into fabrics.”
Developing piezoelectric fibres is a major step towards being able to produce electronic textiles with clear applications in the field of wearable electronics. The goal is to integrate electronic elements, such as sensors, in a fabric, and to generate power while we’re on the move. Most likely, the electricity harvested from the fibres of piezoelectric clothing would be stored in a battery nestled in a pocket. This battery would then connect to a device either via a cable or wirelessly.
“In years to come, we could be using our T-shirts to power a device such as our mobile phone as we walk in the woods, or for monitoring our health,” said Professor Asadi.
In 1973, physicist and later Nobel laureate Philip W. Anderson proposed a bizarre state of matter: the quantum spin liquid (QSL). Unlike the everyday liquids we know, the QSL actually has to do with magnetism – and magnetism has to do with spin.
Disordered electron spin produces QSLs
What makes a magnet? It was a long-lasting mystery, but today we finally know that magnetism arises from a peculiar property of sub-atomic particles, like electrons. That property is called “spin”, and the best – yet grossly insufficient – way to think of it is like a child’s spinning-top toy.
What is important for magnetism is that spin turns every one of a material’s billions of electrons into a tiny magnet with its own magnetic “direction” (think north and south pole of a magnet). But the electron spins aren’t isolated; they interact with each other in different ways until they stabilize to form various magnetic states, thereby granting the material they belong to magnetic properties.
In a conventional magnet, the interacting spins stabilize, and the magnetic directions of each electron align. This results in a stable formation.
But in what is known as a “frustrated” magnet, the electron spins can’t stabilize in the same direction. Instead, they constantly fluctuate like a liquid – hence the name “quantum spin liquid.”
Quantum Spin Liquids in future technologies
What is exciting about QSLs is that they can be used in a number of applications. Because they come in different varieties with different properties, QSLs can be used in quantum computing, telecommunications, superconductors, spintronics (a variation of electronics that uses electron spin instead of current), and a host of other quantum-based technologies.
But before exploiting them, we first have to gain a solid understanding of QSL states. To do this, scientists have to find ways to produce QSLs on demand – a task that has proven difficult so far, with only a few materials on offer as QSL candidates.
A complex material might solve a complex problem
Publishing in PNAS, scientists led by Péter Szirmai and Bálint Náfrádi at László Forró’s lab at EPFL’s School of Basic Sciences have successfully produced and studied a QSL in a highly original material known as EDT-BCO. The system was designed and synthesized by the group of Patrick Batail at Université d’Angers (CNRS).
The structure of EDT-BCO is what makes it possible to create a QSL. The electron spins in the EDT-BCO form triangularly organized dimers, each of which has a spin-1/2 magnetic moment which means that the electron must fully rotate twice to return to its initial configuration. The layers of spin-1/2 dimers are separated by a sublattice of carboxylate anions centred by a chiral bicyclooctane. The anions are called “rotors” because they have conformational and rotational degrees of freedom.
The unique rotor component in a magnetic system makes the material special amongst QSL candidates, representing a new material family. “The subtle disorder provoked by the rotor components introduces a new handle upon the spin system,” says Szirmai.
The scientists and their collaborators employed an arsenal of methods to explore the EDT-BCO as a QSL material candidate: density functional theory calculations, high-frequency electron spin resonance measurements (a trademark of Forró’s lab), nuclear magnetic resonance, and muon spin spectroscopy. All of these techniques explore the magnetic properties of EDT-BCO from different angles.
All the techniques confirmed the absence of long-range magnetic order and the emergence of a QSL. In short, EDT-BCO officially joins the limited ranks of QSL materials and takes us a step further into the next generation of technologies. As Bálint Náfrádi puts it: “Beyond the superb demonstration of the QSL state, our work is highly relevant, because it provides a tool to obtain additional QSL materials via custom-designed functional rotor molecules.”
Reference: Péter Szirmai, Cécile Mézière, Guillaume Bastien, Pawel Wzietek, Patrick Batail, Edoardo Martino, Konstantins Mantulnikovs, Andrea Pisoni, Kira Riedl, Stephen Cottrell, Christopher Baines, László Forró, Bálint Náfrádi. Quantum spin-liquid states in an organic magnetic layer and molecular rotor hybrid. PNAS 05 November 2020. DOI: 10.1073/pnas.2000188117http://dx.doi.org/10.1073/pnas.2000188117
New Singapore study suggests that patients who are carriers of NIID gene mutation may also present with symptoms and signs of Parkinson’s disease (PD), and respond to PD drugs.
A joint study by the National Neuroscience Institute (NNI) and Singapore General Hospital (SGH) revealed that patients who have been diagnosed with Parkinson’s disease might actually have NIID instead.
NIID is a disabling neurodegenerative condition due to a gene mutation and has no effective treatment. Symptoms of NIID include dementia, Parkinsonism, poor balance, as well as numbness and weakness in the limbs. A patient with NIID may or may not experience symptoms, depending on age and stage of disease. The severe form of NIID is usually seen in older patients, where the disease has progressed to an advanced stage.
The team studied more than 2,000 study participants, comprising healthy individuals and those with Parkinson’s disease (PD), over more than a decade. They were surprised to find NIID-causing mutations in those diagnosed with PD. Dr Ma Dongrui, Senior Medical Laboratory Scientist, Department of Neurology, SGH, and first author of the study explains, “To our knowledge, this is the first study reporting PD patients with NOTCH2NLC gene mutations as seen in NIID patients. Thankfully, they responded to PD medications better than most PD patients do. This suggests that there must be factors that can influence why some develop PD while many others develop the more severe form of NIID.
While analysing the NIID gene, the team found a group of healthy participants who had a “milder” form of mutation. Such mutation in the NIID gene could indicate that they are at risk of developing NIID or PD.
Since NIID can go undetected, a high index of suspicion may be needed even in PD patients. “With what we know now, it might be beneficial for clinicians to be watchful of early cognitive impairment or imaging evidence that may suggest NIID in patientsdiagnosed with PD. As NIID is caused by a genetic mutation, it also may be worth looking out for family members of PD patients who may show signs of NIID,” suggests Professor Tan Eng King, Deputy Medical Director and Director of Research, NNI.
“Our findings suggest that many neurodegenerative diseases overlap and may share a common etiology. Finding a common link and uncovering the reason why a similar gene mutation leads to both mild PD and a severe form of NIID can help identify new drugs for these conditions,” continues Prof Tan, who is also the senior author of the study and a Senior Consultant at NNI’s Department of Neurology on SGH Campus.
Following this study, the team plans to conduct more studies to better understand the mechanism behind NIID and identify new drugs for this condition. More research is needed to understand if the broad clinical phenotype of NIID is related to the subtle genetic differences at the NOTCH2NLC gene locus, race or other factors. Long-term follow-up of carriers of the gene mutation with PD phenotype may provide additional clues.
The study findings were published online in JAMA Neurology on 24 August 2020.
Reference: Dongrui Ma, Yi Jayne Tan, Adeline S. L. Ng, Helen L. Ong, Weiying Sim, Weng Khong Lim, Jing Xian Teo, Ebonne Y. L. Ng, Ee-Chien Lim, Ee-Wei Lim, Ling-Ling Chan, Louis C. S. Tan, Zhao Yi, PhD; Eng-King Tan. Association of NOTCH2NLC Repeat Expansions with Parkinson Disease. JAMA Neurology. Published online August 24, 2020. doi:10.1001/jamaneurol.2020.3023link: http://dx.doi.org/10.1001/jamaneurol.2020.3023
New research presented at ACR Convergence, the American College Rheumatology’s annual meeting, showed that patients with rheumatic diseases whose infliximab treatment was individually assessed and adjusted with a new strategy called therapeutic drug monitoring did not achieve remission at higher rates compared to those who received standard care. (ABSTRACT #2029)
Tumor necrosis factor inhibitors (TNFi) are drugs approved for treatment of rheumatoid arthritis, psoriatic arthritis and other inflammatory conditions. They are part of a class of drugs called biologic disease-modifying anti-rheumatic drugs, or biologics. TNFi drugs may help patients lower their disease activity, relieve debilitating symptoms, and manage their condition long term.
Some patients experience a lack or loss of response to TNFi, possibly due to low serum drug levels and or the formation of anti-drug antibodies. Therapeutic drug monitoring is one proposed strategy to prevent loss of response and to optimize TNFi effectiveness. It is an individualized strategy that includes regular assessments of a patient’s serum drug levels. This personalized monitoring can be time-consuming and costly, and it is unclear whether it actually improves clinical outcomes. To learn more, researchers in Norway launched the first open-label, multi-center, randomized, controlled trial to assess its effectiveness in achieving remission in patients with several inflammatory diseases.
“Based on observational data showing associations between serum drug levels and effectiveness, we believed that individual treatment, with optimizing drug levels and early identification of anti-drug antibodies, would optimize the efficacy, safety and cost effectiveness of TNFi therapy,” says the study’s principal author, Silje Watterdal Syversen, MD, PhD, a rheumatologist at Diakonhjemmet Hospital in Oslo, Norway.
Adults with rheumatoid arthritis, psoriatic arthritis, spondyloarthritis, ulcerative colitis, Crohn’s disease and psoriasis who were starting infliximab therapy were recruited for the study. Patients were randomly assigned to receive infliximab either with or without therapeutic drug monitoring. They were examined at each infusion visit. The study’s primary endpoint was remission at week 30. The study included 411 patients from 21 medical centers between January 2017 and December 2018: 80 with rheumatoid arthritis, 42 with psoriatic arthritis, 117 with spondyloarthritis, 80 with ulcerative colitis, 57 with Crohn’s, and 22 with psoriasis. The researchers included 198 patients in the therapeutic drug monitoring arm and 200 patients were included in the control group. Researchers also recorded any adverse events the patients experienced, such as infections or infusion reactions.
According to their results, therapeutic drug monitoring was not superior to standard treatment for achieving disease remission at 30 weeks in people with a range of rheumatic diseases. In the study’s therapeutic drug monitoring arm, 100 or 53% of patients achieved remission, while 106 or 54% of the patients in the standard treatment group also achieved remission. The study also found that 10% of patients who had therapeutic drug monitoring and 15% of patients receiving standard treatment developed significant levels of anti-drug antibodies. Adverse events were similar for both treatment groups as well, but infusion reactions were less frequent in patients who received therapeutic drug monitoring.
“Our study does not support therapeutic drug monitoring be applied as a general treatment strategy during induction of infliximab. Despite a lack of clinical trial data and diverging guidelines, proactive therapeutic drug monitoring has already been adopted in clinical practice across different specialities. This study highlights the need for thorough evaluation of monitoring tools and treatment strategies before their implementation in clinical care,” says Dr. Syversen. “We feel that our results put to rest a long-standing debate on the merits of using therapeutic drug monitoring in all patients starting TNFi.”
Future research should explore if more targeted applications of therapeutic drug monitoring, such as assessment of serum drug levels in treatment failures, could be a useful clinical tool, she adds.
About ACR Convergence
ACR Convergence, the ACR’s annual meeting, is where rheumatology meets to collaborate, celebrate, congregate, and learn. Join ACR for an all-encompassing experience designed for the entire rheumatology community. ACR Convergence is not just another meeting – it’s where inspiration and opportunity unite to create an unmatched educational experience. For more information about the meeting, visit https://www.rheumatology.org/Annual-Meeting, or join the conversation on Twitter by following the official hashtag (#ACR20).
About the American College of Rheumatology
The American College of Rheumatology (ACR) is an international medical society representing over 7,700 rheumatologists and rheumatology health professionals with a mission to empower rheumatology professionals to excel in their specialty. In doing so, the ACR offers education, research, advocacy and practice management support to help its members continue their innovative work and provide quality patient care. Rheumatologists are experts in the diagnosis, management and treatment of more than 100 different types of arthritis and rheumatic diseases.
Therapeutic Drug Monitoring Compared to Standard Treatment of Patients Starting Infliximab: Results from a Multicenter Randomized Controlled Trial of 400 Patients
A lack or loss of response to TNFα inhibitors (TNFi) has been associated with low serum drug levels and formation of anti-drug antibodies (ADAb). Therapeutic drug monitoring (TDM), an individualized treatment strategy based on regular assessments of serum drug levels, has been suggested to optimize efficacy of TNFi. It is still unclear if TDM improves clinical outcomes, and the value of TDM has recently been included in the research agenda across different specialties. This first randomized controlled trial on the effectiveness of TDM in a range of immune mediated inflammatory diseases including rheumatic diseases, the NORwegian DRUg Monitoring trial part A (NOR-DRUM (A)) focus on the induction period of infliximab (INX) treatment and aim to assess if TDM is superior to standard treatment in order to achieve remission.
In the investigator-initiated, randomized, open-label, multicenter NOR-DRUM (A) study, adult patients with rheumatoid arthritis (RA), psoriatic arthritis (PsA), spondyloarthritis (SpA), ulcerative colitis (UC), Crohn’s disease (CD) and psoriasis (Ps) starting INX therapy were randomly assigned administration of INX according to a treatment strategy based on TDM (TDM arm) or to standard administration of INX without TDM (control arm). Study visits were conducted at each infusion. The primary endpoint was remission at week 30. In the TDM arm, the dose and interval were adjusted according to INX trough levels to reach the therapeutic range (Figure 1). If the patient developed significant levels of ADAb, INX was terminated. To guide the investigators, the TDM strategy was integrated in an interactive eCRF. The primary endpoint was analysed by mixed effect logistic regression in the full analyses set (FAS), adjusting for diagnoses. Infections and infusion reactions were specified as adverse events (AEs) of special interest.
We enrolled 411 patients at 21 study centers between January 2017 and December 2018. 398 patients (RA 80, PsA 42, SpA 117, UC 80, CD 57, Ps 22) received the allocated strategy and were included in the FAS population. Demographic and baseline characteristics were comparable in both arms. TDM was not found to be superior to standard treatment with regard to the primary outcome. Remission at week 30 was reached in 100 (53%) and 106 (54%) of the patients in the TDM and control arm, respectively (adjusted difference, 1.5%; 95% confidence interval (CI), -8.2 to 11.1, p=0.78) (Figure 2). Consistent results were shown for all the secondary endpoints (Figure 3) and in the sensitivity analyses. Twenty patients (10%) in the TDM arm and 30 patients (15%) in the control arm developed significant levels of ADAb. The number of adverse events (AE) was similar in both groups, however infusion reactions were less frequent (5 patients (2.5%) vs 16 patients (8.0%)) in the TDM arm (difference 5.5% (95% CI 1.1, 9.8%))
NOR-DRUM (A) is the first randomized trial to address effectiveness of TDM in rheumatic diseases. In this study, TDM was not superior to standard treatment in order to achieve remission. Although improved safety is indicated by a reduction in infusion reactions, implementation of TDM as a general strategy in the induction period of INX is not supported by the NOR-DRUM (A) study.