Self-Interacting Dark Matter Can Serve As A Seed For The Supermassive Black Hole (Astronomy)

Astrophysical observations of high redshift quasars indicate that ∼ 10^9 M black holes exist when the Universe is just 800 Myr old after the Big Bang. The origin of these supermassive black holes (SMBHs) is still a mystery. In particular, it is extremely puzzling how they could become so massive in such a short time. A popular idea is that there exist heavy seed black holes in the early Universe and they grow massive by accreting baryons.

Black Hole ©

Assuming Eddington accretion, we can relate the black hole mass (MBH) and its seed mass (Mseed) as:

where, ∆t is the elapse time and τ = 450/fEdd(€ / (1-€) Myr is the e-folding time. € is the radiative efficiency and commonly assumed to be 0.1, and fEdd is the Eddington ratio (the ratio which relates the AGN bolometric luminosity with the Eddington luminosity), characterizing the accretion efficiency.

Considering J1007+2115, the most massive known quasar with the mass of 1.5 × 10^9 M at redshift, z > 7.5 and taking eddington ratio, fEdd ~ 1, researchers estimated Mseed ∼ 10⁴ M if it forms at z ∼ 30, i.e., ∆t = 597 Myr to its observed z = 7.51. Such a seed is too massive to be produced from collapsed Population III stars, but it could form through the direct collapse of pristine baryonic gas. The latter scenario predicts Mseed ∼ 10^5–10^6 M. However, observations showed there is another population of high redshifts SMBHs with fEdd much less than 1. For example, J1205-0000 is observed at redshift 6.7 with mass of black hole 2.2 × 10^9 M and eddington ratio of 0.16. The Eddington accretion then implies it grows from a seed with a mass of 2×10^8 M at z ∼ 30, too heavy to be produced via the direct collapse of gas.

In this recent study, Feng, Yu and Zhong proposed a scenario where a self-interacting dark matter (SIDM) halo experiences gravothermal instability and its central region collapses into a seed black hole (of high redshifts). Dark matter self-interactions can transport heat in the halo over cosmological timescales. As a gravothermal system, the halo has negative heat capacity and it is genuinely unstable. The central halo would become hot eventually and collapse to a singular state at late stages of the evolution. Thus SIDM has a natural mechanism in triggering gravitational stabilities, a necessary condition to form a black hole. Previous (but recent) studies also showed that SIDM is favored for explaining diverse dark matter distributions over a wide range of galactic systems.

“It is intriguing to explore an SIDM scenario that may explain the origin of the high-redshift SMBHs and observations of galaxies at z ∼ 0”, said Wei-Xiang Feng.

Gravothermal evolution of the dark matter density vs. enclosed mass in the presence of the baryonic potential (solid), as well as the fixed baryon profile (dash-dotted). Each dark matter profile is labeled with its corresponding evolution time, and the vertical dotted line indicates the mass of the central halo that would eventually collapse into a seed black hole. The insert panel illustrates the evolution of the averaged dark matter density of the central halo with (solid) and without (dashed) including the baryons. ©Wei-Xiang Feng

Feng, Yu & Xhong adopted a typical baryon mass profile for high-redshift protogalaxies as shown in graph above, (with the baryons, the halo does not form a large density core and it quickly evolves into the collapse phase. Its density keeps increasing and eventually becomes super-exponential in the end) and showed the collapse time can be shortened by a factor of 100, compared to the SIDM-only case. Even for the self-scattering cross section per unit mass σ/m ∼ 1 cm²/g, broadly consistent with the value used to explain galactic observations, the central halo could collapse sufficiently fast to form a seed for redshift, z ≥ 7. Depending on the halo mass, this scenario could explain both populations of high-redshift SMBHs with fEdd ∼ 1 and 0.1. It also has a built-in mechanism to dissipate angular momentum remanent of the central halo, i.e., viscosity induced by the self-interactions. The host halo must be on high tails of density fluctuations, implying that high-redshifts SMBHs are expected to be rare in this scenario, and the predicted host mass broadly agrees with the dynamical mass inferred from observations. They further showed when the 3D velocity dispersion of SIDM particles in the collapsed central region reaches 0.57c, the general relativistic (GR) instability can be triggered.

Their results indicated that self-interacting dark matter can provide a unified explanation for diverse dark matter distributions in galaxies today and the origin of SMBHs at redshifts z ∼ 6–7.

References: Wei-Xiang Feng, Hai-Bo Yu, Yi-Ming Zhong, “Seeding Supermassive Black Holes with Self-Interacting Dark Matter”, ArXiv, pp. 1-5, 2020. Link:

Copyright of this article totally belongs to uncover reality. One is allowed to use it only by giving proper credit to us and the author of this article.

Brain Region Implicated In Predicting The Consequences Of Actions (Neuroscience)

Researchers monitor brain cells in mice that build mental models to simulate the future.

Our minds can help us make decisions by contemplating the future and predicting the consequences of our actions. Imagine, for instance, trying to find your way to a new restaurant near your home. Your brain can build a mental model of your neighborhood and plan the route you should take to get there.

Scientists have now found that a brain structure called the anterior cingulate cortex (ACC), known to be important for decision making, is involved in using such mental models to learn. A new study of mice published today in Neuron highlights sophisticated mental machinery that helps the brain simulate the results of different actions and make the best choice.

Brain cells switched off in the anterior cingulate cortex (green) prevent mice from learning flexibly. ©Thomas Akam / Rui Costa / Champalimaud Centre for the Unknown.

“The neurobiology of model-based learning is still poorly understood,” said Thomas Akam, PhD, a researcher at Oxford University and lead author on the new paper. “Here, we were able to identify a brain structure that is involved in this behavior and demonstrate that its activity encodes multiple aspects of the decision-making process.”

Deciphering how the brain builds mental models is essential to understanding how we adapt to change and make decisions flexibly: what we do when we discover that one of the roads on the way to that new restaurant is closed for construction, for example.

“These results were very exciting,” said senior author Rui Costa, DVM, PhD, Director and CEO of Columbia’s Zuckerman Institute, who started this research while an investigator at the Champalimaud Centre for the Unknown, where most of the data was collected. “These data identify the anterior cingulate cortex as a key brain region in model-based decision-making, more specifically in predicting what will happen in the world if we choose to do one particular action versus another.”

Model or model-free?

A big challenge in studying the neural basis of model-based learning is that it often operates in parallel with another approach called model-free learning. In model-free learning, the brain does not put a lot of effort into creating simulations. It simply relies on actions that have produced good outcomes in the past.

You might use a model-free mental approach when traveling to your favorite restaurant, for example. Because you’ve been there before, you don’t need to invest mental energy in plotting the route. You can simply follow your habitual path and let your mind focus on other things.

To isolate the contributions of these two cognitive schemes – model-based and model-free – the researchers set up a two-step puzzle for mice.

In this task, an animal first chooses one of two centrally located holes to poke its nose into. This action activates one of two other holes to the side, each of which has a certain probability of providing a drink of water.

“Just like in real life, the subject has to perform extended sequences of actions, with uncertain consequences, in order to obtain desired outcomes,” said Dr. Akam.

To do the task well, the mice had to figure out two key variables. The first was which hole on the side was more likely to provide a drink of water. The second was which of the holes in the center activated that side hole. Once the mice learned the task, they would opt for the action sequence that offered the best outcome. However, in addition to this model-based way of solving the puzzle, mice could also learn simple model-free predictions, e.g. “top is good,” based on which choice had generally led to reward in the past.

The researchers then changed up the experiment in ways that required the animals to be flexible. Every now and then, the side port more likely to provide a drink would switch – or the mapping between central and side ports would reverse.

The animals’ choices as things changed revealed what strategies they were using to learn.

“Model-free and model-based learning should generate different patterns of choices,” said Dr. Akam. “By looking at the subjects’ behavior, we were able to assess the contribution of either approach.”

When the team analyzed the results, about 230,000 individual decisions, they learned that the mice were using model-based and model-free approaches in parallel.

“This confirmed that the task was suitable for studying the neural basis of these mechanisms,” said Dr. Costa. “We then moved on to the next step: investigating the neural basis of this behavior.”

A neural map of model-based learning

The team focused on a brain region called anterior cingulate cortex (ACC).

“Previous studies established that ACC is involved in action selection and provided some evidence that it could be involved in model-based predictions,” Dr. Costa explained. “But no one had checked the activity of individual ACC neurons in a task designed to differentiate between these different types of learning.”

The researchers discovered a tight connection between the activity of ACC neurons and the behavior of their mice. Simply by the looking at patterns of activity across groups of the cells, the scientists could decode whether the mouse was about chose one hole or another, for example – or whether it was receiving a drink of water.

In addition to representing the animal’s current location in the task, ACC neurons also encoded which state was likely to come next.

“This provided direct evidence that ACC is involved in making model-based predictions of the specific consequences of actions, not just whether they are good or bad,” said Dr. Akam.

Moreover, ACC neurons also represented whether the outcome of actions was expected or surprising, thereby potentially providing a mechanism for updating predictions when they turn out to be wrong.

The team also turned off ACC neurons while the animals were trying to make decisions. This prevented the animals from responding flexibly as the situation changed, an indicator that they were having trouble using model-based predictions.

Understanding how the brain controls complex behaviors like planning and sequential decision making is a big challenge for contemporary neuroscience.

“Our study is one of the first to demonstrate that it is possible to study these aspects of decision-making in mice,” said Dr. Akam. “These results will allow us and others to build mechanistic understanding of flexible decision making.”

References: Thomas Akam, Ines Rodrigues-Vaz, Ivo Marcelo, Rodrigo Freire Oliveira, Peter Dayan, Rui M. Costa, “The Anterior Cingulate Cortex Predicts Future States to Mediate Model-Based Action Selection”, 2020. DOI: link:

Provided by Zuckerman Institute At Columbia University

Scientists Grow Carbon Nanotube Forest Much Longer Than Any Other (Material Science)

Novel technique yields a carbon nanotube forest of record length, potentially revolutionizing the future of many industries.

Today, a multitude of industries, including optics, electronics, water purification, and drug delivery, innovate at an unprecedented scale with nanometer-wide rolls of honeycomb-shaped graphite sheets called carbon nanotubes (CNTs). Features such as light weight, convenient structure, immense mechanical strength, superior thermal and electrical conductivities, and stability put CNTs a notch above other material alternatives. However, to supply their rising industrial demand, their production must be constantly scaled up, and therein lies the main challenge to using CNTs.

Scientists from Japan have proposed a way to ensure longer catalyst lifetime and higher growth rate, creating a CNT forest that is a record seven times longer than any existing CNT array. ©Waseda University.

While scientists have been able to grow individual CNTs approximately 50 cm in length, when they attempt arrays, or forests, they hit a ceiling at around 2 cm. This is because the catalyst, which is key to CNT growth occurring, deactivates and/or runs out before CNTs in a forest can grow any longer, driving up monetary and raw-material costs of CNT production and threatening to cap its industrial use.

Now, a ceiling-breaking strategy has been devised by a team of scientists from Japan. In their study published in Carbon, the team presents a novel approach to a conventional technique that yields CNT forests of record length: ~14 cm—7 times greater than the previous maximum. Hisashi Sugime, assistant professor at Waseda University, who led the team, explains, “In the conventional technique, the CNTs stop growing due to a gradual structural change in the catalyst, so we focused on developing a new technique that suppresses this structural change and allows the CNTs to grow for a longer period.”

The team created a catalyst based on their findings in a previous study to begin with. They added a gadolinium (Gd) layer to the conventional iron-aluminum oxide (Fe/Al2Ox) catalyst coated onto a silicon (Si) substrate. This Gd layer prevented the deterioration of the catalyst to a certain extent, allowing the forest to grow up to around 5 cm in length.

Video: Although carbon nanotube forests are hard to grow very long via conventional methods, a little tweak in technique can change things dramatically. (Video/photo credit: Hisashi Sugime, Waseda University)

To further prevent catalyst deterioration, the team placed the catalyst in their original chamber called the cold-gas chemical vapor deposition (CVD) chamber. There, they heated it to 750°C and supplied it with small concentrations (parts-per-million) of room temperature Fe and Al vapors. This kept the catalyst going strong for 26 hours, in which time a dense CNT forest could grow to 14 cm. Various analyses to characterize the grown CNTs showed that they were of high purity and competitive strength.

This kept the catalyst going strong for 26 hours, in which time a dense CNT forest could grow to 14 cm. Various analyses to characterize the grown CNTs showed that they were of high purity and competitive strength.

This achievement not only overcomes hurdles to the widespread industrial application of CNTs but it opens doors in nanoscience research. “This simple but novel method that drastically prolongs catalyst lifetime by supplying ppm-level vapor sources is insightful for catalyst engineering in other fields such as petrochemistry and nanomaterial crystal growth,” Sugime says. “The knowledge herein could be pivotal to making nanomaterials a ubiquitous reality.”

References: Hisashi Sugime, Suguru Noda et al., “Ultra-long carbon nanotube forest via in situ supplements of iron and aluminum vapor sources”, Science Direct, 2020. link:

Provided by Waseda University

Coral Larvae Movement Is Paused In Reaction To Darkness (Biology)

Coral larvae movement is paused in reaction to darkness. Researchers find a new light responding behavior that may affect where corals live.

Light is essential for the growth of reef-building corals. This is because corals grow by using the photosynthetic products of the algae living inside their cells as a source of nutrients. Therefore, the light environment of coral habitats are important for their survival.

A type of reef-building coral, Acropora tenuis. ©NIBB

A new study published in Scientific Reports shows that coral larvae swimming in seawater behave in such a manner so as to temporarily stop swimming due to reduced light, especially blue light. Researchers think that this behavior may play a role in determining where corals settle.

Corals can only move freely during the larval stage of their lives. Larvae that hatch from eggs are able to swim by moving the cilia on the surface of their bodies. After that, when the larva settles on the seabed and transforms into a sedentary form (called a “polyp”), it becomes immobile.

How the corals, whose growth requires light, select a suitable light environment for survival is a mystery. To solve it, a research team led by Dr. Yusuke Sakai, Professor Naoto Ueno of the National Institute for Basic Biology in Japan thoroughly observed the response of coral larvae to light. They found that coral larvae temporarily stop swimming in response to a decrease in light intensity and then subsequently resumed swimming at their initial speed.

Upon light attenuation (at the 0 sec mark in the movie), the larvae temporarily stopped swimming (from around 20 sec mark). After a certain period of time from the light attenuation, the larvae resume swimming (from around 120 sec mark).

Corals mostly lay eggs once a year. “In collaboration with Andrew Negri, principal investigator at the Australian Institute of Marine Science, and Professor Andrew Baird and his colleagues at James Cook University, we have not only tested corals in Japan, but also in Australia’s Great Barrier Reef, where coral spawning occurs at a different time than here. This was performed in order to repeat the experiment and thus validate our findings ” said Dr. Sakai.

A larvae of Acropora tenuis. ©NIBB

The research team then conducted a detailed analysis of the wavelengths of light that coral larvae react to. The Okazaki Large Spectrograph, the world’s largest spectroscopic irradiator at the National Institute for Basic Biology, was used for this experiment. Experiments with coral larvae exposed to various light wavelengths revealed that coral larvae respond strongly to purple to blue light.

How does pausing behavior in response to light decay affect the destination of coral larvae? To answer this question, researchers conducted mathematical simulations; the results of which show that the pause caused by the attenuation of light and the subsequent resumption of swimming have the effect of resetting the swimming direction of the larva once when it moves into a dark region and turning it in a random direction. As a result, it was suggested that it would lead to the gathering of larvae in a bright space.

Dr. Sakai said “In cnidarians, including corals, the mechanism of light reception is largely unknown. We would like to clarify the molecular mechanism of light reception in coral larvae, which do not have an eye structure”.

“In the future, it will be important to elucidate not only this phenomenon but also the mysterious ecology of coral at the molecular and cellular levels, such as the mechanism for controlling the spawning time” Professor Naoto Ueno commented.

References: “A step-down photophobic response in coral larvae: implications for the light-dependent distribution of the common reef coral, Acropora tenuis” by Yusuke Sakai, Kagayaki Kato, Hiroshi Koyama, Alyson Kuba, Hiroki Takahashi, Toshihiko Fujimori, Masayuki Hatta, Andrew Negri, Andrew Baird, Naoto Ueno, Science Reports, 2020. DOI:

Provided by National Institute Of Natural Sciences

Microbial Space Travel On A Molecular Scale (Astronomy)

Since the dawn of space exploration, humankind has been fascinated by survival of terrestrial life in outer space. Outer space is a hostile environment for any form of life, but some extraordinarily resistant microorganisms can survive. Such extremophiles may migrate between planets and distribute life across the Universe, underlying the panspermia hypothesis or interplanetary transfer of life.

Space traveler Deinococcus radiodurans recovered after 1 year of exposure to low Earth orbit (LEO) outside the International Space Station during the Tanpopo space Mission. © Tetyana Milojevic

The extremophilic bacterium Deinococcus radiodurans withstands the drastic influence of outer space: galactic cosmic and solar UV radiation, extreme vacuum, temperature fluctuations, desiccation, freezing, and microgravity. A recent study examined the influence of outer space on this unique microbe on a molecular level. After 1 year of exposure to low Earth orbit (LEO) outside the International Space Station during the Tanpopo space Mission, researches found that D. radiodurans escaped morphological damage and produced numerous outer membrane vesicles. A multifaceted protein and genomic responses were initiated to alleviate cell stress, helping the bacteria to repair DNA damage and defend against reactive oxygen species. Processes underlying transport and energy status were altered in response to space exposure. D. radiodurans used a primordial stress molecule polyamine putrescine as a reactive oxygen species scavenger during regeneration from space exposure.

“These investigations help us to understand the mechanisms and processes through which life can exist beyond Earth, expanding our knowledge how to survive and adapt in the hostile environment of outer space. The results suggest that survival of D. radiodurans in LEO for a longer period is possible due to its efficient molecular response system and indicate that even longer, farther journeys are achievable for organisms with such capabilities” says Tetyana Milojevic, a head of Space Biochemistry group at the University of Vienna and a corresponding author of the study.

Together with the colleagues from Tokyo University of Pharmacy and Life Science (Japan), Research Group Astrobiology at German Aerospace Center (DLR, Cologne), Vienna Metabolomics Centre (ViMe) at the University of Vienna and Center for Microbiome Research at Medical University Graz, researches answered the question not only till which extend but how extremophilic microbes can tolerate drastic space conditions.

References: D. Kölbl, E. Rabbow, P. Rettberg, M. Mora, C. Moissl-Eichinger, W. Weckwerth, A. Yamagishi, T. Milojevic “Molecular repertoire of Deinococcus radiodurans after 1 year of exposure outside the International Space Station within the Tanpopo mission.” Microbiome 8, 150 (2020). link:

Provided by University of Vienna

Luminescent Wood Could Light Up Homes Of The Future (Material Science)

The right indoor lighting can help set the mood, from a soft romantic glow to bright, stimulating colors. But some materials used for lighting, such as plastics, are not eco-friendly. Now, researchers reporting in ACS Nano have developed a bio-based, luminescent, water-resistant wood film that could someday be used as cover panels for lamps, displays and laser devices.

When exposed to UV light on the outside, a luminescent wood panel (right) lights up an indoor space (as seen through “windows;” red arrows), whereas a non-luminescent panel (left) does not. ©Adapted from ACS Nano 2020, DOI: 10.1021/acsnano.0c06110

Consumer demand for eco-friendly, renewable materials has driven researchers to investigate wood-based thin films for optical applications. However, many materials developed so far have drawbacks, such as poor mechanical properties, uneven lighting, a lack of water resistance or the need for a petroleum-based polymer matrix. Qiliang Fu, Ingo Burgert and colleagues wanted to develop a luminescent wood film that could overcome these limitations.

The researchers treated balsa wood with a solution to remove lignin and about half of the hemicelluloses, leaving behind a porous scaffold. The team then infused the delignified wood with a solution containing quantum dots — semiconductor nanoparticles that glow in a particular color when struck by ultraviolet (UV) light. After compressing and drying, the researchers applied a hydrophobic coating. The result was a dense, water-resistant wood film with excellent mechanical properties. Under UV light, the quantum dots in the wood emitted and scattered an orange light that spread evenly throughout the film’s surface. The team demonstrated the ability of a luminescent panel to light up the interior of a toy house. Different types of quantum dots could be incorporated into the wood film to create various colors of lighting products, the researchers say.

Provided by American Chemical Society (ACS)

UL Research Reveals Extreme Levels Of Uric Acid Can Significantly Reduce Patient Survival (Medicine)

Extreme values of serum uric acid levels in the blood can markedly reduce a patient’s chance of surviving and reduce their lifespans by up to 11 years, according to a new study by researchers at University of Limerick’s School of Medicine.

©University Of Limerick

In one of the largest studies and the first of its kind in Ireland, researchers found evidence of substantial reductions in patient survival associated with extreme concentrations of serum uric acid (SUA) for both men and women.

The study, which was seed funded by the Health Research Board (HRB), has just been published in the European Journal of Internal Medicine.

“This is the first study to yield detailed survival statistics for SUA concentrations among Irish men and women in the health system,” according to lead author, Dr Leonard Browne, PhD, Senior Research Fellow in Biostatistics at the UL School of Medicine.

“Our key question was to determine whether SUA, a routinely measured blood marker, could help us predict a patient’s lifespan, all else being equal,” Dr Browne added.

To answer this, the research team used data from the National Kidney Disease Surveillance System (NKDSS), based at UL, and created a large cohort of 26,525 patients who entered the Irish health system at University Hospital Limerick between January 1, 2006 and December 31, 2012, following them until December 31, 2013.

Dr Browne said the results were “quite astonishing”.

“For men, the message was quite clear. The median survival was reduced by an average of 9.5 years for men with low levels of SUA (less than 238μmol/L), and 11.7 years for men with elevated SUA levels (greater than 535 μmol/L) compared to patients with levels of 357-416 μmol/L,” he explained.

“Similarly, for women, we found that the median survival was reduced by almost 6 years for those with SUA levels greater than 416 μmol/L, compared to women with SUA in the normal range.”

The shape of the mortality curves was quite different for men and women, according to Dr Browne.

“For men the shape of the association was predominantly U-shaped with optimal survival between 304-454 μmol/L, whereas, for women, the pattern of association was J-shaped with elevated risk of mortality only present for women with SUA levels beyond 409 μmol/L,” he explained.

Professor Austin Stack, Foundation Chair of Medicine at ULs School of Medicine, senior author of the study and Principal Investigator for the NKDSS at UL and Consultant Nephrologist at UL Hospitals, said there was good evidence that high levels of SUA are associated with a range of serious chronic medical conditions such as kidney failure, hypertension, heart disease, stroke and diabetes.

“These known associations might in part explain the high mortality that we observed for patients with elevated SUA levels in our study,” Professor Stack explained.

“Indeed, when we looked at the cause of death for these patients we found on one hand that that men and women with very high SUA levels died from cardiovascular causes of death.

“On the other hand, and quite surprisingly, we also found that very low levels of SUA were also associated with a higher risk of death primarily in men. This would of course suggest that very low levels of SUA are also detrimental to survival.

“We had speculated that patients with very low levels of SUA might reflect a subgroup that were generally sicker and had poorer nutritional status. Although when we took these considerations into our analysis, low SUA levels still predicted higher death rates in men.

“Interestingly, men who died with low SUA levels had a higher proportion of deaths from cancer – unlike those with high SUA level who had a higher proportion of deaths from cardiovascular disease,” Professor Stack added.

Uric acid is a by-product of the body’s metabolism and is associated with conditions such as heart disease, high blood pressure, stroke, kidney disease, and gout.

Previous work by the research group at UL found that hyperuricaemia is very common and affects about 25% of adults in the health system with a pattern of increasing growth year-on-year.

This current study adds to the body of evidence on the importance of SUA as a major predictor of survival and a potential target for treatment.

“A key consideration is whether we should treat hyperuricaemia and lower SUA levels to a desired target level in order to extend patient survival,” said Professor Stack.

Prospective clinical trials are currently underway using uric acid lowering drugs in order to provide a definitive answer to this question.

Speaking about the results, Dr Mairead O’Driscoll, Chief Executive of the HRB, said: “This study demonstrates the enduring value of having robust datasets in place that have been collected over time. By researching the data, this team at UL and their partners are now making significant discoveries about uric acid on a frequent basis that will help shape treatments for people with conditions like heart disease, stroke and kidney disease.”

References: ‘Serum uric acid and mortality thresholds among men and women in the Irish health system: A cohort study’, by Browne LD, Jaouimaa F, Walsh C, Perez-ruiz F, is published in the European Journal of Internal Medicine and is available online here: DOI:

Provided by University Of Limerick

Ants Are Skilled Farmers: They Have Solved A Problem That We Humans Have Yet To (Botany)

Fungus-farming ants are an insect lineage that relies on farmed fungus for their survival. In return for tending to their fungal crops–protecting them against pests and pathogens, providing them with stable growth conditions in underground nests, and provisioning them with nutritional ‘fertilizers’–the ants gain a stable food supply.

These fungus farming systems are an expression of striking collective organization honed over 60 million years of fungus crop domestication. The farming systems of humans thus pale in comparison, since they emerged only ca. 10,000 years ago.

A new study from the University of Copenhagen, and funded by an ERC Starting Grant, demonstrates that these ants might be one up on us as far as farming skills go. Long ago, they managed to appear to have overcome key domestication challenges that we have yet to solve.

“Ants have managed to retain a farming lifestyle across 60 million years of climate change, and Leafcutter ants appear able to grow a single cultivar species across diverse habitats, from grasslands to tropical rainforest” explains Jonathan Z. Shik, one of the study’s authors and an assistant professor at the University of Copenhagen’s Department of Biology.

Through fieldwork in the rainforests of Panama, he and researchers from the Smithsonian Tropical Research Institute studied how fungus-farming ants use nutrition to manage a tradeoff between the cultivar’s increasingly specialized production benefits, and it’s rising vulnerability to environmental variation.

Ants as clever farmers

We humans have bred certain characteristics — whether a taste or texture — into our crops.

But these benefits of crop domestication can also result in greater sensitivity to environmental threats from weather and pests, requiring increasing pesticide use and irrigation. Simply put, we weaken plants in exchange for the right taste and yield. Jonathan Z. Shik explains:

“The ants appear to have faced a similar yield-vulnerability tradeoff as their crops became more specialized, but have also evolved plenty of clever ways to persist over millions of years. For example, they became impressive architects, often excavating sophisticated and climate-controlled subterranean growth chambers where they can protect their fungus from the elements,” he says.

Furthermore, these little creatures also appear able to carefully regulate the nutrients used to grow their crops.

To study how, Shik and his team spent over a hundred hours lying on rainforest floor on trash bags next to ant nests. Armed only with forceps, they stole tiny pieces of leaves and other substrates from the jaws of ants as they returned from foraging trips.

They did this while snakes slithered through the leaf litter and monkeys peered down at him from the treetops.

“For instance, our nutritional analyses of the plant substrates foraged by leafcutter ants show that they collect leaves, fruit, and flowers from hundreds of different rainforest trees. These plant substrates contain a rich blend of protein, carbohydrates and other nutrients such as sodium, zinc and magnesium,” explains Shik. “This nutritional blend can target the specific nutritional requirements of their fungal crop.”

What can we learn from ants?

Over the years, the ants have adapted their leaf collecting to the needs of the fungus — a kind of organic farming, without the benefits of the technological advances that have helped human farmers over the millenia, one might say.

One might wonder, is it possible to simply copy their ingenious methods?

“Because our plant crops require sunlight and must thus be grown above ground, we can’t directly transfer the ants’ methods to our own agricultural practices. But it’s interesting that at some point in history, both humans and ants have gone from being hunter-gatherers to discovering the advantages of cultivation. It will be fascinating to see what farming systems of humans look like in 60 million years,” concludes Jonathan Z. Shik.

References: Shik, J.Z., Kooij, P.W., Donoso, D.A. et al. Nutritional niches reveal fundamental domestication trade-offs in fungus-farming ants. Nat Ecol Evol (2020). link:

Provided by University Of Copenhagen

Chikungunya May Affect Central Nervous System As Well As Joints And Lungs (Biology)

Investigation conducted by international group of researchers showed that chikungunya virus can cause neurological infections; risk of death in subacute phase is higher for patients with diabetes and significant for young adults.

A study conducted by an international team of researchers with FAPESP’s support shows that infection by chikungunya virus can produce even more severe manifestations than the typical symptoms of the disease, such as acute fever, headache, rash, and intense joint and muscle pain.

Investigation showed that chikungunya virus can cause neurological infections. Risk of death in subacute phase is higher for patients with diabetes and significant for young adults. ©William Marciel de Souza.

The analysis was performed by 38 researchers affiliated with the Federal University of Ceará (UFC), the University of São Paulo (USP) and the Ministry of Health in Brazil, and with Imperial College London and Oxford University in the United Kingdom.

Their main discovery was that chikungunya can infect the central nervous system and impair cognitive and motor functions.

“The study produced important new knowledge about the disease and the virus. We not only confirmed that the virus can infect the central nervous system but also found the disease to be more deadly for young adults, rather than children and the elderly as is usually predicted in outbreaks of the disease,” said William Marciel de Souza , co-author of an article on the study published in Clinical Infectious Diseases.

Souza is a researcher at the University of São Paulo’s Ribeirão Preto Medical School (FMRP-USP). “The study also showed that during the acute or subacute phase of the disease [20-90 days after infection] patients with diabetes appear to die seven times more frequently than non-diabetics,” he said.

The study was conducted under the auspices of the Brazil-UK Center for Arbovirus Discovery, Diagnosis, Genomics and Epidemiology (CADDE). It also derived from Souza’s postdoctoral research, part of which he pursued at Oxford University in the UK with FAPESP’s support via a Research Internship Abroad. Researchers affiliated with several different institutions collaborated on the project, which was also supported by Brazil’s National Council for Scientific and Technological Development (CNPq).

Worst outbreak in the Americas

The study was based on a retrospective analysis of clinical and epidemiological data as well as blood, cerebrospinal fluid, and tissue samples from patients who died during the 2017 outbreak in the state of Ceará, Brazil, the worst chikungunya outbreak in the Americas. Ceará notified 194 chikungunya-related deaths and 105,229 suspected cases (1,166 per 100,000 inhabitants) in 2017.

The researchers used documentation filed during the outbreak by the Ceará State Health Department’s Death Verification Service. To ascertain the cause of death in 100 cases, they analyzed blood serum and cerebrospinal fluid samples using the RT-PCR and MinION genome sequencing techniques, immunohistochemistry, and ELISA assays to detect antibodies against chikungunya.

The virus is transmitted by females of two mosquito species, Aedes aegypti and Aedes albopictus. Most chikungunya patients manifest acute symptoms such as high fever, headache, joint and muscle pain, nausea, fatigue, and rash for three weeks after being infected. Some then progress to the subacute phase, during which the symptoms persist. Joint pain may last more than three months, indicating a transition to the chronic phase, which can last years.

All the evidence gleaned from the laboratory tests and clinical records showed that in most cases of suspected death from chikungunya the patient had a central nervous system infection.

“Joint pain was the most frequent symptom, as evidenced by the name of the disease, which refers to contortion from pain [in the East-African Kimakonde language], but we also identified severe problems in the nervous system due to chikungunya,” Souza said.

Viral RNA was found in cerebrospinal fluid from 36 patients and in four brain tissue samples. “The presence of the virus in the brain tissue of those infected is clear evidence that it’s capable of crossing the blood-brain barrier that protects the central nervous system, and of causing infection in the brain and spinal cord,” Souza said.

Most vulnerable

Besides new characteristics of infection by this virus, the researchers also discovered that the risk of death in the subacute phase was seven times greater for patients with diabetes than patients without diabetes.

Their autopsy and histopathological analysis of fatal cases pointed to viral infection as the cause of bloodstream disorders and fluid imbalances in the brain, heart, lungs, kidneys, spleen, and liver.

“The study confirmed some previous clinical findings about death from chikungunya and also brought up novel aspects of the disease and its lethality. This new information, obtained in a painstaking analysis of the Ceará outbreak, will contribute to the recognition of the factors that cause severity and also to further research to develop better treatments in future,” said Luiz Tadeu Moraes Figueiredo, a professor at FMRP-USP and also a co-author of the article.

Figueiredo is engaged in research supported by São Paulo Research Foundation – FAPESP on high-throughput sequencing (HTS) to identify and characterize viruses without requiring viral isolation or cell culture. He is particularly interested in MinION genome sequencing, which is faster and more affordable than other approaches. The technology also reads RNA and DNA in real time and in a single stage.

Based on their analysis, the authors of the study concluded that older people and children were not at greater risk of dying from chikungunya than other age groups, in contrast with the profile typical of arbovirus epidemics. In the 2017 outbreak, most of the fatal victims were middle-aged.

“We normally associate arboviruses with hospitalizations and deaths for elderly patients and infected children, but our analysis of these 100 fatal cases showed that a majority [over 60%] of those with infection in the central nervous system were adults aged 40 or more,” Souza said, adding that patients aged anywhere from 3 days to 85 years were among the rest of the fatal victims.

The findings show that defective or suppressed immunity is not necessarily the main source of susceptibility to the disease in such outbreaks. “Many of the victims were healthy young adults under 40, and most had no comorbidities,” he said. “The analysis added another layer to our knowledge of the disease and can be extremely important to clinical practice. Even greater attention should be paid to this age group, which is also at great risk of dying.”

References: Shirlene Telmos Silva de Lima, William Marciel de Souza, John Washington Cavalcante, Darlan da Silva Candido, Marcilio Jorge Fumagalli, Jean-Paul Carrera, Leda Maria Simões Mello, Fernanda Montenegro De Carvalho Araújo, Izabel Letícia Cavalcante Ramalho, Francisca Kalline de Almeida Barreto, Deborah Nunes de Melo Braga, Adriana Rocha Simião, Mayara Jane Miranda da Silva, Rhaquel de Morais Alves Barbosa Oliveira, Clayton Pereira Silva Lima, Camila de Sousa Lins, Rafael Ribeiro Barata, Marcelo Nunes Pereira Melo, Michel Platini Caldas de Souza, Luciano Monteiro Franco, Fábio Rocha Fernandes Távora, Daniele Rocha Queiroz Lemos, Carlos Henrique Morais de Alencar, Ronaldo de Jesus, Vagner de Souza Fonseca, Leonardo Hermes Dutra, André Luiz de Abreu, Emerson Luiz Lima Araújo, André Ricardo Ribas Freitas, João Lídio da Silva Gonçalves Vianez Júnior, Oliver G Pybus, Luiz Tadeu Moraes Figueiredo, Nuno Rodrigues Faria, Márcio Roberto Teixeira Nunes, Luciano Pamplona de Góes Cavalcanti, Fabio Miyajima, Fatal Outcome of Chikungunya Virus Infection in Brazil, Clinical Infectious Diseases, , ciaa1038, link:

Provided by São Paulo Research Foundation (FAPESP)