The Hunger Games: Uncovering the Secret of the Hunger Switch in the Brain (Neuroscience)

A 3D structure reveals how a unique molecular switch in our brain causes us to feel full – and may help develop improved anti-obesity drugs

Being constantly hungry, no matter how much you eat, is a daily struggle for people with genetic defects in the brain’s appetite controls, and it often ends in severe obesity. In a study published in Science on April 15, researchers at the Weizmann Institute of Science, together with colleagues from the Queen Mary University of London and the Hebrew University of Jerusalem, have revealed the mechanism of action of the master switch for hunger in the brain: the melanocortin receptor 4, or MC4 receptor for short. They have also clarified how this switch is activated by setmelanotide (Imcivree), a drug recently approved for the treatment of severe obesity caused by certain genetic changes. These findings shed new light on the way hunger is regulated and may help develop improved anti-obesity medications.

The MC4 receptor is present in a brain region called the hypothalamus – within a cluster of neurons that compute the body’s energy balance by processing a variety of energy-related metabolic signals. When the MC4 is activated, or “on” – as it normally is – it sends out commands that cause us to feel full, which means that from the brain’s perspective, our default state is satiety. When our energy levels drop, the hypothalamic cluster produces a “time to eat” hormone that inactivates, or turns off the MC4 receptor, sending out a “become hungry” signal. After we eat, a second, “I’m full” hormone is released. It binds to the same active site on the MC4, replacing the hunger hormone and turning the receptor back on – bringing us back to the satiety default. Mutations that inactivate the MC4 cause people to feel constantly hungry.

MC4 is a prime target for anti-obesity drugs, such as setmelanotide, precisely because it’s a master switch: turning it on can control hunger while bypassing all other energy-related signals. But until now it was unknown how exactly this hunger switch works.

Electron microscopy images representing different views of the MC4 receptor bound to setmelanotide, in complex with proteins activated by the binding © Weizmann Institute of Science

The new study began with the predicament of one family, in which at least eight members, plagued by persistent hunger, were severely obese – most of them with a body mass index of over 70, that is, about triple the norm. Their medical history came to the attention of Hadar Israeli, a medical student pursuing PhD studies into the mechanisms of obesity under the guidance of Dr. Danny Ben-Zvi at the Hebrew University of Jerusalem. Israeli was struck by the fact that the family’s plight was due to a single mutation that ran in the family: one affecting the MC4 receptor. She turned to Dr. Moran Shalev-Benami of Weizmann’s Chemical and Structural Biology Department, asking whether new advances in electron microscopy could help explain how this particular mutation could produce such a devastating effect.

MC4 is a prime target for anti-obesity drugs precisely because it’s a master switch: turning it on can control hunger while bypassing all other energy-related signals

Shalev-Benami decided to launch a study into the structure of MC4 receptor, inviting Israeli to join her lab as a visiting scientist. Together with Dr. Oksana Degtjarik, a postdoctoral fellow in the lab, Israeli isolated large quantities of pure MC4 receptor from cell membranes, let it bind with setmelanotide and determined its 3D structure using cryogenic electron microscopy. The study was conducted in collaboration with the teams of Dr. Peter J. McCormick from the Queen Mary University of London and of Prof. Masha Y. Niv from the Hebrew University of Jerusalem.

The 3D structure revealed that setmelanotide activates the MC4 receptor by entering its binding pocket – that is, by directly hitting the molecular switch that signals satiety, even more potently than the natural satiety hormone. It also turned out that the drug has a surprising helper: an ion of calcium that enters the pocket, enhancing the drug’s binding to the receptor. In biochemical and computational experiments, the scientists found that similarly to the drug, calcium also assists the natural satiety hormone.

McCormick: “Calcium helped the satiety hormone activate the MC4 receptor while interfering with the hunger hormone and reducing its activity.”

“This was a truly unexpected finding,” Shalev-Benami says. “Apparently, the satiety signal can successfully compete with the hunger signal because it benefits from the assistance of calcium, which helps the brain restore the ‘I’m full’ sensation after we eat.”

3D structure showing a setmelanotide molecule (pink) and a calcium ion (green) in the binding pocket of the MC4 receptor © Weizmann Institute of Science

MC4’s structure also revealed that the drug’s entry causes structural changes in the receptor; these changes appear to initiate the signals within the neurons that lead to the sensation of fullness. The study has explained how mutations in the MC4 receptor can interfere with this signaling, leading to never-ending hunger and ultimately obesity.

Moreover, the scientists have identified hotspots that crucially distinguish MC4 from similar receptors in the same family. This should make it possible to design drugs that will bind only to MC4, avoiding side effects that may be caused by interactions with other receptors.

3D structure of a complex formed by the MC4 receptor (blue) and several proteins that it activates, with a setmelanotide molecule (pink) and a calcium ion (green) in the MC4’s binding pocket © Weizmann Institute of Science

“Our findings can help develop improved and safer anti-obesity drugs that will target MC4 with greater precision,” Shalev-Benami says.

Study participants included Dr. Fabrizio Fierro of the Hebrew University of Jerusalem; Vidicha Chunilal, Amandeep Kaur Gill, Nicolas J. Roth, Dr. Joaquin Botta and Dr. Li F. Chan of Queen Mary University of London; Dr. Vadivel Prabahar from Weizmann’s Chemical and Structural Biology Department; and Dr. Yoav Peleg of Weizmann’s Life Sciences Core Facilities Department.

Dr. Moran Shalev-Benami’s research is supported by the Tauro Career Development Chair in Biomedical Research; the Ilse Katz Institute for Material Sciences and Magnetic Resonance Research; the Zuckerman STEM Leadership Program; the Joseph and Wolf Lebovic Lab; and the Abisch Frenkel Foundation for the Promotion of Life Sciences.

Featured image: (l-r) Dr. Oksana Degtjarik, Dr. Moran Shalev-Benami and Hadar Israeli © Weizmann Institute of Science


Reference: Hadar Israeli, Oksana Degtjarik, Fabrizio Fierro, Vidicha Chunilal, Amandeep Kaur Gill, Nicolas J. Roth, Joaquin Botta, Vadivel Prabahar, Yoav Peleg, Li F. Chan, Danny Ben-Zvi, Peter J. McCormick, Masha Y. Niv, Moran Shalev-Benami, “Structure reveals the activation mechanism of the MC4 receptor to initiate satiation signaling”, Science  15 Apr 2021:
eabf7958 DOI: 10.1126/science.abf7958


Provided by Weizmann Institute of Science

Slow Synchronization Keeps Heart Cells Beating in Time (Biology)

For a beating heart cell, noise is a problem. Researchers showed that single cells can regulate their inner noise

If you dance cheek-to-cheek with a partner, your rhythms soon synchronize, so you move smoothly across the floor. Heart cells that beat – cardiomyocytes – are the same. In fact, if you grow embryonic heart cells on a flexible material, you can not only observe this synchronization, you can get these cells to change the rhythm of their beats from “rhumba” to “waltz” or vice versa, just by jiggling them so they “feel” a new pace. But you’ll have to keep the jiggling up for tens of minutes to get them to make the transition, and once you stop, it will take the cells a similar amount of time to return to their normal pulse.

Video: Beating cardiomyocytes in the lab of Prof. Shelly Tzlil, the Technion

That finding arose from experiments done in the lab of Prof. Shelly Tzlil, of the Technion – Israel Institute of Technology. “This was puzzling,” says Prof. Sam Safran, a theoretical physicist at the Weizmann Institute of Science’s Chemical and Biological Physics Department. Together with his PhD student Ohad Cohen, Safran had previously shown that the spontaneous beating of cardiomyocytes – which occurs at a frequency of about once per second – could be described using an analogy to a mechanical system akin to a vibrating spring.

The puzzle was this: If, for the spring-like oscillations, a beating cycle takes place about every second, why is the time needed to get it to change its pace so much longer? Over an hour, the cell would contract 3,600 times – that is, if it were at a dance, a cell would keep waltzing while its partner jitterbugged until it was nearly time to leave. If the only characteristic of the spontaneously beating cardiomyocyte were that spring-like mechanism, the physics of such a system would dictate that it would react within the short timescale of the vibration, and new rhythms would be adopted within several seconds, not many minutes. “When such a mismatch in timing happens, you first examine the intrinsic properties of the system, itself – even in the absence of forces that change its pace – and ask if a different characteristic timescale is involved,” says Safran.

Prof. Sam Safran (sitting) and Ohad Cohen © Weizmann Institute of Science

Safran and Cohen suspected that a possible longer timescale might have to do with the intrinsic response of a spontaneously beating cell to the noise – the “fuzziness” often seen in data from living systems – that was noted in the previous experiments. That is, cardiomyocytes beat about once a second – sometimes in nine-tenths of a second, sometimes waiting an extra tenth of a second to contract. Could an additional, slower mechanism be functioning in the cells to keep that noise close to the “ideal” one second mark?

To test this idea, Tzlil grew cardiomyocytes in her lab and, working in collaboration with Cohen and Safran, timed the natural, unperturbed beating of individual cells for several hours. The researchers thus obtained “cardiograms” of these single cells. Viewing the data over the course of a minute or so, they plotted the difference between the actual timing of each contraction and its “ideal” average timing, plus or minus, from the one-second mark. More challenging was the plotting of beats over hours, rather than minutes, but these revealed a new pattern: The pulses that were slower or faster than average appeared in bunches – many slower beats, followed by many faster beats. The timescale associated with these bunches was tens of minutes, a far cry from the natural one-second timing of an “average” beat.

It was as if a somewhat clumsy, invisible hand were continually adjusting the knob on a staticky radio station, turning the frequency up and down in an attempt to hit the exact right interval. This pattern of regulation occurred over a span of ten to thirty minutes. When the team then compared the regulation from different lab-grown cardiomyocytes, after first mathematically accounting for the differences in their ranges (so that a regulation cycle taking, say, fifteen minutes could be compared with one taking twenty-five), they found that the graphs looked very similar, implying that such a slow, noise-regulating mechanism is intrinsic to the functioning of all of these heart cells, no matter how fast or slow they beat.

(I) Measured over a few minutes, cardiomyocyte beats regularly deviate up and down from the one second mark. (r) But seen over several hours, a longer oscillation around the one-second goal can be seen ©Weizmann Institute of Science

How did the heart cell “know” that it was beating too quickly or slowly, and why did it take it so long to compensate for faster or slower beating by introducing the opposite pace? “Though we still don’t know the detailed biochemistry that accounts for this long regulation time,” says Safran, “its scale of tens of minutes suggests that gene transcription and protein translation are involved, possibly to modify the channels and pumps that area involved in the transport of calcium within the cell.” Further experiments may reveal whether the duration of the regulation time is an indicator of cellular “good health”; too long a regulation time might mean that the cell is not oscillating properly over longer intervals. Extending the research to heart tissue may reveal the functional importance of overly long regulation times to the integrity of the beating tissue and perhaps in the future, to the heart itself.

“We showed, once again, that even as heart cells are noisy – in the manner of all biological systems – the noise is interesting; using physics to analyze its patterns reveals how the cell attempts to deal with its imperfect behavior,” he adds.Prof. Samuel Safran’s research is supported by the Benoziyo Endowment Fund for the Advancement of Science; the Henry Krenter Institute for Biomedical Imaging and Genomics; and the Harold Perlman Family. Prof. Safran is the incumbent of the Fern and Manfred Steinfeld Professorial Chair. 


Provided by Weizmann Institute of Science

Researchers Hope AI Can Help Diagnose Depression More Accurately (Psychiatry)

Project taps into big data to determine the factors involved and develop a screening tool.

A new University of Alberta project aims to develop an AI-based screening tool to help doctors diagnose depression more precisely.

Depression affects millions of Canadians. It can affect quality of life, damage relationships, lower productivity and lead to suicide. A proper diagnosis is key to effective treatment, but making a precise diagnosis can be difficult because there are no biological tests and symptoms vary. 

“We don’t have a clear picture of exactly where depression emerges, although researchers have made substantial progress in the biological underpinnings of depression,” said project leader Bo Cao, an assistant professor in the U of A’s Department of Psychiatry, Canada Research Chair in Computational Psychiatry and member of the Women and Children’s Health Research Institute

“We know there are genetic and brain components but there could be other clinical, social and cognitive factors that can facilitate the precision diagnosis of depression.”

The project, backed by seed funding from a Precision Health Seed Fund Award, brings together scientists from Canada and the U.K. with expertise in computational psychiatry, artificial intelligence, psychology and cognitive neuroscience. 

Using data from the U.K. Biobank, a biomedical database that contains genetic and health information for half a million people in the United Kingdom, the researchers will be able to access health records, brain scans, social determinants and personal factors for more than 8,000 individuals diagnosed with major depressive disorder (MDD). Researchers will compare their profiles with a control group of more than 200,000 people who have not had a diagnosis of depression. This will help determine whether MDD can be identified through social, personal and health records, and when genetic and MRI data are necessary to improve the diagnosis. 

The team will develop and test a prototype of the machine learning tool over the next 18 months. If it proves effective, the model will be applied to Alberta health data to verify its effectiveness.

Computing science professor Russ Greiner is contributing his expertise in applying AI to scour data for patterns that could help identify mental health issues. (Photo: Supplied)

“Machine learning finds patterns in data,” explained collaborator Russ Greiner, professor in the Department of Computing Science and adjunct professor in the Department of Psychiatry, who was recently named as a Canada CIFAR AI Chair. In the last several years, his research has focused on using computational methods to help identify psychiatric problems, including attention deficit hyperactivity disorder, schizophrenia, autism and now depression.

Greiner says he is grateful to be in Alberta, where there is strong support for machine learning research. He helped start the Alberta Machine Intelligence Institute almost 20 years ago. It receives more than $2 million a year from the Alberta government for AI research.

Cao and Greiner, who are both members of the U of A’s Neuroscience and Mental Health Institute, are optimistic that advances in AI will lead to breakthroughs that help doctors diagnose mental illnesses and find the right treatment for each patient. The research is important—according to the Statistics Canada Community Health Survey on Mental Health, more than 11 per cent of Canadian adults will experience depression in their lifetimes.

“It will be a long journey,” said Cao. “Our goal is to provide precision medicine in mental health, but that’s going to take decades. However, we dare to work toward this goal now with the support of our university and other visionary philanthropists and agencies.”

Find out more about the innovative precision health research happening at the U of A.

Featured image: Psychiatry professor Bo Cao is leading a new project drawing on health data to develop an AI-based screening tool that could help doctors diagnose depression more precisely. (Photo: Ross Neitz)


This science news is confirmed by us from the University of Alberta


Provided by University of Alberta

Why COVID-19 Could Be Causing Blood Clots—and What You Can Do to Lower Your Risk (Medicine)

As if the breathing complications associated with COVID-19 aren’t worrisome enough, doctors are discovering another risk posed by the coronavirus: blood clots that can lead to life-threatening strokes, heart attacks and pulmonary embolism. 

As COVID-19 traveled across Europe and hit hard in New York City, word began to spread of patients riddled with clots in their brain, hearts, lungs and legs—and sometimes all over. In Los Angeles, doctors had to amputate the right leg of a Broadway star because of severe clotting. Medical staff at The Ohio State University Wexner Medical Center began seeing blood clots in some of their COVID-19 patients too. 

 “It’s very scary for a patient and it’s alarming for a medical center too,” said Danielle Blais, PharmD, a specialty practice pharmacist in cardiology at the Ohio State Richard M. Ross Heart Hospital. “We called in experts from so many different disciplines to figure out how best to treat these patients, and we continue to learn more every day.”

Blood clots are a serious condition: Untreated, they can cause damage to your brain, heart and lungs. Death or long-term complications are a real concern.

While the health care community is still learning the ways COVID-19 attacks the body, it appears that a few factors are causing the increased risk of clots, said Matthew Exline, MD, medical director of the medical intensive care unit at the Ohio State Wexner Medical Center.

Blood clotting factors 

First, COVID-19 can cause severe inflammation, which can trigger your clotting system. 

“When you, say, fall and skin your knee, it turns your immune system on, and one of the ways your immune system reacts to an injury is by making your clotting system more active,” Exline said. “It kind of makes sense that your body would say, if I see an infection, I need to be ready to clot. But when the infection is as widespread and inflammatory as COVID-19, that tendency to clot can become dangerous.” 

And when you’re sick with COVID-19 or following stay-at-home or quarantine orders, you probably aren’t moving much.

“If you’re immobile, you have an increased risk factor for blood clots,” Exline said. 

Paired together, inflammation and immobility create a near perfect environment for blood clots in your legs and lungs, Exline said. Patients with severe cases of COVID-19 seem especially susceptible, as do those with other health risk factors such as cancer, obesity and a history of blood clots.

Blood clot treatment 

Knowing this, health care providers have changed the way they treat COVID-19 patients to specifically address the risk of clotting. It’s taken quick, widespread collaboration. At the Wexner Medical Center, specialty practice pharmacists along with critical care medicine, cardiology, hematology, emergency medicine and internal medicine doctors developed guidelines on how to manage these patients, Blais said. 

“We’ve done the amount of work that some people would take a year or two to put together in a matter of weeks,” she said.

Now, patients who are sick enough from COVID-19 to go to the hospital receive blood tests to gauge the activity of their clotting systems. Recent studies have demonstrated that patients with COVID-19 are prone to clotting, but patients in the ICU may also be at risk for bleeding.

“Health care providers must carefully weigh the risks and benefits of anticoagulation for each individual patient,” Exline says.

Those whose clotting systems aren’t particularly active receive treatments to prevent clots such as compression socks, inflatable cushions for their calves or small injections of blood thinners. Those with more active clotting systems receive full doses of blood thinners if they’re not at a high bleeding risk.

“We’re having to be thoughtful about our approach with treatment, especially because there is limited data in COVID-19 patients,” said Tiffany Ortman, PharmD, a specialty practice pharmacist in outpatient care at the Ross Heart Hospital. 

After patients leave the hospital, health care providers continue to monitor patients for clotting symptoms and lower their risk through medications. Some currently active research studies are attempting to understand how long patients should stay on anticoagulation medication as they recover from COVID-19.  

Blood clots and COVID-19 vaccination

Recent news from both Europe and the United States has raised concerns about blood clots after COVID-19 vaccination. It’s important to note that the most common vaccines in the U.S. — the Pfizer and Moderna vaccines — have not been found to have a high risk of blood clots.

Blood clots have been reported in Europe after the AstraZeneca vaccine and in the U.S. with the Johnson & Johnson vaccine, but these incidents have been extremely rare, Exline notes. 

“There may be an extremely low risk of blood clots with one type of COVID-19 vaccine, but you’re more at risk for injury driving to your vaccine appointment than from any side effect from the vaccine itself,” Exline says. “It’s important to remember that the risk of blood clots from a COVID-19 infection is much more likely than any side effect of a vaccine. If you want to protect yourself from blood clots, get vaccinated.”

Who’s most at risk for blood clots, and what to look for

While those with severe cases of COVID-19 appear to be more affected by blood clots, those who don’t come to the hospital could still be at risk, said Aaron Dush, PharmD, a specialty practice pharmacist at the James Cancer Hospital and Solove Research Institute. 

They, as everyone, should monitor for signs of clots and possible stroke or heart attack: 

  • facial drooping 
  • weakness of one arm or leg
  • difficulty speaking 
  • new swelling, tenderness, pain or discoloration in the arms or legs 
  • sudden shortness of breath 
  • chest pain or pain radiating to the neck, arms, jaw or back

Call 911 if you’re experiencing concerning symptoms.

And Dush’s biggest recommendation for those with COVID-19 at home: Keep moving. Stay hydrated. When you are seated, try to keep your legs elevated. 

Keep the blood, quite literally, flowing.


Provided by Ohio State University Wexner Medical Center

What Are Some Acute Stroke Symptoms in Women? (Medicine)

Stroke is a medical emergency. Stroke occurs when blood flow to the brain is blocked (causing tissue to be injured) or when a blood vessel bursts causing bleeding. The lifetime risk of stroke for women between the ages of 55 and 75 in the United States is 1 in 5, according to a report by the American Heart Association.   Traditional warning signs of stroke include sudden onset of the following symptoms: 

  • Weakness or numbness on one side of the body
  • Speech difficulty
  • Difficulty with vision
  • Trouble walking or loss of balance
  • Unexplained severe headache

An easy way to remember this is to BE FAST. If you or a loved one is experiencing stroke symptoms, call 911 and go to the hospital immediately. It’s important to know that women may present differently than men during a stroke. In fact, researchers have found that women are more likely to present with non-traditional symptoms than men, including:

  • Pain
  • Headache
  • Fatigue
  • Reduced altered mental status or confusion  

The fact that women may present with non-classic stroke symptoms is important to recognize because overall, women have a higher lifetime risk of stroke than males.    Why are women at high risk of stroke than men?  There are some potential situations that may place women at higher risk.

  • Age. Women tend to live longer than men overall, and increasing age is a risk factor for stroke.
  • Pregnancy. Women may have an increased risk of stroke during the time of pregnancy and delivery. Pre-eclampsia, a condition that is associated with high blood pressure during pregnancy, is associated with increased risk of stroke.
  • Birth control pill/Hormone replacement therapy.  The use of hormones may increase the risk of stroke in women.
  • Atrial fibrillation.  Atrial fibrillation risk increases with age. Women tend to live longer than men. 

Women not only have a higher risk of stroke than men, they also have a higher risk of death from stroke.  In fact, women account for about 60% of stroke deaths, and stroke kills twice as many women as breast cancer. 

Women are at higher risk for stroke than men overall, but the risk of stroke varies with race among women as well. According to this report by the Center for Disease Control and Prevention, African American women have a higher risk and are more likely to die from stroke than white women. 

Learn more about these indicators and risk factors by visiting the Comprehensive Stroke Center at The Ohio State University Wexner Medical Center. Watch this video below to learn more about BE FAST.

Vivien Lee is a neurologist and medical director of the Comprehensive Stroke Center at The Ohio State University Wexner Medical Center and a professor at The Ohio State University College of Medicine.

Featured image: Women and Stroke infographic © Benjamin et al.


Reference: Emelia J. Benjamin et al., “Heart Disease and Stroke Statistics—2019 Update: A Report From the American Heart Association”, https://doi.org/10.1161/CIR.0000000000000659 Circulation. 2019;139:e56–e528


Provided by Ohio State University Wexner Medical Center

Getting a COVID-19 Vaccine Isn’t Just About You — it Protects Others in Your Life (Medicine)

According to recent data, both the Pfizer and Moderna COVID-19 vaccines have been effective not just in preventing severe symptoms in people who get the vaccines — they also effectively prevent infection from COVID-19 in the first place.

This means that those who get the Pfizer and Moderna vaccines are unlikely to be able to pass COVID-19 to others in the community. And that’s important because that significantly cuts down on spread of the virus, helping protect others who, for one reason or another, are more likely to develop deadly symptoms if they were to contract COVID-19.

How we know COVID-19 vaccines can reduce virus spread?

Recent studies, including a study published March 29 in Morbidity and Mortality Weekly Report, show that Pfizer and Moderna vaccines (messenger RNA, or mRNA, vaccines) are 90% effective against COVID-19 infection in real-world conditions once you’re fully immunized (14 or more days after your second dose).

The mRNA vaccines are also 80% effective against COVID-19 infection when you’re partially immunized (less than 14 days after first dose and before the second dose).

Why this is important

We continue to see data to support that current, FDA-authorized mRNA COVID-19 vaccines are not only safe and effective at preventing COVID-19 symptoms, they are a vital key to controlling the pandemic and preventing spread among communities.

What it means for you

Getting vaccinated against COVID-19 is similar to getting vaccinated for other diseases in that it’s as much about protecting your community as it is about protecting yourself. 

Perhaps your immune system is strong and healthy enough that a measles or flu diagnosis is unlikely to threaten your life. But if you contracted one of those diseases and unintentionally passed it to your 90-year-old grandmother or to your newborn niece who’s too young to be vaccinated, would they be able to fight the virus as easily?

Most people 16 and older in the United States should get vaccinated against COVID-19 so that we can control the pandemic as soon as possible. If you’re unsure about whether it’s right for you or you have more questions about the vaccines, please talk with your primary care provider or visit our COVID-19 Vaccine FAQ page.

Iahn Gonsenhauser is the chief quality and patient safety officer at The Ohio State University Wexner Medical Center and an assistant professor in the Ohio State College of Medicine.


Provided by Ohio State University Wexner Medical Center

Stimulating the Immune System With Sponges Made of Silica (Medicine)

Silica nanoparticles developed by a team from the UNIGE and Ludwig-Maximilian University of Munich have significantly increased the effectiveness and precision of immunotherapies.

Immunotherapies are increasingly used to fight cancer and aim to stimulate the immune system to defend itself by destroying tumour cells. While these treatments are often effective, their significant impact on the body can generate severe side effects. In order to increase their precision and limit undesirable side effects, a team from the University of Geneva (UNIGE) and the Ludwig-Maximilian University of Munich (LMU) has developed silica nanoparticles with a very precise opening mechanism that can transport a drug exactly to where it should act. These microscopic vehicles could possibly be used not just for cancer treatment, but also to deliver other drugs at the very heart of our immune system, thus paving the way for entirely new therapeutic or preventive strategies. These results can be read in the journal ACS Nano.

In medicine, nanoparticles are used to encapsulate a drug in order to protect it: indeed, their nanosize allows them to be taken up by dendritic cells, the body’s first line of defence. “The function of dendritic cells is to phagocytose foreign elements to bring them to the lymph nodes and thus trigger the immune response”, explains Carole Bourquin, professor at the UNIGE Faculties of Medicine and Science, who led this research. “We are taking advantage of this mechanism to have these cells transport a drug encapsulated into nanoparticles, which thus reaches the lymph nodes directly, where the immune response is initiated.”  


Silica, a material with multiple properties

Although nanoparticles are already used in certain treatments — the most recent example being the messenger RNA vaccines against Covid-19 — the system can still be improved. “Medical nanoparticles are generally composed of polymers or lipids”, says Julia Wagner, a PhD student in Professor Bourquin’s laboratory and the first author of this work. “In some cases, however, the solubility of the substance to be transported is incompatible with the characteristics of the nanoparticles. This makes it impossible to load the particles with the drug.”

The scientists therefore turned to silica, a mineral that can be found naturally in the environment. “Silica nanoparticles are like little sponges with cavities that can easily be filled and whose properties can be modified to better match those of the drug”, explains Julia Wagner. “The anti-tumour drug we used, for example, had already been tested with other particles, but it often leaked out too quickly.”


A lid that only opens in the right place

To further improve the performance of their particles, the research team added a lid that covers the drug-laden cavities and prevents the drug from escaping during transport. “The lid reacts according to the pH of its environment: when the particles are circulating in the blood, which has a neutralpH of around 7.40, it remains firmly in place. But once the particles are taken up by the dendritic cells, they arrive in vesicles inside the cell whose pH is acidic. Then the lid comes off and the drug is released”, reports Carole Bourquin.

This technical prowess ensures the high precision of the treatment: the seal maintains the integrity of the drug, and therefore its duration of action, while preventing it from spreading in the body, thus reducing undesirable side effects. Indeed, some drugs stimulate the immune system extremely strongly, but disappear within a few hours, requiring repeated administration of high doses. “With our nanoparticles, the drug can take effect up to six times longer, which would make it possible to administer lower and better tolerated doses”, say the authors. Their work provides a proof of concept of the mechanism governing these nanoparticles, which could be used against cancer as well as against other diseases, or as part of preventive or therapeutic vaccines. “Our work will now continue in order to confirm these initial results, and to reproduce their validity with a wider range of anti-tumour drugs.”

Featured image: Confocal microscope image of immune cells (in red). In green, silica nanoparticles. The nanoparticles, once absorbed by the immune cells, appear in yellow. © UNIGE, Carole Bourquin


Reference: Julia Wagner, Dorothée Gößl, Natasha Ustyanovska, Mengyao Xiong, Daniel Hauser, Olga Zhuzhgova, Sandra Hočevar, Betül Taskoparan, Laura Poller, Stefan Datz, Hanna Engelke, Youssef Daali, Thomas Bein, and Carole Bourquin, “Mesoporous Silica Nanoparticles as pH-Responsive Carrier for the Immune-Activating Drug Resiquimod Enhance the Local Immune Response in Mice”, ACS Nano 2021, 15, 3, 4450–4466.
https://doi.org/10.1021/acsnano.0c08384


Provided by University of Geneve

Electrifying Cement With Nanocarbon Black (Material Science)

A collaboration between MIT and CNRS has yielded a cement that conducts electricity and generates heat.

Since its invention several millennia ago, concrete has become instrumental to the advancement of civilization, finding use in countless construction applications — from bridges to buildingsAnd yet, despite centuries of innovation, its function has remained primarily structural.

A multiyear effort by MIT Concrete Sustainability Hub (CSHub) researchers, in collaboration with the French National Center for Scientific Research (CNRS), has aimed to change that.Their collaboration promises to make concrete more sustainable by adding novel functionalities — namely, electron conductivity. Electron conductivity would permit the use of concrete for a variety of new applications, ranging from self-heating to energy storage.

Their approach relies on the controlled introduction of highly conductive nanocarbon materials into the cement mixture. In a paper in Physical Review Materials, they validate this approach while presenting the parameters that dictate the conductivity of the material. 

Nancy Soliman, the paper’s lead author and a postdoc at the MIT CSHub, believes that this research has the potential to add an entirely new dimension to what is already a popular construction material.

“This is a first-order model of the conductive cement,” she explains. “And it will bring [the knowledge] needed to encourage the scale-up of these kinds of [multifunctional] materials.” 

From the nanoscale to the state-of-the-art

Over the past several decades, nanocarbon materials have proliferated due to their unique combination of properties, chief among them conductivity. Scientists and engineers have previously proposed the development of materials that can impart conductivity to cement and concrete if incorporated within.

By running current through this mortar sample made with nanocarbon-doped cement, Chanut and Soliman were able to warm it to 115 F (see thermometer display on the right) © Andrew Logan

For this new work, Soliman wanted to ensure the nanocarbon material they selected was affordable enough to be produced at scale. She and her colleagues settled on nanocarbon black — a cheap carbon material with excellent conductivity. They found that their predictions of conductivity were borne out.

“Concrete is naturally an insulative material,” says Soliman, “But when we add nanocarbon black particles, it moves from being an insulator to a conductive material.”

By incorporating nanocarbon black at just a 4 percent volume of their mixtures, Soliman and her colleagues found that they could reach the percolation threshold, the point at which their samples could carry a current.

They noticed that this current also had an interesting upshot: It could generate heat. This is due to what’s known as the Joule effect.

“Joule heating (or resistive heating) is caused by interactions between the moving electrons and atoms in the conductor, explains Nicolas Chanut, a co-author on the paper and a postdoc at MIT CSHub. “The accelerated electrons in the electric field exchange kinetic energy each time they collide with an atom, inducing vibration of the atoms in the lattice, which manifests as heat and a rise of temperature in the material.”

In their experiments, they found that even a small voltage — as low as 5 volts — could increase the surface temperatures of their samples (approximately 5 cmin size) up to 41 degrees Celsius (around 100 degrees Fahrenheit). While a standard water heater might reach comparable temperatures, it’s important to consider how this material would be implemented when compared to conventional heating strategies.

“This technology could be ideal for radiant indoor floor heating,” explains Chanut. “Usually, indoor radiant heating is done by circulating heated water in pipes that run below the floor. But this system can be challenging to construct and maintain. When the cement itself becomes a heating element, however, the heating system becomes simpler to install and more reliable. Additionally, the cement offers more homogenous heat distribution due to the very good dispersion of the nanoparticles in the material.”

Nanocarbon cement could have various applications outdoors, as well. Chanut and Soliman believe that if implemented in concrete pavements, nanocarbon cement could mitigate durability, sustainability, and safety concerns. Much of those concerns stem from the use of salt for de-icing.

“In North America, we see lots of snow. To remove this snow from our roads requires the use of de-icing salts, which can damage the concrete, and contaminate groundwater,” notes Soliman. The heavy-duty trucks used to salt roads are also both heavy emitters and expensive to run.

By enabling radiant heating in pavements, nanocarbon cement could be used to de-ice pavements without road salt, potentially saving millions of dollars in repair and operations costs while remedying safety and environmental concerns. In certain applications where maintaining exceptional pavement conditions is paramount — such as airport runways — this technology could prove particularly advantageous.

Tangled wires

While this state-of-the-art cement offers elegant solutions to an array of problems, achieving multifunctionality posed a variety of technical challenges. For instance, without a way to align the nanoparticles into a functioning circuit — known as the volumetric wiring — within the cement, their conductivity would be impossible to exploit. To ensure an ideal volumetric wiring, researchers investigated a property known as tortuosity.

“Tortuosity is a concept we introduced by analogy from the field of diffusion,” explains Franz-Josef Ulm, a leader and co-author on the paper, a professor in the MIT Department of Civil and Environmental Engineering, and the faculty advisor at CSHub. “In the past, it has described how ions flow. In this work, we use it to describe the flow of electrons through the volumetric wire.”

Ulm explains tortuosity with the example of a car traveling between two points in a city. While the distance between those two points as the crow flies might be two miles, the actual distance driven could be greater due to the circuity of the streets.

The same is true for the electrons traveling through cement. The path they must take within the sample is always longer than the length of the sample itself. The degree to which that path is longer is the tortuosity.

Achieving the optimal tortuosity means balancing the quantity and dispersion of carbon. If the carbon is too heavily dispersed, the volumetric wiring will become sparse, leading to high tortuosity. Similarly, without enough carbon in the sample, the tortuosity will be too great to form a direct, efficient wiring with high conductivity.

Even adding large amounts of carbon could prove counterproductive. At a certain point conductivity will cease to improve and, in theory, would only increase costs if implemented at scale. As a result of these intricacies, they sought to optimize their mixes.

“We found that by fine-tuning the volume of carbon we can reach a tortuosity value of 2,” says Ulm. “This means the path the electrons take is only twice the length of the sample.”

Quantifying such properties was vital to Ulm and his colleagues. The goal of their recent paper was not just to prove that multifunctional cement was possible, but that it was also viable for mass production.

“The key point is that in order for an engineer to pick up things, they need a quantitative model,” explains Ulm. “Before you mix materials together, you want to be able to expect certain repeatable properties. That’s exactly what this paper outlines; it separates what is due to boundary conditions — [extraneous] environmental conditions — from really what is due to the fundamental mechanisms within the material.”

By isolating and quantifying these mechanisms, Soliman, Chanut, and Ulm hope to provide engineers with exactly what they need to implement multifunctional cement on a broader scale. The path they’ve charted is a promising one — and, thanks to their work, shouldn’t prove too tortuous.

The research was supported through the Concrete Sustainability Hub by the Portland Cement Association and the Ready Mixed Concrete Research and Education Foundation.

Featured image: MIT CShub postdocs Nicolas Chanut and Nancy Soliman hold two of their conductive cement samples © Andrew Logan


Reference: Nancy A. Soliman, Nicolas Chanut, Vincent Deman, Zoe Lallas, and Franz-Josef Ulm, “Electric energy dissipation and electric tortuosity in electron conductive cement-based materials”, Phys. Rev. Materials 4, 125401 – Published 9 December 2020. DOI: https://doi.org/10.1103/PhysRevMaterials.4.125401


Provided by MIT

New AI Tool Calculates Materials’ Stress and Strain Based on Photos (Material Science)

The advance could accelerate engineers’ design process by eliminating the need to solve complex equations.

Isaac Newton may have met his match.

For centuries, engineers have relied on physical laws — developed by Newton and others — to understand the stresses and strains on the materials they work with. But solving those equations can be a computational slog, especially for complex materials.

MIT researchers have developed a technique to quickly determine certain properties of a material, like stress and strain, based on an image of the material showing its internal structure. The approach could one day eliminate the need for arduous physics-based calculations, instead relying on computer vision and machine learning to generate estimates in real time.

The researchers say the advance could enable faster design prototyping and material inspections. “It’s a brand new approach,” says Zhenze Yang, adding that the algorithm “completes the whole process without any domain knowledge of physics.”

The research appears today in the journal Science Advances. Yang is the paper’s lead author and a PhD student in the Department of Materials Science and Engineering. Co-authors include former MIT postdoc Chi-Hua Yu and Markus Buehler, the McAfee Professor of Engineering and the director of the Laboratory for Atomistic and Molecular Mechanics.

Engineers spend lots of time solving equations. They help reveal a material’s internal forces, like stress and strain, which can cause that material to deform or break. Such calculations might suggest how a proposed bridge would hold up amid heavy traffic loads or high winds. Unlike Sir Isaac, engineers today don’t need pen and paper for the task. “Many generations of mathematicians and engineers have written down these equations and then figured out how to solve them on computers,” says Buehler. “But it’s still a tough problem. It’s very expensive — it can take days, weeks, or even months to run some simulations. So, we thought: Let’s teach an AI to do this problem for you.”

The researchers turned to a machine learning technique called a Generative Adversarial Neural Network. They trained the network with thousands of paired images — one depicting a material’s internal microstructure subject to mechanical forces,  and the other depicting that same material’s color-coded stress and strain values. With these examples, the network uses principles of game theory to iteratively figure out the relationships between the geometry of a material and its resulting stresses.

“So, from a picture, the computer is able to predict all those forces: the deformations, the stresses, and so forth,” Buehler says. “That’s really the breakthrough — in the conventional way, you would need to code the equations and ask the computer to solve partial differential equations. We just go picture to picture.”

This visualization shows the deep-learning approach in predicting physical fields given different input geometries. The left figure shows a varying geometry of the composite in which the soft material is elongating, and the right figure shows the predicted mechanical field corresponding to the geometry in the left figure. © MIT

That image-based approach is especially advantageous for complex, composite materials. Forces on a material may operate differently at the atomic scale than at the macroscopic scale. “If you look at an airplane, you might have glue, a metal, and a polymer in between. So, you have all these different faces and different scales that determine the solution,” say Buehler. “If you go the hard way — the Newton way — you have to walk a huge detour to get to the answer.”

But the researcher’s network is adept at dealing with multiple scales. It processes information through a series of “convolutions,” which analyze the images at progressively larger scales. “That’s why these neural networks are a great fit for describing material properties,” says Buehler.

The fully trained network performed well in tests, successfully rendering stress and strain values given a series of close-up images of the microstructure of various soft composite materials. The network was even able to capture “singularities,” like cracks developing in a material. In these instances, forces and fields change rapidly across tiny distances. “As a material scientist, you would want to know if the model can recreate those singularities,” says Buehler. “And the answer is yes.”

This visualization shows the simulated failure in a complicated material by a machine-learning-based approach without solving governing equations of mechanics. The red represents a soft material, white represents a brittle material, and green represents a crack. © MIT

The advance could “significantly reduce the iterations needed to design products,” according to Suvranu De, a mechanical engineer at Rensselaer Polytechnic Institute who was not involved in the research. “The end-to-end approach proposed in this paper will have a significant impact on a variety of engineering applications — from composites used in the automotive and aircraft industries to natural and engineered biomaterials. It will also have significant applications in the realm of pure scientific inquiry, as force plays a critical role in a surprisingly wide range of applications from micro/nanoelectronics to the migration and differentiation of cells.”

In addition to saving engineers time and money, the new technique could give nonexperts access to state-of-the-art materials calculations. Architects or product designers, for example, could test the viability of their ideas before passing the project along to an engineering team. “They can just draw their proposal and find out,” says Buehler. “That’s a big deal.”

Once trained, the network runs almost instantaneously on consumer-grade computer processors. That could enable mechanics and inspectors to diagnose potential problems with machinery simply by taking a picture.

In the new paper, the researchers worked primarily with composite materials that included both soft and brittle components in a variety of random geometrical arrangements. In future work, the team plans to use a wider range of material types. “I really think this method is going to have a huge impact,” says Buehler. “Empowering engineers with AI is really what we’re trying to do here.”

Funding for this research was provided, in part, by the Army Research Office and the Office of Naval Research.

Featured image: MIT researchers have developed a machine-learning technique that uses an image of the material’s internal structure to estimate the stresses and strains acting on the material.


Paper: “Deep Learning Model to Predict Complex Stress and Strain Fields in Hierarchical Composites”


Provided by MIT