CMU’s MoonRanger Will Search for Water at Moon’s South Pole (Planetary Science)

MoonRanger, a small robotic rover being developed by Carnegie Mellon University and its spinoff Astrobotic, has completed its preliminary design review in preparation for a 2022 mission to search for signs of water at the moon’s south pole.

Whether buried ice exists in useful amounts is one of the most pressing questions in lunar exploration, and MoonRanger will be the first to seek evidence of it on the ground. If found in sufficient concentration at accessible locations, ice might be the most valuable resource in the solar system, said William “Red” Whittaker, University Founders Research Professor in the Robotics Institute.

“Water is key to human presence on and use of the moon,” explained Whittaker, who is leading development of MoonRanger. “Space agencies around the world are intent on investigating it.”

Whittaker and his team first approached NASA about using robots to search for lunar ice in 1996, and they will fulfill that vision a quarter century later by landing in 2022.

“This hasn’t been quick or easy,” Whittaker said. “It is stunning that after these many years we will have the first look.”

NASA will follow MoonRanger at a later date with its more capable Volatiles Investigating Polar Exploration Rover (VIPER), which will perform more rigorous and sustained exploration and scientific characterization of the ice.

MoonRanger has completed its preliminary design review in advance of a 2022 mission to search for signs of water at the moon’s south pole.

Video: MoonRanger, a small robotic rover being developed by CMU and its spinoff Astrobotic, has completed its preliminary design review in preparation for a 2022 mission to search for signs of water at the moon’s south pole.

MoonRanger’s lander will be the Masten Space Systems’ XL-1, supported by the NASA Commercial Lunar Payload Services program. The rover will be one of eight science and technology payloads, which are supported by the NASA Lunar Surface Instrument and Technology Payloads program.

The space agency said the payloads support its Artemis program, which aims to return U.S. astronauts to the moon in the coming years.

Last month, reviewers determined the viability of the design for the rover and its mission. Lydia Schweitzer, a master’s student in computational design who led the systems engineering team, said the two-day review involved more than 60 people — including veterans of the Apollo program and Mars rover project — who provided important suggestions and feedback.

Schweitzer said the project involved a dozen faculty and staff members, as well as at least 90 students, including three semesters of enrollees in Whittaker’s project course. Disciplines represented on the team comprise engineering, robotics, computer science, software engineering, human-computer interaction, architecture and design. The team also has taken advantage of a network of CMU alumni with expertise in space robotics to solve problems and optimize the rover’s design.

Even as MoonRanger takes shape, Whittaker and another student team continue to prepare for a 2021 mission in which a four-pound CMU rover called Iris and a CMU art package called MoonArk will travel to the moon on Astrobotic’s Peregrine lander.

MoonRanger features a number of technical innovations. About the size of a suitcase, it is designed to repeatedly explore at the rate of 1,000 meters per Earth day in both sunlit and dark conditions — unprecedented speed for a planetary rover. By contrast, a Chinese robot now on the far side of the moon has averaged less than a meter per Earth day.

Unlike other rovers, MoonRanger doesn’t carry isotope heating, so its battery and electronics will fail when night falls and cryogenic temperatures set in. Hence, the robot must accomplish its mission in less than the 14 sunlit Earth-days of the lunar month. It also is light and can’t carry a big radio for communicating directly with Earth. It thus must return to the lander, with which it will establish short-range wireless communication so the lander’s radio can relay the robot’s findings to Earth.

“MoonRanger is going to be on its own for long periods of time,” said David Wettergreen, research professor of robotics and co-investigator for the rover project, noting the rover will be out of touch with controllers on Earth as it does its explorations.

The mission was originally designed to demonstrate the capability of the rover. But NASA expanded it this spring to include the search for ice by adding its Neutron Spectrometer System (NSS) to MoonRanger. The NSS, developed by NASA Ames Research Center, measures the amount of hydrogen in the upper layer of the moon’s soil, called regolith. Hydrogen abundance is correlated with the concentration of buried water ice. The NSS will be along for the ride, “ticking like a Geiger counter” when the rover passes over buried ice, then falling silent in bone-dry areas, Whittaker said.

The rover’s solar array is oriented vertically to capture the low sun angles experienced at the pole. The low sun also means that craters and dips cast deep, pitch-black shadows. The rover, therefore, will need to sense and navigate through darkness — another first. Since LIDAR sensors used commonly by Earth robots aren’t yet available for small space rovers, MoonRanger achieves night vision by projecting laser line stripes ahead of it to model the darkened terrain, much as stereo cameras do in sunlight.

Once it lands on the moon, MoonRanger will evaluate its driving, navigation and mapping capabilities in short jaunts near the lander. It will then attempt a series of distant treks to seek ice.

“If we could make a one-kilometer trek, we’d be very happy,” Wettergreen said. “If we could do it twice, that would be amazing.”

Uncertainty is inescapable for a mission as ambitious as MoonRanger, Whittaker said.

“In the face of that, there is only the question of whether to do it anyway,” he added. “This has all the elements of purpose, technology, exploration, science and fulfillment of vision. These leave no question about going for it and giving it our all.”

Provided by Carnegie Mellon University

Saving Lives Through Early Detection Of Gastric Cancer Cells (Oncology / Medicine)

A new method for identifying gastric cancer cells within minutes and more accurately than by using traditional methods is underway at City University of Hong Kong (CityU).

Professor Li Wen Jung ©CityU

Led by Professor Li Wen Jung of the Department of Mechanical Engineering and Associate Provost (Institutional Initiatives), with collaborators from the Shenyang Institute of Automation (SIA) of the Chinese Academy of Sciences (CAS) and First Hospital of China Medical University (FHCMU), the research has recently been published in Science Advances.

“The aim is to reduce the number of deaths due to gastric cancer, one of the leading causes of cancer deaths worldwide,” said Professor Li, a co-contact author on the publication titled “Detection and isolation of free cancer cells from ascites and peritoneal lavages using optically induced electrokinetics (OEK)”.

The first author of the paper is Ms Zhang Yuzhao, Professor Li’s student at the SIA, a collaborating institution on the project, and where Professor Li is also an Affiliated Professor. He first set up the OEK system in CityU in 2012. With the support of the CAS-Croucher Funding Scheme, he later replicated the system at the Joint Laboratory co-established by CityU and SIA in Shenyang, where the experiments of this study were formed.

Around 800,000 deaths a year are recorded worldwide from gastric cancer, the third highest rate among cancer deaths. The novel procedure developed by the joint team uses a new kind of OEK microfluidic method to isolate cancer cells from the stomach area.

The OEK method is a new technique that can identify gastric cancer cells within minutes and more accurately. ©CityU

Gastric cancer is often hard to diagnose because current approaches are not sensitive enough to spot malignant cells.

However, the OEK method is a new technique that could be integrated with “lab-on-a-chip” systems that offers researchers opportunities to manipulate objects within a micro- and nanoscale bioengineering environment.

The rationale for applying OEK to gastric cancer is that these cells are not the same size and, crucially, possess different electrical characteristics to other cells in the peritoneal region.

“When compared to traditional methods for spotting gastric cancer cells, our OEK microfluidic method is more sensitive when looking at electrical characteristics. Using this technique, we have been able to separate gastric cancer cells from other cells in six patients with ascites [abnormal buildup of abdominal fluid] with a purity of over 70%,” he said.

The new method is appealing because it is quick and non-invasive. In fact, within five minutes, it can separate out the gastric cancer cells on the OEK microfluidic chip.

“The study has benefited from working with doctors and patients at the FHCMU in Shenyang where medical staff have been impressed with the results,” Professor Li said. The principal collaborator from the FHCMU is Professor Wang Zhenning, who is also a co-contact author on the publication.

“Our hope is that our research will speed up the diagnosis of gastric cancer and save lives.”

References: Yuzhao Zhang,, Junhua Zhao,, Haibo Yu, Pan Li, Wenfeng Liang, Zhu Liu, Gwo-Bin Lee, Lianqing Liu, Wen Jung Li and Zhenning Wang, “Detection and isolation of free cancer cells from ascites and peritoneal lavages using optically induced electrokinetics (OEK)”, Science Advances, 2020, Vol. 6, no. 32, eaba9628 DOI: 10.1126/sciadv.aba9628 link:

Provided by City University Of Hong Kong

There’s No Single Gene For left-handedness: At Least 41 Regions Of DNA Are Involved (Biology)

Most people consistently use the same hand to do tasks that require skill and control such as writing or threading a needle. We know genetics plays a big part in which hand a person prefers, but it has been difficult to identify the exact genes responsible.

To find out more, researchers from University Of Queensland analysed the DNA of more than 1.7 million people and discovered 41 regions of the genome associated with being left handed and another seven associated with being ambidextrous.

What makes people left-handed?

About 88% of people prefer to use their right hand for complex tasks, around 10% prefer their left hand, and the other 2% report they do not have a preference and can use either hand. Hand preference develops so early that it can be seen in the womb.

Handedness tends to stabilise around the time children are learning to draw. In the absence of injury or training it remains constant throughout life. Evidence from historic human populations suggests it has been this way for hundreds of thousands of years.

Research examining patterns of handedness in twins and families shows most of the variation is down to non-genetic factors, such as training and the environment in which they gain early motor skills. However, genetics does play a significant role.

There is no single gene for handedness

Since the mid-1980s more than 100 journal articles have explored the idea that a single gene might influence handedness. These theories suggested one variant of the gene would bias an individual towards right-handedness, while the alternate variant led to handedness being randomly determined.

While there have been many theories attempting to explain different human characteristics via single genes, in recent years researchers of University of Queensland have discovered that the reality is often much more complicated. More recent research uses genome-wide association studies (GWAS) to look for a relationship between a trait of interest and the number of copies of a genetic variant someone has. These analyses are run for millions of variants located across the genome.

These genome-wide studies have shown that almost all human traits are influenced by many hundreds or thousands of genetic variants. Often these variants are located between genes whose purpose is not clearly identifiable, in what used to be called “junk DNA”.

GWAS has also shown most traits are influenced by large numbers of genes which each contribute a very small effect, rather than a single gene which has a large effect. To track these small effects, large collaborative studies with many participants are required in order to identify the individual genetic variants involved.

What GWAS reveals about handedness?

In 2009 researchers started a project involving researchers from around the world to hunt for genetic variants that influence handedness using GWAS. They did not recruit participants based on their handedness, so the number of left-handed people was relatively small. As a result, they have only recently gathered enough to undertake robust analyses.

Their study brought together analyses of data from 1,766,671 people. Of these people, 194,198 were left-handed and 37,637 were ambidextrous. They found 41 regions of the genome associated with left-handedness and seven regions associated with ambidexterity.

Many of the regions of the genome associated with left-handedness contained genes that code for microtubule proteins. These proteins play important roles during development in the migration of neurons and in the ability of the brain to adapt to changes in the environment.

Interestingly, genes that influence other asymmetries in the body, such as which side of the body the heart is located on, were not associated with handedness in our study.

Another important finding was that there was little overlap between the regions of the genome associated with left-handedness and those associated with ambidexterity. This suggests that ambidexterity is more complicated than they previously thought. The mechanisms that influence the direction of hand preference might be different from those that influence the degree of hand preference.

These findings give researchers promising new leads but more work is needed to identify further genetic variants that influence handedness. There is also a long way to go before we understand how these variants play a role in someone becoming right-handed, left-handed or ambidextrous.

References: Cuellar-Partida, G., Tung, J.Y., Eriksson, N. et al. Genome-wide association study identifies 48 common genetic variants associated with handedness. Nat Hum Behav (2020). link:

Provided by University Of Queensland

Living With Diabetes? Get Your Eyes Checked (Ophthalmology / Medicine)

A new University of Sydney study has found only half of people living with diabetes get the recommended diabetes eye checks, putting them at risk of significant vision loss and blindness.

Diabetes is the leading cause of blindness in working-age Australians. All people with diabetes are at risk of diabetes related retinopathy which causes damage to the back of the eye. Most people with diabetes need a diabetes eye check every two years, and some more frequently.

The study, published in Clinical and Experimental Ophthalmology, linked data from the Sax Institute’s 45 and Up study with Medicare Benefits data to examine how frequently almost 25,000 people in New South Wales living with diabetes had eye examinations.

Researchers found people who had been living with diabetes for 10 or more years were even less likely to get regular eye checks, with almost 80 percent of people not partaking in the annual check recommended for this group.

Co-author and ophthalmologist Professor Mark Gillies from the University of Sydney Faculty of Medicine and Health and Save Sight Institute said the findings reinforced the need for more education.

“Ninety-eight percent of serious vision loss from diabetes can be prevented with regular eye examinations and early treatment,” said Professor Gillies.

“I encourage people to use services like KeepSight to keep on top of their appointments. It’s also important they understand the kind of eye check required, as only eye checks that include dilation of the pupil with eye drops (fundus dilation) is appropriate to detect changes in the eye-related to diabetes.”

Diabetes Australia’s KeepSight program, which commenced just over 12 months ago, is helping to ensure that the proportion of people with diabetes accessing eye checks increases in coming years and ultimately, every person with diabetes get the necessary eye checks and this helps prevent vision loss and blindness.

KeepSight is an online eye check reminder program easily accessed from a mobile phone.

Diabetes Australia CEO Professor Greg Johnson said KeepSight has enrolled 100,000 people since it started last year.

“Having 100,000 Australians with diabetes registered with KeepSight is an important milestone for the program – but there are currently over 1.36 million Australians living with diabetes so we are encouraging every person with diabetes, and all health professionals, to register with KeepSight,” said Professor Johnson.

“Keep Sight provides electronic alerts and reminders to help people with diabetes remember their diabetes eye checks. When it’s time for a diabetes eye check you get a reminder. It’s that simple. KeepSight can also help you find an optometrist if you don’t know one.”

The KeepSight program, which is run by Diabetes Australia in partnership with Vision 2020 Australia, Centre for Eye Research Australia and Oculo, has been co-funded by the Australian Government, Specsavers, Bayer, Novartis and Mylan. The program has widespread support from leading diabetes and eye health groups including the Royal Australian and New Zealand College of Ophthalmologists, Orthoptics Australia, Optometry Australia, the Australian Diabetes Society and the Australian Diabetes Educators Association.

The University of Sydney-led research is part of a series of population-based record linkage projects using the NSW 45 and Up study to evaluate the uptake and long-term health impact of government-funded services and programs implemented to support care and reduce complications in people with diabetes.

References: Gibson, AA, Humphries, J, Gillies, M, Nassar, N, Colagiuri, S. Adherence to eye examination guidelines among individuals with diabetes: An analysis of linked health data. Clin Experiment Ophthalmol. 2020; 1– 10. link:

Provided by University Of Sydney

Cells Communicate By Doing The ‘Wave’ (Biology)

Cells work around the clock to deliver, maintain, and control every aspect of life. And just as with humans, communication is a key to their success.

A ‘wave’ can start with just one person. Kyoto University scientists have found how a single cell can move an entire collective. (Kyoto University)

Every essential biological process requires some form of communication among cells, not only with their immediate neighbors but also to those significantly farther away. Current understanding is that this information exchange relies on the diffusion of signaling molecules or on cell-to-cell relays.

Publishing in the journal Developmental Cell, a research team at Kyoto University’s Graduate School of Medicine reports on a novel method of communication relying on ‘mechano-chemical’ signals to control cell movement. The research group focused on a fundamental pathway — MAPK/ERK, or ERK pathway — and were able to demonstrate how the movement of a single cell could trigger a cascading reaction resulting in the migration of a cell collective.

“Mechanical and biochemical signals in cells fundamentally control everything from homeostasis, development, to diseases,” explains Tsuyoshi Hirashima, leader of the study.

“We knew from past experiments how vital the ERK pathway is in cell activity, but the mechanism of how it can propagate in a collection of cells was incomplete.”

MAPK/ERK is so fundamental that it exists in all cells, controlling a wide range of actions from growth and development to eventual cell death. The pathway is activated when a receptor protein on the cell surface binds with a signaling molecule, resulting in a cascade of proteins and reactions spreading throughout the cell’s interior.

Employing a live imaging technique that can visualize an individual cell’s active ERK pathway, the team began observing the effects of cell movement. What they found was unexpected: when a cell began to extend itself, ERK activity increased, causing the cell to contract.

“Cells are tightly connected and packed together, so when one starts contracting from ERK activation, it pulls in its neighbors,” elaborates Hirashima. This then caused surrounding cells to extend, activating their ERK, resulting in contractions that lead to a kind of tug-of-war propagating into colony movement.

“Researchers had previously proposed that cells extend when ERK is activated, so our results came as quite a surprise.”

The team incorporated these observations into a mathematical model, combining mechano-chemical regulations with quantitative parameters. The output demonstrated consistency with experimental data.

“Our work clearly shows that the ERK-mediated mechano-chemical feedback system generates complicated multicellular patterns,” concludes Hirashima.

“This will provide a new basis for understanding many biological processes, including tissue repair and tumor metastasis.”

References: Naoya Hino, Leone Rossetti, Ariadna Marín-Llauradó, Kazuhiro Aoki, Xavier Trepat, Michiyuki Matsuda, TsuyoshiHirashima (2020). ERK-Mediated Mechanochemical Waves Direct Collective Cell Polarization. Developmental Cell. Doi:

Provided By Kyoto University

How Your Brain Finds The Good Objects? (Neuroscience)

In the wild, it is essential for animals to pick out good or bad objects within their visual field. Whether it be food or predator, split-second recognition and action need to be made for survival.

Researchers uncover the neural circuit that connects how the brain perceives valuable information and how the eyes move toward it (Takada lab)

The underlying mechanisms that govern this behavior in the brain has been gradually uncovered by researchers. Now, a team from Kyoto University’s Primate Research Institute have revealed how the brain controls eye movements toward the ‘good objects’.

“The brain regions that control how we process ‘value information’ are called the basal ganglia. Over the years, we had uncovered that it also controls eye movement,” explains Hidetoshi Amita, first author of the paper published in Nature Communications.

“However, exactly how the basal ganglia convert ‘value information’ to eye movement was still unknown. So, we designed a way to explore this information processing pathway.”

The team artificially manipulated a specific neural circuit in the basal ganglia using a method called optogenetics, a way of activating selective neurons using light.

“Using macaques, we activated the pathway from the caudate nucleus to the superior colliculus,” continues Amita. “The caudate nucleus is a part of the brain’s reward system, and the superior colliculus triggers controlled eye movement.”

It was confirmed that the neuronal activity in these brain regions increased when presented with objects that would privide the most benefit, indicating that these regions convey information about high-value objects associated with a reward.

The pathway was then stimulated by light to understand how the macaques’ eyes moved to see these ‘high-value objects’, and found that the pathway conveying this value information was activated. At the same time eye movement also showed notable increase.

“Our findings indicate that there is a direct neural circuit that connects how the brain perceives valuable information and how the eyes move toward it,” concludes Masahiko Takada a collaborator in the study. “In fact, we were impressed that only 20 milliseconds of activation were sufficient to modulate gaze shift.”

The study establishes that a combination of task-related neuronal activity recording and optogenetic manipulation can be a powerful tool for analyzing neural networks in the primate brain. Takada hopes that this approach can be applied to study other neural circuits to better understand their functional roles.

References: Hidetoshi Amita, Hyoung F. Kim, Ken-ichi Inoue, Masahiko Takada & Okihide Hikosaka (2020). Optogenetic manipulation of a value-coding pathway from the primate caudate tail facilitates saccadic gaze shift. Nature communication, 11:1876. Link:

Breastfeeding Hormones Make Mothers Happier (Biology)

Oxytocin makes nursing mothers more sensitive to happier faces than angry ones.

Mothers with increased levels of oxytocin after breastfeeding showed reduced negative recognition and enhanced positive recognition to facial expressions.

Oxytocin is one of the most important hormones between a mother and her baby. Researchers at Kyoto University and Azabu University in Japan report in a new study that the levels of oxytocin correspond to a mother’s sensitivity to happy and angry adults.

The findings, published in Biology Letters, give new insights on the behavioral effects of nurturing that apply well beyond the mother’s response to the baby.

Oxytocin is best recognized for its role in childbirth and childrearing. It causes labor contractions and promotes lactation in the mother.

Its effects are not just physical, however. Higher oxytocin levels strengthen the bond between the mother and child. It even has an effect on how we deal with other people, as it buffers negative emotions, such as stress and anxiety. It also enhances our recognition of positive facial expressions while dampening our recognition of negative ones.

However, much of our knowledge about the behavioral effects of oxytocin comes from studying populations who have been administered oxytocin, and therefore has not considered the effects of oxytocin levels produced naturally.

“Intranasal oxytocin studies are inconsistent. One reason is that individual differences in endogenous oxytocin concentrations and fluctuations are relatively ignored,” said Masako Myowa from Kyoto University’s Department of Education, and lead of the new study.

Rather than examining the effects of people receiving oxytocin, Myowa’s team were more curious at looking at how a primiparous mother’s natural oxytocin levels affects her behavior. To do so, the scientists investigated how mothers reacted before and after breastfeeding — where they would have different levels of oxytocin — to pictures of human faces.

The study found that there was a fair amount of variation in oxytocin between mothers, and that this variation correlated with their responses to adult faces that showed positive or negative expressions. Namely, mothers with more oxytocin were more apt at recognizing positive expressions and visa-versa.

Breastfeeding is recommended worldwide, as it has a number of benefits on the baby, including stronger immunity against a number of diseases and infections and even indications that it can help prevent obesity.

There are also a number of benefits to the mother, including a faster recovery from the birthing process.

However, there are also many psychological problems that can emerge after birth, such as postpartum depression. Understanding hormonal changes could help identify mothers more likely to suffer from these problems.

“Our goal is to understand the perceptual and psychological changes that occur in mothers. Our work suggests natural oxytocin levels could be an important factor,” says Myowa.

References: Michiko Matsunaga, Takefumi Kikusui, Kazutaka Mogi, Miho Nagasawa, Rumi Ooyama and Masako Myowa (2020). Breastfeeding dynamically changes endogenous oxytocin levels and emotion recognition in mothers. Biology Letters, 16(6):20200139. Doi:

Provided by Kyoto University

Babies Random Choices Become Their Preferences (Psychology)

We assume we choose things that we like, but research suggests that’s sometimes backwards: We like things because we choose them, and we dislike things that we don’t choose.


When a baby reaches for one stuffed animal in a room filled with others just like it, that seemingly random choice is very bad news for those unpicked toys: the baby has likely just decided she doesn’t like what she didn’t choose.

Though researchers have long known that adults build unconscious biases over a lifetime of making choices between things that are essentially the same, findings from Johns Hopkins University indicate that even babies engage in this phenomenon, suggesting that this way of justifying choice is intuitive and somehow fundamental to the human experience.

“The act of making a choice changes how we feel about our options,” said co-author Alex Silver, a former Johns Hopkins undergraduate who’s now a graduate student in cognitive psychology at the University of Pittsburgh. “Even infants who are really just at the start of making choices for themselves have this bias.”

The findings are published today in the journal Psychological Science.

People assume they choose things that they like, but the new research suggests that’s sometimes backwards: We like things because we choose them, and we dislike things that we don’t choose.
“I chose this, so I must like it. I didn’t choose this other thing, so it must not be so good. Adults make these inferences unconsciously,” said co-author Lisa Feigenson, a Johns Hopkins cognitive scientist specializing in child development. “We justify our choice after the fact.”

This makes sense for adults in a consumer culture who must make arbitrary choices every day, between everything from toothpaste brands to makes of cars to styles of jeans. The question, for Feigenson and Silver, was when exactly people start doing this. So they turned to babies, who don’t get many choices so are “a perfect window into the origin of this tendency,” Feigenson says.

The team brought 10- to 20-month-old babies into the lab and gave them a choice of objects to play with: two equally bright and colorful soft blocks.

They set each block far apart, so the babies had to crawl to one or the other—a random choice.

After the baby chose one of the toys, the researchers took it away and came back with a new option. The babies could then pick from the toy they didn’t play with the first time, or a brand new toy.

“The babies reliably chose to play with the new object rather than the one they had previously not chosen, as if they were saying, ‘Hmm, I didn’t choose that object last time, I guess I didn’t like it very much,'” Feigenson said. “That is the core phenomenon. Adults will like less the thing they didn’t choose, even if they had no real preference in the first place. And babies, just the same, dis-prefer the unchosen object.”

In follow-up experiments, when the researchers instead chose which toy the baby would play with, the phenomenon disappeared entirely. If you take the element of choice away, Feigenson said, the phenomenon goes away.

“They are really not choosing based on novelty or intrinsic preference,” Silver said. “I think it’s really surprising. We wouldn’t expect infants to be making such methodical choices.”

To continue studying the evolution of choice in babies, the lab will next look at the idea of “choice overload.” For adults, choice is good, but too many choices can be a problem, so the lab will try to determine if that is also true for babies.

References: Silver AM, Stahl AE, Loiotile R, Smith-Flores AS, Feigenson L. When Not Choosing Leads to Not Liking: Choice-Induced Preference in Infancy. Psychological Science. October 2020. doi:10.1177/0956797620954491 link:

Provided by University Of Hopkins

Robotic Fabric: A Breakthrough with Many Uses (Engineering / Material Science)

Researchers at Yale have developed a robotic fabric, a breakthrough that could lead to such innovations as adaptive clothing, self-deploying shelters, or lightweight shape-changing machinery.

The lab of Prof. Rebecca Kramer-Bottiglio has created a robotic fabric that includes actuation, sensing, and variable stiffness fibers while retaining all the qualities that make fabric so useful – flexibility, breathability, small storage footprint, and low weight. They demonstrated their robotic fabric going from a flat, ordinary fabric to a standing, load-bearing structure. They also showed a wearable robotic tourniquet and a small airplane with stowable/deployable fabric wings. The results are published this week in Proceedings of the National Academy of Sciences.

The researchers focused on processing functional materials into fiber-form so they could be integrated into fabrics while retaining its advantageous properties. For example, they made variable stiffness fibers out of an epoxy embedded with particles of Field’s metal, an alloy that liquifies at relatively low temperatures. When cool, the particles are solid metal and make the material stiffer; when warm, the particles melt into liquid and make the material softer.

“Our Field’s metal-epoxy composite can become as flexible as latex rubber or as stiff as hard acrylic, over 1,000 times more rigid, just by heating it up or cooling it down,” said Trevor Buckner, a graduate student in Kramer-Bottiglio’s lab and lead author on the paper. “Long fibers of this material can be sewn onto a fabric to give it a supportive skeleton that we can turn on and off.” These on-demand support fibers allow a robotic fabric to be bent or twisted and then locked into shape, or hold loads that would otherwise collapse a typical fabric.

To create sensors that detect internal or environmental changes and allow the fabric to respond appropriately, the researchers developed a conductive ink based on a Pickering emulsion, which lowers the ink viscosity and also enables the use of non-toxic solvents. With this ink, the researchers can paint the sensors directly onto the fabric.

“The conductive composite self-coagulates around the individual fibers and does not notably change the porosity of the fabric,” said Kramer-Bottiglio, the John J. Lee Assistant Professor of Mechanical Engineering & Materials Science. “The sensors are visible, but don’t change the texture or breathability of the fabric, which is important for comfort in wearable applications.”

To make the fabric move, the researchers used shape-memory alloy (SMA) wire, which can return to a programmed shape after being deformed. SMA wire is usually programmed into coils or meshes to generate contracting motion, but this approach was not desirable as it caused the fabric to bunch up unpredictably.

“Instead of using the coil technique, we flattened the wires out into ribbons to give them a geometry much more suited to smooth bending motion, which is perfect for robotic fabrics,” said Buckner.

As the project was funded by the Air Force Office of Science Research, the researchers envision applications such deployable and adaptive structures, active compression garments, smart cargo webbing, and reconfigurable RF antennas. “We believe this technology can be leveraged to create self-deploying tents, robotic parachutes, and assistive clothing,” says Kramer-Bottiglio. “Fabrics are a ubiquitous material used in a wide range of products, and the ability to ‘roboticize’ some of these products opens up many possibilities.”

References: Trevor L. Buckner, R. Adam Bilodeau, Sang Yup Kim, Rebecca Kramer-Bottiglio, “Roboticizing fabric by integrating functional fibers”, Proceedings of the National Academy of Sciences Sep 2020, 202006211; DOI: 10.1073/pnas.2006211117

Provided by Yale University