Australian Stinging Trees Make Spider- and Cone Snail-Like Venom (Botany)

A team of scientists from the University of Queensland and King’s College London has found that the venom of Australian Dendrocnide trees contains previously unidentified neurotoxic peptides and that the 3D structure of these pain-inducing peptides is reminiscent of spider and cone snail venoms targeting the same pain receptors, thus representing a remarkable case of inter-kingdom convergent evolution of animal and plant venoms.

Stinging nettles of the genus Dendrocnide produce potent neurotoxins: (A) sign at a North Queensland National Park advising caution around stinging trees; (B) Dendrocnide excelsa petioles are covered in stinging hairs; (C) scanning electron micrograph of trichome structure on the leaf of Dendrocnide moroides; (D-G) cutaneous reaction resulting from an accidental sting with Dendrocnide moroides documented with an iPhone XR and NEC G120W2 thermal imager, illustrating almost immediate local piloerection (arrowheads in D), development of wheals where stinging hairs penetrate the skin (arrows in E), as well as a long-lasting axon reflex erythema (arrows in F and G) and associated local increase in skin temperature (degrees Celsius); (H) HPLC chromatogram of trichome extract from Dendrocnide excelsa; diamonds indicate nocifensive responses elicited by intraplantar administration of individual fractions in vivo in C57BL6/J mice, with a single late-eluting peak identified as the main pain-causing fraction. Image credit: Irina Vetter, Thomas Durek & Darren Brown, University of Queensland.

Australia notoriously harbors some of the world’s most venomous animals, but although less well known, its venomous flora is equally remarkable.

The giant stinging tree (Dendrocnide excelsa) reigns superlative in size, with some specimens growing to 35 m (115 feet) tall along the slopes and gullies of eastern Australian rainforests. However, these members of the family Urticaceae are far more than oversized nettles.

Of the six species in the genus Dendrocnide native to the subtropical and tropical forests of Eastern Australia, the giant stinging tree and the mulberry-like stinging tree (Dendrocnide moroides) are particularly notorious for producing painful stings, which can cause symptoms that last for days or weeks in extreme cases.

Like other stinging plants such as nettles, the giant stinging tree is covered in needle-like appendages called trichomes that are around five millimeters in length — the trichomes look like fine hairs, but actually act like hypodermic needles that inject toxins when they make contact with skin.

Small molecules in the trichomes such as histamine, acetylcholine and formic acid have been previously tested, but injecting these did not cause the severe and long-lasting pain of the stinging tree, suggesting that there was an unidentified neurotoxin to be found.

The scientists found a completely new class of neurotoxin miniproteins that they termed ‘gympietides,’ after the Indigenous name for the plant.

Although they come from a plant, the gympietides are similar to spider and cone snail toxins in the way they fold into their 3D molecular structures and target the same pain receptors — this arguably makes the Gympie-Gympie tree a truly ‘venomous’ plant.

The long-lasting pain from the stinging tree may be explained by the gympietides permanently changing the sodium channels in the sensory neurons, not due to the fine hairs getting stuck in the skin.

By understanding how this toxin works, we hope to provide better treatment to those who have been stung by the plant, to ease or eliminate the pain. They can also potentially use the gympietides as scaffolds for new therapeutics for pain relief. They can also potentially use the gympietides as scaffolds for new therapeutics for pain relief.

They can also potentially use the gympietides as scaffolds for new therapeutics for pain relief.

References: Edward K. Gilding et al. 2020. Neurotoxic peptides from the venom of the giant Australian stinging tree. Science Advances 6 (38): eabb8828; doi: 10.1126/sciadv.abb8828 link: https://advances.sciencemag.org/content/6/38/eabb8828

Supercooled Water is Two Liquids in One (Chemistry)

The first-ever measurements of liquid water at temperatures between 135 K (minus 138.15 degrees Celsius, or minus 216.7 degrees Fahrenheit) and 235 K (minus 38.15 degrees Celsius, or minus 36.7 degrees Fahrenheit) provide evidence that it exists in two distinct structures that co-exist and vary in proportion dependent on temperature.

Kringle et al captured reversible changes in the structure of supercooled water using pulsed laser heating and infrared spectroscopy. Image credit: Timothy Holland, Pacific Northwest National Laboratory.

Liquid water at the most extreme possible temperatures has long been the subject of competing theories and conjecture.

Some scientists have asked whether it is even possible for water to truly exist as a liquid at temperatures as low as 190 K (minus 83.15 degrees Celsius, or 117.7 degrees Fahrenheit) or whether the odd behavior is just water rearranging on its inevitable path to a solid.

They showed that liquid water at extremely cold temperatures is not only relatively stable, it exists in two structural motifs. Their findings explain a long-standing controversy over whether or not deeply supercooled water always crystallizes before it can equilibrate. The answer is: no.

The new data obtained using a sort of stop-motion snapshot of supercooled water shows that it can condense into a high-density, liquid-like structure.

This higher density form co-exists with a lower-density structure that is more in line with the typical bonding expected for water.

The proportion of high-density liquid decreases rapidly as the temperature goes from 245 K (minus 138.15 degrees Celsius, or minus 18.7 degrees Fahrenheit) to 190 K, supporting predictions of mixture models for supercooled water.

Dr. Kimmel and colleagues used infrared spectroscopy to observe water molecules trapped in a kind of stop motion when a thin film of ice got zapped with a laser, creating a supercooled liquid water for a few fleeting nanoseconds.

According to the team, this research may help explain graupel, the fluffy pellets that sometimes fall during cool weather storms.

Graupel forms when a snowflake interacts with supercooled liquid water in the upper atmosphere.

Liquid water in the upper atmosphere is deeply cooled. When it encounters a snowflake it rapidly freezes and then in the right conditions, falls to Earth. It’s really the only time most people will experience the effects of supercooled water.

These studies may also help understand how liquid water can exist on very cold planets — Jupiter, Saturn, Uranus and Neptune — in our Solar System, and beyond. Supercooled water vapor also creates the beautiful tails that trail behind comets.

References: Loni Kringle et al. 2020. Reversible structural transformations in supercooled liquid water from 135 to 245 K. Science 369 (6510): 1490-1492; doi: 10.1126/science.abb7542

Paleontologists Discover New Species of Ornithopod Dinosaur (Paleontology)

A new genus and species of an early ornithopod dinosaur has been identified from two nearly complete skeletons found in China’s Liaoning Province.

Changmiania liaoningensis, an anterior part of the holotype in caudolateral view; red arrow indicates the emplacement of the gastrolith clusters. Image credit: Yang et al, doi: 10.7717/peerj.9832.

The newly-discovered dinosaur roamed Earth approximately 123 million years ago during the Early Cretaceous epoch.

The ancient creature belongs to Ornithopoda, a large group of mainly herbivorous bird-hipped dinosaurs.

Scientifically named Changmiania liaoningensis, the new species is the earliest member of the group described so far.

The two nearly complete and articulated skeletons of Changmiania liaoningensis were found by local farmers in the Lujiatun Beds of the Yixian Formation close to Lujiatun Village in western Liaoning Province.

Both individuals were likely entrapped in a collapsed underground burrow while they were resting, which would explain their perfect lifelike postures and the complete absence of weathering and scavenging traces.

The paleontologists hypothesize that the dinosaurs were killed catastrophically by lahar (volcanic mudflow) from a nearby shield volcano.

The holotype (top) and the referred specimen (bottom) of Changmiania liaoningensis in dorsal view; red arrows indicate the emplacement of the gastrolith clusters. Image credit: Yang et al, doi: 10.7717/peerj.9832.

The analysis of the specimens shows that Changmiania liaoningensis was an efficient cursorial (adapted to running) dinosaur and had adaptations to a fossorial (burrowing) behavior.

Some extant fossorial vertebrates dig with their head to some degree, using the top of their broad, firm heads to move, loosen, or compact soil.

The fused premaxillae and the spatulate shape of the dorsal surface of the snout in Changmiania liaoningensis could represent such an implement.

Its postcranial skeleton shares a series of morphological characteristics with actual scratch-digging mammals, including a shortened neck (six cervical vertebrae), a radius that is significantly shorter (70%) than the humerus (upper arm bone), and short hands.

The hip of Changmiania liaoningensis exhibits some features that might also tentatively be related to a digging behavior.

Actual mammals that dig with the forefeet usually brace with their hindfeet, often supplemented by the tail serving as a prop.

The leg of Changmiania liaoningensis is about twice as long as its hand and its tibia (calf bone) is significantly longer than its femur (thigh bone), as in most other small basal ornithopods except Koreanosaurus.

Those leg proportions suggest that Changmiania liaoningensis basically remained an efficient cursorial dinosaur.

Moreover, the hand and skull modifications remain rather modest, so that Changmiania liaoningensis was obviously not a true subterranean animal, but more likely a facultative digger.

References: Y. Yang et al. 2020. A new basal ornithopod dinosaur from the Lower Cretaceous of China. PeerJ 8: e9832; doi: 10.7717/peerj.9832 link: https://peerj.com/articles/9832/

Are Your Friends Cooler Than You Are? Blame The Friendship Paradox (Psychology)

Have you ever felt like everyone else has so much more to be thankful for? Check your Facebook or Instagram feed: Your friends seem to dine at finer restaurants, take more exotic vacations, and have more accomplished children. They even have cuter pets!

Rest assured, it’s an illusion, one that’s rooted in a property of social networks known as the friendship paradox. The paradox, first formulated by sociologist Scott Feld, states that “your friends are more popular than you are, on average.” This property combines with other peculiarities of social networks to create an illusion.

What the friendship paradox means is this: If I asked you who your friends are, and then I met them, on the whole, I would find them to be better socially connected than you. Of course, if you are an exceptionally gregarious person, the paradox won’t apply to you. But for most of us, it is likely to hold.

While this paradox can occur in any social network, it is rampant online. One study found that 98 percent of Twitter users subscribe to accounts that have more followers than they themselves do.

Although it sounds strange, the friendship paradox has a simple mathematical explanation.

Each person’s social circle of friends is different. Most of us have some friends, and then there are well-connected people like David Rockefeller, the onetime CEO of Chase Manhattan Bank, whose address book included more than 100,000 people!

On social media, celebrities like Justin Bieber can have more than 100 million followers. It’s this small group of hyperconnected people — people with many friends, who are part of your social circle — that increases the average popularity of your friends.

This is the mathematical double whammy at the heart of the friendship paradox. Not only does the extraordinary popularity of people like Justin Bieber skew the average popularity of friends for anyone they are connected to, but even though people like him are rare, they also appear in an extraordinary number of social circles.

And the friendship paradox is not a mere mathematical curiosity. It has useful applications in forecasting trends and monitoring disease. Researchers have used it to predict trending topics on Twitter weeks before they became popular and to spot flu outbreaks in their early stages and devise efficient strategies to manage the disease.

Here’s how it can work: Imagine, for example, that you arrive in an African village with only five doses of Ebola vaccine. The best strategy is not to vaccinate the first five people you happen to meet but to ask those people who their friends are and vaccinate those five friends. If you do this, you are likely to pick people who have wider social circles and thus would infect more people were they to get sick. Vaccinating friends would be more effective at stopping the spread of Ebola than inoculating random people who may be on the periphery of a social network and not connected to many others.

There’s more. Remarkably, a stronger version of the friendship paradox holds for many people: Most of your friends have more friends than you do. Let that sink in. I’m no longer talking about averages, where a single exceptionally popular friend could skew the average popularity of your friends.

What this means is that the majority of your friends are better socially connected than you are. Go ahead and try it for yourself. Click on the name of each friend on Twitter and see how many followers they have and how many accounts they are following. I am willing to bet that most numbers are bigger than yours.

Stranger still, this paradox holds not just for popularity but for other traits as well, like enthusiasm for using social media, dining at fine restaurants, or taking exotic vacations. As a concrete example, consider how frequently someone posts updates on Twitter.

It is true that most of the people you follow post more status updates than you do. Also, most of the people you follow receive more novel and diverse information than you do. And most of the people you follow receive more viral information that ends up spreading much farther than what you see in your feed.

This stronger version of the friendship paradox can lead to a “majority illusion,” in which a trait that is rare in a network as a whole appears to be common within many social circles. Imagine that few people, in general, are redheads, yet it appears to many people that most of their friends have red hair. All it takes for the illusion that “red hair is common” to take hold is for a few hyperconnected influencers to be redheads.

The majority illusion can explain why you may notice that your friends seem to be doing more exciting things: People who are more socially connected disproportionately influence what we see and learn on social media. This helps explain why adolescents overestimate the prevalence of binge drinking on college campuses and why some topics appear to be more popular on Twitter than they really are.

The majority illusion can distort your perceptions of the lives of others. People who are better socially connected than the rest of us may also do more notable things, like dining at Michelin-starred restaurants or vacationing in Bora Bora. They are also more active on social media and more likely to Instagram their lives, distorting our perceptions of how common those things are. A good way to mitigate the illusion is to stop comparing yourself to friends and be thankful for what you have.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

This Is Why You Should Sleep In A Cold Bedroom (Biology)

Whether you like to bundle up with several layers of blankets or sleep on a bare mattress exposed to the elements, there’s no one right way to sleep. But if you’re trying to maintain a healthy weight, there might be: Research shows that sleeping in a cold room could help boost your metabolism and make you burn more calories, even during the day.

When you were a baby, you had two types of fat. White fat is what you usually imagine when you think about fat. It stores calories; that’s pretty much all it does. Brown fat, conversely, is what you’d call metabolically active; it burns calories to generate heat. Babies haven’t yet developed the ability to shiver, so they need another way to stay warm — brown fat to the rescue. Once you got older and found other ways to maintain your body temperature, you lost most of your brown fat. Another sad truth about being human.

But a 2014 study published in the journal Diabetes suggested that you might be able to boost your body’s levels of brown fat by sleeping in a chilly environment. The researchers recruited five healthy male volunteers to sleep in climate-controlled rooms at the National Institutes of Health for four months (hospital scrubs and light sheets were provided — what luxury!). The researchers kept the men’s calorie intake controlled by providing all of their meals. For the first month, the men slept with the thermostat set to a neutral 75 degrees Fahrenheit (24 degrees Celsius). The next month, it was set to a chilly 66 degrees F (19 degrees C), then was reset to neutral for a month. They spent a final month sleeping in a toasty 81 degrees F (27 degrees C).

The cold temperatures had a big effect on the men’s bodies. After a month sleeping in the cold, the men had almost doubled their stores of brown fat, which helped improve their insulin sensitivity — a measure that’s affected by shifts in blood sugar, and is generally used as a sign of metabolic health. They even burned more calories during the day. But as quickly as the improvements came about, they were easily undone; the month of sleeping in warm temperatures actually reduced their brown fat to pre-experiment levels. Even still, that’s good news. To supercharge your metabolism, it may take just a month of chilly slumber.

Sleep Helps Your Brain Forget Unnecessary Memories (Neuroscience)

For something so simple and so universal, sleep has long baffled scientists. We still don’t really know why virtually every higher-order animal spends about a third of their day curled up with their eyes closed instead of doing something more productive, like hunting or running an Etsy shop. But researchers have recently discovered that sleep is necessary for the brain to perform an essential task: forgetting everything it learned during the day that it doesn’t need to remember for tomorrow.

All day long, your brain is busy gathering data and growing synapses in order to access that data. And we do mean all day long. Everything you experience gets logged and archived, from the important project your boss assigned you to what sandwich is on special at the local deli. When that data gets logged, your brain sprouts a new synapse to the neuron cluster the contains that information. But just because that information is there, that doesn’t mean it’s doing anything useful. It might just be taking up space—or worse, interfering with other memories and causing you to call Tonya “Tawny” by accident. That’s why it’s important for your brain to evaluate the important memories, and clear out the rest.

According to a pair of studies published in Science, sleep is when the majority of this pruning takes place. In fact, it might be the one of the main reasons why we do it. The first found that, during sleep, the synapses of mice would undergo numerous changes, including some that left them weakened and inaccessible. The second study went a bit further, noting that the targeted synapses were the smaller, weaker ones, and suggesting that this process might be a core function of sleep. It all adds up to a brain that won’t accidentally prioritize a reuben on rye over the big work project—assuming that’s where your priorities lie.

Men And Women Remember Pain Differently (Psychology)

There may be variations, based on sex, in the way that both mice and humans remember pain, according to new research.

Scientists increasingly believe that one of the driving forces in chronic pain — the number one health problem in both prevalence and burden — appears to be the memory of earlier pain.

The researchers found that men (and male mice) remembered earlier painful experiences clearly. As a result, they felt stress and were hypersensitive to later pain when they returned to the location where they’d experienced it. Earlier experiences of pain didn’t seem to stress women (and female mice).

The researchers believe that the robust translational nature of the results, from mice to humans, will potentially aid scientists to move forward in their search for future treatments of chronic pain. It was a discovery that came as a total surprise.

“We set out to do an experiment looking at pain hypersensitivity in mice and found these surprising differences in stress levels between male and female mice,” explains senior author Jeffrey Mogil, professor of pain studies at McGill University’s psychology department and the Alan Edwards Centre for Research on Pain.

“So we decided to extend the experiment to humans to see whether the results would be similar. We were blown away when we saw that there seemed to be the same differences between men and women as we had seen in mice.”

“What was even more surprising was that the men reacted because it is well known that women are both more sensitive to pain than and that they are also generally more stressed out,” says first author Loren Martin, an assistant professor of psychology at the University of Toronto Mississauga.

In experiments with both humans and mice, researchers took the subjects (41 men and 38 women between the ages of 18–40) to a specific room (or put them in a testing container of a certain shape — in the case of the mice) where they experienced low levels of pain from heat on their hind paw or forearm. Humans rated the level of pain on a 100-point scale and mice “rated” the pain by how quickly they moved away from the heat source.

Immediately following this initial experience of low-level pain, subjects experienced more intense pain designed to act as Pavlovian conditioning stimuli. Researchers asked the human subjects to wear a tightly inflated blood pressure cuff and exercise their arms for 20 minutes. This is excruciating and only seven of the 80 subjects rated it at less than 50 on a 100-point scale. Each mouse received a diluted injection of vinegar designed to cause a stomach ache for about 30 minutes.

In order to look at the role that memory plays in the experience of pain, the following day human subjects returned to either the same or a different room, and researchers put mice in the same or a different testing container. Researchers again applied heat to their arms or hind paws.

When (and only when) they went into the same room as in the previous test, men rated the heat pain higher than they did the day before, and higher than the women did. Similarly, male, but not female mice returning to the same environment exhibited a heightened heat pain response, while mice placed in a new and neutral environment did not.

“We believe that the mice and the men were anticipating the cuff, or the vinegar, and, for the males, the stress of that anticipation caused greater pain sensitivity,” says Mogil. “There was some reason to expect that we would see increased sensitivity to pain on the second day, but there was no reason to expect it would be specific to males. That came as a complete surprise.”

In order to confirm that pain increased due to memories of previous pain, the researchers interfered with memory by injecting the brains of male mice with a drug called ZIP that blocks memory. When the researchers then ran the pain memory experiment, these mice showed no signs of remembered pain.

“This is an important finding because increasing evidence suggests that chronic pain is a problem to the extent that you remember it, and this study is the first time such remembered pain has been shown using a translational — both rodent and human subject — approach,” says Martin.

“If remembered pain is a driving force for chronic pain and we understand how pain is remembered, we may be able help some sufferers by treating the mechanisms behind the memories directly.”

Mogil echoes this optimism, “This research supports the idea that the memory of pain can affect later pain.”

“I think it is appropriate to say that further study of this extremely robust phenomenon might give us insights that may be useful for future treatment of chronic pain, and I don’t often say that! One thing is for sure, after running this study, I’m not very proud of my gender,” he adds.

The Canadian Institutes for Health Research, the Natural Sciences and Engineering Research Council of Canada, the Canadian Pain Society/Pfizer Early Career Investigator Pain Research Grant, the Louise and Alan Edwards Foundation, Brain Canada, and the Canada Research Chairs Program funded the research.

This article is republished from Futurity under a Creative Commons license. Read the original article.

There’s Such A Thing As An End-Of-Decade Crises (Psychology)

When you turned 29 (or 19, or 49, or 89 for that matter), we bet that wasn’t the number on your mind. You were probably thinking how close that brought you to 30—a whole new decade. Did you look back on your life and assess how well you’d lived it until that point? Did that make you want to do something out of the ordinary—even dangerous? You’re not alone. Studies show that when you approach a new decade in age, you start to search for meaning. In essence, you have an end-of-decade crisis.

The question that comes up when you’re assessing your life until that point is often: what did it all mean? Has your life had purpose? Value? Is the world better off with you in it? According to a 2014 paper published in PNAS by Adam L. Alter and Hal E. Hershfield, when people feel that some part of their life lacks meaning, they respond in one of two ways: adaptively, by doing something that helps them find meaning; or maladaptively, by doing something that submerges them further into meaninglessness.

The data bears this out. On the adaptive side, Alter and Hershfield found that people whose ages end in 9 (which they referred to as “9-enders”) are more likely to reach for an athletic milestone: they’re overrepresented in first-time marathon runners. Among those who have already run a marathon, 9-enders are also more likely to achieve better times that people of other ages, suggesting they trained harder. On the maladaptive side, things aren’t so pretty. The 9-enders are also more likely to self-sabotage: they’re overrepresented on dating sites that specialize in extramarital affairs, and they have a higher rate of suicide than people of any other age.

So you’ve turned the big X-9 and find that life lacks meaning. How do you make sure you don’t cheat on your partner or deal with suicidal thoughts? According to psychologist Dr. Margaret Rutherford, it helps to give your new decade a “theme.” Do you regret not going back to school in your 20s? Make your 30s all about improving your education and skill set. Are you leaving your 30s without many close friends or a lasting romantic relationship? Make your 40s all about connecting with others. “Giving a new decade an initial desired focus helps you enter it with a sense of creativity and positive expectation,” says Dr. Rutherford.

Your Smartphone Is Designed To Hack Your Brain (Psychology)

The word “hack” gets thrown around a lot these days. “Life hacks” include everything from life-changing study techniques to using a shoe-organizer to organize things besides shoes. And then there are “brain hacks”, which supposedly teach us to access powers we didn’t know our brains possessed. But there’s a more insidious form of brain-hacking — when your brain is the thing being hacked. And smartphone developers are doing it to you all the time.

“This thing is a slot machine,” says former Google Design Ethicist Tristan Harris, holding up his phone. “Every time I check my phone, I’m playing the slot machine to see, ‘What did I get?'” It’s incredibly addictive, especially since you don’t have to pay a single cent to pull the lever. And smartphone developers know that, so they’ve designed their software to tickle your rewards center.

One example? When you get “likes” on Instagram, you don’t necessarily find out when they happen. Instead, the ‘gram sometimes saves up your notifications and delivers them all in one big burst. That kind of a windfall can feel like a rush, even if what you won is essentially valueless. And it keeps you coming back for more.

It’s all about “intermittent variable rewards”, which encourage you to keep on checking your smartphone over and over because it might pay off: maybe that guy you’re fighting on Twitter has posted an asinine reply, maybe your friends have responded to the picture you uploaded, maybe something hilarious and exciting is going down and you’re the last to know about it. Whatever the reward, there’s a chance that it’s waiting for you on your phone — and there’s a chance it’s not, as well. The only way to find out is to check it, and check it, and check it, ad infinitum.

Of course, your reward center isn’t the only primal heartstring your phone knows how to tug. When you upload a group picture, Facebook tries to guess who’s in it and encourages you to tag them. That makes you feel connected to your friends and family, and when you follow through on the suggestion, it draws them back in as well. Snapchat makes a game out of users’ habits by tracking how many days in a row they’ve snapped something — you gotta keep that streak going.

The only question left to answer is “why” — if you’re not gambling with money when you hit that slot machine, then what do the companies get out of your addictive use? The answer, ominously, is you. The more they can encourage users to stay logged in, to keep returning to the well, the more they can charge their advertisers. It’s like they say: “If you’re not paying, you’re the product.”

Now, we’re not saying that you have to give up your social media entirely. But it’s worthwhile to take a minute to recognize what you’re getting out of the accounts that you’ve signed up for. And once you realize that, you might figure out a better, healthier way to scratch that itch.

To Harris, the problem starts in tech companies assuming that and behaving as if their technology is neutral. And the only solution comes in a redesign from the technology out. In other words, the attention economy is inherently flawed because it will inevitably lead to more and more powerful hacks meant to hijack your brain and direct it in the most profitable direction.

But there are ways to start dealing with a personal smartphone addiction at home — and they aren’t much different from breaking any other addiction. The Week provides a set of five suggestions that would be pretty useful for any narcotic

• First, say “I don’t,” not “I can’t.” That takes some of the pressure off and reminds you that it’s not that you can’t, it’s that it’s not who you want to be.
• Next, try making your phone inaccessible. That could be as simple as leaving it one room to charge while you stay in another. But the longer it’s in your pocket, the more it preys on your mind.
• Try setting a stopping rule. That might be something like, “I don’t go past the first page of Reddit.” Voila — you’re no longer losing hours to the internet.
• The next tip is to replace your bad habits with habits you want to encourage instead. That’s as simple as picking up a book.
• And finally, be ready for pushback. Your brain doesn’t react well to losing its addictions.

And it’s important to forgive yourself for relapsing. But stick to all of these methods, and you’ll have your smartphone habit well in hand in no time.