Tag Archives: #electroencephalography

Bedside EEG Test Can Aid Prognosis in Unresponsive Brain Injury Patients (Neuroscience)

Assessing the ability of unresponsive patients with severe brain injury to understand what is being said to them could yield important insights into how they might recover, according to new research.

A team at the University of Birmingham has shown that responses to speech can be measured using electroencephalography, a non-invasive technique used to record electrical signals in the brain. The strength of these responses can be used to provide an accurate prognosis that can help clinicians make the most effective treatment decisions.

Significantly the assessments can be made while the patient is still in intensive care and does not require any conscious response from the patient – they do not have to ‘do’ anything.

In the study, published in Annals of Neurology, the team assessed 28 patients with acute traumatic brain injury (TBI) who were not under sedation, and who failed to obey commands. The patients were assessed within just a few days of their injury. They were played streams of sentences and phrases made up of monosyllabic words while their brain activity was monitored using EEG.

In healthy individuals, EEG activity only synchronises with the rhythm of phrases and sentence when listeners consciously comprehend the speech. The researchers assessed the level of the unresponsive patients’ comprehension by measuring the strength of this synchronicity.

The researchers were able to follow up 17 of the patients three months following their injury, and 16 of the patients after six months. They found the outcomes significantly correlated with the strength of the patients’ response to speech measured by the EEG.

Patients with traumatic brain injury are commonly assessed by their behaviour or by a CT scan, but some patients who remain unresponsive pose a significant challenge. Recent studies have shown that TBI patients can be shown to ‘imagine’ themselves following commands. This activity can also be tracked using EEG. However, this approach requires a fairly sophisticated response from the patient, so patients with lower brain capabilities may be overlooked.

Lead author Dr Damian Cruse is based at the University of Birmingham’s School of Psychology and Centre for Human Brain Health. He explains: “The strength of our approach is that we can measure a scale of comprehension without needing any other sort of response from the patient. This insight could significantly reduce prognostic uncertainty at a critical point. It could help clinicians make more appropriate decisions about whether or not to continue life-sustaining therapy – and also ensure rehabilitation resources are allocated to patients who are most likely to benefit.”

Reference: Cruse et al (2020). ‘Covert speech comprehension predicts recovery from acute post-traumatic unresponsive states.’ Annals of Neurology.

Provided by University of Birmingham

Using a Video Game to Understand The Origin of Emotions (Neuroscience)

Emotions are complex phenomena that influence our minds, bodies and behavior. A number of studies have sought to connect given emotions, such as fear or pleasure, to specific areas of the brain, but without success. Some theoretical models suggest that emotions emerge through the coordination of multiple mental processes triggered by an event. These models involve the brain orchestrating adapted emotional responses via the synchronization of motivational, expressive and visceral mechanisms.

The transient synchronization between the different emotional components corresponds to an emotional state. Credit: UNIGE/LEITAO

To investigate this hypothesis, a research team from the University of Geneva (UNIGE) studied brain activity using functional MRI. They analyzed the feelings, expressions and physiological responses of volunteers while they were playing a video game that had been specially developed to arouse different emotions depending on the progress of the game. The results, published in the journal PLOS Biology, show that different emotional components recruit several neural networks in parallel distributed throughout the brain, and that their transient synchronization generates an emotional state. The somatosensory and motor pathways are two of the areas involved in this synchronization, thereby validating the idea that emotion is grounded in action-oriented functions in order to allow an adapted response to events.

Most studies use passive stimulation to understand the emergence of emotions: they typically present volunteers with photos, videos or images evoking fear, anger, joy or sadness while recording the cerebral response using electroencephalography or imaging. The goal is to pinpoint the specific neural networks for each emotion. “The problem is, these regions overlap for different emotions, so they’re not specific,” begins Joana Leitão, a post-doctoral fellow in the Department of Fundamental Neurosciences (NEUFO) in UNIGE’s Faculty of Medicine and at the Swiss Centre for Affective Sciences (CISA). “What’s more, it’s likely that, although these images represent emotions well, they don’t evoke them.”

A question of perspective

Several neuroscientific theories have attempted to model the emergence of an emotion, although none has so far been proven experimentally. The UNIGE research team subscribe to the postulate that emotions are “subjective”: two individuals faced with the same situation may experience a different emotion. “A given event is not assessed in the same way by each person because the perspectives are different,” continues Dr. Leitão.

In a theoretical model known as the component process model (CPM) – devised by Professor Klaus Scherer, the retired founding director of CISA- an event will generate multiple responses in the organism. These relate to components of cognitive assessment (novelty or concordance with a goal or norms), motivation, physiological processes (sweating or heart rate), and expression (smiling or shouting). In a situation that sets off an emotional response, these different components influence each other dynamically. It is their transitory synchronization that might correspond to an emotional state.

Emotional about Pacman

The Geneva neuroscientists devised a video game to evaluate the applicability of this model. “The aim is to evoke emotions that correspond to different forms of evaluation,” explains Dr. Leitão. “Rather than viewing simple images, participants play a video game that puts them in situations they’ll have to evaluate so they can advance and win rewards.” The game is an arcade game that is similar to the famous Pacman. Players have to grab coins, touch the “nice monsters,” ignore the “neutral monsters” and avoid the “bad guys” to win points and pass to the next level.

The scenario involves situations that trigger the four components of the CPM model differently. At the same time, the researchers were able to measure brain activity via imaging; facial expression by analyzing the zygomatic muscles; feelings via questions; and physiology by skin and cardiorespiratory measurements. “All of these components involve different circuits distributed throughout the brain,” says the Geneva-based researcher. “By cross-referencing the imagery data with computational modeling, we were able to determine how these components interact over time and at what point they synchronize to generate an emotion.”

A made-to-measure emotional response

The results also indicate that a region deep in the brain called the basal ganglia is involved in this synchronization. This structure is known as a convergence point between multiple cortical regions, each of which is equipped with specialized affective, cognitive or sensorimotor processes. The other regions involve the sensorimotor network, the posterior insula and the prefrontal cortex. “The involvement of the somatosensory and motor zones accords with the postulate of theories that consider emotion as a preparatory mechanism for action that enables the body to promote an adaptive response to events,” concludes Patrik Vuilleumier, full professor at NEUFO and senior author of the study.

Reference: Joana Leitão et al, Computational imaging during video game playing shows dynamic synchronization of cortical and subcortical networks of emotions, PLOS Biology (2020). DOI: 10.1371/journal.pbio.3000900 https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3000900

Provided by University of Geneva