Emotion Recognition in Dementia

Advancing technology for multimodal analysis of emotion expression in everyday life

Image: Susan Sermoneta (CC License)

Emotional expression plays a crucial role in everyday functioning. It is a continuous process involving many features of behavioral, facial, vocal, and verbal modalities. Given this complexity, few psychological studies have addressed emotion recognition in an everyday context. Recent technological innovations in affective computing could result in a scientific breakthrough as they open up new possibilities for the ecological assessment of emotions. However, existing technologiesstill pose major challenges in the field of Big Data Analytics.

Little is known about how these lab-based technologies generalize to real world problems. Rather than a one-size-fits-all-solution, existing tools need to be adapted to specific user groups in more natural settings. They also need to take large individual differences into account. We take up these challenges by studying emotional expression in dementia. In this context, emotional functioning is strongly at risk, yet highly important to maintain quality of life in person-centered care. Our domain-specific goal is to gain a better understanding of how dementia affects emotion expression.

We carry out a pilot study, a comparative study between Alzheimer patients and matched healthy older adults, as well as a longitudinal study on the development of emotion expression in Alzheimer patients across time. We develop a unique corpus, use state of the art machine learning techniques to advance technologies for multimodal emotion recognition, and develop visualization and statistical models to assess multimodal patterns of emotion expression. We test their usability in a workshop for international researchers and make them available through the eScience Technology Platform.

Participating organisations

Netherlands eScience Center
University Medical Center Groningen
University of Twente
Life Sciences
Life Sciences

Impact

Output

Team

Peter Kok
Peter Kok
eScience Research Engineer
Netherlands eScience Center
GW
Gerben Westerhof
Principal investigator
University of Twente
BdV
Ben de Vries
eScience Research Engineer
Netherlands eScience Center
Vincent van Hees
Vincent van Hees
Senior eScience Research Engineer
Netherlands eScience Center

Related projects

MEXCA

Multimodal Emotion Expression Capture Amsterdam

Updated 2 weeks ago
Finished

Rethinking risk of falls in stroke survivors

Deviations in gait kinematics using machine learning

Updated 7 months ago
Finished

Automated video-based movement assessment

Automated video-based movement assessment using machine learning to support personalized treatment of movement disorders

Updated 2 weeks ago
Finished

ePODIUM

Early prediction of dyslexia in infants using machine learning

Updated 20 months ago
Finished

Understanding visually grounded spoken language via multi-tasking

An alternative approach for intelligent systems to understand human speech

Updated 21 months ago
Finished

Genetics of sleep patterns

Detecting human sleep from wearable accelerometer data without the aid of sleep diaries

Updated 11 months ago
Finished

What Works When for Whom?

Advancing therapy change process research

Updated 25 months ago
Finished

From Sentiment Mining to Mining Embodied Emotions

Emotional styles on the Dutch stage between 1600-1800

Updated 3 months ago
Finished

Dr. Watson

Medical experts helping machines diagnose

Updated 21 months ago
Finished

Related software

Emo-spectre

EM

Interactive visualization and exploration of multi modal time-series data and video

Updated 29 months ago
1 5

emvoice

EM

A Python package for computing emotion expression-related features from speech signals.

Updated 6 months ago
1

Xenon

XE

If you are using remote machines to do your computations, and don’t feel like learning and implementing many different APIs, Xenon is the tool for you.

Updated 14 months ago
13 11