Abstract
Purpose: The purpose of this paper is to propose a novel emotion recognition algorithm from multimodal physiological signals for emotion aware healthcare systems. In this work, physiological signals are collected from a respiratory belt (RB), photoplethysmography (PPG), and fingertip temperature (FTT) sensors. These signals are used as their collection becomes easy with the advance in ergonomic wearable technologies. Methods: Arousal and valence levels are recognized from the fused physiological signals using the relationship between physiological signals and emotions. This recognition is performed using various machine learning methods such as random forest, support vector machine and logistic regression. The performance of these methods is studied. Results: Using decision level fusion, the accuracy improved from 69.86 to 73.08% for arousal, and from 69.53 to 72.18% for valence. Results indicate that using multiple sources of physiological signals and their fusion increases the accuracy rate of emotion recognition. Conclusion: This study demonstrated a framework for emotion recognition using multimodal physiological signals from respiratory belt, photo plethysmography and fingertip temperature. It is shown that decision level fusion from multiple classifiers (one per signal source) improved the accuracy rate of emotion recognition both for arousal and valence dimensions.
Original language | English |
---|---|
Pages (from-to) | 149-157 |
Number of pages | 9 |
Journal | Journal of Medical and Biological Engineering |
Volume | 40 |
Issue number | 2 |
DOIs | |
Publication status | Published - 1 Apr 2020 |
Bibliographical note
Publisher Copyright:© 2020, The Author(s).
Keywords
- Emotion recognition
- Multi-sensor data fusion
- Physiological data