TY - JOUR
T1 - Emotion recognition in Virtual Reality using sensor fusion with eye tracking
AU - Kuyucu, Meral
AU - Sarikaya, Mehmet Ali
AU - Karakaş, Tülay
AU - Özkan, Dilek Yıldız
AU - Demir, Yüksel
AU - Bilen, Ömer
AU - Ince, Gökhan
N1 - Publisher Copyright:
© 2025 Elsevier Ltd
PY - 2025/10
Y1 - 2025/10
N2 - Emotion recognition is an emerging field with applications in healthcare, education, and entertainment. This study integrates Virtual Reality (VR) with multi-sensor fusion to enhance emotion recognition. The research comprises two phases: data collection and analysis/evaluation. Ninety-five participants were exposed to curated audiovisual stimuli designed to elicit a wide range of emotions within an immersive VR environment. VR was chosen for its ability to provide controlled conditions and overcome the limitations of current mobile sensor technologies. Physiological data streams from various sensors were integrated for comprehensive emotional analysis. ElectroEncephaloGraphy (EEG) data revealed brain activity linked to emotional states, while eye tracking data provided insights into gaze direction, pupil dilation, and eye movement—factors correlated with cognitive and emotional processes. Peripheral signals, including heart rate variability, ElectroDermal Activity (EDA), and body temperature, were captured via wearable sensors to enrich the dataset. Machine learning models, such as XGBoost, CatBoost, Multilayer Perceptron, Gradient Boosting, and LightGBM, were employed to predict participants’ emotional states. Evaluation metrics, including accuracy, precision, recall, and F1 scores, demonstrated the robustness and precision of the proposed VR-based multi-sensor fusion approach. This research presents a novel approach to emotion recognition, bridging gaps in traditional methods by integrating VR, multi-sensor fusion, and machine learning.
AB - Emotion recognition is an emerging field with applications in healthcare, education, and entertainment. This study integrates Virtual Reality (VR) with multi-sensor fusion to enhance emotion recognition. The research comprises two phases: data collection and analysis/evaluation. Ninety-five participants were exposed to curated audiovisual stimuli designed to elicit a wide range of emotions within an immersive VR environment. VR was chosen for its ability to provide controlled conditions and overcome the limitations of current mobile sensor technologies. Physiological data streams from various sensors were integrated for comprehensive emotional analysis. ElectroEncephaloGraphy (EEG) data revealed brain activity linked to emotional states, while eye tracking data provided insights into gaze direction, pupil dilation, and eye movement—factors correlated with cognitive and emotional processes. Peripheral signals, including heart rate variability, ElectroDermal Activity (EDA), and body temperature, were captured via wearable sensors to enrich the dataset. Machine learning models, such as XGBoost, CatBoost, Multilayer Perceptron, Gradient Boosting, and LightGBM, were employed to predict participants’ emotional states. Evaluation metrics, including accuracy, precision, recall, and F1 scores, demonstrated the robustness and precision of the proposed VR-based multi-sensor fusion approach. This research presents a novel approach to emotion recognition, bridging gaps in traditional methods by integrating VR, multi-sensor fusion, and machine learning.
KW - EEG
KW - Emotion recognition
KW - Eye tracking
KW - Physiological signals
KW - Sensor fusion
KW - Virtual reality
UR - https://www.scopus.com/pages/publications/105016084903
U2 - 10.1016/j.compbiomed.2025.111070
DO - 10.1016/j.compbiomed.2025.111070
M3 - Article
AN - SCOPUS:105016084903
SN - 0010-4825
VL - 197
JO - Computers in Biology and Medicine
JF - Computers in Biology and Medicine
M1 - 111070
ER -