Experience with an Affective Robot Assistant for Children with Hearing Disabilities

Pinar Uluer*, Hatice Kose, Elif Gumuslu, Duygun Erol Barkana

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)

Abstract

This study presents an assistive robotic system enhanced with emotion recognition capabilities for children with hearing disabilities. The system is designed and developed for the audiometry tests and rehabilitation of children in a clinical setting and includes a social humanoid robot (Pepper), an interactive interface, gamified audiometry tests, sensory setup and a machine/deep learning based emotion recognition module. Three scenarios involving conventional setup, tablet setup and setup with the robot+tablet are evaluated with 16 children having cochlear implant or hearing aid. Several machine learning techniques and deep learning models are used for the classification of the three test setups and for the classification of the emotions (pleasant, neutral, unpleasant) of children using the recorded physiological signals by E4 wristband. The results show that the collected signals during the tests can be separated successfully and the positive and negative emotions of children can be better distinguished when they interact with the robot than in the other two setups. In addition, the children’s objective and subjective evaluations as well as their impressions about the robot and its emotional behaviors are analyzed and discussed extensively.

Original languageEnglish
Pages (from-to)643-660
Number of pages18
JournalInternational Journal of Social Robotics
Volume15
Issue number4
DOIs
Publication statusPublished - Apr 2023

Bibliographical note

Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Nature B.V.

Funding

We would like to thank collaborating audiologists Dr. Selma Yilar, Talha Cogen and Busra Gokce from Istanbul University Cerrahpasa Medical Faculty for their contributions to this study. This study is supported by The Scientific and Technological Research Council of Turkey (TÜBİTAK) under the grant number 118E214. This work is supported by the Turkish Academy of Sciences in scheme of the Outstanding Young Scientist Award (TÜBA-GEBİP). Research supported by The Scientific and Technological Research Council of Turkey (TÜBİTAK) under the grant number 118E214. We would like to thank collaborating audiologists Dr. Selma Yilar, Talha Cogen and Busra Gokce from Istanbul University Cerrahpasa Medical Faculty for their contributions to this study. This study is supported by The Scientific and Technological Research Council of Turkey (TÜBİTAK) under the grant number 118E214. This work is supported by the Turkish Academy of Sciences in scheme of the Outstanding Young Scientist Award (TÜBA-GEBİP).

FundersFunder number
TÜBA-GEBİP
TÜBİTAK
Istanbul Üniversitesi
Türkiye Bilimsel ve Teknolojik Araştırma Kurumu118E214
Türkiye Bilimler Akademisi

    Keywords

    • Deep learning
    • Emotion recognition
    • Human-robot interaction
    • Machine learning
    • Physiological signals
    • Social robots

    Fingerprint

    Dive into the research topics of 'Experience with an Affective Robot Assistant for Children with Hearing Disabilities'. Together they form a unique fingerprint.

    Cite this