Sentiment analysis in sign language

Şeyma Takır*, Barış Bilen, Doğukan Arslan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In sign languages, communication relies on hand gestures, facial expressions, and body language, with signs varying significantly based on the position and movement of different body parts. These variations present challenges to tasks like sentiment analysis, where the direct translation of hand gestures alone is insufficient. In this study, we introduce a novel approach to sentiment analysis in Turkish Sign Language (TİD), marking the first time in the literature that both hand gestures and facial expressions have been incorporated for this purpose. We developed and fine-tuned customized models for emotion extraction from facial expressions using the RAF-DB dataset, and for sentiment analysis from hand gestures using the AUTSL dataset. Additionally, we compiled a dataset of sign language videos enhanced with facial expressions for testing. Our findings indicate that facial expressions are more critical for sentiment analysis in sign language than hand gestures alone. However, integrating both modalities resulted in even greater performance enhancements.

Original languageEnglish
Article number223
JournalSignal, Image and Video Processing
Volume19
Issue number3
DOIs
Publication statusPublished - Mar 2025

Bibliographical note

Publisher Copyright:
© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2025.

Keywords

  • Computer vision
  • Emotion extraction
  • Natural language processing
  • Sentiment analysis
  • Turkish sign language

Fingerprint

Dive into the research topics of 'Sentiment analysis in sign language'. Together they form a unique fingerprint.

Cite this