Özet
In sign languages, communication relies on hand gestures, facial expressions, and body language, with signs varying significantly based on the position and movement of different body parts. These variations present challenges to tasks like sentiment analysis, where the direct translation of hand gestures alone is insufficient. In this study, we introduce a novel approach to sentiment analysis in Turkish Sign Language (TİD), marking the first time in the literature that both hand gestures and facial expressions have been incorporated for this purpose. We developed and fine-tuned customized models for emotion extraction from facial expressions using the RAF-DB dataset, and for sentiment analysis from hand gestures using the AUTSL dataset. Additionally, we compiled a dataset of sign language videos enhanced with facial expressions for testing. Our findings indicate that facial expressions are more critical for sentiment analysis in sign language than hand gestures alone. However, integrating both modalities resulted in even greater performance enhancements.
| Orijinal dil | İngilizce |
|---|---|
| Makale numarası | 223 |
| Dergi | Signal, Image and Video Processing |
| Hacim | 19 |
| Basın numarası | 3 |
| DOI'lar | |
| Yayın durumu | Yayınlandı - Mar 2025 |
Bibliyografik not
Publisher Copyright:© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2025.
Parmak izi
Sentiment analysis in sign language' araştırma başlıklarına git. Birlikte benzersiz bir parmak izi oluştururlar.Alıntı Yap
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver