Özet
Semantic role labeling (SRL) is the task of finding the argument structures of verbs in a sentence. Previous studies for Turkish SRL have focused mostly on syntactic features; context-oriented approaches have not been explored in this area yet. In this paper, we investigate the impact of pre-trained neural language models, which are strong in context representations, on the semantic role labeling task of Turkish. BERT, ConvBERT and ELECTRA language models are adapted to Turkish SRL with parameter tuning. We report a 10 percentage points improvement over the morphology focused results, which relies on gold-standard morphological tags and thus does not contain the errors propagated due to a previous morphological analysis layer. Since our model does not have any such dependencies, the performance increase will be even higher in the actual scenario.
Tercüme edilen katkı başlığı | The Impact of Pre-trained Language Models on Turkish Semantic Role Labelling |
---|---|
Orijinal dil | Türkçe |
Ana bilgisayar yayını başlığı | 2022 30th Signal Processing and Communications Applications Conference, SIU 2022 |
Yayınlayan | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Elektronik) | 9781665450928 |
DOI'lar | |
Yayın durumu | Yayınlandı - 2022 |
Etkinlik | 30th Signal Processing and Communications Applications Conference, SIU 2022 - Safranbolu, Turkey Süre: 15 May 2022 → 18 May 2022 |
Yayın serisi
Adı | 2022 30th Signal Processing and Communications Applications Conference, SIU 2022 |
---|
???event.eventtypes.event.conference???
???event.eventtypes.event.conference??? | 30th Signal Processing and Communications Applications Conference, SIU 2022 |
---|---|
Ülke/Bölge | Turkey |
Şehir | Safranbolu |
Periyot | 15/05/22 → 18/05/22 |
Bibliyografik not
Publisher Copyright:© 2022 IEEE.
Keywords
- language models
- semantic role labeling