Abstract
Semantic role labeling (SRL) is the task of finding the argument structures of verbs in a sentence. Previous studies for Turkish SRL have focused mostly on syntactic features; context-oriented approaches have not been explored in this area yet. In this paper, we investigate the impact of pre-trained neural language models, which are strong in context representations, on the semantic role labeling task of Turkish. BERT, ConvBERT and ELECTRA language models are adapted to Turkish SRL with parameter tuning. We report a 10 percentage points improvement over the morphology focused results, which relies on gold-standard morphological tags and thus does not contain the errors propagated due to a previous morphological analysis layer. Since our model does not have any such dependencies, the performance increase will be even higher in the actual scenario.
Translated title of the contribution | The Impact of Pre-trained Language Models on Turkish Semantic Role Labelling |
---|---|
Original language | Turkish |
Title of host publication | 2022 30th Signal Processing and Communications Applications Conference, SIU 2022 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Electronic) | 9781665450928 |
DOIs | |
Publication status | Published - 2022 |
Event | 30th Signal Processing and Communications Applications Conference, SIU 2022 - Safranbolu, Turkey Duration: 15 May 2022 → 18 May 2022 |
Publication series
Name | 2022 30th Signal Processing and Communications Applications Conference, SIU 2022 |
---|
Conference
Conference | 30th Signal Processing and Communications Applications Conference, SIU 2022 |
---|---|
Country/Territory | Turkey |
City | Safranbolu |
Period | 15/05/22 → 18/05/22 |
Bibliographical note
Publisher Copyright:© 2022 IEEE.