BERT2D: Two Dimensional Positional Embeddings for Efficient Turkish NLP

Yigit Bekir Kaya, A. Cuneyd Tantug

Araştırma sonucu: Dergiye katkıMakalebilirkişi

1 Atıf (Scopus)

Özet

This study addresses the challenge of improving the downstream performance of pretrained language models for morphologically rich languages, with a focus on Turkish. Traditional BERT models use one-dimensional absolute positional embeddings, which, while effective, have limitations when dealing with complex languages. We propose BERT2D, which is a novel BERT-based model that contributes to positional embedding systems. This approach introduces a dual embedding system that targets all the words and their subwords. Remarkably, this modification, coupled with whole word masking, resulted in a significant increase in performance despite a negligible increase in the parameters. Our experiments showed that BERT2D consistently outperformed the leading Turkish-focused BERT model, BERTurk, in terms of various performance metrics in text classification, token classification, and question-answering downstream tasks. For a fair comparison, we pretrained our BERT2D language model on the same dataset as that of BERTurk. The results demonstrate that two-dimensional positional embeddings can significantly improve the performance of encoder-only models in Turkish and other morphologically rich languages, suggesting a promising direction for future research in natural language processing.

Orijinal dilİngilizce
Sayfa (başlangıç-bitiş)1
Sayfa sayısı1
DergiIEEE Access
DOI'lar
Yayın durumuKabul Edilmiş/Basında - 2024

Bibliyografik not

Publisher Copyright:
Authors

Parmak izi

BERT2D: Two Dimensional Positional Embeddings for Efficient Turkish NLP' araştırma başlıklarına git. Birlikte benzersiz bir parmak izi oluştururlar.

Alıntı Yap