BERT2D: Two Dimensional Positional Embeddings for Efficient Turkish NLP

Yigit Bekir Kaya, A. Cuneyd Tantug

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

This study addresses the challenge of improving the downstream performance of pretrained language models for morphologically rich languages, with a focus on Turkish. Traditional BERT models use one-dimensional absolute positional embeddings, which, while effective, have limitations when dealing with complex languages. We propose BERT2D, which is a novel BERT-based model that contributes to positional embedding systems. This approach introduces a dual embedding system that targets all the words and their subwords. Remarkably, this modification, coupled with whole word masking, resulted in a significant increase in performance despite a negligible increase in the parameters. Our experiments showed that BERT2D consistently outperformed the leading Turkish-focused BERT model, BERTurk, in terms of various performance metrics in text classification, token classification, and question-answering downstream tasks. For a fair comparison, we pretrained our BERT2D language model on the same dataset as that of BERTurk. The results demonstrate that two-dimensional positional embeddings can significantly improve the performance of encoder-only models in Turkish and other morphologically rich languages, suggesting a promising direction for future research in natural language processing.

Original languageEnglish
Pages (from-to)1
Number of pages1
JournalIEEE Access
DOIs
Publication statusAccepted/In press - 2024

Bibliographical note

Publisher Copyright:
Authors

Keywords

  • BERT
  • BERT2D
  • Bidirectional control
  • Encoding
  • named entity recognition
  • NLP
  • positional embeddings
  • positional encoding
  • question answering
  • sentiment analysis
  • Task analysis
  • Tokenization
  • transformer models
  • Transformers
  • Turkish
  • Vectors
  • Vocabulary

Fingerprint

Dive into the research topics of 'BERT2D: Two Dimensional Positional Embeddings for Efficient Turkish NLP'. Together they form a unique fingerprint.

Cite this