Ana gezinime geç Aramaya geç Ana içeriğe geç

When fractional calculus meets robust learning: Adaptive robust loss functions

Araştırma sonucu: Dergiye katkıMakalebilirkişi

4 Atıf (Scopus)

Özet

In deep learning, robust loss functions are crucial for addressing challenges like outliers and noise. This paper introduces a novel family of adaptive robust loss functions, Fractional Loss Functions (FLFs), generated by deploying the fractional derivative operator into conventional ones. We demonstrate that adjusting the fractional derivative order α allows generating a diverse spectrum of FLFs while preserving the essential properties necessary for gradient-based learning. We show that tuning α gives the unique property to morph the loss landscape to reduce the influence of large residuals. Thus, α serves as an interpretable hyperparameter defining the robustness level of FLFs. However, determining α prior to training requires a manual exploration to pinpoint an FLF that aligns with the learning tasks. To overcome this issue, we reveal that FLFs can balance robustness against outliers while increasing penalization of inliers by tuning α. This inherent feature allows transforming α to an adaptive parameter as a trade-off that ensures balanced learning of α is feasible. Thus, FLFs can dynamically adapt their loss landscape, facilitating error minimization while providing robustness during training. We performed experiments across diverse tasks and showed that FLFs significantly enhanced performance. Our source code is available at https://github.com/mertcankurucu/Fractional-Loss-Functions.

Orijinal dilİngilizce
Makale numarası113136
DergiKnowledge-Based Systems
Hacim312
DOI'lar
Yayın durumuYayınlandı - 15 Mar 2025

Bibliyografik not

Publisher Copyright:
© 2025 Elsevier B.V.

Parmak izi

When fractional calculus meets robust learning: Adaptive robust loss functions' araştırma başlıklarına git. Birlikte benzersiz bir parmak izi oluştururlar.

Alıntı Yap