Ana gezinime geç Aramaya geç Ana içeriğe geç

AVERAGER STUDENT: DISTILLATION FROM UNDISTILLABLE TEACHER

Araştırma sonucu: Konferansa katkıYazıbilirkişi

1 Atıf (Scopus)

Özet

Today, some companies release their black-box model as a service for users, where users can see the model’s output corresponding to their input. However, these models can be stolen via knowledge distillation by malicious users. Recently, undistillable teacher (Ma et al., 2021) is introduced in order to prevent the knowledge leakage. In this study, with the aim of contributing to solutions for model intellectual property (IP) protection, we propose a novel method which improves the distillation from an undistillable teacher whose goal is make the distillation difficult for students, with the purpose of model protection. The codes are released at https://github.com/rkevser/AveragerStudent.

Orijinal dilİngilizce
Yayın durumuYayınlandı - 2023
Etkinlik1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023 - Kigali, Rwanda
Süre: 5 May 20235 May 2023

???event.eventtypes.event.conference???

???event.eventtypes.event.conference???1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023
Ülke/BölgeRwanda
ŞehirKigali
Periyot5/05/235/05/23

Bibliyografik not

Publisher Copyright:
© 2023 1st Tiny Papers Track at ICLR 2023 - Tiny Papers @ ICLR 2023. All rights reserved.

Parmak izi

AVERAGER STUDENT: DISTILLATION FROM UNDISTILLABLE TEACHER' araştırma başlıklarına git. Birlikte benzersiz bir parmak izi oluştururlar.

Alıntı Yap