Özet
Today, some companies release their black-box model as a service for users, where users can see the model’s output corresponding to their input. However, these models can be stolen via knowledge distillation by malicious users. Recently, undistillable teacher (Ma et al., 2021) is introduced in order to prevent the knowledge leakage. In this study, with the aim of contributing to solutions for model intellectual property (IP) protection, we propose a novel method which improves the distillation from an undistillable teacher whose goal is make the distillation difficult for students, with the purpose of model protection. The codes are released at https://github.com/rkevser/AveragerStudent.
| Orijinal dil | İngilizce |
|---|---|
| Yayın durumu | Yayınlandı - 2023 |
| Etkinlik | 1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023 - Kigali, Rwanda Süre: 5 May 2023 → 5 May 2023 |
???event.eventtypes.event.conference???
| ???event.eventtypes.event.conference??? | 1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023 |
|---|---|
| Ülke/Bölge | Rwanda |
| Şehir | Kigali |
| Periyot | 5/05/23 → 5/05/23 |
Bibliyografik not
Publisher Copyright:© 2023 1st Tiny Papers Track at ICLR 2023 - Tiny Papers @ ICLR 2023. All rights reserved.
Parmak izi
AVERAGER STUDENT: DISTILLATION FROM UNDISTILLABLE TEACHER' araştırma başlıklarına git. Birlikte benzersiz bir parmak izi oluştururlar.Alıntı Yap
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver