AVERAGER STUDENT: DISTILLATION FROM UNDISTILLABLE TEACHER

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)

Abstract

Today, some companies release their black-box model as a service for users, where users can see the model’s output corresponding to their input. However, these models can be stolen via knowledge distillation by malicious users. Recently, undistillable teacher (Ma et al., 2021) is introduced in order to prevent the knowledge leakage. In this study, with the aim of contributing to solutions for model intellectual property (IP) protection, we propose a novel method which improves the distillation from an undistillable teacher whose goal is make the distillation difficult for students, with the purpose of model protection. The codes are released at https://github.com/rkevser/AveragerStudent.

Original languageEnglish
Publication statusPublished - 2023
Event1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023 - Kigali, Rwanda
Duration: 5 May 20235 May 2023

Conference

Conference1st Tiny Papers at 11th International Conference on Learning Representations, Tiny Papers @ ICLR 2023
Country/TerritoryRwanda
CityKigali
Period5/05/235/05/23

Bibliographical note

Publisher Copyright:
© 2023 1st Tiny Papers Track at ICLR 2023 - Tiny Papers @ ICLR 2023. All rights reserved.

Fingerprint

Dive into the research topics of 'AVERAGER STUDENT: DISTILLATION FROM UNDISTILLABLE TEACHER'. Together they form a unique fingerprint.

Cite this