Experts has a new look! Let us know what you think of the updates.

Provide feedback
Home
Scholarly Works
Modeling Teacher-Student Techniques in Deep Neural...
Preprint

Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation

Abstract

Knowledge distillation (KD) is a new method for transferring knowledge of a structure under training to another one. The typical application of KD is in

Authors

Abbasi S; Hajabdollahi M; Karimi N; Samavi S

Publication date

December 31, 2019

DOI

10.48550/arxiv.1912.13179

Preprint server

arXiv