Preprint
Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation
Abstract
Knowledge distillation allows transferring knowledge from a pre-trained model
to another. However, it suffers from limitations, and constraints related to
Authors
Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
Publication date
September 1, 2020
DOI
10.48550/arxiv.2009.00982
Preprint server
arXiv