Classification of diabetic retinopathy using unlabeled data and knowledge distillation Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • Over the last decade, advances in Machine Learning and Artificial Intelligence have highlighted their potential as a diagnostic tool in the healthcare domain. Despite the widespread availability of medical images, their usefulness is severely hampered by a lack of access to labeled data. For example, while Convolutional Neural Networks (CNNs) have emerged as an essential analytical tool in image processing, their impact is curtailed by training limitations due to insufficient labeled data availability. Transfer Learning enables models developed for one task to be reused for a second task. Knowledge distillation enables transferring knowledge from a pre-trained model to another. However, it suffers from limitations, and the two models' constraints need to be architecturally similar. Knowledge distillation addresses some of the shortcomings of transfer learning by generalizing a complex model to a lighter model. However, some parts of the knowledge may not be distilled by knowledge distillation sufficiently. In this paper, a novel knowledge distillation approach using transfer learning is proposed. The proposed approach transfers the complete knowledge of a model to a new smaller one. Unlabeled data are used in an unsupervised manner to transfer the new smaller model's maximum amount of knowledge. The proposed method can be beneficial in medical image analysis, where labeled data are typically scarce. The proposed approach is evaluated in classifying images for diagnosing Diabetic Retinopathy on two publicly available datasets, including Messidor and EyePACS. Simulation results demonstrate that the approach effectively transfers knowledge from a complex model to a lighter one. Furthermore, experimental results illustrate that different small models' performance is improved significantly using unlabeled data and knowledge distillation.

publication date

  • November 2021