KBStyle: Fast Style Transfer Using a 200 KB Network With Symmetric Knowledge Distillation Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • Convolutional Neural Networks (CNNs) have achieved remarkable progress in arbitrary artistic style transfer. However, the model size of existing state-of-the-art (SOTA) style transfer algorithms is immense, leading to enormous computational costs and memory demand. It makes real-time and high resolution hard for GPUs with limited memory and limits the application on mobile devices. This paper proposes a novel arbitrary artistic style transfer algorithm, KBStyle, whose model size is only 200 KB. Firstly, we design a style transfer network where the style encoder, content encoder, and corresponding decoder are custom designed to guarantee low computational cost and high shape retention. Besides, the weighted style loss function is presented to improve the performance of style migration. Then, we propose a novel knowledge distillation method (Symmetric Knowledge Distillation, SKD) for encoder-decoder-based style transfer models, which redefines the knowledge and symmetrically compresses the encoder and decoder. With the SKD, the proposed style transfer network is further compressed by 14 times to achieve the KBStyle. Experimental results demonstrate that the proposed SKD method achieves comparable results with other SOTA knowledge distillation algorithms for style transfer. Besides, the proposed KBStyle achieves high-quality stylized images. And the inference time of the KBStyle on an Nvidia TITAN RTX GPU is only 20 ms when the resolutions of the content image and style image are both 2k-resolution ( 2048×1080 ). Moreover, the 200 KB model size of KBStyle is much smaller than the SOTA models and facilitates style transfer on mobile devices.

authors

  • Chen, Wenshu
  • Huang, Yujie
  • Wang, Mingyu
  • Wu, Xiaolin
  • Zeng, Xiaoyang

publication date

  • 2024