Home
Scholarly Works
No-reference stereo image quality assessment based...
Journal article

No-reference stereo image quality assessment based on discriminative sparse representation

Abstract

The quality of images could be degraded through processing, compression, and transmission. This has created a fundamental need for perceptual quality assessment methods in multimedia services. The use of 3D stereo imaging is rapidly increasing. The quality assessment of stereo images is different from their 2D counterparts. Various algorithms have been devised in this field, mainly based on feature extraction. However, the objective quality scores are not sufficiently correlated with human judgments, or they are not fast enough because of using stereo matching and fusion techniques. In this paper, a fast-efficient algorithm is proposed for blind quality assessment of stereoscopic images. A supervised dictionary learning approach for discriminative sparse representation is applied as an automatic feature discovery framework. Based on this framework, discriminative distortion-specific bases are learned on structural features from stereoscopic images based upon discriminative dictionary learning. The over-complete bases are suitable for sparse representation of samples from the same distortion class. Given these features, an SVR is trained for the no-reference quality assessment of stereoscopic images. The experimental results show that the proposed method has achieved an overall correlation of 97% with subjective scores on common datasets so that its superiority compared to the state-of-the-art methods is up to 2%. Furthermore, our method is computationally efficient and evaluates the quality of stereo images much faster than the competing methods making it qualified for real-time applications.

Authors

Karimi M; Nejati M; Khadivi P; Karimi N; Samavi S

Journal

Multimedia Tools and Applications, Vol. 80, No. 21-23, pp. 33033–33053

Publisher

Springer Nature

Publication Date

September 1, 2021

DOI

10.1007/s11042-021-11322-z

ISSN

1380-7501

Contact the Experts team