abstract
- A near real-time no-reference video quality assessment method is proposed for videos encoded by H.264/AVC codec. A fully connected neural network is trained with features extracted from both bit-stream and pixel domains along with their respective subjective quality scores. Feature selection procedure is designed in a manner to address spatial as well as temporal artifacts of the encoded sequences, while minimizing the overall run-time in order to adapt this work to applications in live streaming. The performance of our method is verified by applying it on H.264-encoded sequences from the LIVE video dataset and the correlation with the differential mean opinion scores (DMOS) from the subjective tests are presented. Our framework outperforms widely-used NR VQA methods and a number of state-of-art full-reference VQA methods.