Close
Metadata

%0 Conference Proceedings
%4 sid.inpe.br/sibgrapi/2018/10.17.19.12
%2 sid.inpe.br/sibgrapi/2018/10.17.19.12.01
%A Freitas, Pedro Garcia,
%A Farias, Mylène C. Q.,
%@affiliation University of Brasília
%@affiliation University of Brasília
%T Using Texture Measures for Visual Quality Assessment
%B Conference on Graphics, Patterns and Images, 31 (SIBGRAPI)
%D 2018
%E Ross, Arun,
%E Gastal, Eduardo S. L.,
%E Jorge, Joaquim A.,
%E Queiroz, Ricardo L. de,
%E Minetto, Rodrigo,
%E Sarkar, Sudeep,
%E Papa, João Paulo,
%E Oliveira, Manuel M.,
%E Arbeláez, Pablo,
%E Mery, Domingo,
%E Oliveira, Maria Cristina Ferreira de,
%E Spina, Thiago Vallin,
%E Mendes, Caroline Mazetto,
%E Costa, Henrique Sérgio Gutierrez,
%E Mejail, Marta Estela,
%E Geus, Klaus de,
%E Scheer, Sergio,
%S Proceedings
%8 Oct. 29 - Nov. 1, 2018
%J Porto Alegre
%I Sociedade Brasileira de Computação
%C Foz do Iguaçu, PR, Brazil
%K Visual quality, objective metrics, no-reference image quality assessment, video quality assessment.
%X The automatic quality assessment of images and videos is a crucial problem for a wide range of applications in the fields of computer vision and multimedia processing. For instance, many computer vision applications, such as biometric identification, content retrieval, and object recognition, rely on input images with a specific range of quality. Therefore, a great research effort has been made to develop a visual quality assessment (VQA) methods that are able to automatically estimate quality. However, VQA still faces several challenges. In the case of images, most of the proposed methods are complex and require a reference (pristine image) to estimate the quality, which limits their use in several multimedia applications. For videos, the current state-of-the-art methods still perform worse than the methods designed for images, both in terms of prediction accuracy and computational complexity. In this work, we proposed a set of methods to estimate visual quality using texture descriptors and machine learning. Starting from the premise that visual impairments alter image and video texture statistics, we propose a framework that use these descriptors to produce new quality assessment methods, including no-reference (blind) and full-reference quality metrics. Experimental results indicate that the proposed metrics present a good performance when tested on several benchmark image and video quality databases, outperforming current state-of-the-art metrics.
%@language en
%3 wtd-manuscript-CR.pdf


Close