%0 Conference Proceedings
%T Video pornography detection through deep learning techniques and motion information
%D 2017
%8 Oct. 17-20, 2017
%A Perez, Mauricio Lisboa,
%A Testoni, Vanessa,
%A Rocha, Anderson,
%@affiliation EEE - NTU
%@affiliation Samsung Research Institute Brazil
%@affiliation IC - UNICAMP
%E Torchelsen, Rafael Piccin,
%E Nascimento, Erickson Rangel do,
%E Panozzo, Daniele,
%E Liu, Zicheng,
%E Farias, Mylène,
%E Viera, Thales,
%E Sacht, Leonardo,
%E Ferreira, Nivan,
%E Comba, João Luiz Dihl,
%E Hirata, Nina,
%E Schiavon Porto, Marcelo,
%E Vital, Creto,
%E Pagot, Christian Azambuja,
%E Petronetto, Fabiano,
%E Clua, Esteban,
%E Cardeal, Flávio,
%B Conference on Graphics, Patterns and Images, 30 (SIBGRAPI)
%C Niterói, RJ
%S Proceedings
%I Sociedade Brasileira de Computação
%J Porto Alegre
%K Pornography classification, Deep learning and motion information, Optical flow, MPEG motion vectors, Sensitive video classification.
%X Recent literature has explored automated pornographic detection - a bold move to replace humans in the tedious task of moderating online content. Unfortunately, on scenes with high skin exposure, such as people sunbathing and wrestling, the state of the art can have many false alarms. This paper is based on the premise that incorporating motion information in the models can alleviate the problem of mapping skin exposure to pornographic content, and advances the bar on automated pornography detection with the use of motion information and deep learning architectures. Deep Learning, especially in the form of Convolutional Neural Networks, have striking results on computer vision, but their potential for pornography detection is yet to be fully explored through the use of motion information. We propose novel ways for combining static (picture) and dynamic (motion) information using optical flow and MPEG motion vectors. We show that both methods provide equivalent accuracies, but that MPEG motion vectors allow a more efficient implementation. The best proposed method yields a classification accuracy of 97.9% - an error reduction of 64.4% when compared to the state of the art - on a dataset of 800 challenging test cases. Finally, we present and discuss results on a larger, and more challenging, dataset.
%@language en
%3 wtd-sibgrapi.pdf