Identity statement area
Reference TypeConference Paper (Conference Proceedings)
Last Update2021: (UTC)
Metadata Last Update2021: (UTC)
Citation KeyMoraesEvanFernMart:2021:GeCoOu
TitleGCOOD: A Generic Coupled Out-of-Distribution Detector for Robust Classification
Access Date2021, Sep. 24
Number of Files1
Size843 KiB
Context area
Author1 Moraes, Rogério Ferreira de
2 Evangelista, Raphael dos S.
3 Fernandes, Leandro A. F.
4 Martí, Luis
Affiliation1 Universidade Federal Fluminense (UFF), Niterói, Brazil
2 Universidade Federal Fluminense (UFF), Niterói, Brazil
3 Universidade Federal Fluminense (UFF), Niterói, Brazil
4 Inria Chile Research Center, Las Condes, Chile
EditorPaiva, Afonso
Menotti, David
Baranoski, Gladimir V. G.
Proença, Hugo Pedro
Junior, Antonio Lopes Apolinario
Papa, Joăo Paulo
Pagliosa, Paulo
dos Santos, Thiago Oliveira
e Sá, Asla Medeiros
da Silveira, Thiago Lopes Trugillo
Brazil, Emilio Vital
Ponti, Moacir A.
Fernandes, Leandro A. F.
Avila, Sandra
Conference NameConference on Graphics, Patterns and Images, 34 (SIBGRAPI)
Conference LocationGramado (Virtual), Brazil
DateOctober 18th to October 22nd, 2021
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Book TitleProceedings
Tertiary TypeFull Paper
Content and structure area
Is the master or a copy?is the master
Content Stagecompleted
Content TypeExternal Contribution
Voronoi diagrams
AbstractNeural networks have achieved high degrees of accuracy in classification tasks. However, when an out-of-distribution (OOD) sample (emph{i.e.,}~entries from unknown classes) is submitted to the classification process, the result is the association of the sample to one or more of the trained classes with different degrees of confidence. If any of these confidence values are more significant than the user-defined threshold, the network will mislabel the sample, affecting the model credibility. The definition of the acceptance threshold itself is a sensitive issue in the face of the classifier's overconfidence. This paper presents the Generic Coupled OOD Detector (GCOOD), a novel Convolutional Neural Network (CNN) tailored to detect whether an entry submitted to a trained classification model is an OOD sample for that model. From the analysis of the Softmax output of any classifier, our approach can indicate whether the resulting classification should be considered or not as a sample of some of the trained classes. To train our CNN, we had to develop a novel training strategy based on Voronoi diagrams of the location of representative entries in the latent space of the classification model and graph coloring. We evaluated our approach using ResNet, VGG, DenseNet, and SqueezeNet classifiers with images from the CIFAR-10 dataset.
doc Directory Contentaccess
source Directory Contentthere are no files
agreement Directory Content
agreement.html 14/09/2021 21:27 1.3 KiB 
Conditions of access and use area
data URL
zipped data URL
Target File2021___Moraes_et_al____SIBGRAPI.pdf
Allied materials area
Notes area
Empty Fieldsaccessionnumber archivingpolicy archivist area callnumber copyholder copyright creatorhistory descriptionlevel dissemination documentstage doi edition electronicmailaddress group holdercode isbn issn label lineage mark nextedition nexthigherunit notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url versiontype volume
Description control area
e-Mail (login)