Identity statement area
Reference TypeConference Paper (Conference Proceedings)
Last Update2017:
Metadata Last Update2020: administrator
Citation KeyMesquita:2017:ViSeOb
TitleVisual Search for Object Instances Guided by Visual Attention Algorithms
DateOct. 17-20, 2017
Access Date2021, Jan. 21
Number of Files1
Size1905 KiB
Context area
AuthorMesquita, Rafael Galvão de
AffiliationUniversidade Federal de Pernambuco
EditorTorchelsen, Rafael Piccin
Nascimento, Erickson Rangel do
Panozzo, Daniele
Liu, Zicheng
Farias, Mylène
Viera, Thales
Sacht, Leonardo
Ferreira, Nivan
Comba, João Luiz Dihl
Hirata, Nina
Schiavon Porto, Marcelo
Vital, Creto
Pagot, Christian Azambuja
Petronetto, Fabiano
Clua, Esteban
Cardeal, Flávio
Conference NameConference on Graphics, Patterns and Images, 30 (SIBGRAPI)
Conference LocationNiterói, RJ
Book TitleProceedings
PublisherSociedade Brasileira de Computação
Publisher CityPorto Alegre
Tertiary TypeMaster's or Doctoral Work
History2017-09-05 16:05:38 :: -> administrator ::
2020-02-20 22:06:47 :: administrator -> :: 2017
Content and structure area
Is the master or a copy?is the master
Content Stagecompleted
KeywordsVisual search. saliency detection. visual attention. object recognition. local feature detectors/descriptors. matching.
AbstractVisual attention is the process by which the human brain prioritizes and controls visual stimuli and it is, among other characteristics of the visual system, responsible for the fast way in which human beings interact with the environment, even considering a large amount of information to be processed. Visual attention can be driven by a bottom-up mechanism, in which low level stimuli of the analysed scene, like color, guides the focused region to salient regions (regions that are distinguished from its neighborhood or from the whole scene); or by a top-down mechanism, in which cognitive factors, like expectations or the goal of concluding certain task, define the attended location. This Thesis investigates the use of visual attention algorithms to guide (and to accelerate) the search for objects in digital images. Inspired by the bottom-up mechanism, a saliency detector based on the estimative of the scenes background combined with the result of a Laplacian-based operator, referred as BLS (Background Laplacian Saliency), is proposed. Moreover, a modification in SURF (Speeded-Up Robust Features) local feature detector/descriptor, named as patch-based SURF, is designed so that the recognition occurs iteratively in each focused location of the scene, instead of performing the classical recognition (classic search), in which the whole scene is analysed at once. The search mode in which the patch-based SURF is applied and the order of the regions of the image to be analysed is defined by a saliency detection algorithm is called BGMS. The BLS and nine other state-of-the-art saliency detection algorithms are experimented in the BGMS. Results indicate, in average, a reduction to (i) 73% of the classic search processing time just by applying patch-based SURF in a random search, (ii) and to 53% of this time when the search is guided by BLS. When using other state-of-the-art saliency detection algorithms, between 55% and 133% of the processing time of the classic search is needed to perform recognition. Moreover, inspired by the top-down mechanism, it is proposed the BGCO, in which the visual search occurs by prioritizing scene descriptors according to its Hamming distance to the descriptors of a given target object. The BGCO uses Bloom filters to represent feature vectors that are similar to the descriptors of the searched object and it has constant space and time complexity in relation to the number of elements in the set of the descriptors of the target. Experiments showed a reduction in the processing time to 80% of the required time when the classic search is performed. Finally, by using the BGMS and the BGCO in an integrated way, the processing time of the search was reduced to 44% of the execution time required by the classic search.
source Directory Contentthere are no files
agreement Directory Content
agreement.html 05/09/2017 13:05 1.2 KiB 
Conditions of access and use area
data URL
zipped data URL
Target FileMesquitaMello_final.pdf
Update Permissionnot transferred
Allied materials area
Next Higher Units8JMKD3MGPAW/3PJT9LS
Notes area
Empty Fieldsaccessionnumber archivingpolicy archivist area callnumber contenttype copyholder copyright creatorhistory descriptionlevel dissemination doi edition electronicmailaddress group holdercode isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url versiontype volume