Close
Metadata

Identity statement area
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Identifier6qtX3pFwXQZG2LgkFdY/LMmuu
Repositorysid.inpe.br/sibgrapi@80/2006/07.17.10.15
Last Update2006:07.17.10.15.54 administrator
Metadatasid.inpe.br/sibgrapi@80/2006/07.17.10.15.55
Metadata Last Update2020:02.19.03.17.39 administrator
Citation KeyCamposMayoMurr:2006:DiAtWe
TitleDirecting the attention of a wearable camera by pointing gestures
FormatOn-line
Year2006
Date8-11 Oct. 2006
Access Date2021, Jan. 19
Number of Files1
Size334 KiB
Context area
Author1 de Campos, Teofilo
2 Mayol Cuevas, Walterio W.
3 Murray, David W.
Affiliation1 Department of Engineering Science, University of Oxford
2 Department of Computer Science, University of Bristol
3 Department of Engineering Science, University of Oxford
EditorOliveira Neto, Manuel Menezes de
Carceroni, Rodrigo Lima
e-Mail Addressteo@robots.ox.ac.uk
Conference NameBrazilian Symposium on Computer Graphics and Image Processing, 19 (SIBGRAPI)
Conference LocationManaus
Book TitleProceedings
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Tertiary TypeFull Paper
History2006-07-17 10:15:55 :: teo.decampos -> banon ::
2006-08-30 21:48:03 :: banon -> teo.decampos ::
2008-07-17 14:11:03 :: teo.decampos -> administrator ::
2009-08-13 20:38:06 :: administrator -> banon ::
2010-08-28 20:02:23 :: banon -> administrator ::
2020-02-19 03:17:39 :: administrator -> :: 2006
Content and structure area
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Content TypeExternal Contribution
Keywords3D hand tracking, hand detection, wearable robots.
AbstractWearable visual sensors provide views of the environment which are rich in information about the wearers location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine method for shape detection and a 3D tracking method that can identify pointing gestures and estimate their direction. The low computational complexity of both methods allows a real-time implementation that is applied to estimate the users focus of attention and to control fast redirections of gaze of a wearable active camera. Experiments have demonstrated a level of robustness of this system in long and noisy image sequences.
source Directory Contentthere are no files
agreement Directory Contentthere are no files
Conditions of access and use area
data URLhttp://urlib.net/rep/6qtX3pFwXQZG2LgkFdY/LMmuu
zipped data URLhttp://urlib.net/zip/6qtX3pFwXQZG2LgkFdY/LMmuu
Languageen
Target FiledeCamposT-Gestures.pdf
User Groupteo.decampos
administrator
Visibilityshown
Allied materials area
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
Notes area
Empty Fieldsaccessionnumber archivingpolicy archivist area callnumber copyholder copyright creatorhistory descriptionlevel dissemination documentstage doi edition electronicmailaddress group holdercode isbn issn label lineage mark mirrorrepository nextedition nexthigherunit notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url versiontype volume

Close