Close
Metadata

Identity statement area
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Identifier8JMKD3MGPBW34M/3C8G5FH
Repositorysid.inpe.br/sibgrapi/2012/07.06.21.52
Last Update2012:07.06.21.52.50 thalesv@gmail.com
Metadatasid.inpe.br/sibgrapi/2012/07.06.21.52.50
Metadata Last Update2020:02.19.02.18.27 administrator
Citation KeyMirandaViMaLeViCa:2012:ReGeRe
TitleReal-time gesture recognition from depth data through key poses learning and decision forests
FormatDVD, On-line.
Year2012
Access Date2021, Jan. 24
Number of Files1
Size1840 KiB
Context area
Author1 Miranda, Leandro
2 Vieira, Thales
3 Martinez, Dimas
4 Lewiner, Thomas
5 Vieira, Antônio W.
6 Campos, Mario F. M.
Affiliation1 Mathematics, UFAL
2 Mathematics, UFAL
3 Mathematics, UFAL
4 Mathematics, PUC-Rio
5 Computer Science, UFMG
6 Computer Science, UFMG
EditorFreitas, Carla Maria Dal Sasso
Sarkar, Sudeep
Scopigno, Roberto
Silva, Luciano
e-Mail Addressthalesv@gmail.com
Conference NameConference on Graphics, Patterns and Images, 25 (SIBGRAPI)
Conference LocationOuro Preto
DateAug. 22-25, 2012
Book TitleProceedings
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Tertiary TypeFull Paper
History2012-09-20 16:45:34 :: thalesv@gmail.com -> administrator :: 2012
2020-02-19 02:18:27 :: administrator -> :: 2012
Content and structure area
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Content TypeExternal Contribution
KeywordsGesture recognition , Pose identification , Depth sensors , 3d motion , Natural user interface.
AbstractHuman gesture recognition is a challenging task with many applications. The popularization of real time depth sensors even diversifies potential applications to end-user natural user interface (NUI). The quality of such NUI highly depends on the robustness and execution speed of the gesture recognition. This work introduces a method for real-time gesture recognition from a noisy skeleton stream, such as the ones extracted from Kinect depth sensors. Each pose is described using a tailored angular representation of the skeleton joints. Those descriptors serve to identify key poses through a multi-class classifier derived from Support Vector learning machines. The gesture is labeled on-the-fly from the key pose sequence through a decision forest, that naturally performs the gesture time warping and avoids the requirement for an initial or neutral pose. The proposed method runs in real time and shows robustness in several experiments.
source Directory Contentthere are no files
agreement Directory Content
agreement.html 06/07/2012 18:52 0.7 KiB 
Conditions of access and use area
data URLhttp://urlib.net/rep/8JMKD3MGPBW34M/3C8G5FH
zipped data URLhttp://urlib.net/zip/8JMKD3MGPBW34M/3C8G5FH
Languageen
Target Filegesture_learning_sibgrapi_certified.pdf
User Groupthalesv@gmail.com
Visibilityshown
Allied materials area
Mirror Repositorysid.inpe.br/banon/2001/03.30.15.38.24
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
Notes area
Empty Fieldsaccessionnumber archivingpolicy archivist area callnumber copyholder copyright creatorhistory descriptionlevel dissemination documentstage doi edition electronicmailaddress group holdercode isbn issn label lineage mark nextedition nexthigherunit notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url versiontype volume

Close