Identity statement area | |
Reference Type | Conference Paper (Conference Proceedings) |
Site | sibgrapi.sid.inpe.br |
Identifier | 8JMKD3MGPBW34M/3C8G5FH |
Repository | sid.inpe.br/sibgrapi/2012/07.06.21.52 |
Last Update | 2012:07.06.21.52.50 thalesv@gmail.com |
Metadata | sid.inpe.br/sibgrapi/2012/07.06.21.52.50 |
Metadata Last Update | 2020:02.19.02.18.27 administrator |
Citation Key | MirandaViMaLeViCa:2012:ReGeRe |
Title | Real-time gesture recognition from depth data through key poses learning and decision forests  |
Format | DVD, On-line. |
Year | 2012 |
Access Date | 2021, Jan. 24 |
Number of Files | 1 |
Size | 1840 KiB |
Context area | |
Author | 1 Miranda, Leandro 2 Vieira, Thales 3 Martinez, Dimas 4 Lewiner, Thomas 5 Vieira, Antônio W. 6 Campos, Mario F. M. |
Affiliation | 1 Mathematics, UFAL 2 Mathematics, UFAL 3 Mathematics, UFAL 4 Mathematics, PUC-Rio 5 Computer Science, UFMG 6 Computer Science, UFMG |
Editor | Freitas, Carla Maria Dal Sasso Sarkar, Sudeep Scopigno, Roberto Silva, Luciano |
e-Mail Address | thalesv@gmail.com |
Conference Name | Conference on Graphics, Patterns and Images, 25 (SIBGRAPI) |
Conference Location | Ouro Preto |
Date | Aug. 22-25, 2012 |
Book Title | Proceedings |
Publisher | IEEE Computer Society |
Publisher City | Los Alamitos |
Tertiary Type | Full Paper |
History | 2012-09-20 16:45:34 :: thalesv@gmail.com -> administrator :: 2012 2020-02-19 02:18:27 :: administrator -> :: 2012 |
Content and structure area | |
Is the master or a copy? | is the master |
Content Stage | completed |
Transferable | 1 |
Content Type | External Contribution |
Keywords | Gesture recognition , Pose identification , Depth sensors , 3d motion , Natural user interface. |
Abstract | Human gesture recognition is a challenging task with many applications. The popularization of real time depth sensors even diversifies potential applications to end-user natural user interface (NUI). The quality of such NUI highly depends on the robustness and execution speed of the gesture recognition. This work introduces a method for real-time gesture recognition from a noisy skeleton stream, such as the ones extracted from Kinect depth sensors. Each pose is described using a tailored angular representation of the skeleton joints. Those descriptors serve to identify key poses through a multi-class classifier derived from Support Vector learning machines. The gesture is labeled on-the-fly from the key pose sequence through a decision forest, that naturally performs the gesture time warping and avoids the requirement for an initial or neutral pose. The proposed method runs in real time and shows robustness in several experiments. |
source Directory Content | there are no files |
agreement Directory Content | |
Conditions of access and use area | |
data URL | http://urlib.net/rep/8JMKD3MGPBW34M/3C8G5FH |
zipped data URL | http://urlib.net/zip/8JMKD3MGPBW34M/3C8G5FH |
Language | en |
Target File | gesture_learning_sibgrapi_certified.pdf |
User Group | thalesv@gmail.com |
Visibility | shown |
Allied materials area | |
Mirror Repository | sid.inpe.br/banon/2001/03.30.15.38.24 |
Host Collection | sid.inpe.br/banon/2001/03.30.15.38 |
Notes area | |
Empty Fields | accessionnumber archivingpolicy archivist area callnumber copyholder copyright creatorhistory descriptionlevel dissemination documentstage doi edition electronicmailaddress group holdercode isbn issn label lineage mark nextedition nexthigherunit notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url versiontype volume |
| |