dc.contributor.author |
Burke, Michael G
|
|
dc.contributor.author |
Lasenby, J
|
|
dc.date.accessioned |
2016-01-20T09:47:30Z |
|
dc.date.available |
2016-01-20T09:47:30Z |
|
dc.date.issued |
2015-10 |
|
dc.identifier.citation |
Burke, M.G. and Lasenby, J. 2015. Pantomimic gestures for human-robot interaction. IEEE Transactions on Robotics, vol. 31(5), pp 1225- 1237 |
en_US |
dc.identifier.issn |
1552-3098 |
|
dc.identifier.uri |
http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7274529
|
|
dc.identifier.uri |
http://hdl.handle.net/10204/8345
|
|
dc.description |
Copyright: 2015 IEEE XPLORE. Due to copyright restrictions, the attached PDF file only contains the abstract of the full text item. For access to the full text item, please consult the publisher's website. The definitive version of the work is published in the IEEE Transactions on Robotics, vol. 31(5), pp 1225- 1237 |
en_US |
dc.description.abstract |
This paper introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behavior recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro-UAV behavior recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis and compared using a nearest neighbor classifier. These features are biased in that they are better suited to classifying certain behaviors. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labeling. |
en_US |
dc.language.iso |
en |
en_US |
dc.publisher |
IEEE Xplore |
en_US |
dc.relation.ispartofseries |
Workflow;15659 |
|
dc.subject |
Gesture recognition |
en_US |
dc.subject |
Human-robot interaction |
en_US |
dc.subject |
Pantomimic |
en_US |
dc.subject |
PCA |
en_US |
dc.subject |
Time series classification |
en_US |
dc.title |
Pantomimic gestures for human-robot interaction |
en_US |
dc.type |
Article |
en_US |
dc.identifier.apacitation |
Burke, M. G., & Lasenby, J. (2015). Pantomimic gestures for human-robot interaction. http://hdl.handle.net/10204/8345 |
en_ZA |
dc.identifier.chicagocitation |
Burke, Michael G, and J Lasenby "Pantomimic gestures for human-robot interaction." (2015) http://hdl.handle.net/10204/8345 |
en_ZA |
dc.identifier.vancouvercitation |
Burke MG, Lasenby J. Pantomimic gestures for human-robot interaction. 2015; http://hdl.handle.net/10204/8345. |
en_ZA |
dc.identifier.ris |
TY - Article
AU - Burke, Michael G
AU - Lasenby, J
AB - This paper introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behavior recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro-UAV behavior recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis and compared using a nearest neighbor classifier. These features are biased in that they are better suited to classifying certain behaviors. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labeling.
DA - 2015-10
DB - ResearchSpace
DP - CSIR
KW - Gesture recognition
KW - Human-robot interaction
KW - Pantomimic
KW - PCA
KW - Time series classification
LK - https://researchspace.csir.co.za
PY - 2015
SM - 1552-3098
T1 - Pantomimic gestures for human-robot interaction
TI - Pantomimic gestures for human-robot interaction
UR - http://hdl.handle.net/10204/8345
ER -
|
en_ZA |