dc.contributor.author |
Kleinhans, A
|
|
dc.contributor.author |
Rosman, Benjamin S
|
|
dc.contributor.author |
Michalik, M
|
|
dc.contributor.author |
Tripp, B
|
|
dc.contributor.author |
Detry, R
|
|
dc.date.accessioned |
2016-07-11T10:42:50Z |
|
dc.date.available |
2016-07-11T10:42:50Z |
|
dc.date.issued |
2015-05 |
|
dc.identifier.citation |
Kleinhans, A. Rosman, B.S., Michalik, M. Tripp, B. and Detry, R. 2015. G3DB: A database of successful and failed grasps with RGB-D images, point clouds, mesh models and gripper parameters. In: Workshop on Robotic Hands, Grasping, and Manipulation, at the IEEE International Conference on Robotics and Automation, 26-30 May 2015, Seattle, Washington, USA |
en_US |
dc.identifier.uri |
http://www.benjaminrosman.com/papers/2015-hgm-abstract.pdf
|
|
dc.identifier.uri |
http://hdl.handle.net/10204/8613
|
|
dc.description |
Workshop on Robotic Hands, Grasping, and Manipulation, at the IEEE International Conference on Robotics and Automation, 26-30 May 2015, Seattle, Washington, USA |
en_US |
dc.description.abstract |
Autonomous grasping is a problem that receives continuous attention from our community because it is both key to many applications, and difficult to solve. The complexity of robot grasping is counter-intuitive. For us humans, planning a grasp is a trivial task that requires neither considerable effort nor time to solve, it is difficult to imagine the tremendous challenges that our brain has to overcome in order to allow us to interact with objects with such ease. Firstly, much information is missing. The weight, mass distribution, or friction of an object are impossible to measure prior to manipulating an object, and these properties have a dramatic impact on the behavior of a grasp. We thus have to infer from past experience or common sense the most likely values for them, and adapt to actual object properties during the grasp. Another key piece of information that is missing is the object’s 3D shape. Humans and mobile robots perceive the world from a single viewpoint, making the back and often the sides of an object inaccessible through our senses. We are thus forced to consider grasps for which at least one finger comes in contact with a part of the object that we cannot perceive. Further, the faces of an object that we do perceive are sampled through noisy sensors yielding unreliable depth or color. To complete the list of challenges associated with grasping, we note that grasps are parametrized in a highdimensional space: six parameters to fix the position and orientation of the gripper, plus the parameters that define the shape of the hand – 25 for a human hand, 4 in the case of the Barrett hand considered in this paper. The space to explore to plan a grasp is high dimensional, and, given the mechanical complexity of grasping, good solutions are sparsely distributed. |
en_US |
dc.language.iso |
en |
en_US |
dc.publisher |
ICRA 2015 |
en_US |
dc.relation.ispartofseries |
Workflow;16651 |
|
dc.subject |
Grasping |
en_US |
dc.subject |
Database |
en_US |
dc.subject |
Manipulation |
en_US |
dc.subject |
Neurophysiology |
en_US |
dc.title |
G3DB: A database of successful and failed grasps with RGB-D images, point clouds, mesh models and gripper parameters |
en_US |
dc.type |
Conference Presentation |
en_US |
dc.identifier.apacitation |
Kleinhans, A., Rosman, B. S., Michalik, M., Tripp, B., & Detry, R. (2015). G3DB: A database of successful and failed grasps with RGB-D images, point clouds, mesh models and gripper parameters. ICRA 2015. http://hdl.handle.net/10204/8613 |
en_ZA |
dc.identifier.chicagocitation |
Kleinhans, A, Benjamin S Rosman, M Michalik, B Tripp, and R Detry. "G3DB: A database of successful and failed grasps with RGB-D images, point clouds, mesh models and gripper parameters." (2015): http://hdl.handle.net/10204/8613 |
en_ZA |
dc.identifier.vancouvercitation |
Kleinhans A, Rosman BS, Michalik M, Tripp B, Detry R, G3DB: A database of successful and failed grasps with RGB-D images, point clouds, mesh models and gripper parameters; ICRA 2015; 2015. http://hdl.handle.net/10204/8613 . |
en_ZA |
dc.identifier.ris |
TY - Conference Presentation
AU - Kleinhans, A
AU - Rosman, Benjamin S
AU - Michalik, M
AU - Tripp, B
AU - Detry, R
AB - Autonomous grasping is a problem that receives continuous attention from our community because it is both key to many applications, and difficult to solve. The complexity of robot grasping is counter-intuitive. For us humans, planning a grasp is a trivial task that requires neither considerable effort nor time to solve, it is difficult to imagine the tremendous challenges that our brain has to overcome in order to allow us to interact with objects with such ease. Firstly, much information is missing. The weight, mass distribution, or friction of an object are impossible to measure prior to manipulating an object, and these properties have a dramatic impact on the behavior of a grasp. We thus have to infer from past experience or common sense the most likely values for them, and adapt to actual object properties during the grasp. Another key piece of information that is missing is the object’s 3D shape. Humans and mobile robots perceive the world from a single viewpoint, making the back and often the sides of an object inaccessible through our senses. We are thus forced to consider grasps for which at least one finger comes in contact with a part of the object that we cannot perceive. Further, the faces of an object that we do perceive are sampled through noisy sensors yielding unreliable depth or color. To complete the list of challenges associated with grasping, we note that grasps are parametrized in a highdimensional space: six parameters to fix the position and orientation of the gripper, plus the parameters that define the shape of the hand – 25 for a human hand, 4 in the case of the Barrett hand considered in this paper. The space to explore to plan a grasp is high dimensional, and, given the mechanical complexity of grasping, good solutions are sparsely distributed.
DA - 2015-05
DB - ResearchSpace
DP - CSIR
KW - Grasping
KW - Database
KW - Manipulation
KW - Neurophysiology
LK - https://researchspace.csir.co.za
PY - 2015
T1 - G3DB: A database of successful and failed grasps with RGB-D images, point clouds, mesh models and gripper parameters
TI - G3DB: A database of successful and failed grasps with RGB-D images, point clouds, mesh models and gripper parameters
UR - http://hdl.handle.net/10204/8613
ER -
|
en_ZA |