This work proposes a framework that enables arbitrary robots with unknown kinematics models to imitate human demonstrations to acquire a skill, and reproduce it in real-time. The diversity of robots active in non-laboratory environments is growing constantly, and to this end we present an approach for users to be able to easily teach a skill to a robot with any body configuration. Our proposed method requires a motion trajectory obtained from human demonstrations via a Kinect sensor, which is then projected onto a corresponding human skeleton model. The kinematics mapping between the robot and the human model is learned by employing Local Procrustes Analysis, which enables the transfer of the demonstrated trajectory from the human model to the robot. Finally, the transferred trajectory is modeled using Dynamic Movement Primitives, allowing it to be reproduced in real time. Experiments in simulation on a 4 degree of freedom robot show that our method is able to correctly imitate various skills demonstrated by a human.
Reference:
Hiratsuka, M., Makondo, N., Rosman, B.S. and Hasegawa, O. 2016. Trajectory learning from human demonstrations via manifold mapping. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, 9-14 October 2016, Daejeon Convention Center, Daejeon, Korea
Hiratsuka, M., Makondo, N., Rosman, B. S., & Hasegawa, O. (2016). Trajectory learning from human demonstrations via manifold mapping. IEEE Xplore. http://hdl.handle.net/10204/8935
Hiratsuka, M, N Makondo, Benjamin S Rosman, and O Hasegawa. "Trajectory learning from human demonstrations via manifold mapping." (2016): http://hdl.handle.net/10204/8935
Hiratsuka M, Makondo N, Rosman BS, Hasegawa O, Trajectory learning from human demonstrations via manifold mapping; IEEE Xplore; 2016. http://hdl.handle.net/10204/8935 .