Publication

Abstract

In this paper we present an automated system that is able to track and grasp a moving object within the workspace of a manipulator using range images acquired with a Microsoft Kinect sensor. Realtime tracking is achieved by a geometric particle filter on the affine group. Based on the tracked output, the pose of a 7-DoF WAM robotic arm is continuously updated using dynamic motor primitives until a distance measure between the tracked object and the gripper mounted on the arm is below a threshold. Then, it closes its three fingers and grasps the object. The tracker works in real-time and is robust to noise and partial occlusions. Using only the depth data makes our tracker independent of texture which is one of the key design goals in our approach. An experimental evaluation is provided along with a comparison of the proposed tracker with state-of-the-art approaches, including the OpenNI-tracker. The developed system is integrated with ROS and made available as part of IRI's ROS stack.

Categories

computer vision, manipulators, robot kinematics.

Scientific reference

F. Husain, A. Colomé, B. Dellen, G. Alenyà and C. Torras. Realtime tracking and grasping of a moving object from range video, 2014 IEEE International Conference on Robotics and Automation, 2014, Hong Kong, China, pp. 2617-2622.