By Dov Katz, Oliver Brock | Robotics and Biology Lab
Abstract: Robust robotic manipulation and perception remains a difficult challenge, in particular in unstructured environments. To address this challenge, we propose to couple manipulation and perception. The robot observes its own deliberate interactions with the world.
These interactions reveal sensory information that would otherwise remain hidden and also facilitate the interpretation of perceptual data. To demonstrate the effectiveness of interactive perception we present a skill for the manipulation of an articulated object. Using this skill, we show how UMan, our mobile manipulation platform, obtains a kinematic model of an unknown object. The model then enables the robot to perform purposeful manipulation of that object. Our algorithm is extremely robust, and does not require prior knowledge of the object; it is insensitive to lighting, texture, color, specularities, and is computationally highly efficient.