Model-free vision-based shaping of deformable plastic materials
We address the problem of autonomously shaping deformable plastic materials using non-prehensile actions. Shaping plastic objects is challenging, since they are difficult to model and to track visually. We study this problem, by using kinetic sand, a plastic toy material which mimics the physical properties of wet sand. Inspired by a pilot study where humans shape kinetic sand, we define two types of actions: pushing the material from the sides and tapping from above. The chosen actions are executed with a robotic arm using image-based visual servoing. From the current and desired view of the material, we define states based on visual features such as the outer contour shape and the pixel luminosity values. These are mapped to actions, which are repeated iteratively to reduce the image error until convergence is reached. For pushing, we propose three methods for mapping the visual state to an action. These include heuristic methods and a neural network, trained from human actions. We show that it is possible to autonomously achieve simple shapes with the kinetic sand, without explicitly modeling the material. Our approach is limited in the types of shapes it can achieve. A richer set of action types and multi-step reasoning is needed to achieve more sophisticated shapes.
A. Cherubini, V. Ortenzi, A. Cosgun, R. Lee, P. Corke. Model-free vision-based shaping of deformable plastic materials. Int. Journal of Robotics Research Special Issue on Soft Manipulation (submitted).
A. Cherubini, J. Leitner, V. Ortenzi, P. Corke. Towards vision-based manipulation of plastic materials. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS'18.
A. Cherubini, P. Corke, Towards sensor-based manipulation of flexible objects. ICRA 2017 Workshop on Sensor-Based Object Manipulation for Collaborative Assembly.
Last update on 13/04/2019