Information : cette page n'est pas traduite en français
See and Touch: 1st Workshop on multimodal sensor-based robot control for HRI and soft manipulation
Organizers: A. Cherubini, Y. Mezouar, D. Navarro-Alarcon, M. Prats, J. A. Corrales Ramon
Recent technological developments on bio-inspired sensors have made them affordable and lightweight, and therefore eased their use on robots, in particular on anthropomorphic ones (e.g., humanoids and dexterous hands). These sensors include e.g. RGB-D cameras, tactile skins, force/moment transducers, and capacitive proximity sensors.
Historically, heterogeneous sensor data was fed to fusion algorithms (e.g., Kalman or Bayesian-based methods) to provide state estimation for modeling the environment. However, since these sensors generally measure different physical phenomena, it is preferable to use them directly in the low-level servo controller rather than to apply multi-sensory fusion, or to design complex state machines. This idea, originally proposed in the hybrid position-force control paradigm, when extended to feedback from multiple sensors, brings new challenges to the controller design; e.g. related to the sensors characteristics (synchronization, hybrid control, task compatibility, etc.) or to the task representation.
However, this approach represents at best many of our cognitive processes (which directly link perception and action), and is fundamental in many innovative robotic applications, such as soft material manipulation, and human-robot interaction. Whole-body control is another field of research that would greatly profit from the discussed methods. In fact, multiple tasks (manipulation, self collision avoidance, etc.) can be simultaneously realized by exploiting the diverse sensing capabilities of the robot body.
The purpose of this workshop is to bring together researchers with common interests in the area of multimodal servo-control, based on a variety of feedback signals, including vision (2D and 3D), touch (haptics), position, force, proximity (from capacitive measurements) etc.
Multimodal robot control, based on the concurrent use of various sensors directly at the control level, is crucial in many applications. For instance, human-robot interaction for collaborative tasks often relies on force/tactile feedback to transmit the user intention to the robot. However, the robot should be capable of recognizing the intention even without direct contact between the two. A possible solution comes from visual data, which should then be combined with haptics to obtain the best result. This is of particular interest in whole-body control of humanoid robots, since their actuators and sensors are generally bio-inspired, to facilitate interaction with the human.
The automatic manipulation of soft materials (e.g., in the food industry) represents a second important case study. The natural evolution of recent works on vision-based servoing of soft objects, is the integration of haptics and force feedback.
For all these reasons, we believe that adaptive sensor-based methods directly linking perception to action, as in the above-mentioned approaches, can provide better solutions in unpredictable scenarios, than traditional planning and model-based techniques, which require a priori models of the environment.
We propose a half-day workshop to enhance active collaboration and discuss formal methods for sensor-based control. The invited speakers will share their experience and will give an insight into the evolution and current status of multimodal control. The workshop will also be opened to paper submission, and the final schedule will be adapted depending on the quantity and quality of the submissions. We will organize a poster session of the submitted papers, to ease interaction and discussion between participants.
Prospective participants are required to submit an extended abstract (maximum 2 pages in length), but videos are also welcome!
All submissions will be reviewed using a single-blind review process.
Accepted contributions will be presented during the workshop as posters.
Submissions must be sent in pdf, following the IEEE conference style (two-columns), to:
indicating [IROS 2015 Workshop] in the e-mail subject.
After the workshop, the organising committee will consider pursuing publication of a special issue in a journal or book including extended versions of the best papers.
Submission Deadline: August 27
Notification of acceptance: September 2
Camera-ready deadline: September 4
Workshop day: September 28
João Bimbo, King's College London, United Kingdom
Jeannette Bohg, Max-Planck Institute for Intelligent Systems, Tübingen, Germany
Gianni Borghesan, University of Leuven, Belgium
Giorgio Cannata, Università degli Studi di Genova, Italy
Eris Chinellato, University of Leeds, United Kingdom
Juan Antonio Corrales Ramòn, Institut Pascal, Clermont Ferrand, France
Joris De Schutter, University of Leuven, Belgium
Robert Haschke, Univeristät Bielefeld, Germany
Björn Hein, Karlsruhe Institute of Technology, Germany
Norman Hendrich - TAMS - Universität Hamburg
Wang Hesheng, Shanghai Jiao Tong University, China
Olivier Kermorgant, Université de Strasbourg, France
Jun Kinugawa, Tohoku University, Japan
James Kuffner, Google Research, USA
Zheng Li, The Chinese University of Hong Kong, China
Philip Edward Long, IRT Jules Verne, France
Philipp Mittendorfer, TUM, Germany
Benjamin Navarro, Université d'Orléans / Université de Montpellier, France
Stefan Escaida Navarro, Karlsruhe Institute of Technology, Germany
Veronique Perdereau, ISIR, Paris, France
Mohamed Sorour, PSA / Université de Montpellier, France
|14:00 - 14:10||Opening|
|14:10 - 14:40||
Joris De Schutter - A constraint-based approach for multi-modal servo-control
|14:40 - 15:10||
Anh-Van Ho - Multimodal Sense of Touch: Complexity and Feasibility in HMI
|15:10 - 15:40||
Jaeheung Park - Active sensing strategies for contact using constraints between the robot and environment
|15:40 - 15:50||Poster teasers|
|15:50 - 16:30||Poster session / Coffee break with refreshments|
|16:30 - 17:00||
Eris Chinellato - Multimodal Integration in Nature: Lessons for Robotics Research
|17:00 - 17:30||
Stefan Escaida Navarro - Multi-Modal Robot Skins: Proximity Servoing and further Applications
Support of IEEE RAS Technical Committees
This workshop is supported by the IEEE RAS Technical Committees on:
- Robotic Hands, Grasping and Manipulation,
- Human-Robot Interaction and Coordination,
- Whole-Body Control,
- Computer & Robot Vision,
- Humanoid Robotics,
Dernière mise à jour le 01/07/2016