See, Touch, and Hear: 2nd Workshop on multimodal sensor-based robot control for HRI and soft manipulation

Organizers: A. Cherubini, Y. Mezouar, D. Navarro-Alarcon, J. A. Corrales Ramon, Robert Haschke

Objectives

The recent development of bio-inspired sensors (which are nowadays affordable and lightweight) has spurred their use on robots, particularly on anthropomorphic ones (e.g., humanoids and dexterous hands). Such sensors include e.g. RGB-D cameras, tactile skins,  microphones, joint torque sensors, and capacitive proximity sensors.

Since these sensors generally measure different physical phenomena, it is preferable to use them directly in the low-level servo controller rather than to apply multi-sensory fusion, or to design complex state machines. We believe that adaptive sensor-based methods directly linking perception to action can provide better solutions in unpredictable scenarios, than traditional planning and model-based techniques, which require a priori models of the environment. This idea, originally proposed in the hybrid position-force control paradigm, when extended to feedback from multiple sensors, brings new challenges to the controller design; e.g. related to the sensors characteristics (synchronization, hybrid control, task compatibility, etc.) or to the task representation.

This multimodal approach represents at best our cognitive processes (which directly link perception and action), and is fundamental in many innovative robotic applications, such as human-robot interaction, soft material manipulation, and whole-body control.

Human-robot interaction for collaborative tasks often relies on force/tactile feedback, to transmit the user intention to the robot. However, the robot should also be capable of recognizing the intention, when there is no direct contact with the human. Possible solutions come from audio and/or visual data, which should then be combined with haptics to obtain the best result. These approaches are particularly useful in whole-body control of humanoid robots, since their actuators and sensors are generally bio-inspired, to facilitate interaction with the human.

The automatic manipulation of soft materials (e.g., in the food industry) represents a second important case study. The natural evolution of recent works on vision-based servoing of soft objects is the integration of haptics and force feedback.

Whole-body control is a third field of research that would greatly profit from the discussed methods. In fact, multiple tasks (manipulation, self collision avoidance, etc.) can be simultaneously realized by exploiting the diverse sensing capabilities of the robot body.

We propose a half-day workshop to enhance active collaboration and discuss formal methods for sensor-based control. The purpose of the workshop is to bring together researchers with common interests in the area of multimodal control, based on a variety of sensed signals, including vision (2D and 3D), touch (haptics), audio, position, force, proximity (from capacitive measurements) etc. The invited speakers will share their experience and will give an insight into the evolution and current status of multimodal control. The workshop will also be opened to paper submission, and the final schedule will be adapted depending on the quantity and quality of the submissions. We will organize a poster session of the submitted papers, to ease interaction and discussion between participants.

Topics of interest

  • hands-on applications where multimodal control is necessary
  • whole-body control with heterogeneous sensors
  • bio-inspired approaches to multimodal control
  • theoretical foundations of multimodal control (e.g., task frame approaches or constraint-based task specification)
  • new trends in sensor-based control, based on perspective integration with other modalities (e.g: visual deformation servoing, tactile/haptic servoing, robot audition, proximity servoing)

Previous edition

Follow this link to the webiste of the "1st Workshop on multimodal sensor-based robot control for HRI and soft manipulation" at IROS 2015 (see photos below).

Attendants.png
Poster2
Poster1.png

Submission information

Prospective participants are required to submit an extended abstract (maximum 2 pages in length), but videos are also welcome!
All submissions will be reviewed using a single-blind review process.
Accepted contributions will be presented during the workshop as posters.
Submissions must be sent in pdf, following the IEEE conference style (two-columns), to:

cherubini_AT_lirmm_DOT_fr

indicating [IROS 2016 Workshop] in the e-mail subject.

Submission Deadline: August 29
Notification of acceptance: September 12
Camera-ready deadline: September 16
Workshop day: October 10

The organising committee is preparing a Special Issue in the journal ROBOTICS AND AUTONOMOUS SYSTEMS, including extended versions of the best papers, see:
http://www.journals.elsevier.com/robotics-and-autonomous-systems/call-for-papers/special-issue-on-multimodal-sensor-based-robot-control-for-i/
(select “VSI:Multimodal robot control” as “Article Type Name” in the submission process)

Program Committee

Kaspar Althoefer, King's College, London, United Kingdom
Yasemin Bekiroglu, University of Birmingham, United Kingdom
João Bimbo, King's College London, United Kingdom
Jeannette Bohg, Max-Planck Institute for Intelligent Systems, Tübingen, Germany
Gianni Borghesan, University of Leuven, Belgium
Robert Haschke, Univeristät Bielefeld, Germany
Björn Hein, Karlsruhe Institute of Technology, Germany
Norman Hendrich - TAMS - Universität Hamburg, Germany
Anh-Van Ho, Ryukoku University, Japan
Jun Kinugawa, Tohoku University, Japan
Darwin Lau, The Chinese University of Hong Kong, China
Qiang Li, Univeristät Bielefeld, Germany
Philip Edward Long, IRT Jules Verne, France
Kazuhiro Nakadai, Honda Research, Japan
Lorenzo Natale, IIT Genova, Italy
Stefan Escaida Navarro, Karlsruhe Institute of Technology, Germany
Jaeheung Park, Seoul National University, South Korea
Veronique Perdereau, ISIR, Paris, France
Hesheng Wang, Shanghai Jiao Tong University, China
Li Xiang, The Chinese University of Hong Kong, China

Proceedings

Program

Time Talk
14:00 - 14:15 Opening
14:15 - 14:45 Lorenzo Natale - Seeing and touching: multimodal perception on the iCub robot
The iCub is a humanoid robot shaped as a four years old child. Its sensory system includes cameras, an inertial unit, F/T sensors and a tactile system distributed on the arms and legs. In this talk I will illustrate how these sensors have been used to control the interaction between the robot and the environment. I will show examples of whole-body control, object manipulation and object perception.
14:45 - 15:15 Kazuhiro Nakadai - Robot Audition and Its application to Rescue Robot
Robot audition was proposed in 2000 to realize a robot's auditory functions with its own ears. Robots are exposed to surrounding noise sources including ego-noise sources, and we defined three main auditory functions for robots: sound source localization, sound source separation, and automatic speech recognition of the separated speech in acoustically noisy conditions. We developed microphone array processing for these functions, and released open source software for robot audition which is called HARK (Honda Research Institute Japan for Robots with Kyoto University) by compiling the reported techniques for robot audition. Now we try to deploy HARK to many fields such as ICT, cars, ethology, ecology, and so on. In this talk, I would like to focus on applications of rescue robots, i.e., UAV with a microphone array to search people in disastrous situations as activities in JST ImPACT Tough Robotics Challenges. We showed the developed UAV can detect sound sources in a noisy condition where a signal-to-noise ratio is around -15 dB.
15:15 - 15:30 Aly Magassouba et al. - Binaural auditory interaction without HRTF for humanoid robots: A sensor-based control approach
15:30- 16:00 Coffee break with refreshments
16:00 - 16:30 Kaspar Althoefer - Force and tactile sensing in soft and flexible robot arms
In my talk, I will present an overview of recent developments in my research group pertaining to the creation of sensors that are particularly suited for integration with soft and flexible robot arms. Developing appropriate sensors for such robots and integrating them with the robot structure is challenging, since the main intrinsic features of the robots, that is their flexibility and softness, is not to be compromised by the introduction of the sensing elements. I will report on sensing concepts and prototypes developed.
16:30 - 16:45 Stefan Escaida Navarro et al. - Flexible Spatial Resolution for Preshaping with a Modular Capacitive Tactile Proximity Sensor
16:45 - 17:15 Robert Haschke - Tactile Servoing – Tactile-Based Robot Control
Modern Tactile Sensor arrays provide contact force information at reasonable frame rates suitable for robot control. Using machine learning technique like deep learning, we can learn to detect incipient slippage, to control grasping forces, and to explore object surfaces – all without knowing the shape, weight or friction properties of the object.
17:15 - 17:30 Joshua Wade et al. - Force and Thermal Sensing with a Fabric-based Skin
17:30 - 17:45 Angel Delgado et al. - Tactile Image Interface for Tactile Sensors Based on a Method of Mixture of Gaussians with Adaptable Deviations
17:45 - 18:15 Yasemin Bekiroglu - Learning Approaches for Robust Grasping and Manipulation based on Experience
To perform robust grasping, a robot should be able to assess and adapt its grasp configuration to deal with uncertainties about physical properties of objects, e.g., the object weight and the friction at the contact points. A change of grasp configuration, based on corrective actions, depends on the controller, the employed sensory feedback and the type of uncertainties inherit to the problem. In this talk, I will present probabilistic approaches using real sensory data obtained from exploration, e.g., visual and tactile, to construct object models, to assess grasp success that can be used to trigger plan corrections and to find better grasp configurations when failure is estimated.
18:15 - 18:30 Juan Rojas et al. - Growing A Robot’s Haptic Awareness

Support of IEEE RAS Technical Committees

This workshop is supported by the IEEE RAS Technical Committees on:
- Whole-Body Control,
- Humanoid Robotics,
- Robotic Hands, Grasping and Manipulation,
- Haptics,
- Human-Robot Interaction & Coordination.
The number of Technical Committees supporting the workshop confirms its relevance for the RAS community.

Last update on 09/03/2017