Interactive Digital Humans
The goal of IDH (Interactive Digital Humans) is to develop robots for helping people in healthcare and industrial scenarios.
Our expertise lies in the robot software (control, perception, artificial intelligence) rather than its hardware (mechanical and electronic design).
In a nutshell, we provide existing commercial robots with the cognitive capabilities needed to help humans.
First, our robots must be capable of inferring the human intention, using multimodal sensing.
This requires solid representations of the human model, and expertise in the use of cutting edge sensors to update this model and to control the robot accordingly.
These sensors include 3D vision, tactile/proximity skins and even human-machine interfaces (BCI and EMG), with an emphasis on signal processing and machine learning applied to physiological and human movement data.
A crucial objective is that the user controls the robot in a transparent way, to ultimately feel embodied in his/her augmented avatar.
To achieve this feeling of embodiment, our robots are anthropomorphic, i.e., humanoid.
A huge research challenge consists in controlling their whole body (feet, arms, head…) to realise multiple simultaneous tasks (just as we humans do).
This often requires planning and controlling in real time the robot contact points with the environment as well as with the human user, in a manner that should be safe for robot, person and environment.
Recently, we have pushed these aspects forward, towards non-conventional manipulation, including intentional impacts, and intentional deformation of soft objects.
Sofiane Ramdani, Maître de conférences, UM
Abderrahmane Kheddar, Directeur de recherche, CNRS
Ganesh Gowrishankar, Chargé de recherche, CNRS
Andrea Cherubini, Professeur des universités, UM
Philippe Fraisse, Professeur des universités, UM
Associates & Students
Ahmed Zermane, Gouv. Algérien
Antonin Dallard, CNRS
Ege Gürsoy, UM
Meriem Naamani, CNRS
Louise Scherrer, CNRS
Celia Saghour, UM
Carole Fournier, CNRS
Saber Riadh Khaldi, Gouv. Algérien
Sandra Victor, CNRS
Enzo Indino, CNRS
Hugo Lefevre, CNRS
Lea Boillereaux, UM
Guillaume Gourmelen, CDD Enseignant-Chercheur, UM
Nadine Jacquet, CDD Ingénieur-Technicien, CNRS
Youcan Yan, CDD Chercheur, CNRS
Leo Moussafir, CDD Chercheur, CNRS
Yukiko Iwasaki, CDD Chercheur, CNRS
Julien Roux, CDD Enseignant-Chercheur, UM
Yale Lee, Doctorant externe, Université Côte d’Azur
Arnaud Tanguy, CDD Ingénieur-Technicien, CNRS
Wanchen Li, CDD Ingénieur-Technicien, UM
- Representing and modeling human motion (Fraisse, Cherubini, Ramdani)
Multimodal Sensing and Control (Kheddar, Fraisse, Cherubini)
- Physiological signal processing (Kheddar, Ramdani, Ganesh)
- Data analysis for human-machine interfaces (Kheddar, Ramdani,Ganesh)
- Whole body robot control (Kheddar, Fraisse)
- Multicontact control and planning (Kheddar)
- Physical human-robot interaction (Kheddar, Fraisse, Cherubini, Ganesh)
- Non-conventional manipulation (Kheddar, Cherubini)
- Embodied robotics and virtual reality (Ganesh, Kheddar)
MovCap (Mocap on the Move)