Acronym LIMOMAN
Name Developmental Learning of Internal Models for Robotic Manipulation based on Motor Primitives and Multisensory Integration
Funding Reference FP7-PEOPLE-IEF- 628315
URL http://limoman-project.blogspot.pt/
Dates 2014-05|2016-04
Summary

Dexterous manipulation is a key challenge for the dissemination of robots in our society: most of the tasks robots can be useful for resort in some form of manipulation of objects. However, unlike humans, robots only achieve good performances in very controlled settings, failing to scale to unknown environments or novel objects. This project focuses on three main aspects of human motor control that can be combined to improve the performances of current robots: internal models, development and multisensory integration.
We propose the concept of “hierarchical, probabilistic and contextual” internal models, that should allow to cope with the main issues related to motor control in real world. A hierarchical organization of models dealing with different levels of complexity will allow the system to represent from simple grasping to finer manipulation, and to be able to respond properly to the environment depending on the available sensory information. These different levels of complexity will be acquired incrementally through motor experience in a developmental way. Different sensory modalities will be combined in a probabilistic (Bayesian) fashion, depending on their reliability and the associated computational cost.
We aim at both i) proposing a general framework for learning and control in complex systems, and ii) devising a working solution for robotic manipulation.
Moreover, due the bio-inspired nature of the project, a secondary goal is also to support hypotheses proposed by psychologists and neuroscientists about human development and learning.
The work will be implemented on the humanoid robot iCub, one of the most advanced robotic platforms for research on cognition, and will combine several results of past and ongoing European projects in the fields of dexterous manipulation (HANDLE), cognitive modeling (RobotCub and RoboSoM) and motor learning theories (Poeticon++), in which the host laboratory has been or is currently involved.

Research Groups Computer and Robot Vision Lab (VisLab)
ISR/IST Responsible
Lorenzo Jamone