The field of robotics is undergoing a major revolution as it is increasingly being applied to general purposes outside the production line: for health, rehabilitation and professional services, in domestic and leisure environments, as well as hazardous environments. There, one keystone for robots to carry out accurate and intelligent tasks, with and for people, is their ability both to handle autonomously all sorts of objects and to use human tools. However, today’s robots are unable to achieve dexterous and fine manipulation, especially when this requires in-hand manipulation. They are far from being able to understand and reason about their environments, their goals and their own capabilities, to learn skills and improve their performance by what they have been taught and their own experience, to interact with their environments with the efficiency of humans.
The HANDLE project aims at understanding how humans perform the manipulation of objects in order to replicate grasping and skilled in-hand movements with an anthropomorphic artificial hand, and thereby move robot grippers from current best practice towards more autonomous, natural and effective articulated hands. The project implies not only focusing on technological developments but also working with fundamental multidisciplinary research aspects in order to endow the robotic hand system with advanced perception capabilities, high level feedback control and elements of intelligence that allow recognition of objects and context, reasoning about actions and a high degree of recovery from failure during the execution of dexterous tasks.
Integrating findings from disciplines such as neuroscience, developmental psychology, cognitive science, robotics, multimodal perception and machine learning, the method we will develop is based on an original blend of learning and predicting behaviours from imitation and “babbling” to allow the robot to be capable of responding to gaps in its knowledge.