|Funding Reference||FCT - PTDC/EEA-ACR/71032/2006|
There is an increasing interest in advanced human-robot interfaces due to a growing need for “service robots” designed to perform a variety of assistive tasks in human inhabited environments. Head and eye movements are particularly important for human-humanoid interaction, because they constitute a highly attended and communicative part of the human body, being able to convey emotions and express intentions and goals. On one hand, the way a robot controls its gaze toward targets may elicit different emotional interpretations by the user. Fast motions may indicate deep engagement on a task in time-critical or dangerous situations while smooth motions may indicate idleness and availability for interaction. On the other hand, the visual locations in which a robot concentrates its attention convey information about objects and spatial locations of interest to the current task, driving users attention to the important items in the scene (sharing attention). By concentrating the direction of observation in particular objects or humans will indicate whether the robot is enrolled in a well defined task or its intention to interact with the human. Both modalities constitute basic implicit communication skills that will contribute to the development of advanced human-humanoid interfaces.
Computer and Robot Vision Lab (VisLab)
|Project Partners||University of Uppsala (SE)|
Biomimetic Oculomotor Control for Humanoid Robots