Naturally occurring gestures in a human–robot teaching scenario
Abstract:This paper describes our general framework for the investigation of how human gestures can be used to facilitate the interaction and communication between humans and robots. Two studies were carried out to reveal which "naturally occurring" gestures can be observed in a scenario where users had to explain to a robot how to perform a home task. Both studies followed a within-subjects design: participants had to demonstrate how to lay a table to a robot using two different methods — utilizing only gestures or gestures and speech. The first study enabled the validation of the COGNIRON coding scheme for human gestures in Human–Robot Interaction (HRI). Based on the data collected in both studies, an annotated video corpus was produced and characteristics such as frequency and duration of the different gestural classes have been gathered to help capture requirements for the designers of HRI systems. The results from the first study regarding the frequencies of the gestural types suggest an interaction between the order of presentation of the two methods and the actual type of gestures produced. However, the analysis of the speech produced along with the gestures did not reveal differences due to ordering of the experimental conditions. The second study expands the issues addressed by the first study: we aimed at extending the role of the interaction partner (the robot) by introducing some positive acknowledgement of the participants' activity. The results show no significant differences in the distribution of gestures (frequency and duration) between the two explanation methods, in contrast to the previous study. Implications for HRI are discussed focusing on issues relevant for the design of the robot's communication skills to support the interaction loop with humans in home scenarios.
Document Type: Regular Paper
Publication date: 2008-12-01
More about this publication?
- Social Behaviour and Communication in Biological and Artificial Systems