TY - GEN
T1 - Towards automated human-robot mutual gaze
AU - Broz, Frank
AU - Kose-Bagciy, Hatice
AU - Nehaniv, Chrystopher L.
AU - Dautenhahn, Kerstin
PY - 2011
Y1 - 2011
N2 - The role of gaze in interaction has been an area of increasing interest to the field of human-robot interaction. Mutual gaze, the pattern of behavior that arises when humans look directly at each other's faces, sends important social cues communicating attention and personality traits and helping to regulate conversational turn-taking. In preparation for learning a computational model of mutual gaze that can be used as a controller for a robot, data from human-human pairs in a conversational task was collected using a gaze-tracking system and face-tracking algorithm. The overall amount of mutual gaze observed between pairs agreed with predictions from the psychology literature. But the duration of mutual gaze was shorter than predicted, and the amount of direct eye contact detected was, surprisingly, almost nonexistent. The results presented show the potential of this automated method to capture detailed information about human gaze behavior, and future applications for interaction-based robot language learning are discussed. The analysis of human-human mutual gaze using automated tracking allows further testing and extension of past results that relied on hand-coding and can provide both a method of data collection and input for control of interactive robots.
AB - The role of gaze in interaction has been an area of increasing interest to the field of human-robot interaction. Mutual gaze, the pattern of behavior that arises when humans look directly at each other's faces, sends important social cues communicating attention and personality traits and helping to regulate conversational turn-taking. In preparation for learning a computational model of mutual gaze that can be used as a controller for a robot, data from human-human pairs in a conversational task was collected using a gaze-tracking system and face-tracking algorithm. The overall amount of mutual gaze observed between pairs agreed with predictions from the psychology literature. But the duration of mutual gaze was shorter than predicted, and the amount of direct eye contact detected was, surprisingly, almost nonexistent. The results presented show the potential of this automated method to capture detailed information about human gaze behavior, and future applications for interaction-based robot language learning are discussed. The analysis of human-human mutual gaze using automated tracking allows further testing and extension of past results that relied on hand-coding and can provide both a method of data collection and input for control of interactive robots.
KW - Human-robot interaction
KW - Markov model
KW - Mutual gaze
KW - Psychology
UR - http://www.scopus.com/inward/record.url?scp=84883097959&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84883097959
SN - 9781612081175
T3 - ACHI 2011 - 4th International Conference on Advances in Computer-Human Interactions
SP - 222
EP - 227
BT - ACHI 2011 - 4th International Conference on Advances in Computer-Human Interactions
T2 - 4th International Conference on Advances in Computer-Human Interactions, ACHI 2011
Y2 - 23 February 2011 through 28 February 2011
ER -