Beat Tracking for Interactive Dancing Robots

Joao Lobato Oliveira, Gokhan Ince, Keisuke Nakamura, Kazuhiro Nakadai, Hiroshi G. Okuno, Fabien Gouyon, Luis Paulo Reis

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)

Abstract

Dance movement is intrinsically connected to the rhythm of music and is a fundamental form of nonverbal communication present in daily human interactions. In order to enable robots to interact with humans in natural real-world environments through dance, these robots must be able to listen to music while robustly tracking the beat of continuous musical stimuli and simultaneously responding to human speech. In this paper, we propose the integration of a real-time beat tracking system with state recovery with different preprocessing solutions used in robot audition for its application to interactive dancing robots. The proposed system is assessed under different real-world acoustic conditions of increasing complexity, which consider multiple audio sources of different kinds, multiple noise sources of different natures, continuous musical and speech stimuli, and the effects of beat-synchronous ego-motion noise and of jittering in ego noise (EN). The overall results suggest improved beat tracking accuracy with lower reaction times to music transitions, while still enhancing automatic speech recognition (ASR) run in parallel in the most challenging conditions. These results corroborate the application of the proposed system for interactive dancing robots.

Original languageEnglish
Article number1550023
JournalInternational Journal of Humanoid Robotics
Volume12
Issue number4
DOIs
Publication statusPublished - 1 Dec 2015

Bibliographical note

Publisher Copyright:
© 2015 World Scientific Publishing Company.

Funding

This work was partially supported by SFRH/BD/43704/2008 PhD scholarship endorsed by the Portuguese Government through FCT and partially supported by Istanbul Technical University Scienti¯c Research Projects Foundation under the contract BAP 37537.

FundersFunder number
Istanbul Technical University Scienti¯c Research Projects FoundationBAP 37537
Japan Society for the Promotion of Science24220006
Fundação para a Ciência e a Tecnologia

    Keywords

    • beat tracking
    • human-robot interaction
    • noise suppression
    • Robot audition
    • robot dancing

    Fingerprint

    Dive into the research topics of 'Beat Tracking for Interactive Dancing Robots'. Together they form a unique fingerprint.

    Cite this