TY - GEN
T1 - Online audio beat tracking for a dancing robot in the presence of ego-motion noise in a real environment
AU - Oliveira, João Lobato
AU - Ince, Gökhan
AU - Nakamura, Keisuke
AU - Nakadai, Kazuhiro
PY - 2012
Y1 - 2012
N2 - This paper presents the design and implementation of a real-time real-world beat tracking system which runs on a dancing robot. The main problem of such a robot is that, while it is moving, ego noise is generated due to its motors, and this directly degrades the quality of the audio signal features used for beat tracking. Therefore, we propose to incorporate ego noise reduction as a pre-processing stage prior to our tempo induction and beat tracking system. The beat tracking algorithm is based on an online strategy of competing agents sequentially processing a continuous musical input, while considering parallel hypotheses regarding tempo and beats. This system is applied to a humanoid robot processing the audio from its embedded microphones on-the-fly, while performing simplistic dancing motions. A detailed and multi-criteria based evaluation of the system across different music genres and varying stationary/non-stationary noise conditions is presented. It shows improved performance and noise robustness, outperforming our conventional beat tracker (i.e., without ego noise suppression) by 15.2 points in tempo estimation and 15.0 points in beat-times prediction.
AB - This paper presents the design and implementation of a real-time real-world beat tracking system which runs on a dancing robot. The main problem of such a robot is that, while it is moving, ego noise is generated due to its motors, and this directly degrades the quality of the audio signal features used for beat tracking. Therefore, we propose to incorporate ego noise reduction as a pre-processing stage prior to our tempo induction and beat tracking system. The beat tracking algorithm is based on an online strategy of competing agents sequentially processing a continuous musical input, while considering parallel hypotheses regarding tempo and beats. This system is applied to a humanoid robot processing the audio from its embedded microphones on-the-fly, while performing simplistic dancing motions. A detailed and multi-criteria based evaluation of the system across different music genres and varying stationary/non-stationary noise conditions is presented. It shows improved performance and noise robustness, outperforming our conventional beat tracker (i.e., without ego noise suppression) by 15.2 points in tempo estimation and 15.0 points in beat-times prediction.
UR - http://www.scopus.com/inward/record.url?scp=84864481812&partnerID=8YFLogxK
U2 - 10.1109/ICRA.2012.6224998
DO - 10.1109/ICRA.2012.6224998
M3 - Conference contribution
AN - SCOPUS:84864481812
SN - 9781467314039
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 403
EP - 408
BT - 2012 IEEE International Conference on Robotics and Automation, ICRA 2012
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2012 IEEE International Conference on Robotics and Automation, ICRA 2012
Y2 - 14 May 2012 through 18 May 2012
ER -