TY - JOUR
T1 - Object-aware interactive perception for tabletop scene exploration
AU - Koc, Cagatay
AU - Sariel, Sanem
N1 - Publisher Copyright:
© 2024 Elsevier B.V.
PY - 2024/5
Y1 - 2024/5
N2 - Recent advancements in sensors and deep learning techniques have improved reliability of robotic perceptual systems, but current systems are not robust enough for real-world challenges such as occlusions and sensing uncertainties in cluttered scenes. To overcome these issues, active or interactive perception actions are often necessary, such as sensor repositioning or object manipulation to reveal more information about the scene. Existing perception systems lack a comprehensive approach that incorporates both active and interactive action spaces, thereby limiting the robot's perception capabilities. Moreover, these systems focus on exploring a single object or scene, without utilizing object information to guide the exploration of multiple objects. In this work, we propose an object-aware hybrid perception system that selects the next best action by considering both active and interactive action spaces and enhances the selection process with an object-aware approach to guide the cognitive robot operating in tabletop scenarios. Novel volumetric utility metrics are used to evaluate actions that include positioning sensors from a heterogeneous set or manipulating objects to gain a better perspective of the scene. The proposed system maintains the volumetric information of the scene that includes semantic information about objects, enabling it to exploit object information, associate occlusion with corresponding objects, and make informed decisions about object manipulation. We evaluate the performance of our system both in simulated and real-world experiments using a Baxter robotic platform equipped with two arms, RGB and depth cameras. Our experimental results show that the proposed system outperforms the compared state-of-the-art methods in the given scenarios, achieving an 11.2% performance increase.
AB - Recent advancements in sensors and deep learning techniques have improved reliability of robotic perceptual systems, but current systems are not robust enough for real-world challenges such as occlusions and sensing uncertainties in cluttered scenes. To overcome these issues, active or interactive perception actions are often necessary, such as sensor repositioning or object manipulation to reveal more information about the scene. Existing perception systems lack a comprehensive approach that incorporates both active and interactive action spaces, thereby limiting the robot's perception capabilities. Moreover, these systems focus on exploring a single object or scene, without utilizing object information to guide the exploration of multiple objects. In this work, we propose an object-aware hybrid perception system that selects the next best action by considering both active and interactive action spaces and enhances the selection process with an object-aware approach to guide the cognitive robot operating in tabletop scenarios. Novel volumetric utility metrics are used to evaluate actions that include positioning sensors from a heterogeneous set or manipulating objects to gain a better perspective of the scene. The proposed system maintains the volumetric information of the scene that includes semantic information about objects, enabling it to exploit object information, associate occlusion with corresponding objects, and make informed decisions about object manipulation. We evaluate the performance of our system both in simulated and real-world experiments using a Baxter robotic platform equipped with two arms, RGB and depth cameras. Our experimental results show that the proposed system outperforms the compared state-of-the-art methods in the given scenarios, achieving an 11.2% performance increase.
KW - Active perception
KW - Cognitive robotics
KW - Interactive perception
KW - Robotic scene exploration
KW - View evaluation
KW - Volumetric utility metrics
UR - http://www.scopus.com/inward/record.url?scp=85188029121&partnerID=8YFLogxK
U2 - 10.1016/j.robot.2024.104674
DO - 10.1016/j.robot.2024.104674
M3 - Article
AN - SCOPUS:85188029121
SN - 0921-8890
VL - 175
JO - Robotics and Autonomous Systems
JF - Robotics and Autonomous Systems
M1 - 104674
ER -