top of page

A Predictive Visuo-Tactile Interactive Perception Framework
for Active Exploration of Object Properties

This work focuses on autonomously inferring the physical properties of a diverse set of various homogeneous, heterogeneous, and articulated objects utilizing a robotic system equipped with vision and tactile sensors. We propose a novel predictive perception framework for identifying object properties of the diverse objects by leveraging versatile exploratory actions: non-prehensile pushing and prehensile pulling. As part of the framework, we propose a novel active shape perception to initiate exploration seamlessly. Our innovative dual differentiable filtering with Graph Neural Networks learns the object-robot interaction and performs consistent inference of non observable time-invariant object properties. In addition, we formulate a $N$-step information gain approach to actively select most informative actions for efficient learning and inference. Extensive real-robot experiments with planar objects show that our predictive perception framework results in better performance than the state-of-the-art baseline, and demonstrate our framework in three major applications for i) object tracking, ii) goal-driven task and iii) change in environment detection.

Fig_1_v5.png

Overview of the proposed visuo-tactile based interactive perception framework for active object exploration

Additional Experimental Videos and Dataset to be uploaded soon !! 

bottom of page