Abstract

Grasping is a crucial task in robotics, necessitating tactile feedback and reactive grasping adjustments for robust grasping of objects under various conditions and with differing physical properties. In this article, we introduce LeTac-MPC, a learning-based model predictive control (MPC) for tactile-reactive grasping. Our approach enables the gripper to grasp objects with different physical properties on dynamic and force-interactive tasks. We utilize a vision-based tactile sensor, GelSight (Yuan et al. 2017), which is capable of perceiving high-resolution tactile feedback that contains information on the physical properties and states of the grasped object. LeTac-MPC incorporates a differentiable MPC layer designed to model the embeddings extracted by a neural network from tactile feedback. This design facilitates convergent and robust grasping control at a frequency of 25 Hz. We propose a fully automated data collection pipeline and collect a dataset only using standardized blocks with different physical properties. However, our trained controller can generalize to daily objects with different sizes, shapes, materials, and textures. The experimental results demonstrate the effectiveness and robustness of the proposed approach. We compare LeTac-MPC with two purely model-based tactile-reactive controllers (MPC and PD) and open-loop grasping. Our results show that LeTac-MPC has optimal performance in dynamic and force-interactive tasks and optimal generalizability.

Comments

This is the author-accepted manuscript of Z. Xu and Y. She, "LeTac-MPC: Learning Model Predictive Control for Tactile-Reactive Grasping," in IEEE Transactions on Robotics, vol. 40, pp. 4376-4395, 2024. (c) 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works. The version of record is available at DOI: 10.1109/TRO.2024.3463470.

Keywords

Grasping; Tactile sensors; Robots; Real-time systems; Grippers; Dynamics; Shape; Deep learning in robotics and automation; perception for grasping and manipulation; tactile control

Date of this Version

2024

Share

COinS