Functional Force-Aware Retargeting from Virtual Human Demos to Soft Robot Policies

2026-04-01Robotics

Robotics
AI summary

The authors present SoftAct, a new way to teach soft robot hands to do tasks like humans by focusing on how the fingers touch and push objects. They use virtual reality to capture detailed human hand movements and forces, then translate these into commands for soft robot hands using a two-step process that balances forces and adjusts finger positions in real time. Their approach works better than traditional methods that only copy finger movements, especially for robots that don't look like human hands. Tests show SoftAct controls the robot fingers more accurately and succeeds more often in real-world tasks.

soft robotic handscontact forcesvirtual reality demonstrationsforce-aware retargetingfinger kinematicstrajectory trackingnon-anthropomorphic robotcompliancereal-time control
Authors
Uksang Yoo, Mengjia Zhu, Evan Pezent, Jom Preechayasomboon, Jean Oh, Jeffrey Ichnowski, Amir Memar, Ben Abbatematteo, Homanga Bharadhwaj, Ashish Deshpande, Harsha Prahlad
Abstract
We introduce SoftAct, a framework for teaching soft robot hands to perform human-like manipulation skills by explicitly reasoning about contact forces. Leveraging immersive virtual reality, our system captures rich human demonstrations, including hand kinematics, object motion, dense contact patches, and detailed contact force information. Unlike conventional approaches that retarget human joint trajectories, SoftAct employs a two-stage, force-aware retargeting algorithm. The first stage attributes demonstrated contact forces to individual human fingers and allocates robot fingers proportionally, establishing a force-balanced mapping between human and robot hands. The second stage performs online retargeting by combining baseline end-effector pose tracking with geodesic-weighted contact refinements, using contact geometry and force magnitude to adjust robot fingertip targets in real time. This formulation enables soft robotic hands to reproduce the functional intent of human demonstrations while naturally accommodating extreme embodiment mismatch and nonlinear compliance. We evaluate SoftAct on a suite of contact-rich manipulation tasks using a custom non-anthropomorphic pneumatic soft robot hand. SoftAct's controller reduces fingertip trajectory tracking RMSE by up to 55 percent and reduces tracking variance by up to 69 percent compared to kinematic and learning-based baselines. At the policy level, SoftAct achieves consistently higher success in zero-shot real-world deployment and in simulation. These results demonstrate that explicitly modeling contact geometry and force distribution is essential for effective skill transfer to soft robotic hands, and cannot be recovered through kinematic imitation alone. Project videos and additional details are available at https://soft-act.github.io/.