Ruka-v2: Tendon Driven Open-Source Dexterous Hand with Wrist and Abduction for Robot Learning
2026-03-27 • Robotics
RoboticsArtificial Intelligence
AI summaryⓘ
The authors improved their previously open-sourced robot hand, called Ruka, by adding two important new features: a wrist that moves independently in two ways and fingers that can spread apart or come together. These upgrades help the robot hand better imitate how a real human hand moves, making it easier to pick up thin objects and move in tight spaces. In tests, the new Ruka-v2 hand performed tasks faster and more successfully than the original. The authors also showed that it can be used in various robot learning scenarios, and have shared all their design files and software publicly.
tendon-drivendegrees of freedomwrist mobilityfinger abduction/adductionrobot teleoperationdexterous manipulationrobot learninghumanoid handopen-source robotics3D printing
Authors
Xinqi, Liu, Ruoxi Hu, Alejandro Ojeda Olarte, Zhuoran Chen, Kenny Ma, Charles Cheng Ji, Lerrel Pinto, Raunaq Bhirangi, Irmak Guzey
Abstract
Lack of accessible and dexterous robot hardware has been a significant bottleneck to achieving human-level dexterity in robots. Last year, we released Ruka, a fully open-sourced, tendon-driven humanoid hand with 11 degrees of freedom - 2 per finger and 3 at the thumb - buildable for under $1,300. It was one of the first fully open-sourced humanoid hands, and introduced a novel data-driven approach to finger control that captures tendon dynamics within the control system. Despite these contributions, Ruka lacked two degrees of freedom essential for closely imitating human behavior: wrist mobility and finger adduction/abduction. In this paper, we introduce Ruka-v2: a fully open-sourced, tendon-driven humanoid hand featuring a decoupled 2-DOF parallel wrist and abduction/adduction at the fingers. The parallel wrist adds smooth, independent flexion/extension and radial/ulnar deviation, enabling manipulation in confined environments such as cabinets. Abduction enables motions such as grasping thin objects, in-hand rotation, and calligraphy. We present the design of Ruka-v2 and evaluate it against Ruka through user studies on teleoperated tasks, finding a 51.3% reduction in completion time and a 21.2% increase in success rate. We further demonstrate its full range of applications for robot learning: bimanual and single-arm teleoperation across 13 dexterous tasks, and autonomous policy learning on 3 tasks. All 3D print files, assembly instructions, controller software, and videos are available at https://ruka-hand-v2.github.io/ .