TouchFusion: Multimodal Wristband Sensing for Ubiquitous Touch Interactions
2026-02-16 • Human-Computer Interaction
Human-Computer Interaction
AI summaryⓘ
The authors created TouchFusion, a wristband that lets you interact with nearby surfaces by detecting your hand movements and touches without needing cameras or extra devices. It uses several types of sensors to understand how your hand is moving and touching stuff, allowing things like simple gestures and trackpad-like controls on any surface or even your own body. They tested it with 100 people to make sure it works well across different wrist shapes and skin types. TouchFusion can work with devices like smart glasses to offer easy, adaptable ways to control technology just by tapping or swiping nearby surfaces.
surface electromyography (sEMG)bioimpedanceinertial sensingoptical sensingwearable sensinggesture recognitioncontextual interfacesaugmented realitytrackpad interactionhuman-computer interaction
Authors
Eric Whitmire, Evan Strasnick, Roger Boldu, Raj Sodhi, Nathan Godwin, Shiu Ng, Andre Levi, Amy Karlson, Ran Tan, Josef Faller, Emrah Adamey, Hanchuan Li, Wolf Kienzle, Hrvoje Benko
Abstract
TouchFusion is a wristband that enables touch interactions on nearby surfaces without any additional instrumentation or computer vision. TouchFusion combines surface electromyography (sEMG), bioimpedance, inertial, and optical sensing to capture multiple facets of hand activity during touch interactions. Through a combination of early and late fusion, TouchFusion enables stateful touch detection on both environmental and body surfaces, simple surface gestures, and tracking functionality for contextually adaptive interfaces as well as basic trackpad-like interactions. We validate our approach on a dataset of 100 participants, significantly exceeding the population size of typical wearable sensing studies to capture a wider variance of wrist anatomies, skin conductivities, and behavioral patterns. We show that TouchFusion can enable several common touch interaction tasks. Using TouchFusion, a wearer can summon a trackpad on any surface, control contextually adaptive interfaces based on where they tap, or use their palm as an always-available touch surface. When paired with smart glasses or augmented reality devices, TouchFusion enables a ubiquitous, contextually adaptive interaction model.