Sparse Haptic Proxy: Touch Feedback in Virtual Environments Using a General Passive Prop

Abstract

We propose a class of passive haptics that we call Sparse Haptic Proxy: a set of geometric primitives that simulate touch feedback in elaborate virtual reality scenes. Unlike previous passive haptics that replicate the virtual environment in physical space, a Sparse Haptic Proxy simulates a scene’s detailed geometry by redirecting the user’s hand to a matching primitive of the proxy. To bridge the divergence of the scene from the proxy, we augment an existing Haptic Retargeting technique with an on-the-fly target remapping: We predict users’ intentions during interaction in the virtual space by analyzing their gaze and hand motions, and consequently redirect their hand to a matching part of the proxy. We conducted three user studies on haptic retargeting technique and implemented a system from three main results: 1) The maximum angle participants found acceptable for retargeting their hand is 40°, with an average rating of 4.6 out of 5. 2) Tracking participants’ eye gaze reliably predicts their touch intentions (97.5%), even while simultaneously manipulating the user’s hand-eye coordination for retargeting. 3) Participants preferred minimized retargeting distances over better-matching surfaces of our Sparse Haptic Proxy when receiving haptic feedback for single-finger touch input. We demonstrate our system with two virtual scenes: a flight cockpit and a room quest game. While their scene geometries differ substantially, both use the same sparse haptic proxy to provide haptic feedback to the user during task completion.

Publication
Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems