Abstract:
We introduce WeARHand, which allows a user to manipulate virtual 3D objects
with a bare hand in a wearable augmented reality (AR) environment. Our method
uses no environmentally tethered tracking devices and localizes a pair of
near-range and far-range RGB-D cameras mounted on a head-worn display and a
moving bare hand in 3D space by exploiting depth input data. Depth perception
is enhanced through egocentric visual feedback, including a semi-transparent
proxy hand. We implement a virtual hand interaction technique and feedback
approaches, and evaluate their performance and usability. The proposed method
can apply to many 3D interaction scenarios using hands in a wearable AR
environment, such as AR information browsing, maintenance, design, and games.