Abstract:
This paper presents a method for using object based manipulation and spatial
augmented reality for the purpose of remote guidance. Previous remote
guidance methods have typically not made use of any semantic information
about the physical properties of the environment and require the helper and
worker to provide context. Our new prototype system introduces a level of
abstraction to the remote expert, allowing them to directly specify the
object movements required of a local worker. We use 3D tracking to create a
hidden virtual reality scene, mirroring the real world, with which the remote
expert interacts while viewing a camera feed of the physical workspace. The
intended manipulations are then rendered to the local worker using Spatial
Augmented Reality (SAR). We report on the implementation of a functional
prototype that demonstrates an instance of this approach. We anticipate that
techniques such as the one we present will allow more efficient collaborative
remote guidance in a range of physical tasks.
Social Program