|
Modeling, Tracking, Annotating and Augmenting a 3D Object in less than 5 Minutes Steve Bourgeois, Boris Meden, Vincent Gay-Bellile, Mohamed Tamaazousti, Sebastian Knödel We present an easy-to-use framework for 3D object tracking allowing modeling, track and annotating a 3D object in few minutes. While the modeling process relies on 3D reconstruction using a depth-sensing camera (Kinect/Xtion), the tracking is achieved with a 2D camera. To complete the software framework, we will also exhibit the Selltic Wand device, an intuitive point-and-shoot interface for scanning and annotating.
|
|
Real Time Relighting for an Arbitrary Shaped Object using an RGB-D Camera Takuya Ikeda, Francois de Sorbier and Hideo Saito We propose a new relighting approach for arbitrary shaped objects using an RGB-D camera. We focus on the noisy depth map modification to segment the object region, and normal estimation for an accurate relighting. Our implementation of the method achieves a relighting at 15fps.
|
|
Mobile Interactive Hologram Verification Andreas Hartl, Jens Grubert, Dieter Schmalstieg, Gerhard Reitmayr Holograms present on documents appear very differently depending on the current viewing direction and illumination conditions. We present a mobile AR system which provides interactive assistance in capture and verification of these elements. The user may then decide whether to accept the element as genuine or reject.
|
|
Level-of-Detail AR Jae-In Hwang, Min-Hyuk Sung, Yongmin Choi, Junho Kim, Ig-Jae Kim, Heedong Ko Our level-of-detail(LOD) AR system handles multi-layered information of large-scale image. We propose points of interest tree structure and a method to identify which part the user is the most attentive to. We demonstrate the feasibility of our idea by implementing a mobile AR system.
|
|
osgGap: Scene Graph Library for Mobile based on Hybrid Web App Framework Youna Lee, Seungmin Rho, Jae-In Hwang, Heedong Ko, Junho Kim While providing high performances on mobiles, osgGap enables programmers to develop 3D contents using JavaScript and HTML5 instead of using mobile platform-specific languages. In the demonstration, we will show attendees several mobile 3D apps based on osgGap, including vision-based AR and touch-based interactions, which are performed on cross mobile platforms.
|
|
Implementing an overtaking assistance system through augmented reality glasses Michel Ferreira, Pedro Gomes, Michelle Krüger Silvéria and Fausto Vieira This demo will show the functioning of the See-Through System (STS) implemented with augmented reality smart glasses. The STS is an innovative video-based cooperative driver assistance system designed to enhance a driver’s visual perception while overtaking a vision-obstructing vehicle.
|
|
Geometrically-correct projection-based texture mapping onto a deformable object Yuichiro Fujimoto, Takafumi Taketomi, Goshiro Yamamoto, Jun Miyazaki, Hirokazu Kato, Ross Smith, and Bruce Thomas In our demonstration, we can illustrate the geometrically-correct projection-based texture mapping onto a deformable object with a pattern-marker. Participants will be given the opportunity to twist and bend the substrate by hand and a correct projection will be calculated and displayed onto the surface after reshaping.
|
|
Arbitrary Textured 3D Object Tracking System Dissaphong Thachasongtham, Takumi Yoshida, François de Sorbier, Hideo Saito This demonstration presents a 3D object tracking system based on viewpoint generative learning. It can track any arbitrary textured 3D object as long as the textured 3D model is available. Many rendered images from different viewpoints of the model are virtually generated. The system learns from these rendered images shortly.
|
|
Pathfinder Vision: Future Prediction Interface for Vehicle Operation Naoya Maeda, Maki Sugimoto Our robot operation interface generates the model of the real environments and the path of the future vehicle and presents images depicting the predicted events. Attendees of our demonstration will be able to drive the vehicle by using our system.
|
|
Human Engine Mohammad Poswal, Kelley Hecker, Vincent Bohossian and Debra Isaac Downing Human Engine is an innovative approach for transforming 4D scan data into virtual humans, clothing, or anything else you can put in front of a camera.
|
|
Shapes AR Mohammad Poswal, Kelley Hecker, Vincent Bohossian and Debra Isaac Downing Shapes AR is an interactive augmented reality physics game. Users have a marker on a spinning disc, another marker on the end of a wand, and an iPad with the camera feed of the marker displayed on-screen. When the software recognizes the marker, it displays augmented reality shapes.
|
|
MindLight Brian Mullins, Gaia Dempsey, Andrew Krage, Rajay Kumar, Greg Khachaturyan MindLight is a unique combination of thought detection and augmented reality. Using an EEG brainwave headset combined with Google Glass and an innovative AR interface created by DAQRI, the viewer can turn on a lamp by thinking it. In this demonstration you will experience controlling real-world objects with your mind.
|
|
A mobile augmented reality application by using Chroma keying Yuan Wang, Linh Chi Nguyen, and Do Yi Luen Ellen This demonstration showcases a mobile device application (iPad) with real-time video processing through augmented reality (AR) technology. Magic Keyer provides a new way to display the technology and entertainment by using green screen video effect in different environment.
|
|
Bare Hand Natural Interaction with Augmented Objects Lucas Figueiredo, Jorge Lindoso, Rafael Roberto, Veronica Teichrieb, Ronaldo dos Anjos, Edvar Neto, Manoela da Silva This demonstration will allow visitors to experience the direct interaction with augmented objects through hand gestures. The developed tool provides is designed for tabletop applications, making it possible to add augmented objects upon a common work table scenario as well as empowering the user with the ability to grab these objects.
|
|
Mobile Augmented Reality - Tracking and Mapping Qualcomm Research Our demos show state of the art methods in Visual SLAM, Tracking with Sensor Fusion and Online Creation of Object Detection Datasets. All demos are hands-on and run completely on off-the-shelf mobile phones.
|
|
Insight: A Mobile Web Browser for HTML5-based AR Applications Sangchul Ahn, Byounghyun Yoo, Heedong Ko, and Steven Feiner This demo presents a novel mobile augmented reality (AR) content platform that uses HTML5 as its content structure. Insight is a mobile AR Web browser that executes applications developed as normal HTML5 documents with popular visualization components (e.g., jQuery UI, SVG, and WebGL) under the current Web ecosystem.
|
|
Interactive Syntactic Modeling for Augmented Reality
|
|
Settlers of Catan Quick-StartAR Demo Naman Thakar, Flora Salim, and Stefan Greuter This demo will enable users to learn the basic concepts of the board game Settlers of Catan. Helpful hints are displayed on the physical game tokens and the map using Augmented Reality, specifically the visual overlay approach. The use of AR will not replace the conventional way of playing the game, but will complement it as a new medium for learning the game.
|
|
Calibration of Head-Mounted Finger Tracking to Optical See-Through Head Mounted Display Yuta Itoh, Frieder Pankratz, Christian Waechter, Gudrun Klinker Try out a simple way to employ a new environment-instrumented finger tracking device in an AR scenario with an OST HMD. The calibration procedure uses Single Point Active Alignment Method (SPAAM) to determine the static spatial relationship between the display and the finger tracking device rigidly mounted to the HMD. |
|
Multi-projector Calibration Method for Spatial Augmented Reality System by Planar Photo Detectors Tatsuya Kodera, Maki Sugimoto, Ross Smith, Guy Webber, Michael Marner, and Bruce Thomas We demonstrate a projector calibration method for a Spatial Augmented Reality (SAR) system by planer photo detectors. Our method measures the X-Y position using gray-coding and a scan line pattern to find a subpixel accurate location. Using this technique we adjust the projection patterns in a SAR environment.
|
|
KITE: Platform for Mobile Augmented Reality Gaming and Interaction using Magnetic Tracking and Depth Sensing Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst KITE is a mobile Augmented Reality (AR) platform using a magnetic tracker and depth sensor to support unique interaction. Using off-the-shelf hardware and efficiently designed software, we demonstrate four possible modalities based on hand input and provide a platform that game and interaction designers can use to explore AR.
|
|
A Visual Programming Language for Advanced AR Applications in Unity Adrian Clark, Mark Billinghurst, Tristan Scott In this demo we present a visual programming authoring tool which allows non-programmers to create complex interactive Augmented Reality (AR) experiences on mobile devices, utilizing the power of the Unity development environment and the Qualcomm Vuforia AR engine, without ever writing a single line of code.
|
|
Robust Monocular SLAM in Dynamic Environments Wei Tan, Haomin Liu, Zilong Dong, Guofeng Zhang and Hujun Bao We present a novel real-time monocular SLAM system which can robustly work in dynamic environments. Different to the traditional methods, our system allows parts of the scene to be dynamic or the whole scene to gradually change. Two augmented reality games will be provided for people to play.
|
|
Handling Pure Camera Rotation in Keyframe-Based SLAM Christian Pirchheim, Dieter Schmalstieg, Gerhard Reitmayr We demonstrate a monocular visual SLAM system that combines 6DOF and panoramic SLAM into a hybrid keyframe-based system, as described in our ISMAR’13 paper. It handles temporary rotations away from the mapped part of the scene and dynamically switches between full 6DOF and 3DOF tracking and mapping modes
|
Social Program