Date & Time : Thursday, September 11 12:00 pm - 12:30 pm
Location : TBA
Posters :
Augmentation of Live Excavation Work for Subsurface Utilities Engineering
Organizers:
Stéphane Côté, Ian Létourneau, Jade Marcoux-Ouellet
Description:
The virtual excavation is a well-known augmentation technique
that was proposed for city road environments. It can be used for
planning excavation work by augmenting the road surface with a
virtual excavation revealing subsurface utility pipes. In this paper,
we proposed an extension of the virtual excavation technique for
live augmentation of excavation work sites. Our miniaturized
setup, consisting of a sandbox and a Kinect device, was used to
simulate dynamic terrain topography capture. We hypothesized
that the virtual excavation could be used live on the ground being
excavated, which could facilitate the excavator operator’s work.
Our results show that the technique can indeed be adapted to
dynamic terrain topography, but turns out to occlude terrain in a
potentially hazardous way. Potential solutions include the use of
virtual paint markings instead of a virtual excavation.
Augmented Reality Binoculars on the Move
Organizers:
Taragay Oskiper, Mikhail Sizintsev, Vlad Branzoi, Supun Samarasekera, Rakesh Kumar
Description:
In this paper, we expand our previous work on augmented reality
(AR) binoculars to support wider range of user motion - up to thousand
square meters compared to only a few square meters as before.
We present our latest improvements and additions to our pose estimation
pipeline and demonstrate stable registration of objects on
the real world scenery while the binoculars are undergoing significant
amount of parallax-inducing translation.
Contact-view: A Magic-lens Paradigm Designed to Solve the Dual-view Problem
Organizers:
Klen Čopič Pucihar, Paul Coulton
Description:
Typically handheld AR systems utilize a single back-facing camera and the screen in order to implement device transparency. This creates the dual-view problem a consequence of virtual transparency which does not match true transparency—what the user would see looking through a transparent glass pane.
The dual-view problem affects usability of handheld AR systems and is commonly addressed though user-perspective rendering solutions. Whilst such approach produces promising results, the complexity of implementing user-perspective rendering and the fact it does not solve all sources that produce the dual-view problem, means it only ever addresses part of the problem.
This paper seeks to create a more complete solution for the dual-view problem that will be applicable to readily available handheld-device. We pursue this goal by designing, implementing and evaluating a novel interaction paradigm we call ‘contact-view’. By utilizing the back and front-facing camera and the environment base-plane texture—predefined or incrementally created on the fly, we enable placing the device directly on top of the base-plane. As long as the position of the phone in relation to the base-plane is known, appropriate segment of the occluded base-plane can be rendered on the device screen, result of which is transparency in which dual-view problem is eliminated.
HMD Video See Though AR with Unfixed Cameras Vergence
Organizers:
Vincenzo Ferrari, Fabrizio Cutolo, Emanuele Maria Calabrò, Mauro Ferrari
Description:
Stereoscopic video see though AR systems permit accurate marker video based registration. To guarantee accurate registration, cameras are normally rigidly blocked while the user could require changing their vergence. We propose a solution working with lightweight hardware that, without the need for a new calibration of the cameras relative pose after each vergence adjustment, guarantees registration accuracy using pre-determined calibration data.
Local Optimization for Natural Feature Tracking Targets
Organizers:
Elias Tappeiner, Dieter Schmalstieg, Tobias Langlotz
Description:
In this work, we present an approach for optimizing targets for natural feature-based pose tracking such as used in Augmented Reality applications. Our contribution is an approach for locally optimizing a given tracking target instead of applying global optimizations, such as proposed in the literature. The local optimization together with visualized trackability rating leads to a tool to create high quality tracking targets.
Motion Detection based Ghosted Views for Occlusion Handling in Augmented Reality
Organizers:
Arthur Padilha, Veronica Teichrieb
Description:
This work presents an improvement to the scene analysis pipeline of a visualization technique called Ghosting. Computer vision and image processing techniques are used to extract natural features, from each video frame. These features will guide the assignment of transparency to pixels, in order to give the ghosting effect, while blending the virtual object into the real scene. Video sequences were obtained from traditional RGB cameras. The main contribution of this work is the inclusion of a motion detection technique to the scene feature analysis step. This procedure leads to a better perception of the augmented scene because the proper ghosting effect is achieved when a moving natural salient object, that catches users attention, passes in front of an augmented one.
Ongoing development of a user-centered, AR testbed in industry
Organizers:
Luca Bertuccelli, Taimoor Khawaja, Paul O'Neill, Bruce Walker
Description:
User experience assessment of new augmented reality (AR) technology is an increasingly important area of research, including industrial applications. In our domain, many services field technicians have traditionally relied on stand alone tools with restricted connectivity and information visualization capabilities for on-site diagnostics and maintenance. With new hand-held and wearable AR technology and recent developments in cloud-based computing, new services can be delivered so that they are more interactive and more connected, with a goal to ultimately improve the efficiency and productivity of the technician. It is fundamental for acceptance that this technology enables a high quality of user experience, and a user-centered design framework is necessary for the testing and evaluation of these new technologies. This paper presents a testbed that we are building at United Technologies Research Center that leverages user-centered design framework for developing and deploying AR applications both for hand held devices as well as wearable AR glasses. We present two test cases from our testbed: (a) a hand-held based AR application for active diagnostics in building HVAC systems; (b) an interactive AR application for aircraft engine maintenance based on wearable see-through AR glasses.
QR Code Alteration for Augmented Reality Interactions
Organizers:
Han Park, Taegyu Kim, Jun Park
Description:
QR code, for its recognition robustness and data capacity, has been often used for Augmented Reality applications as well as for other commercial applications. However, it is difficult to enable tangible interactions through which users may change 3D models or animations. It is because QR codes are automatically generated by the rules, and are not easily modifiable. Our goal was to enable QR code based Augmented Reality interactions. By analysis and through experiments, we discovered that some parts of a QR code can be altered to change the text string that the QR code represents. In this paper, we introduced a prototype for QR code based Augmented Reality interactions, which allows for Rubik’s cube style rolling interactions.
Smartwatch-Aided Handheld Augmented Reality
Organizers:
Darko Stanimirovic, Daniel Kurz
Description:
We propose a novel method for interaction of humans with real objects in their surrounding combining Visual Search and Augmented Reality (AR). This method is based on utilizing a smartwatch tethered to a smartphone, and it is designed to provide a more user-friendly experience compared to approaches based only on a handheld device, such as a smartphone or a tablet computer. The smartwatch has a built-in camera, which enables scanning objects without the need to take the smartphone out of the pocket. An image captured by the watch is sent wirelessly to the phone that performs Visual Search and subsequently informs the smartwatch whether digital information related to the object is available or not.
We thereby distinguish between three cases. If no information is available or the object recognition failed, the user is notified accordingly. If there is digital information available that can be presented using the smartwatch display and/or audio output, it is presented there. The third case is that the recognized object has digital information related to it, which would be beneficial to see in an Augmented Reality view spatially registered with the object in real-time. Then the smartwatch informs the user that this option exists and encourages using the smartphone to experience the Augmented Reality view. Thereby, the user only needs to take the phone out of the pocket in case Augmented Reality content is available, and when the content is of interest for the user.
Towards User Perspective Augmented Reality for Public Displays
Organizers:
Jens Grubert, Hartmut Seichter, Dieter Schmalstieg
Description:
We work towards ad-hoc augmentation of public displays on handheld devices, supporting user perspective rendering of display content. Our prototype system only requires access to a screencast of the public display, which can be easily provided through common streaming platforms and is otherwise self-contained. Hence, it easily scales to multiple users.
Turbidity-based Aerial Perspective Rendering for Mixed Reality
Organizers:
Carlos Morales, Takeshi Oishi, Katsushi Ikeuchi
Description:
In outdoor Mixed Reality (MR), objects distant from the observer suffer from an effect called aerial perspective that fades the color of the objects and blends it to the environmental light color. The aerial perspective can be modeled using a physics-based approach; however, handling the changing and unpredictable environmental illumination is demanding. We present a turbidity-based method for rendering a virtual object with aerial perspective effect in a MR application. The proposed method first estimates the turbidity by matching luminance distributions of sky models and a captured omnidirectional sky image. Then the obtained turbidity is used to render the virtual object with aerial perspective.
Using Augmented Reality to Support Information Exchange of Teams in the Security Domain
Organizers:
Dragos Datcu, Marina Cidota, Heide Lukosch, Stephan Lukosch
Description:
For operational units in the security domain that work together in teams it is important to quickly and adequately exchange context-related information. This extended abstract investigates the potential of augmented reality (AR) techniques to facilitate information exchange and situational awareness of teams from the security domain. First, different scenarios from the security domain that have been elicited using an end-user oriented design approach are described. Second, a usability study is briefly presented based on an experiment with experts from operational security units. The results of the study show that the scenarios are well-defined and the AR environment can successfully support information exchange in teams operating in the security domain.
Utilizing Contact-view as an Augmented Reality Authoring Method for Printed Document Annotation
Organizers:
Klen Čopič Pucihar, Paul Coulton
Description:
In Augmented Reality (AR) the real world is enhanced by superimposed digital information commonly visualized through augmented annotations. The visualized data comes from many different data sources. One increasingly important source of data is user generated content. Unfortunately, AR tools that support user generated content are not common hence the majority of augmented data within AR applications is not generated utilizing AR technology. In this paper we discuss the main reasons for this and evaluate how the contact-view paradigm could enhance annotation authoring process within the class of tabletop size AR workspaces. This evaluation is based on a prototype that allows musicians to annotate a music score manuscript utilizing freehand drawing on top of device screen. Experimentation showed the potential of contact-view paradigm as an annotation authoring method that performs well in single and collaborative multi-user situations.
Visualization of Solar Radiation Data in Augmented Reality
Organizers:
Maria Beatriz Carmo, Ana Paula Cláudio, António Ferreira, Ana Paula Afonso, Paula Redweik, Cristina Catita, Miguel Centeno Brito, José Nunes Pedrosa
Description:
We present an AR application for visualizing solar radiation data on facades of buildings, generated from LiDAR data and climatic observations. Data can be visualized using colored surfaces and glyphs. A user study revealed the proposed AR visualizations were easy to use, which can lead to leverage the potential benefits of AR visualizations: to detect errors in the simulated data, to give support to the installation of photovoltaic equipment and to raise public awareness of the use of facades for power production.