Author:

Felix Löw
Supervisor:Prof. Gudrun Klinker
Advisor:Martin Wagner
Submission Date:15.05.2005

Abstract

Augmented Reality (AR) applications enrich the real world by augmenting virtual objects. In order to gaze this fusion of real environment and virtual content Augmented Reality setups utilize common graphical output hardware like Head Mounted Displays or Tablet PC and tracking technologies to estimate the position and orientation of tracking targets. Frequently used vision-based techniques like Natural Feature Tracking are error-prone to camera move- ments. Features have to be found in subsequent video frames again. Basic idea of this work is to adopt the search area for features to the change in orientation of the user interface hard- ware. This work is a first step to solve this problem for a special class of Augmented Reality applications, Table Top Augmented Reality. The work provides a hybrid tracking approach to bring tracking and the user ’s movement context together. Orientation information given by an additional tracker is used and applied for a dynamic configuration during runtime of the vision-based tracking routine, a texture tracking algorithm. To accomplish this a special software architecture is proposed. After we introduced the basic ideas of table top Augmented Reality we show the design, the execution and evaluation of a user study. Goal is to find an approximation for a linear mapping between user motion and search window of the texture tracking routine. Applying statistical techniques we will show that it is possible to derive such a mapping. This map- ping can be expressed by a simple linear function with the change of orientation as input parameter. We will also evaluate that the user behavior is related to the performed tasks. We will identify tasks for Table Top AR and discuss implications for the tracking routine.

Results/Implementation/Project Description

Conclusion

[ PDF (optional) ] 

[ Slides Kickoff/Final (optional)]