Stable Vision-Aided Navigation for Large-Area Augmented Reality

Citation

T. Oskiper, H. Chiu, Z. Zhu, S. Samarasekera, and R. Kumar, “Stable vision-aided navigation for large-area augmented reality,”  in Proc. IEEE VR, 2011, pp.63-70.

Abstract

In this paper, we present a unified approach for a drift-free and jitter-reduced vision-aided navigation system. This approach is based on an error-state Kalman filter algorithm using both relative (local) measurements obtained from image based motion estimation through visual odometry, and global measurements as a result of landmark matching through a pre-built visual landmark database. To improve the accuracy in pose estimation for augmented reality applications, we capture the 3D local reconstruction uncertainty of each landmark point as a covariance matrix and implicity rely more on closer points in the filter. We conduct a number of experiments aimed at evaluating different aspects of our Kalman filter framework, and show our approach can provide highly-accurate and stable pose both indoors and outdoors over large areas. The results demonstrate both the long term stability and the overall accuracy of our algorithm as intended to provide a solution to the camera tracking problem in augmented reality applications.


Read more from SRI