Multi-Sensor Navigation Algorithm Using Monocular Camera, IMU and GPS for Large Scale Augmented Reality

SRI authors: ,

Citation

Oskiper, T.; Samarasekera, S.; Kumar, R., “Multi-sensor navigation algorithm using monocular camera, IMU and GPS for large scale augmented reality,” Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on, vol., no., pp.71,80, 5-8 Nov. 2012

Abstract

Camera tracking system for augmented reality applications that can operate both indoors and outdoors is described. The system uses a monocular camera, a MEMS-type inertial measurement unit (IMU) with 3-axis gyroscopes and accelerometers, and GPS unit to accurately and robustly track the camera motion in 6 degrees of freedom (with correct scale) in arbitrary indoor or outdoor scenes. IMU and camera fusion is performed in a tightly coupled manner by an error-state extended Kalman filter (EKF) such that each visually tracked feature contributes as an individual measurement as opposed to the more traditional approaches where camera pose estimates are first extracted by means of feature tracking and then used as measurement updates in a filter framework. Robustness in feature tracking and hence in visual measurement generation is achieved by IMU aided feature matching and a two-point relative pose estimation method, to remove outliers from the raw feature point matches. Landmark matching to contain long-term drift in orientation via on the fly user generated geo-tiepoint mechanism is described.


Read more from SRI