Collision Sensing by Stereo Vision and Radar Sensor Fusion

Citation

Wu, S., Decker, S., Chang, P., Camus, T., & Eledath, J., (June 2008). “Collision sensing by stereo vision and radar sensor fusion,” Intelligent Vehicles Symposium, 2008 IEEE, vol., no., pp.404,409, 4-6.

Abstract

To take the advantages of both stereo cameras and radar, this paper proposes a fusion approach to accurately estimate the location, size, pose and motion information of a threat vehicle with respect to the host from observations obtained by both sensors. To do that, we first fit the contour of a threat vehicle from stereo depth information, and find the closest point on the contour from the vision sensor. Then the fused closest point is obtained by fusing radar observations and the vision closest point. Next by translating the fitted contour to the fused closest point, the fused contour is obtained. Finally the fused contour is tracked by using the rigid body constraints to estimate the location, size, pose and motion of the threat vehicle. Experimental results from both the synthetic data and the real world road test data demonstrate the success of the proposed algorithm.


Read more from SRI

  • Banner and attendees at the IEEE Hard Tech Venture Summit

    Cultivating hard tech startups that scale

    IEEE’s Hard Tech Venture Summit convened innovators at SRI to refine strategies and build new networks.

  • Patient going into a MRI

    Bringing surgical tools inside the MRI

    Drawing on SRI’s unique innovation ecosystem, the startup Medical Devices Corner is seeking to improve cancer surgery by advancing MRI-safe teleoperation.

  • Christopher Mims and Susan Patrick

    PARC Forum: How to AI

    The Wall Street Journal tech columnist Christopher Mims and SRI Education’s Susan Patrick discuss how AI can strengthen human agency.