A new Kalman-filter based framework for fast and accurate visual tracking of rigid objects

Youngrock Yoon, Purdue University


The best of Kalman-filter based frameworks reported in the literature for rigid object tracking work well only if the object motions are smooth (which allows for tight uncertainty bounds to be used for where to look for the object features to be tracked). In this thesis, we present a new Kalman-filter based framework that carries out fast and accurate rigid object tracking even when the object motions are large and jerky. The new framework has several novel features, the most significant of which is as follows: The traditional backtracking consists of undoing one-at-a-time the model-to-scene matchings as the pose-acceptance criterion is violated. In our new framework, once a violation of the pose-acceptance criterion is detected, we seek the best largest subset of the candidate scene features that fulfill the criterion, and then continue the search until all the model features have been paired up with their scene correspondents (while, of course, allowing for nil-mapping for some of the model features). With the new backtracking framework, our Kalman filter is able to update on a real-time basis the pose of a typical industrial 3D object moving at the rate of approximately 5 cms per second (typical for automobile assembly lines) using an off-the-shelf PC hardware. Pose updating occurs at the rate of 7 frames per second and is immune to large jerks introduced manually as the object is in motion. The objects are tracked with an average translational accuracy of 4.8 mm and the average rotational accuracy of 0.27°.^




Avinash C. Kak, Purdue University.

Subject Area

Engineering, Electronics and Electrical|Computer Science

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server