A multi -Kalman filtering approach for video tracking of human -delineated objects in cluttered environments

Jean Xuejing Gao, Purdue University

Abstract

In this paper, we propose a new approach that uses a motion-estimation based framework for video tracking of objects in cluttered environments. Our approach is semi-automatic, in the sense that a human is called upon to delineate the boundary of the object to be tracked in the first frame of the image sequence. The approach presented requires no camera calibration; therefore it is not necessary that the camera be stationary. The heart of the approach lies in extracting features and estimating motion through multiple applications of Kalman filtering. The estimated motion is used to place constraints on where to seek feature correspondences; successful correspondences are subsequently used for Kalman-based recursive updating of the motion parameters. Associated with each feature is the frame number in which the feature makes its first appearance in an image sequence. All features that make first-time appearance in the same frame are grouped together for Kalman-based updating of motion parameters. Finally, in order to make the tracked object look visually familiar to the human observer, the system also makes its best attempt at extracting the boundary contour of the object—a difficult problem in its own right since self-occlusion created by any rotational motion of the tracked object would cause large sections of the boundary contour in the previous frame to disappear in the current frame. Boundary contour is estimated by projecting the previous-frame contour into the current frame for the purpose of creating neighborhoods in which to search for the true boundary in the current frame. Our approach has been tested on a wide variety of video sequences, some of which are shown in this paper.

Degree

Ph.D.

Advisors

Kosaka, Purdue University.

Subject Area

Electrical engineering

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS