Share Email Print
cover

Proceedings Paper

Position, rotation, scale, and orientation invariant object tracking from cluttered scenes
Author(s): Peter Bone; Rupert Young; Chris Chatwin
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

A method of tracking objects in video sequences despite any kind of perspective distortion is demonstrated. Moving objects are initially segmented from the scene using a background subtraction method to minimize the search area of the filter. A variation on the Maximum Average Correlation Height (MACH) filter is used to create invariance to orientation while giving high tolerance to background clutter and noise. A log r-θ mapping is employed to give invariance to in-plane rotation and scale by transforming rotation and scale variations of the target object into vertical and horizontal shifts. The MACH filter is trained on the log r-θ map of the target for a range of orientations and applied sequentially over the regions of movement in successive video frames. Areas of movement producing a strong correlation response indicate an in-class target and can then be used to determine the position, in-plane rotation and scale of the target objects in the scene and track it over successive frames.

Paper Details

Date Published: 17 April 2006
PDF: 9 pages
Proc. SPIE 6245, Optical Pattern Recognition XVII, 624508 (17 April 2006); doi: 10.1117/12.664048
Show Author Affiliations
Peter Bone, Univ. of Sussex (United Kingdom)
Rupert Young, Univ. of Sussex (United Kingdom)
Chris Chatwin, Univ. of Sussex (United Kingdom)


Published in SPIE Proceedings Vol. 6245:
Optical Pattern Recognition XVII
David P. Casasent; Tien-Hsin Chao, Editor(s)

© SPIE. Terms of Use
Back to Top