Proceedings PaperTracking point features with distributed imaging sensors
|Format||Member Price||Non-Member Price|
The use of multiple distributed imaging sensors to track point or blob features is motivated by a desire to estimate 3-dimensional kinematics that are not easily derived from individual imaging sensors. Several distinct processing steps can be identified. First, the desired features are tracked based only on data collected by each sensor individually (sensor-level tracks). Next, these sensor-level tracks are compared with one-another and those corresponding to common features are identified. Based on this data correspondence, the sensor-level tracks are combined to form measurements of 3-D kinematic parameters. Finally, the resulting time- sequence of measurements is smoothed or filtered to improve accuracy as well as to estimate 3-D parameters not measured directly. This paper briefly reviews algorithms for each of these processing steps and describes a tracking system that results from considering the interaction among the individual steps.