Proceedings Volume 6238

Acquisition, Tracking, and Pointing XX

cover
Proceedings Volume 6238

Acquisition, Tracking, and Pointing XX

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 2 May 2006
Contents: 4 Sessions, 12 Papers, 0 Presentations
Conference: Defense and Security Symposium 2006
Volume Number: 6238

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Laser Systems Acquisition, Tracking, and Pointing Technologies
  • Tracking Systems Technologies I
  • Tracking Systems Technologies II
  • Algorithms for Radar Tracking Systems
Laser Systems Acquisition, Tracking, and Pointing Technologies
icon_mobile_dropdown
Laser system for cooperative pointing and tracking of moving terminals over long distance.
Dean S. Grinch, James A. Cunningham, Donald S. Fisher
A presentation of the system approach and hardware implementation of a practical precision laser pointing and tracking system designed to enable operation of free-space optical (FSO) communication "lasercom" terminals which are in motion across a nominal range exceeding 100 km. The technique is applicable to other widely spaced mobile systems, which must remain accurately pointed toward one another. The system's initial pointing relies upon prior known or GPS-provided coordinates and subsequent tracking is maintained through the detection of a laser beacon signal emanating from the remote terminal. Besides meeting the range and accuracy goals for this particular communications application, a major consideration for the hardware design was to facilitate continuing product refinement and migration toward more compact, more robust, lower cost mobile or airborne systems suitable for production in quantity.
Estimation filters for missile tracking with airborne laser
This paper examines the use of various estimation filters on the highly non-linear problem of tracking a ballistic missile during boost phase from a moving airborne platform. The aircraft receives passive bearing data from an IR sensor and range data from a laser rangefinder. The aircraft is assumed to have a laser weapon system that requires highly accurate bearing information in order to keep the laser on target from a distance of 100-200 km. The tracking problem is made more difficult due to the changing acceleration of the missile, especially during stage drop-off and ignition. The Extended Kalman Filter (EKF), Unscented Kalman Filter (UKF), 'bootstrap' Particle Filter (PF), and the Gaussian Sum Particle Filter (GSPF) are explored using different values for sensor accuracy in bearing and range, and various degrees of uncertainty of the target and platform dynamic. Scenarios were created using Satellite Toolkit© for trajectories from a Southeast Asia launch with associated sensor observations. MATLAB© code modified from the ReBEL Toolkit© was used to run the EKF, UKF, PF, and GSPF sensor track filters. Mean Square Error results are given for tracking during the period when the target is in view of the radar and IR sensors. This paper provides insight into the accuracy requirements of the sensors and the suitability of the given estimators.
Amplitude phase adaptive correction of optical waves distortions
Some schemes of realization of amplitude-phase correction are considered. The problem of application adaptive corrections is solved at formation of laser beams in a turbulent atmosphere on the basis of use of a reference beam which propagate towards to the initial one, and in a plane of radiation of the reference beam there is the adaptive correction representing introduction of "predistortions" in the initial beam on is algorithms as "wave-front reversion", or "phase conjugation" fields of a reference beam. In a plane of radiation there is a rearrangement of an initial beam on the basis of measurements of fluctuations in a reference beam. In work on efficiency of correction of beam formation numerical research of influence of an errors of approximation of amplitude and a phase of a field is carried out a turbulent atmosphere in conditions of strong fluctuations of intensity.
Tracking Systems Technologies I
icon_mobile_dropdown
Tracking filter algorithm for automatic video tracking
In addition to servo control and power amplification, motion control systems for optical tracking pedestals feature capabilities such as electro-optical tracking using an integrated Automatic Video Tracker (AVT) card. An electro-optical system tracking loop is comprised of sensors mounted on a pointing pedestal, an AVT that detects a target in the sensor imagery, and a tracking filter algorithm that commands the pedestal to follow the target. The tracking filter algorithm receives the target boresight error from the AVT and calculates motion demands for the pedestal servo controller. This paper presents a tracking algorithm based on target state estimation using a Kalman filter. The servo demands are based on calculating the Kalman filter state estimate from absolute line-of-sight angles to the target. Simulations are used to compare its performance to tracking loops without tracking filters, and to other tracking filter algorithms, such as rate feedback loops closed around boresight error. Issues such as data latency and sensor alignment error are discussed.
Autonomous target reacquisition after image disturbance
Lee Wren, John Thornton, David White, et al.
Many Command-to-Line-of-Sight missile systems use ground-based electro-optic sensors to track their targets. Both optical and Infra-Red systems can be affected by launch effects, which can include camera shake on launch and target obscuration due to the missile exhaust plume. Further effects can be encountered during flight including aimpoint disturbance, launch debris and countermeasures. An automatic video tracking system (AVT) is required to cope with all of these distractions, whilst maintaining track on the primary target. If track is broken during the engagement, the AVT needs to employ a strategy that will enable reacquisition of the primary target with the minimum of delay. This task can be significantly more complicated in a cluttered scene. This paper details such a reacquisition algorithm, the primary purpose of which is to correctly identify the primary target whilst reducing the reacquisition timeline. Results are presented against synthetic imagery and actual missile firings.
Tracking Systems Technologies II
icon_mobile_dropdown
Spherical alignment of imagers using optical flow fields
Ben Lambert, Jason F. Ralph, Lee Wren, et al.
Optical flow fields can be used to recover some components of the camera ego-motion such as velocity and angular velocity. In this paper, we discuss the use of optical flow fields to estimate the relative orientation of two imagers with non-overlapping fields of view. The algorithms proposed are based on a spherical alignment technique which is closely related to rapid transfer alignment methods used to align aircraft inertial navigation systems. Of particular importance is the relationship between the accuracy of the optical flow field (which is dependent upon the complexity of the scene and the resolution of the cameras) and the accuracy of the resultant alignment process.
Electronic image stabilization based on the spatial intensity gradient
The presence of parasitic jitter in video sequences can degrade imaging system performance. Image stabilization systems correct for this jitter by estimating motion and then compensating for undesirable movements. These systems often require tradeoffs between stabilization performance and factors such as system size and computational complexity. This paper describes the theory and operation of an electronic image stabilization technique that provides sub-pixel accuracy while operating at real-time video frame rates. This technique performs an iterative search on the spatial intensity gradients of video frames to estimate and refine motion parameters. Then an intelligent segmentation approach separates desired motion from undesired motion and applies the appropriate compensation. This computationally efficient approach has been implemented in the existing hardware of compact infrared imagers. It is designed for use as both a standalone stabilization module and as a part of more complex electro-mechanical stabilization systems. For completeness, a detailed comparison of theoretical response characteristics with actual performance is also presented.
Study of compact stereoscopic system for target distance estimation
Distance measurement is necessary in a variety of fields, including targeting, surveillance, reconnaissance, robotics, and cartography. Today, the most commonly used method for distance measurement is laser ranging. However, laser rangers being active systems require more energy and cost more than passive systems, and they can be detected by the adversary. Stereoscopic vision, a passive system, requires low levels of power and allows covert operation. This study considers stereoscopic vision with a compact, portable system, and investigates its essential parameters that can be optimized for accurate distance measurement. The main parameters addressed in this study are the distance between the two cameras, the kernel size used for correlation between the two images, and the quality of the image measured by the standard deviation of pixel values. The distance estimation accuracy is determined as a function of these parameters and the range to target. To represent a compact, portable system, the study considered parallel camera pairs placed 6 inches or 12 inches apart. Using small, visible light digital cameras, the slant range measurement error is less than 3% with 12 inches camera spacing, and a correlation kernel of 30 pixels in width. Larger camera spacing and shorter ranges to target increase the disparity and decrease the distance estimate error. Analytical error predictions explain the experimental results.
Automated position estimation of target using view extrapolation
Humera Noor, Shahid Hafeez Mirza
Position Estimation of target has always remained a critical task in defense applications. A variety of techniques exists for evaluation of position of objects over time. This paper discusses the idea of view morphing to generate future images of scenes having moving objects. It is highlighted that the concepts of view interpolation may be extended to synthesize new views that are NOT present between the given views with reference to time and/or position. This problem is addressed using View Extrapolation. It is based on the assumption that present development of non-stationary objects will continue in the same direction and with unvarying speed. The problem has been solved by dividing it into the three steps of Prewarping, View Extrapolation and Postwarping. It is pointed out that due consideration be given to the time passed between the capture of original views and time at which the new view is to be generated. This will help in finding the motion related parameters of the scene. This paper outlines an algorithm and highlights the issues to be considered to generate future images using the information in existing ones.
Algorithms for Radar Tracking Systems
icon_mobile_dropdown
A MUSIC (multiple signal classification) algorithm for specifying azimuth, elevation, and range of multiple sources
The MUSIC (Multiple Signal Classification) algorithm uses the phase difference at different antenna elements of a receiving antenna array to determine the azimuth and elevation angles of a source. This algorithm can be extended to determine the range of multiple sources as well as their azimuth and elevation angles. In this report a generalized MUSIC algorithm is presented that accomplishes this task when the receiving antenna consists of a planar, rectangular, array of receiving elements. Greater range accuracies can be achieved by increasing the signal to noise ratio, increasing the number of PRIs per CPI, and searching for a solution over range space with a finer mesh. The mesh employed in this study had a range gate size that was 10% of the range space searched. An increase in range accuracy gained by the latter two methods comes at the price of increased processing time.
Performance analysis of fuzzy logic particle filter compared to fuzzy IMM in tracking high-performance targets
A high-performance target may accelerate at non-uniform rates, complete sharp turns within short time periods, thrust, roll, and pitch; which may not follow a linear model. Even though the interacting multiple model (IMM) can be considered as a multimodal approach, it still requires prior knowledge about the target model. To overcome this weakness, a fuzzy logic particle filter (FLPF) is used. It is comprised of single-input single-output; which is presented by fuzzy relational equations. A canonical-rule based form is used to express each of these fuzzy relational equations. The dynamics of the high-performance target are modeled by multiple switching (jump Markov) systems. The target may follow one-out of-seven dynamic behavior model at any time in the observation period under assumption of coordinate turn model. The FLPF has the advantage that it does not require any prior knowledge of statistical models of process as in IMM. Moreover, it does not need any maneuver detector even when tracking a high performance target; which results in less computational complexities. By using an appropriate fuzzy overlap set, only a subset of the total number of models need to be evaluated, and these will be conditioned on acceleration values close to the estimate. This reduces the computational load compared to the fuzzy IMM (FIMM) algorithm. To achieve the whole range of maneuver variables, more models can be added without increasing the computational load as the number of models evaluated is determined only by the overlap. An example is included for visualizing the effectiveness of the proposed algorithm. Simulation results showed that the FLPF has good tracking performance and less computational load compared to the FIMM when applied to systems characterized by large scan periods.
Simplified generalized asynchronous track fusion filter
This paper provides a simplified solution to the general asynchronous track fusion problem of the authors. The original solution solves a practical sensor to sensor track fusion problem when the sensors used are asynchronous, communication delays exist between sensor platforms and track fusion center, and tracks may arrive out-of-sequence. The new fusion rule is derived under mild assumptions. The rule does not require the synchronization of tracks for the purpose of track fusion. The Bar-Shalom-Campo fusion rule is derived as a special case of the new rule for the case of synchronous tracks. This rule is also illustrated on an alpha-beta filter.