Proceedings Volume 3067

Sensor Fusion: Architectures, Algorithms, and Applications

cover
Proceedings Volume 3067

Sensor Fusion: Architectures, Algorithms, and Applications

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 16 June 1997
Contents: 3 Sessions, 20 Papers, 0 Presentations
Conference: AeroSense '97 1997
Volume Number: 3067

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Fusion System Architectures and Strategies
  • Fusion System Applications
  • Fusion Algorithm Developments
  • Fusion System Applications
  • Fusion Algorithm Developments
  • Fusion System Applications
  • Fusion Algorithm Developments
  • Fusion System Architectures and Strategies
  • Fusion Algorithm Developments
  • Fusion System Applications
Fusion System Architectures and Strategies
icon_mobile_dropdown
Novel architecture for expert-assisted decision-level fusion
In this paper, we discuss a new fusion architecture, including some preliminary results on field data. The architecture consists of a new decision level fusion algorithm, the piecewise level fusion algorithm (PLFA), integrated with a new expert system based user assistant that adjusts PLFA parameters to optimize for a user desired classification performance. This architecture is applicable for both multisensor and multilook fusion. The user specifies classification performance by inputting entries for a desired confusion matrix at the fusion center. The intelligent assistant suggests input alternatives to reach the performance goal based on previously supplied user inputs and on performance specifications of the individual sensors. If deadlock results, i.e., the goal is not attainable because of conflicting user inputs, the assistant will inform the user. As the user and assistant interact, the assistant calculates the parameters necessary to automatically adjust the PLFA for the required performance. These parameters and calculations are hidden from the user. That is, the architecture is designed so that user inputs are intuitive for an unskilled operator. The implementation of this adaptable fusion architecture is due to the relatively simple structure of the PLFA and the expert system heuristic rules. We briefly describe the PLFA structure and operation, illustrate some expert system rules, and discuss preliminary performance of the entire architecture, including a sample dialogue between the user and the intelligent assistant. We conclude this paper with a discussion of future extensions to this architecture that include replacing human interactions with dynamic learning techniques.
Decision fusion strategies for target detection with a three-sensor suite
Boolean logic based decision fusion strategies for target detection with two sensors have been studied in detail in the literature over the years. Increasing the number of sensors to three, offers an added dimension of flexibility in the design of fusion strategies. One could visualize a single stage fusion wherein the decision outputs of all the three sensor subsystems are fused simultaneously under a variety of strategies such as AND logic, or majority (simple or firm decisions only) logic, or a no-firm contradiction logic. Alternatively, one could explore a two-stage fusion strategy, wherein either an AND or an OR logic is used at the first stage combining the decisions of two of the sensor subsystems, followed by a similar logic choice combining the fused decision from the first level with the decision from the third sensor subsystem. Of these strategies, while some may turn out to be equivalent in a mathematical sense, others remain clearly unique. The study analyzes these strategies to assess their relative benefits. The paper concludes with a brief discussion on possible extensions in terms of temporal fusion strategies that exploit information derived from multiple looks and the potential for application to real-world problems such as mine detection.
Identity and attribute information fusion using evidential reasoning
Eloi Bosse, Marc-Alain Simard
The aim of this paper is to explore the problem of fusing identity and attribute information emanating from different sources, and to offer the decision maker a quantitative analysis based on statistical methodology that can enhance his/her decision making process regarding the identity of detected objects. Two identity information fusion architectures are discussed here. The first is concerned with the fusion of identity declarations where the sources are expected to provide only useful and complete results such as an identity declaration. The second is concerned with the fusion of attribute information using a modified version of the Dempster-Shafer evidential combination algorithm.
Asymmetric fusion strategies for target detection in multisensor environments
Most of the studies reported in the open literature on sensor fusion for target detection in multisensor environments have proposed fusion strategies that are essentially independent of the identity of the object as observed by the individual sensors. This independence makes the fusion strategies symmetric relative to the identity of the objects in terms of their target or non-target (decoys, clutter, etc.) status. In this study, new ground is broken in terms of fusion strategies which, by being dependent on the identity of the objects as perceived by the individual sensors, can be asymmetric relative to the identity of the object under observation. The study analyzes the scope for and benefits of deployment of these asymmetric fusion strategies as compared to the conventional Boolean logic based symmetric fusion strategies studied previously. Under these conventional fusion strategies, while use of the Boolean AND logic for fusion tends to minimize the false alarm rate, use of OR logic maximizes the detection probability. Under the asymmetric fusion strategies conceived here, it is possible to drive the decision process towards maximizing the probability of detection of lethal objects (targets) while simultaneously minimizing the false alarm rates. The performance of these asymmetric fusion strategies, when embedded in a recursive structure that permits multiple observation and temporal fusion along the time line, are analyzed relative to that of the conventional symmetric fusion strategies to parametrically determine their domains of beneficial deployment.
Decision-level data fusion using Bayesian inference
John Parish
This paper describes a blackboard system for integrating observations from multiple sensors. Multiple sensors report observations to the blackboard system. The blackboard system correlates the observations to a set of active models, and the models are both temporally limited and also probabilistic. The design is object-oriented, to allow for extensions that accommodate new models and sensors. An example application to a grid of sensors is presented.
Fusion System Applications
icon_mobile_dropdown
Gyroscopic data fusion via a quaternion-based complementary filter
Robert Smith, Andy Frost, Penelope J. Probert
We present an application of a complementary filter system to the attitude determination of a remotely operated underwater vehicle (ROV). The main contribution of this paper is to combine existing complementary filter theory with quaternion attitude representation. The combination allows accurate attitude determination by a real-time system using low cost sensors. We fuse the estimate from an extended Kalman filter (EKF) with the output from a set of vibrating structure rate gyroscopes. The EKF supplies high-quality low frequency information, the gyroscopes supply corresponding high frequency information. The attitude is described via a quaternion representation. We discuss how the use of quaternions is beneficial for estimator design due to the low computational burden, and lack of discontinuities and singularities. The EKF combines the output from two inclinometers and a magnetometer with a vehicle process model The EKF assumes that sensor and process noise is broadband and that the process model captures all the important dynamics. An underwater vehicle is capable of rapid rotations, which are difficult to model, and would require computationally unattainable update rates to track effectively. We develop a filter, which uses the difference between the EKF and gyroscopic attitude estimates (an indirect filter) to correct for drift in the gyroscopic attitude estimate. We develop first a feedforward and then a feedback filter. The simplicity of the indirect filter permits very fast update rates, so the system may follow rapid vehicle rotations. We discuss the real-time implementation of the estimator on a transputer based system mounted within a small ROV. We present experimental results showing the system performance of the combined filter system.
Combined force and position polyvinylidene fluoride (PVDF) robotic tactile sensing system
Javad Dargahi, Andrew R. Eastwood, Ian J. Kemp
This paper reports on a tactile sensing system with only three sensing elements. The magnitude and position of the applied force is obtained by utilizing triangulation approach combined with membrane stress, some information about shape of the contact object is obtained. The sensor is designed to overcome the problems of cross-talk between sensing elements, complexity and fragility which is associated with some PVDF tactile sensors arranged in matrix form. The theoretical analysis of the sensor is made and compared with experimental results. The limitation of the sensor is also reported.
Fusion Algorithm Developments
icon_mobile_dropdown
Three-dimensional data fusion for biomedical surface reconstruction
John M. Zachary, S. Sitharama Iyengar
Traditional surface reconstruction techniques have focused exclusively on contour sections in one anatomical direction. However, in certain medical situations, such as in presurgical planning and radiation treatment, medical scans are taken of the patient in three orthogonal directions to better localize pathologies. Fusion techniques must be used to register this data with respect to a surface fitting method. We explore the issues involved in fusing data from ellipsoid anatomy, such as the brain, heart, and major organs. The output of the fusion process is a set of data points which are correlated to one another to represent the surface of a single object. This data network is then used as input to a surface fitting algorithm which depends on two sampling metrics which we derive. The solution to this problem is important in presurgical planning, radiation treatment, and telemedical systems.
Fusion System Applications
icon_mobile_dropdown
E-O and SAR image analysis and registration applications
George A. Lampropoulos, Arezki Halet, Maria T. Rey, et al.
In this paper we present an approach for association and registration of multi-sensor images. With no loss of generality, we consider herein the association and registration of SAR and visible satellite images. Although the current approach as presented is a semi-automatic registration technique, there is sufficient promise for the development of a fully automatic multi-sensor multi-spectra registration approach. Some experimental results are presented.
Fusion Algorithm Developments
icon_mobile_dropdown
Novel approach to multispectral blind image fusion
D. Kundur, D. Hatzinakos, H. Leung
In this paper we propose a robust method of data fusion for the classification of multispectral images. The approach is novel in that it attempts to remove blurring of the images in conjunction with fusing the data. This produces a more robust and accurate overall classification scheme. The approach is applicable to situations in which registered multispectral images of the same scene are available. The novel scheme is comprised of three main stages. The first stage involves the blind restoration of the degraded multispectral images to combat blurring effects. The results are fused in the second stage with a statistical classification method which performs both pixel-level and intermediate-level classification. The classification output is then passed through a final stage which provides a relative measure of the success of the classification method. This information is fed back to the first stage to improve the reliability of the restoration method. The performance of the proposed scheme is demonstrated by applying the technique to simulated and real photographic data. The simulation results demonstrate the potential of the method for robust classification of degraded data.
Fusion System Applications
icon_mobile_dropdown
Fusion of imaging and nonimaging data for surveillance aircraft
Elisa Shahbazian, Langis Gagnon, Jean Remi Duquet, et al.
This paper describes a phased incremental integration approach for application of image analysis and data fusion technologies to provide automated intelligent target tracking and identification for airborne surveillance on board an Aurora Maritime Patrol Aircraft. The sensor suite of the Aurora consists of a radar, an identification friend or foe (IFF) system, an electronic support measures (ESM) system, a spotlight synthetic aperture radar (SSAR), a forward looking infra-red (FLIR) sensor and a link-11 tactical datalink system. Lockheed Martin Canada (LMCan) is developing a testbed, which will be used to analyze and evaluate approaches for combining the data provided by the existing sensors, which were initially not designed to feed a fusion system. Three concurrent research proof-of-concept activities provide techniques, algorithms and methodology into three sequential phases of integration of this testbed. These activities are: (1) analysis of the fusion architecture (track/contact/hybrid) most appropriate for the type of data available, (2) extraction and fusion of simple features from the imaging data into the fusion system performing automatic target identification, and (3) development of a unique software architecture which will permit integration and independent evolution, enhancement and optimization of various decision aid capabilities, such as multi-sensor data fusion (MSDF), situation and threat assessment (STA) and resource management (RM).
Intelligent fusion processing in BMD applications
Claire L. McCullough, Katherine A. Byrd, Charles A. Bjork, et al.
Intelligent processing techniques are applied to a ballistic missile defense (BMD) application, focused on classifying the objects in a typical threat complex, using fused IR and ladar sensors. These techniques indicate the potential to improve designation robustness against 'off-normal'/unexpected conditions, or when sensor data or classifier performance degrades. A fused sensor discrimination (FuSeD) simulation testbed was assembled for designation experiments, to evaluate test and simulation data, assess intelligent processor and classification algorithms, and evaluate sensor performance. Results were produced for a variety of neural net and other nonlinear classifiers, yielding high designation performance and low false alarm rates. Most classifiers yield a few percent in false alarm rate; rates are further improved when multiple techniques are applied vi a majority based fusion technique. Example signatures, features, classifier descriptions, intelligent controller design, and architecture are included. Work was performed for the discriminating interceptor technology program (DITP).
Fusion Algorithm Developments
icon_mobile_dropdown
Decision fusion using channels with communication constraints
Chao-Tang Yu, Pramod K. Varshney
A decentralized detection system usually contains a number of remotely located local sensors that observe a common phenomenon and a data fusion center that makes a final decision. The local sensors are linked to the data fusion center by transmission channels. In this paper, some aspects of decision fusion problems with communication constraints are considered. Two interesting issues, namely, bandwidth allocation among the channels linking local sensors to the fusion center, and the trade-off between the number of sensors and the number of likelihood-ratio quantization levels at local sensors, are studied. Examples are presented for illustration.
Fusion System Architectures and Strategies
icon_mobile_dropdown
Formal methods of automated reasoning for situational estimation
Most research and prototype development of automated methods for situational estimation in the data fusion community have applied heuristic approaches coupled to techniques for uncertainty management. Reasoning theorists would label these methods as those of the parametric reasoning class. Such methods are reasonable when the so-called 'closed world' assumption can be confidently applied (ability to full pre- specify expected conditions) which might have been reasonable in the 'Soviet Era' but would appear fragile/brittle for current-day application. Motivated in part by these considerations and by the need to consider much more cost- effective knowledge-based-system development in an era of declining budgets, this paper offers some discussion on the applicability of more formal methods of reasoning for KBS. It is concluded that strictly formal methods for real-world applications require yet further theoretical development but that movement toward formalization is possible.
Fusion Algorithm Developments
icon_mobile_dropdown
Fusion of data from spatially separated sensors using Riemannian manifolds
In this paper, an approach for representing target classes in feature space using Riemannian manifolds is explored. In a formal application of the approach, it is required that several basic assumptions used in the development of differential and Riemannian geometry are satisfied. These assumptions relate to the concepts of allowable parametric representations and allowable coordinate transformations. Developing target class representations which satisfy these assumptions has a direct consequence on the selection of a suitable feature set. Having found a suitable feature set, the approach results in a natural coordinate system in which to calculate distance metrics used in classification algorithms. In this paper, the approach is applied to a situation where an active sensor and a passive sensor are spatially separated and are simultaneously collecting data on a set of targets. It is shown that the use of the natural coordinate system offered by this approach leads to a straightforward and mathematically rigorous method for fusing the sensor data at the feature level.
Linearly constrained least squares approach for multisensor data fusion
Yifeng Zhou, Henry Leung
In this paper, we present a linearly constrained least squares (LCLS) algorithm for multisensor data fusion. While fusion is considered in the scope of linear combination, the objective of the LCLS algorithm is to minimize the energy of the linearly fused information based on empirical sensory information. Statistical performance analysis of the LCLS algorithm will be carried out including the consistency and asymptotic covariance of the estimates. Effectiveness of the proposed fusion algorithm will be evaluated numerically based on fusion of signals and images.
Fusion rule estimation using vector space methods
Nageswara S. V. Rao
In a system of N sensors, the sensor (formula available in paper) The problem is to estimate a fusion rule (formula available in paper), based on the sample, such that the expected square error is minimized over a family of functions F that constitute a vector space. The function f* that minimizes the expected error cannot be computed since the underlying densities are unknown, and only an approximation f to f* is feasible. We estimate the sample size sufficient to ensure that f provides a close approximation to f* with a high probability. The advantages of vector space methods are two-fold: (1) the sample size estimate is a simple function of the dimensionality of F, and (2) the estimate f can be easily computed by well-known least square methods in polynomial time. The results are applicable to the classical potential function methods and also (to a recently proposed) special class of sigmoidal feedforward neural networks.
Influence of coordinate transform upon measurement error of long-baseline distributed sensors system
Hongwei Cheng, Zhongkang Sun
Coordinate transformation is always concerned in data processing of long-baseline distributed sensor systems. The influence of coordinate transformation upon measurement errors of the distributed sensors system is discussed in the paper. Detail derivations of the relation equation between the original coordinate system and the transformed new one, quantitative analysis of the influence and a number of curves are also given in the paper.
Fusion System Applications
icon_mobile_dropdown
Multiple sensor fusion for long-range surveillance: design and performance issues
Graeme Jones
This paper provides a description and detailed review of a multiple hypothesis tracking system, which handles data from two radars, and a number of other sources. An efficient method for processing detections from two (time-offset) radars, and integrating them in the multiple hypothesis framework is described. The implications of such a methodology on the tracking filter also are discussed. The paper then explains the algorithm employed for fusion of automatic dependent surveillance reports into the system, and concludes with a demonstration of some sample results.
Tracking a maneuvering target with a multisensor
Xiaoquan Song, Qi Liu, Zhongkang Sun
A fusion method for tracking a maneuvering target is presented, the fusion information is from track files of each sensor in the presence of clutter. Though the performance of this method is degraded than that of fusion from measurements, the requirements of communication and data processing ability can be effectively reduced, especially when with high density clutter. Since the common process noise correction is taken into account, the optimal track fusion method is derived.