Furthermore, as the results in Table 12.2 indicate, the steady-state covariances of all of the states in all of the nodes are smaller than those for case 1.. In other words, this example
Trang 1The results from the first strategy (no data distribution) are shown in Figure 12.7 As expected, the
system behaves poorly Because each node operates in isolation, only Node 1 (which measures x) is fully
observable The position variance increases without bound for the three remaining nodes Similarly, the velocity is observable for Nodes 1, 2, and 4, but it is not observable for Node 3
The results of the second strategy (all nodes are assumed independent) are shown in Figure 12.8 The effect of assumed independence observations is obvious: all of the estimates for all of the states in all of
the nodes (apart from x for Node 3) are inconsistent This clearly illustrates the problem of double counting.
Finally, the results from the CI distribution scheme are shown in Figure 12.9 Unlike the other two approaches, all the nodes are consistent and observable Furthermore, as the results in Table 12.2 indicate, the steady-state covariances of all of the states in all of the nodes are smaller than those for case 1 In other words, this example shows that this data distribution scheme successfully and usefully propagates
FIGURE 12.7 Disconnected nodes (A) Mean squared error in x (B) Mean squared error in ·x (C) Mean squared error in ··x Mean squared errors and estimated covariances for all states in each of the four nodes The curves for Node 1 are solid, Node 2 are dashed, Node 3 are dotted, and Node 4 are dash-dotted The mean squared error is the rougher of the two lines for each node.
(A)
50 60 70 80 90 100 0
100 200 300 400 500 600 700 800 900
1000 Average MSE x(1) estimate
(B)
50 60 70 80 90 100 0
2 4 6 8 10
12 Average MSE x(2) estimate
Trang 2Data Fusion in Nonlinear Systems
13.1 Introduction
13.2 Estimation in Nonlinear Systems
Problem Statement • The Transformation of Uncertainty 13.3 The Unscented Transformation (UT)
The Basic Idea • An Example Set of Sigma Points •
Properties of the Unscented Transformation 13.4 Uses of the Transformation
Polar to Cartesian Coordinates • A Discontinuous Transformation
13.5 The Unscented Filter (UF)
13.6 Case Study: Using the UF with Linearization Errors
13.7 Case Study: Using the UF with a High-Order Nonlinear System
13.8 Multilevel Sensor Fusion
13.9 Conclusions
Acknowledgments
References
13.1 Introduction
The extended Kalman filter (EKF) has been one of the most widely used methods for tracking and estimation based on its apparent simplicity, optimality, tractability, and robustness However, after more than 30 years of experience with it, the tracking and control community has concluded that the EKF is difficult to implement, difficult to time, and only reliable for systems that are almost linear on the time scale of the update intervals This chapter reviews the unscented transformation (UT), a mechanism for propagating mean and covariance information through nonlinear transformations, and describes its implications for data fusion This method is more accurate, is easier to implement, and uses the same order of calculations as the EKF Furthermore, the UT permits the use of Kalman-type filters in appli-cations where, traditionally, their use was not possible For example, the UT can be used to rigorously integrate artificial intelligence-based systems with Kalman-based systems
Performing data fusion requires estimates of the state of a system to be converted to a common representation The mean and covariance representation is the lingua franca of modern systems engi-neering In particular, the covariance intersection (CI)1 and Kalman filter (KF)2 algorithms provide mechanisms for fusing state estimates defined in terms of means and covariances, where each mean vector defines the nominal state of the system and its associated error covariance matrix defines a lower bound on the squared error However, most data fusion applications require the fusion of mean and covariance estimates defining the state of a system in different coordinate frames For example, a tracking
Simon Julier
IDAK Industries
Jeffrey K Uhlmann
University of Missouri
Trang 3Random Set Theory for Target Tracking and Identification
14.1 Introduction
The “Bayesian Iceberg”: Models, Optimality, Computability •
Why Multisource, Multitarget, Multi-Evidence Problems Are Tricky • Finite-Set Statistics (FISST) • Why Random Sets?
14.2 Basic Statistics for Tracking and Identification
Bayes Recursive Filtering • Constructing Likelihood Functions from Sensor Models • Constructing Markov Densities from Motion Models • Optimal State Estimators 14.3 Multitarget Sensor Models
Case I: No Missed Detections, No False Alarms • Case II:
Missed Detections • Case III: Missed Detection and False Alarms • Case IV: Multiple Sensors
14.4 Multitarget Motion Models
Case I: Target Number Does Not Change • Case II: Target Number Can Decrease • Case III: Target Number Can Increase and Decrease
14.5 The FISST Multisource-Multitarget Calculus
The Belief-Mass Function of a Sensor Model • The Belief-Mass Function of a Motion Model • The Set Integral and Set Derivative • “ Turn-the-Crank” Rules for the FISST Calculus 14.6 FISST Multisource-Multitarget Statistics
Constructing True Multitarget Likelihood Functions •
Constructing True Multitarget Markov Densities • Multitarget Prior Distributions • Multitarget Posterior Distributions •
Expected Values and Covariances • The Failure of the Classical State Estimators • Optimal Multitarget State Estimators •
Cramér-Rao Bounds for Multitarget State Estimators •
Multitarget Miss Distance 14.7 Optimal-Bayes Fusion, Tracking, ID
Multisensor-Multitarget Filtering Equations • A Short History
of Multitarget Filtering • Computational Issues in Multitarget Filtering • Optimal Sensor Management
14.8 Robust-Bayes Fusion, Tracking, ID
Random Set Models of “Ambiguous” Data • Forms of Ambiguous Data • True Likelihood Functions for Ambiguous Data • Generalized Likelihood Functions • Posteriors Conditioned on Ambiguous Data • Practical Generalized Likelihood Functions • Unified Multisource-Multitarget Data Fusion
Ronald Mahler
Lockheed Martin