(523b) A Feasible Generalized Least-Squares Approach to Disturbance Covariance Identification | AIChE

(523b) A Feasible Generalized Least-Squares Approach to Disturbance Covariance Identification

Authors 

Zagrobelny, M. A. - Presenter, University of Wisconsin-Madison
Rawlings, J. B., University of Wisconsin-Madison

Identifying disturbance covariances has application both to estimator de- sign and controller performance monitoring. Knowledge of the process and measurement noise covariances is required to find the optimal gain for the Kalman filter for linear systems. An accurate disturbance model is also necessary to calculate the theoretical benchmark for controller performance monitoring. The achieved performance can be compared to this optimal benchmark [3]. For a linear system, the disturbances are modeled as zero- mean white process and measurement noises. Their covariances can be esti- mated using autocovariance least-squares (ALS). ALS uses the input-output data, as well as any stable estimator, to calculate the sample autocovariances of the prediction error [1, 2]. The unknown covariance matrices are estimated by solving a least-squares problem to fit the theoretical autocovariances to the sample autocovariances.
In the current ALS method, the least-squares problem is weighted with the identity matrix. Although this method is simple to use, the choice of weighting has no theoretical basis and in general does not provide the mini- mum variance estimator. The generalized least-squares estimator would give the true minimum variance estimates, but it requires knowledge of the vari- ance of the data, in this case the sample autocovariances [2]. The theoretical formula for this variance, given in Rajamani and Rawlings [2] becomes in- creasingly complicated due to the correlations between each element in the sample autocovariances. As a result of these correlations, the theoretical variance is intractable to calculate for systems with a large number of data points. In addition to intractability, calculating this variance requires knowl- edge of the unknown process and measurement noise covariance matrices, and therefore would require an iterative scheme.
An alternative to generalized least-squares, feasible generalized least- squares estimates the variance used to weight the least-squares problem. For application to ALS, the variance of the autocovariances can be approximated from data. In this approach, the data is divided into several sections, the autocovariances are calculated for each section, and the sample variance of these autocovariances is calculated. The inverse of the sample variance is then used to weight the least-squares portion of the problem. As a purely data-based approximation, this method eliminates the need for knowledge of the unknown disturbance covariances and no longer faces computational limitations. This approach is shown to significantly reduce the variance of the results from ALS and thus provides a more reliable estimate.
The covariances from the feasible generalized ALS technique are applied
1
to estimator design. As shown by the innovations, the resulting Kalman fil- ter behaves optimally over multiple data sets. These covariances also can be applied to produce a realistic benchmark for controller performance moni- toring.

References

[1] B. J. Odelson, M. R. Rajamani, and J. B. Rawlings. A new autocovariance least-squares method for estimating noise covariances. Automatica, 42(2):
303â??308, February 2006.
[2] M. R. Rajamani and J. B. Rawlings. Estimation of the disturbance struc- ture from data using semidefinite programming and optimal weighting. Automatica, 45:142â??148, 2009.
[3] M. A. Zagrobelny, L. Ji, and J. B. Rawlings. Quis custodiet ipsos cus- todes? Annual Rev. Control, 37:260â??270, 2013.
2

Checkout

This paper has an Extended Abstract file available; you must purchase the conference proceedings to access it.

Checkout

Do you already own this?

Pricing

Individuals

AIChE Pro Members $150.00
AIChE Graduate Student Members Free
AIChE Undergraduate Student Members Free
AIChE Explorer Members $225.00
Non-Members $225.00