(194i) Data-Driven State Observation through Kernel Canonical Correlation Method
AIChE Annual Meeting
2024
2024 AIChE Annual Meeting
Computing and Systems Technology Division
10B: Advances in Process Control I
Monday, October 28, 2024 - 5:38pm to 5:54pm
While it is possible to achieve input-output data-driven control without state observers, compared to output-feedback control, a state-feedback control on the basis of state observers is expected to reduce the suboptimality of data-driven control performance. Under the assumption that a given system is observable, the states are the information in the input and output measurements in both preceding times and succeeding times. Thus, kernel canonical correlation analysis (KCCA) is adopted in this work for the reconstruction of a state sequence from a nonlinear dynamical system that incorporates both past and future data, circumventing the need for first-principle models. Specifically, by using a nonlinear kernel function, which maps the data into a feature space, particularly the reproducing kernel Hilbert space (RHKS) [5], we employ canonical correlation analysis (CCA) to find a subspace from two datasets (i.e., past and future input and output measurements) with maximal correlation. The resulting subspace is thus an estimation of the latent state space. The proposed approach is motivated by brief prior work [6], which focused on using KCCA as a nonlinear extension for subspace identification of linear time-invariant systems.
Based on the formulation using Mercerâs Theorem and the kernel trick, an appropriate form of regularization can be further added by adopting a least squares support vector machine (LS-SVM) approach with KCCA. In this formulation, the squared error is minimized with regularization on the weight vectors. This regularized KCCA is a convex optimization problem, which results in solving a generalized eigenvalue problem in the dual space [7].
In addition to addressing the challenges posed by nonlinear dynamics through data-driven approaches, ensuring robust performance guarantees is vital for practical application. To this end, a probabilistic performance guarantee is proposed by leveraging Rademacher complexity to demonstrate the statistical consistency of the KCCA algorithm. Specifically, an upper bound is estimated for the generalized observation error. The application to a numerical example and exothermic continuously stirred reactor (CSTR) case study demonstrates that KCCA is a sufficient, performance-guaranteed, model-free state observer, provided adequate data.
References:
(1) Rawlings, J. B.; Mayne, D. Q.; Diehl, M. M., Model Predictive Control: Theory, Computation, and Design, 2nd ed.; Nob Hill: 2017.
(2) Kravaris, C.; Hahn, J.; Chu, Y. Computers & Chemical Engineering 2013, 51, 111â123.
(3) Tang, W.; Woelk, M. In 2023 American Control Conference (ACC), 2023, pp. 3036â 3041.
(4) Tang, W. AIChE Journal 2023, 69, e18224.
(5) Wahba, G.; Schölkopf, B.; Burges, C. J. C. Support vector machines, reproducing kernel Hilbert spaces and the randomized GACV. In Advances in Kernel Methods-Support Vector Learning; MIT Press, 1999; pp. 69â87.
(6) Verdult, V.; Suykens, J. A. K.; Boets, J.; Goethals, I.; Moor, B. D. In Proceedings of the 16th international symposium on Mathematical Theory of Networks and Systems (MTNS2004), 2004.
(7) Suykens, J. A. K.; De Brabanter, J.; Lukas, L.; Vandewalle, J. Neurocomputing 2002, 48, 85â105.