(712e) BO4IO: A Bayesian Optimization Approach to Inverse Optimization with Uncertainty Quantification
AIChE Annual Meeting
2024
2024 AIChE Annual Meeting
Computing and Systems Technology Division
10C: Data-driven Optimization
Thursday, October 31, 2024 - 4:54pm to 5:15pm
In this work, we propose a Bayesian optimization framework for solving general IOPs [7], which we call BO4IO. We treat the loss function of the IOP as a black box and approximate it with a non-parametric probabilistic surrogate using Gaussian process (GP) regression [8]. It looks to converge to the optimal parameter estimates that provide the minimum decision loss by iteratively selecting candidate solutions through the maximization of an acquisition function. Here, the key advantage of BO4IO is that, at each iteration, it only requires the evaluation of the loss function with the current parameter estimates; this can be achieved by directly solving the FOP for every data point, which circumvents the need for a complex reformulation or special algorithm. Consequently, BO4IO remains applicable even when the FOP is nonlinear and nonconvex, and it can consider both continuous and discrete decision variables.
BO4IO also enables us to address another major challenge in IO that is often overlooked, namely the quantification of uncertainty with respect to the estimated parameters. In IO, there are often multiple parameter estimates that lead to the same loss; these âequally goodâ estimates form the so-called inverse-feasible set, of which the size can be viewed as a measure of uncertainty. A common approach to uncertainty quantification in parameter estimation is to derive sample-based confidence intervals for the point estimates based on the profile likelihood method [9]. We find that in BO4IO, we can use the posterior of the GP surrogate to approximate the profile likelihood, which provides uncertainty quantification and insights into the identifiability of given model parameters.
To assess the efficacy of the proposed BO4IO approach, we perform computational case studies covering various classes of FOPs from convex nonlinear to mixed-integer nonlinear and nonconvex programs. The computational experiments demonstrate that BO4IO can efficiently identify estimates of the unknown model parameters with a relatively small number of iterations. In addition, using the approximate profile likelihood, we can assess how the identifiability of certain parameters changes with the number and quality of observations.
References
[1] T. C. Y. Chan, R. Mahmood, and I. Y. Zhu, âInverse optimization: Theory and applications,â Oper. Res., 2023.
[2] A. Keshavarz, Y. Wang, and S. Boyd, âImputing a convex objective function,â IEEE Int. Symp. Intell. Control - Proc., pp. 613â619, 2011.
[3] T. C. Y. Chan, T. Lee, and D. Terekhov, âInverse optimization: Closed-form solutions, geometry, and goodness of fit,â Manage. Sci., vol. 65, no. 3, pp. 1115â1135, 2019.
[4] L. Wang, âCutting plane algorithms for the inverse mixed integer linear programming problem,â Oper. Res. Lett., vol. 37, no. 2, pp. 114â116, 2009.
[5] R. Gupta and Q. Zhang, âDecomposition and Adaptive Sampling for Data-Driven Inverse Linear Optimization,â INFORMS J. Comput., no. August, 2022.
[6] R. Gupta and Q. Zhang, âEfficient learning of decision-making models: A penalty block coordinate descent algorithm for data-driven inverse optimization,â Comput. Chem. Eng., vol. 170, no. October 2022, p. 108123, 2023.
[7] B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, and N. de Freitas, âTaking the Human Out of the Loop: A Review of Bayesian Optimization,â Proc. IEEE, vol. 104, no. 1, pp. 148â175, 2016.
[8] C. E. Rasmussen and C. K. I. Williams, âGaussian Processes for Machine Learning.â The MIT Press, Nov. 23, 2005.
[9] A. Raue et al., âStructural and practical identifiability analysis of partially observed dynamical models by exploiting the profile likelihood,â Bioinformatics, vol. 25, no. 15, pp. 1923â1929, 2009.