(227c) Computing Lower Bounds in Global Optimization By Tractably Sampling Convex Relaxations | AIChE

(227c) Computing Lower Bounds in Global Optimization By Tractably Sampling Convex Relaxations

Authors 

Khan, K. - Presenter, McMaster University
Song, Y., McMaster University
Cao, H., McMaster University
Several chemical engineering applications demand global optimization of nonconvex process models, including safety verification and determination of thermodynamic equilibria. Deterministic methods for global optimization (implemented, for example, in the state-of-the-art solvers BARON and ANTIGONE) typically proceed by generating upper and lower bounds on the unknown objective value at a unknown global minimum, and progressively improving these bounds. Lower bounds in global minimization are typically obtained by minimizing convex relaxations of the original objective function using a local nonlinear programming (NLP) solver. However, in certain cases, this minimization may be difficult. Some convex relaxations are significantly nonsmooth or noisy, and may hinder the application of smooth NLP solvers. For other convex relaxations, crucial gradient/subgradient information may be unavailable; this is the case for certain relaxations of ordinary differential equations, and new prototype convex relaxations that are not yet fully understood.

This presentation shows that, for such a convex relaxation of n variables on a box domain, guaranteed affine underestimators and lower bounds may be constructed and evaluated tractably without requiring a local NLP solver [1]. This process involves evaluating the convex relaxation at most (2n+1) times and rearranging the results using a new variant of centered finite differencing, without requiring any further assumptions. If the underlying convex relaxations converge rapidly to an objective function as the box domain shrinks, then this property also holds for the new bounds. Noise in the relaxation evaluations is handled easily. Several numerical examples are presented for illustration, including examples where these new sampled relaxations are deployed effectively in nontrivial global optimization problems. Beyond addressing the obstacles in the previous paragraph, these developments also enable effective global optimization of sums of known nonconvex functions and black-box convex functions.

Reference

[1] Y. Song, H. Cao, C. Mehta, and K.A. Khan, Bounding convex relaxations of process models from below by tractable black-box sampling, under review, 2021.