(598a) Towards Global Optimization on Low-Dimensional Surrogates Via Manifold Learning | AIChE

(598a) Towards Global Optimization on Low-Dimensional Surrogates Via Manifold Learning

Authors 

Matthews, L. R. - Presenter, Texas A&M University
Dietrich, F., Johns Hopkins University
Pozharskiy, D., Princeton University
Kevrekidis, I. G., Princeton University
Global optimization typically requires explicit forms of the objective function and constraints to provide valid lower and upper bounds on the solution. Thus, it is a challenge to perform global optimization for problems where the objective function or constraints are not known in explicit form, but are available only through sampling of a simulation, a black-box or a grey-box [1-3]. In these cases, global optimization can only be performed on surrogate models constructed on the data, and not on the original problem. These problems become increasingly difficult to solve as dimensionality increases; retrieving enough samples to develop meaningful surrogate models can become difficult, and efficient means for solving such problems will be required for many applications in the future.

In this work, we seek to conduct optimization (global as well as local) of high dimensional black-box problems for which manifold learning may be applied to effectively reduce the problem dimensionality. Our core assumption is the existence of a low-dimensional manifold in search (parameter) space that is approached quickly by an “inner” optimization algorithm (e.g. a steepest descent method) applied to the objective function. On the manifold, local optimization is then assumed to converge much slower. The approach we discuss does not require information about the manifold beyond its existence. It has already been shown that identifying the underlying manifolds of the objective function, and taking steps on them, can quickly speed up traditional local optimization algorithms [4]. From a global optimization perspective, identifying low-dimensional underlying manifolds for non-convex objective functions in high-dimensional search space has the potential to significantly decrease the required sampling and surrogate modeling. That is, rather than spending the majority of the time working in a high-dimensional space, the problem can be addressed based on its representative low-dimensional manifold. Building a surrogate model at this coarser level may allow the use of global optimization solvers, such as ANTIGONE [5], as soon as an explicit form of the interpolant of the objective function on the manifold is constructed.

The approach presented begins with the identification of the lower dimensional manifold through brief spurts of local optimization algorithms starting at sample points in the high dimensional space. As we assume that the manifold is strongly attracting for, say, steepest descent methods, we only need to sample the space of fibers over the manifold, and not the full, high-dimensional space. After fast convergence towards the manifold, Diffusion Maps are then applied to the points close to the manifold, identifying a (“nonlinear”) parametrization [6]. Then, a surrogate model may be built on this low-dimensional manifold, providing the ability to utilize standard global optimization algorithms, and the optimal solutions may then be lifted back from the diffusion map coordinates to the original high-dimensional forms. We apply this algorithm to multiple examples to explore its efficiency and effectiveness for this class of problems.

  1. Boukouvala, F., & Floudas, C. A. (2017). ARGONAUT: AlgoRithms for Global Optimization of coNstrAined grey-box compUTational problems.Optimization Letters, 11(5), 895-913.
  2. Rios, L. M., & Sahinidis, N. V. (2013). Derivative-free optimization: a review of algorithms and comparison of software implementations.Journal of Global Optimization, 56(3), 1247-1293.
  3. Boukouvala, F., Misener, R., & Floudas, C. A. (2016). Global optimization advances in mixed-integer nonlinear programming, MINLP, and constrained derivative-free optimization, CDFO. European Journal of Operational Research, 252(3), 701-727.
  4. Pozharskiy, D. et al. Manifold Learning for Coarse-Grained Optimization, in preparation.
  5. Misener, R., & Floudas, C. A. (2014). ANTIGONE: algorithms for continuous/integer global optimization of nonlinear equations. Journal of Global Optimization, 59(2-3), 503-526.
  6. Coifman, R. R., & Lafon, S. (2006). Diffusion maps. Applied and computational harmonic analysis, 21(1), 5-30.