(598a) Towards Global Optimization on Low-Dimensional Surrogates Via Manifold Learning
AIChE Annual Meeting
2018
2018 AIChE Annual Meeting
Computing and Systems Technology Division
Advances in Optimization with Surrogate and Mixed-Integer Models
Thursday, November 1, 2018 - 8:00am to 8:19am
In this work, we seek to conduct optimization (global as well as local) of high dimensional black-box problems for which manifold learning may be applied to effectively reduce the problem dimensionality. Our core assumption is the existence of a low-dimensional manifold in search (parameter) space that is approached quickly by an âinnerâ optimization algorithm (e.g. a steepest descent method) applied to the objective function. On the manifold, local optimization is then assumed to converge much slower. The approach we discuss does not require information about the manifold beyond its existence. It has already been shown that identifying the underlying manifolds of the objective function, and taking steps on them, can quickly speed up traditional local optimization algorithms [4]. From a global optimization perspective, identifying low-dimensional underlying manifolds for non-convex objective functions in high-dimensional search space has the potential to significantly decrease the required sampling and surrogate modeling. That is, rather than spending the majority of the time working in a high-dimensional space, the problem can be addressed based on its representative low-dimensional manifold. Building a surrogate model at this coarser level may allow the use of global optimization solvers, such as ANTIGONE [5], as soon as an explicit form of the interpolant of the objective function on the manifold is constructed.
The approach presented begins with the identification of the lower dimensional manifold through brief spurts of local optimization algorithms starting at sample points in the high dimensional space. As we assume that the manifold is strongly attracting for, say, steepest descent methods, we only need to sample the space of fibers over the manifold, and not the full, high-dimensional space. After fast convergence towards the manifold, Diffusion Maps are then applied to the points close to the manifold, identifying a (ânonlinearâ) parametrization [6]. Then, a surrogate model may be built on this low-dimensional manifold, providing the ability to utilize standard global optimization algorithms, and the optimal solutions may then be lifted back from the diffusion map coordinates to the original high-dimensional forms. We apply this algorithm to multiple examples to explore its efficiency and effectiveness for this class of problems.
- Boukouvala, F., & Floudas, C. A. (2017). ARGONAUT: AlgoRithms for Global Optimization of coNstrAined grey-box compUTational problems.Optimization Letters, 11(5), 895-913.
- Rios, L. M., & Sahinidis, N. V. (2013). Derivative-free optimization: a review of algorithms and comparison of software implementations.Journal of Global Optimization, 56(3), 1247-1293.
- Boukouvala, F., Misener, R., & Floudas, C. A. (2016). Global optimization advances in mixed-integer nonlinear programming, MINLP, and constrained derivative-free optimization, CDFO. European Journal of Operational Research, 252(3), 701-727.
- Pozharskiy, D. et al. Manifold Learning for Coarse-Grained Optimization, in preparation.
- Misener, R., & Floudas, C. A. (2014). ANTIGONE: algorithms for continuous/integer global optimization of nonlinear equations. Journal of Global Optimization, 59(2-3), 503-526.
- Coifman, R. R., & Lafon, S. (2006). Diffusion maps. Applied and computational harmonic analysis, 21(1), 5-30.