(40h) EAGO.jl: Next Generation Global & Robust Optimization in Julia, Revisited
AIChE Annual Meeting
2021
2021 Annual Meeting
Computing and Systems Technology Division
Software Tools and Implementations for Process Systems Engineering
Sunday, November 7, 2021 - 5:15pm to 5:30pm
An improved ability to support extensive user customization is provided through EAGO.jlâs new modular directed-acyclic multigraph (DAMG) backend. A discussion of how this allows for user-defined composite relaxations and gradient information calculation is included. More importantly, this backend standardizes a number of tree-walking features present in EAGO.jl, including: common subexpression elimination, automatic differentiation [17], disciplined convex programming [18], forward-reverse propagation of relaxations, and set-valued enclosures of intermediate subexpressions [9, 19, 20]. This is coupled with novel functionality in the broader EAGO.jl package allowing for simplified detection and reformulation of hidden conic forms. Moreover, the unified DAMG representation naturally allows for the combined representation of both a reduced-space model and auxiliary variable representation. We exploit this dual representation to develop preliminary results that address a central question surrounding factorable programming approaches: âWhen might auxiliary variables serve to improve solver performance and reliability?â [6, 11]. In addition to this key revision in EAGO.jlâs backend, several improvements have been made to support specialized model forms.
EAGO.jl now extends the JuMP modeling language to allow for seamless support for solving semi-infinite programs and models with multi-input multi-output subexpressions. This has allowed users to address embedded hybrid neural network models [19] as well as dynamical systems. We discuss our recent theoretic advances related to constructing relaxations of both embedded hybrid models [19] and dynamical systems, which have recently been incorporated into EAGO.jlâs core functionality. We conclude by illustrating EAGO.jlâs flexibility and performance on case studies from the literature.
REFERENCES
[1] Horst, R. and Tuy, H., A. Global Optimization: Deterministic Approaches. Springer Berlin Heidelberg. 2013.
[2] Floudas, Christodoulos A., and Chrysanthos E. Gounaris. "A review of recent advances in global optimization." Journal of Global Optimization 45.1 (2009): 3-38
[3] Tawarmalani, Mohit, and Nikolaos V. Sahinidis. "A polyhedral branch-and-cut approach to global optimization." Mathematical programming 103.2 (2005): 225-249.
[4] Misener, Ruth, and Christodoulos A. Floudas. "ANTIGONE: algorithms for continuous/integer global optimization of nonlinear equations." Journal of Global Optimization 59.2-3 (2014): 503-526.
[5] Matthew E. Wilhelm and Matthew D. Stuber. "EAGO.jl: easy advanced global optimization in Julia. " Optimization Methods and Software, 1â26, 2020.
[6] Mitsos, A., Chachuat, B., and Barton, P.I. "McCormick-Based Relaxations of Algorithms. " SIAM Journal on Optimization. 20(2): 573-601.
[7] Scott, J.K., Stuber, M.D., and Barton, P.I. "Generalized McCormick Relaxations. " Journal of Global Optimization, 51:569-606, 2011.
[8] Kamil A. Khan, Harry AJ Watson, and Paul I. Barton. "Differentiable McCormick relaxations." Journal of Global Optimization 67.4 (2017): 687-729.
[9] Wechsung, Achim, Joseph K. Scott, Harry AJ Watson, and Paul I. Barton. "Reverse propagation of McCormick relaxations." Journal of Global Optimization 63.1 (2015): 1-36.
[10] Alexander Tsoukalas, Alexander Mitsos. "Tighter McCormick relaxations through subgradient propagation. "Journal of Global Optimization, 59: 633-662, 2014.
[11] JaromiÅ Najman, Dominik Bongartz, Alexander Mitsos. "Linearization of McCormick relaxations and hybridization with the auxiliary variable method. " Journal of Global Optimization, 1-26, 2021.
[12] Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah. "Julia: A fresh approach to numerical computing." SIAM review 59.1 (2017): 65-98.
[13] Dunning, Iain, Joey Huchette, and Miles Lubin. "JuMP: A modeling language for mathematical optimization." SIAM review 59.2 (2017): 295-320.
[14] Benoit Legat, Oscar Dowson, Joaquim Garcia, and Miles Lubin. âMathOptInterface: a data structure for mathematical optimization problemsâ. INFORMS Journal on Computing. In Press.
[15] Mike Innes. "Flux: Elegant machine learning with Julia." Journal of Open-Source Software 3.25 (2018): 602.
[16] Rackauckas, Christopher, et al. "Universal differential equations for scientific machine learning." arXiv preprint arXiv:2001.04385 (2020).
[17] Khan, Kamil A., and Paul I. Barton. "A vector forward mode of automatic differentiation for generalized derivative evaluation." Optimization Methods and Software 30.6 (2015): 1185-1212.
[18] Michael Grant Stephen Boyd, and Yinyu Ye. "Disciplined convex programming." Global optimization. Springer, Boston, MA, 2006. 155-210.
[19] Matthew E. Wilhelm, Chenyu Wang, and Matthew. D. Stuber. âRobust Optimization with Hybrid First-Principles Data-Driven Models.â Under review.
[20] Schichl, Hermann, and Arnold Neumaier. "Interval analysis on directed acyclic graphs for global optimization." Journal of Global Optimization 33.4 (2005): 541-562.
[21] JaromiÅ Najman, Alexander Mitsos. Tighter McCormick relaxations through subgradient propagation. Journal of Global Optimization. Journal of Global Optimization, 75(3): 565-593, 2019.