(216f) Advances in EAGO.jl: Updates, Dynamics, and Parallelism | AIChE

(216f) Advances in EAGO.jl: Updates, Dynamics, and Parallelism

Authors 

Gottlieb, R. - Presenter, University of Connecticut
Alston, D., University of Connecticut
Xu, P., University of Connecticut
Stuber, M., University of Connecticut
Mathematical optimization is an essential tool for process systems engineers; yet, for many researchers and practitioners, problems of interest often require advanced problem formulations and complex user-defined functions. For this reason, EAGO.jl was released six years ago as the first open-source global optimizer for general nonlinear programs (NLPs), with a design goal of handling both AML- and script-defined functions [1, 2]. Since its release, EAGO has continued to mature as a software package and incorporates recent advances in reduced-space McCormick relaxations [3–6], a variety of domain reduction techniques [7–9] and improved linearization methods [10] that also appear in the C++ optimizer MAiNGO [11], as well as specialized tools for handling composite bilinear terms that are ubiquitous in NLP problem formulations [12]. EAGO is aided by the unique features and high performance of the Julia programming language [13] as well as a diverse set of active, integrated development communities covering the fields of optimization (e.g., JuMP-dev [14] and MathOptInterface [15]), machine learning (e.g., SciML [16, 17] and Flux [18, 19]), and general packages for numerical analysis.

This talk will focus on recent development efforts and future directions for the EAGO package. To start, EAGO’s internal nonlinear expression handling has been updated for compatibility with the latest revisions to the JuMP modeling language and the MathOptInterface backend. We will also discuss and highlight our recent developments in extending EAGO for GPGPU compatibility, improvements to handling global dynamic optimization problems through the DynamicBounds.jl subpackage, as well as our efforts to improve the documentation of the core software and public-facing examples. The utility of the package and the impact of the added features will be demonstrated with a brief tutorial on a relevant case study.

References

[1] Matthew E. Wilhelm and Matthew D. Stuber. Easy advanced global optimization (EAGO): An open-source platform for robust and global optimization in Julia. In AIChE Annual Meeting. AIChE, 2017.

[2] Matthew E. Wilhelm and Matthew D. Stuber. EAGO.jl: easy advanced global optimization in Julia. Optimization Methods and Software, 37(2):425–450, aug 2022. doi: 10.1080/10556788.2020.1786566.

[3] Alexander Mitsos, Benoît Chachuat, and Paul I. Barton. McCormick-based relaxations of algorithms. SIAM Journal on Optimization, 20(2):573–601, jan 2009. doi: 10.1137/080717341.

[4] Joseph K. Scott, Matthew D. Stuber, and Paul I. Barton. Generalized McCormick relaxations. Journal of Global Optimization, 51(4):569–606, feb 2011. doi: 10.1007/s10898-011-9664-7.

[5] Angelos Tsoukalas and Alexander Mitsos. Multivariate McCormick relaxations. Journal of Global Optimization, 59(2-3):633–662, apr 2014. doi: 10.1007/s10898-014-0176-0.

[6] Kamil A. Khan, Harry A. J. Watson, and Paul I. Barton. Differentiable McCormick relaxations. Journal of Global Optimization, 67(4):687–729, may 2016. doi: 10.1007/s10898-016-0440-6.

[7] Stefan Vigerske. Decomposition in Multistage Stochastic Programming and a Constraint Integer Programming Approach to Mixed-Integer Nonlinear Programming. PhD thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftlichen Fakultät II, 2013.

[8] Achim Wechsung, Joseph K. Scott, Harry A. J. Watson, and Paul I. Barton. Reverse propagation of McCormick relaxations. Journal of Global Optimization, 63(1):1–36, apr 2015. doi: 10.1007/s10898-015-0303-6.

[9] Ambros M. Gleixner, Timo Berthold, Benjamin Müller, and Stefan Weltge. Three enhancements for optimization-based bound tightening. Journal of Global Optimization, 67(4):731–757, jun 2017. doi: 10.1007/s10898-016-0450-4.

[10] Jaromił Najman, Dominik Bongartz, and Alexander Mitsos. Linearization of McCormick relaxations and hybridization with the auxiliary variable method. Journal of Global Optimization, feb 2021. doi: 10.1007/s10898-020-00977-x.

[11] Dominik Bongartz, Jaromił Najman, Susanne Sass, and Alexander Mitsos. MAiNGO - McCormick-based Algorithm for mixed-integer Nonlinear Global Optimization. Technical report, RWTH-Aachen, 2018. URL https://www.avt.rwth-aachen.de/global/show_document.asp?id=aaaaaaaaabclahw.

[12] Matthew E. Wilhelm and Matthew D. Stuber. Improved convex and concave relaxations of composite bilinear forms. Journal of Optimization Theory and Applications, mar 2023. doi: 10.1007/s10957-023-02196-2.

[13] Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah. Julia: A fresh approach to numerical computing, 2014.

[14] Iain Dunning, Joey Huchette, and Miles Lubin. JuMP: A modeling language for mathematical optimization. SIAM Review, 59(2):295–320, jan 2017. doi: 10.1137/15m1020575.

[15] Benoît Legat, Oscar Dowson, Joaquim Dias Garcia, and Miles Lubin. MathOptInterface: a data structure for mathematical optimization problems. arXiv preprint arXiv:2002.03447, 2020. doi: 10.48550/ARXIV.2002.03447.

[16] Yingbo Ma, Shashi Gowda, Ranjan Anantharaman, Chris Laughman, Viral Shah, and Chris Rackauckas. ModelingToolkit: A composable graph transformation system for equation-based modeling. arXiv preprint arXiv:2103.05244, 2021. doi: 10.48550/ARXIV.2103.05244.

[17] Christopher Rackauckas and Qing Nie. DifferentialEquations.jl – a performant and feature-rich ecosystem for solving differential equations in Julia. Journal of Open Research Software, 5, may 2017. doi: 10.5334/jors.151.

[18] Michael Innes, Elliot Saba, Keno Fischer, Dhairya Gandhi, Marco Concetto Rudilosso, Neethu Mariya Joy, Tejan Karmali, Avik Pal, and Viral Shah. Fashionable modelling with Flux. arXiv preprint arXiv:1811.01457, 2018. doi: 10.48550/ARXIV.1811.01457.

[19] Michael Innes. Flux: Elegant machine learning with Julia. Journal of Open Source Software, 3(25):602, may 2018. doi: 10.21105/joss.00602.