(236h) Recent Developments in EAGO.jl (Easy Advanced Global Optimization in Julia) | AIChE

(236h) Recent Developments in EAGO.jl (Easy Advanced Global Optimization in Julia)

Authors 

Gottlieb, R. - Presenter, University of Connecticut
Wilhelm, M., University of Connecticut
Xu, P., University of Connecticut
Stuber, M., University of Connecticut
Five years have passed since the public release of EAGO.jl: the first open-source global optimizer for general nonlinear programs (NLPs) [1, 2]. EAGO was created to satisfy the need for a performant global optimizer capable of handling the advanced problem formulations and complex user-defined functions that are prevalent in modern engineering and operations applications. Consequently, its out-of-the-box functionality handles both AML- and script-defined functions; in both cases, EAGO takes advantage of recent advances in reduced-space McCormick relaxations [3–6] such as various domain reduction techniques [7–9] and improved linearization methods [10] that appear in the C++ optimizer MAiNGO [11]. Additionally, EAGO was developed as a fully customizable research platform: All the major subroutines that comprise the base tool can be readily extended for specialized use cases. EAGO has been aided by the unique features and high performance of the Julia programming language [12] as well as the active and integrated development communities for optimization (e.g., JuMP-dev [13] and MathOptInterface [14]), machine learning (e.g., SciML [15, 16] and Flux [17, 18]), and general numerical analysis packages.

During the past year, EAGO has matured into an MINLP optimizer and has made stability improvements alongside major stability milestones for JuMP [13] and MathOptInterface [14]. In this talk, we discuss recent developments and directions for the EAGO.jl software package. These developments include: compatibility with equation-oriented modeling in Julia; exploitation of GPGPU parallel architectures through source-code transformation approaches; enhanced relaxations of composite bilinear forms; and overall improvements in relaxation libraries, including relaxations of artificial neural networks. The utility of these new features is demonstrated in two applications of extreme importance in process systems engineering: global dynamic optimization and global parameter estimation.

[1] Matthew E. Wilhelm and Matthew D. Stuber. Easy advanced global optimization (EAGO): An open-source platform for robust and global optimization in Julia. In AIChE Annual Meeting. AIChE, 2017.

[2] Matthew E. Wilhelm and Matthew D. Stuber. EAGO.jl: easy advanced global optimization in Julia. Optimization Methods and Software, pages 1–26, aug 2020. doi: 10.1080/10556788.2020.1786566.

[3] Alexander Mitsos, Benoît Chachuat, and Paul I. Barton. McCormick-based relaxations of algorithms. SIAM Journal on Optimization, 20(2):573–601, jan 2009. doi: 10.1137/080717341.

[4] Joseph K. Scott, Matthew D. Stuber, and Paul I. Barton. Generalized McCormick relaxations. Journal of Global Optimization, 51(4):569–606, feb 2011. doi: 10.1007/s10898-011-9664-7.

[5] Angelos Tsoukalas and Alexander Mitsos. Multivariate McCormick relaxations. Journal of Global Optimization, 59(2-3):633–662, apr 2014. doi: 10.1007/s10898-014-0176-0.

[6] Kamil A. Khan, Harry A. J. Watson, and Paul I. Barton. Differentiable McCormick relaxations. Journal of Global Optimization, 67(4):687–729, may 2016. doi: 10.1007/s10898-016-0440-6.

[7] Stefan Vigerske. Decomposition in Multistage Stochastic Programming and a Constraint Integer Programming Approach to Mixed-Integer Nonlinear Programming. PhD thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftlichen Fakultät II, 2013.

[8] Achim Wechsung, Joseph K. Scott, Harry A. J. Watson, and Paul I. Barton. Reverse propagation of McCormick relaxations. Journal of Global Optimization, 63(1):1–36, apr 2015. doi: 10.1007/s10898-015-0303-6.

[9] Ambros M. Gleixner, Timo Berthold, Benjamin Müller, and Stefan Weltge. Three enhancements for optimization-based bound tightening. Journal of Global Optimization, 67(4):731–757, jun 2017. doi: 10.1007/s10898-016-0450-4.

[10] Jaromił Najman, Dominik Bongartz, and Alexander Mitsos. Linearization of McCormick relaxations and hybridization with the auxiliary variable method. Journal of Global Optimization, feb 2021. doi: 10.1007/s10898-020-00977-x.

[11] Dominik Bongartz, Jaromił Najman, Susanne Sass, and Alexander Mitsos. MAiNGO – McCormick-based Algorithm for mixed-integer Nonlinear Global Optimization. 2018.

[12] Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah. Julia: A fresh approach to numerical computing, 2014.

[13] Iain Dunning, Joey Huchette, and Miles Lubin. JuMP: A modeling language for mathematical optimization. SIAM Review, 59(2):295–320, jan 2017. doi: 10.1137/15m1020575.

[14] Benoît Legat, Oscar Dowson, Joaquim Dias Garcia, and Miles Lubin. MathOptInterface: a data structure for mathematical optimization problems. 2020. doi: 10.48550/ARXIV.2002.03447.

[15] Yingbo Ma, Shashi Gowda, Ranjan Anantharaman, Chris Laughman, Viral Shah, and Chris Rackauckas. ModelingToolkit: A composable graph transformation system for equation-based modeling. 2021. doi: 10.48550/ARXIV.2103.05244.

[16] Christopher Rackauckas and Qing Nie. DifferentialEquations.jl – a performant and feature-rich ecosystem for solving differential equations in Julia. Journal of Open Research Software, 5, may 2017. doi: 10.5334/jors.151.

[17] Michael Innes, Elliot Saba, Keno Fischer, Dhairya Gandhi, Marco Concetto Rudilosso, Neethu Mariya Joy, Tejan Karmali, Avik Pal, and Viral Shah. Fashionable modelling with Flux. 2018. doi: 10.48550/ARXIV.1811.01457.

[18] Michael Innes. Flux: Elegant machine learning with Julia. Journal of Open Source Software, 3(25):602, may 2018. doi: 10.21105/joss.00602.