(282d) A Multiparametric Programming Approach to Solving Neural Network-Based Optimization Problems with Application to Control
AIChE Annual Meeting
2023
2023 AIChE Annual Meeting
Computing and Systems Technology Division
Data-driven and Surrogate Optimization in Operation I
Tuesday, November 7, 2023 - 1:33pm to 1:54pm
In this presentation, we propose a novel tightening procedure based on a multiparametric programming reformulation of the corresponding ReLU reformulated optimization problem. The bounding procedure features 1) generating valid tight bounds on the individual auxiliary variables introduced from the ReLU NN reformulation and 2) generating bounds on the binary variables relating to each layer of the ReLU reformulation. The tightened bounds are valid for all realizations of parameters and, thus, are valid for any realization of the problem and would only happen once and offline. As this bounds-tightening procedure is applied offline, more computationally expensive methods for formulation tightening can be applicable than in the case where tightening occurs purely online. Thus larger computational benefits can be observed than in the online case. We demonstrate the effectiveness of this method in a case study of model predictive control of a nonlinear chemostat where the dynamics are approximated via a ReLU NN.
[1] Calvin Tsay, Jan Kronqvist, Alexander Thebelt, and Ruth Misener. Partition-based formulations for mixed-integer optimization of trained relu neural networks. Advances in Neural Information Processing Systems, 34:3068â3080, 2021.
[2] Justin Katz, Iosif Pappas, Styliani Avraamidou, and Efstratios N Pistikopoulos. Integrating deep learning models and multiparametric programming. Computers & Chemical Engineering, 136:106801, 2020.
[3] Bjarne Grimstad and Henrik Andersson. Relu networks as surrogate models in mixed-integer linear programs. Computers & Chemical Engineering, 131:106580, 2019.
[4] Ross Anderson, Joey Huchette, Will Ma, Christian Tjandraatmadja, and Juan Pablo Vielma. Strong mixed-integer programming formulations for trained neural networks. Mathematical Programming, 183(1):3â39, 2020.
[5] Christian Tjandraatmadja, Ross Anderson, Joey Huchette, Will Ma, Krunal Kishor Patel, and Juan Pablo Vielma. The convex relaxation barrier, revisited: Tightened single-neuron relaxations for neural network verification. Advances in Neural Information Processing Systems, 33:21675â21686, 2020.