(373h) A Global Optimization Approach for the Symbolic Design of Iterative Algorithms
AIChE Annual Meeting
2024
2024 AIChE Annual Meeting
Computing and Systems Technology Division
10C: Interactive Session: Systems and Process Operations
Tuesday, October 29, 2024 - 3:30pm to 5:00pm
In this work, we model the update function as an expression tree that can represent many functional forms using simple mathematical operators (addition, multiplication, division, subtraction). The resulting algorithm design problem is a Mixed Integer Nonlinear Programming problem which is solved to global optimality with BARON. This problem formulation allows the solver to consider multiple functional forms for the update function and find a symbolic expression that can satisfy a given convergence criterion while a desired objective, such as the complexity of the update function, is optimized. We apply the proposed approach to gradient descent-based algorithms. The results show that the proposed formulation allows us to discover update functions (both linear and nonlinear), which lead to faster convergence than standard gradient descent methods for a given initial guess.
References:
[1]. Chen, X., Liang, C., Huang, D., Real, E., Wang, K., Pham, H., Dong, X., Luong, T., Hsieh, C.J., Lu, Y. and Le, Q.V., 2024. Symbolic discovery of optimization algorithms. Advances in Neural Information Processing Systems, 36.
[2]. Acevedo, N., Rey, C., Contreras-Bolton, C. and Parada, V., 2020. Automatic design of specialized algorithms for the binary knapsack problem. Expert Systems with Applications, 141, p.112908.
[3]. Fawzi, A., Balog, M., Huang, A., Hubert, T., Romera-Paredes, B., Barekatain, M., Novikov, A., R Ruiz, F.J., Schrittwieser, J., Swirszcz, G. and Silver, D., 2022. Discovering faster matrix multiplication algorithms with reinforcement learning. Nature, 610(7930), pp.47-53.
[4]. Romera-Paredes, B., Barekatain, M., Novikov, A., Balog, M., Kumar, M.P., Dupont, E., Ruiz, F.J., Ellenberg, J.S., Wang, P., Fawzi, O. and Kohli, P., 2024. Mathematical discoveries from program search with large language models. Nature, 625(7995), pp.468-475.
[5]. Drori, Y. and Teboulle, M., 2014. Performance of first-order methods for smooth convex minimization: a novel approach. Mathematical Programming, 145(1), pp.451-482.
[6]. Mitsos, A., Najman, J. and Kevrekidis, I.G., 2018. Optimal deterministic algorithm generation. Journal of Global Optimization, 71, pp.891-913.
[7]. Kim, D. and Fessler, J.A., 2021. Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions. Journal of Optimization Theory and Applications, 188(1), pp.192-219.
[8]. Grimmer, B., 2023. Provably faster gradient descent via long steps. arXiv preprint arXiv:2307.06324.