(15d) Interior-Point Decomposition Approaches for Parallel Solution of Large-Scale Nonlinear Parameter Estimation Problems
AIChE Annual Meeting
2006
2006 Annual Meeting
Computing and Systems Technology Division
Advances in Optimization I
Monday, November 13, 2006 - 9:45am to 10:10am
The importance of commodity polymers in today's chemical process industry cannot be understated. Low-density polyethylene is a low cost plastic used primarily for packaging (like soft plastic bags) and protective coatings. Various grades of this commodity polymer can be produced and it is important for any process to provide consistent product with desired properties.
A typical process for producing Low-density polyethylene is high pressure gas phase reaction of of ethylene and comonomer in the presence of various initiators inside a multi-zone tubular reactor. These reactors are composed of reaction and cooling zones with various initiator and feed mixtures injected at multiple points along the length of the reactor. The properties of the produced polymer are affected by the reactor design, the additives or chain transfer agents used, and the process operating conditions. Producing polymer with the desired properties requires effective design and operation of these complex reactor systems.
The ability to accurately model these systems is an important task in designing appropriate operating strategies, and a number of models exist in the literature [1-3]. The accuracy of these and other models is highly dependent on the values used for model parameters. Given the complexity of the process and the lack of an all-encompassing model with a consistent database of parameters, it is often necessary to estimate model parameters using the particular industrial process under study. This task is made difficult by the large sets of complex DAEs used to model the process and the lack of efficient solution strategies for the large-scale optimization problems. In previous work, some of the authors of this paper showed a fully simultaneous formulation[4] which could be solved efficiently using IPOPT[5], a large-scale, interior-point, nonlinear programming solver (projects.coin-or.org/Ipopt). The authors included standard least-squares formulations as well as errors-in-variables-measured (EVM) formulations. By including more data sets in the estimation, the 95% confidence regions were drastically reduced. Using this approach, the authors were able to simultaneously include up to 6 data sets in the problem formulation. While it is desirable to include more data sets and tighten confidence regions, this requires the use of a decomposition approach to facilitate efficient solution of the resulting large-scale nonlinear programming problem. The approach of this current work is to solve problems with many more data sets by using an internal decomposition approach possible through a recent implementation of Ipopt.
The size of the least-squares estimation problem grows affinely with the number of data sets. However, the problem has an almost block diagonal structure where the model equations for each data set are largely independent except for the problem parameters. It is possible to exploit this structure to allow parallel solution approaches of very large-scale problems, including many data sets. The traditional approach to solving structured problems is to use a problem-level decomposition approach like benders decomposition or lagrangian relaxation techniques. Internal decomposition approaches, on the other hand, solve the entire problem with a large-scale nonlinear programming solver, applying decomposition approaches to the individual linear algebra calculations required by the algorithm.
The dominant cost of the interior-point solution approach in terms of memory and computation is the solution of the linear primal-dual KKT system, required each iteration of the optimization algorithm. The structure of the nonlinear parameter estimation problem induces an arrowhead or block-bordered structure in this linear system where the dimension of the border is the same as the number of parameters that are common across each data set. This structured linear system can be efficiently solved in parallel using Schur-complement decomposition approach[6]. In this work, we solve the large-scale parameter estimation problem with 32 simultaneous data sets showing a significant reduction of the 95% confidence regions. This problem has over 350,000 constraints and 1000 degrees of freedom. While we demonstrate excellent scaleup properties of the decomposition approach on this particular real-world problem, the approach is general in nature and amenable to any large-scale parameter estimation problem.
[1] Kiparissedes, C.; Veros, G. and McGregor, J., "Mathematical Modeling, Optimization, and Quality control of High-Pressure Ethylene Polymerization Reactors" J.M.S.-Rev Mcromol. Chem. Phys. 1993, C33, 437-527
[2] Brandolin, A.; Lacunza, P.; Ugrin, L. and Capiati, N.J., "High-Pressure Polymerization of Ethylene and Improved Mathematical model for Industrial Tubular Reactors" Polym. Reac. Eng. 1996, 4, 193-241
[3] Bokis, C. P., "Physical Properties, Reactor Modeling, and Polymerization Kinetics in the Low-Density Polyethylene Tubular Reactor Process" Ind. eng. chem. Res. 2001, 41, 1017-1030
[4] Zavala, V. M. and Biegler, L. T., ?Large-Scale Parameter Estimation in Low-Density Polyethylene Tubular Reactors?, submitted to Ind. Eng. Chem. Res.
[5] Waachter, A.; Biegler, L. T., ?On the Implementation of an Interior-Point Filter Line-Search Algorithm for Large-Scale Nonlinear Programming?, Math. Programm. 2006, 106, 25-57.
[6] Laird, C. and Biegler, L.T., ?A Block-Bordered Interior Point Approach for the Solution of Multiperiod Nonlinear Programs?, presented at the AIChE Annual Meeting, Cincinnati, OH, USA, 2005.