(240d) Improved Genetic Algorithms for Deterministic Optimization and Optimization under Uncertainty
AIChE Annual Meeting
2005
2005 Annual Meeting
Computing and Systems Technology Division
Poster Session: Recent Developments in Computers in Operations and Information Processing
Tuesday, November 1, 2005 - 3:00pm to 6:00pm
Summary
Many optimization problems, which include a large number of continuous or discrete design variables fall into the category of integer programming (IP), mixed integer linear programming (MILP) and mixed integer nonlinear programming (MINLP). Branch and bound (BB), generalized bender's decomposition (GBD), and outer-approximation(OA)1 are generally used for solving IP, MILP and MINLP problems. However, problems occur when: 1) functions do not satisfy convexity conditions, 2) systems have large combinatorial explosion, or 3) the solution space is discontinuous. One probabilistic optimization technique named evolutionary algorithm (EA), that has been developed based on Darwin's natural selection and Mendel's genetics, provides an alternative to the mixed integer programming techniques like the BB, GBD and OA. Among all the evolutionary algorithms, genetic algorithm (GA) is the most widely used. GA was first introduced by Holland2. It has been receiving increased attention due to a series of successful applications in different disciplines like biology, medicine and different branches of engineering3.
GA starts from a population instead of a single point in the potential solution space of a specific problem, and allow that population to evolve from generation to generation by genetic operators like selection, crossover, and mutation until the stopping criteria are satisfied. In the evolving process, GA uses a structured yet randomized information exchange to form a search direction. The behavior of GA is characterized by a balance between exploitation and exploration. This balance is strongly affected by strategic parameters like population size, crossover and mutation rate as well as the mechanisms employed for: 1) choosing initial population, 2) representing individuals and 3) performing evolution. A number of modifications arising from the above considerations have been developed in the last several decades to obtain better performance such as real number encoding4, simulated binary crossover operator (SBX)5, parameter adaptation6, and hybrid genetic algorithm (HGA)7.
This paper proposes three new variants of genetic algorithm to solve deterministic and stochastic optimization problems. In these algorithms, a new and efficient sampling technique, Hammersley Sequence Sampling (HSS)8, is utilized in both initial population generation and population updating. Additionally for stochastic optimization problems, HSS is also used for propagation of parametric uncertainties through the model. The k-dimensional uniformity property of HSS is exploited in developing the efficient genetic Algorithm (EGA) to solve deterministic optimization problems. Case studies have been performed in this work to show that EGA shows better performance than its traditional counterpart, in which the random number generator from Monte Carlo sampling is commonly employed. For stochastic optimization problems, the Hammersley stochastic genetic algorithm (HSGA) coupled with better confidence interval based on fractal geometry approach9 is introduced. Case studies show that the new algorithm outperforms the state of the art stochastic optimization algorithms significantly.
Reference: (1) Diwekar, U. M., 2003. Introduction to applied optimization. Applied optimization, 80, Kulwer Academic Publishers, Massachusetts. (2) Holland, J. H., 1975. Adaptation in Natural and Artifical System, University of Michigan Press, Ann Arbor, MI. (3) Tarafder, A., Rangaiah, G.P. and Ray, Ajay K.,2005. Multiobjective optimization of an industrial styrene monomer manufacturing process. Chemical Engineering Science, 60, 347-363. (4) Michalewicz, Z., 1996. Genetic Algorithm + Data structure = Evolution Programs, 3rd edition, Springer-Verlag, New York. (5) Deb, K., Agrawal, R., B. 1995. Simulated binary crossover for continuous search space, Complex Systems, 9, 115-148. (6) Herrera, F. and M. Lozano. 1996. Adaptation of genetic algorithm parameters based on fuzzy logic controllers, in Herrera, F. and J. Verdegay, editors, Genetic Algorithms and Soft Computing, 95-125. (7) Özdamar, L. 1999. A genetic algorithm approach to a general category project scheduling problem. IEEE Trans, Syst., Man, Cybern.-Part C: Application and reviews, 29, 44-59. (8) Diwekar, U., M., Kalagnanam, J., R. 1997. Efficiency sampling technique for optimization under uncertainty. AIChE J., 43, 440-447. (9) Kim K. and U. Diwekar (2005), Fractal dimension approach to error characterization in sampling, submitted to Technometrics
Checkout
This paper has an Extended Abstract file available; you must purchase the conference proceedings to access it.
Do you already own this?
Log In for instructions on accessing this content.
Pricing
Individuals
AIChE Pro Members | $150.00 |
AIChE Graduate Student Members | Free |
AIChE Undergraduate Student Members | Free |
AIChE Explorer Members | $225.00 |
Non-Members | $225.00 |