Loading…

Stochastic Nelder–Mead simplex method – A new globally convergent direct search method for simulation optimization

► A new direct search method, SNM, is developed for simulation optimization. ► With the developed sample size scheme, SNM can handle noises effectively. ► Without requiring gradient estimation, SNM can handle nonsmooth response functions. ► We prove that SNM has attractive global convergence propert...

Full description

Saved in:
Bibliographic Details
Published in:European journal of operational research 2012-08, Vol.220 (3), p.684-694
Main Author: Chang, Kuo-Hao
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:► A new direct search method, SNM, is developed for simulation optimization. ► With the developed sample size scheme, SNM can handle noises effectively. ► Without requiring gradient estimation, SNM can handle nonsmooth response functions. ► We prove that SNM has attractive global convergence property. ► Numerical experiments show that the performance of SNM is promising. Nelder–Mead simplex method (NM), originally developed in deterministic optimization, is an efficient direct search method that optimizes the response function merely by comparing function values. While successful in deterministic settings, the application of NM to simulation optimization suffers from two problems: (1) It lacks an effective sample size scheme for controlling noise; consequently the algorithm can be misled to the wrong direction because of noise, and (2) it is a heuristic algorithm; the quality of estimated optimal solution cannot be quantified. We propose a new variant, called Stochastic Nelder–Mead simplex method (SNM), that employs an effective sample size scheme and a specially-designed global and local search framework to address these two problems. Without the use of gradient information, SNM can handle problems where the response functions are nonsmooth or gradient does not exist. This is complementary to the existing gradient-based approaches. We prove that SNM can converge to the true global optima with probability one. An extensive numerical study also shows that the performance SNM is promising and is worthy of further investigation.
ISSN:0377-2217
1872-6860
DOI:10.1016/j.ejor.2012.02.028