ABSTRACT

This chapter aims to demonstrate how Gaussian process (GP) surrogate modeling can assist in optimizing a blackbox objective function. That is, a function about which one knows little – one opaque to the optimizer – and that can only be probed through expensive evaluation. Models deployed to assist in optimization can be both statistical and non-statistical, however the latter often have strikingly similar statistical analogs. Statistical methods in optimization, in particular of noisy blackbox functions, probably goes back to Box and Draper, a precursor to a canonical response surface methods text by the same authors. In the computer experiments literature, folks have been using GPs to optimize functions for some time. In the mid 1990s, Matthias Schonlau was working on his dissertation, which basically revisited Mockus' Bayesian optimization idea from a GP and computer experiments perspective. He came up with a heuristic called expected improvement, which is the basis of the so-called efficient global optimization algorithm.