Simultaneous kriging-based sampling for optimization and uncertainty propagation
Abstract
Robust analysis and optimization is typically based on repeated calls to a deterministic simulator that aim at propagating uncertainties and finding optimal design variables. Without loss of generality a double set of simulation parameters can be assumed: x are deterministic optimization variables, u are random parameters of known probability density function and f (x, u) is the objective function attached to the simulator. Most robust optimization methods involve two imbricated tasks, the u's uncertainty propagation (e.g., Monte Carlo simulations, reliability index calculation) which is recurcively performed inside optimization iterations on the x's. In practice, f is often calculated through a computationally expensive software. This makes the computational cost one of the principal obstacle to optimization in the presence of uncertainties. This report proposes a new efficient method for minimizing the mean objective function, min E[f(x, U)]. The efficiency stems from the simultaneous sampling of f for uncertainty propagation and optimization, i.e., the hierarchical imbrication is avoided. Y(x,u) (ω), a kriging (Gaussian process conditioned on t past calculations of f) model of f (x, u) is built and the mean process, Z(x) (ω) = E[Y(x,U)(ω)], is analytically derived from it. The sampling criterion that yields both x and u is the one-step ahead minimum variance of the mean process Z at the maximizer of the expected improvement. The method is compared with Monte Carlo and kriging-based approaches on analytical test functions in two, four and six dimensions.
Origin | Files produced by the author(s) |
---|
Loading...