A bayesian framework for quantifying uncertainty in stochastic simulation

Wei Xie, Barry L. Nelson, Russell R. Barton

Research output: Contribution to journalArticlepeer-review

74 Scopus citations

Abstract

When we use simulation to estimate the performance of a stochastic system, the simulation often contains input models that were estimated from real-world data; therefore, there is both simulation and input uncertainty in the performance estimates. In this paper, we provide a method to measure the overall uncertainty while simultaneously reducing the influence of simulation estimation error due to output variability. To reach this goal, a Bayesian framework is introduced. We use a Bayesian posterior for the input-model parameters, conditional on the real-world data, to quantify the input-parameter uncertainty; we propagate this uncertainty to the output mean using a Gaussian process posterior distribution for the simulation response as a function of the input-model parameters, conditional on a set of simulation experiments. We summarize overall uncertainty via a credible interval for the mean. Our framework is fully Bayesian, makes more effective use of the simulation budget than other Bayesian approaches in the stochastic simulation literature, and is supported with both theoretical analysis and an empirical study. We also make clear how to interpret our credible interval and why it is distinctly different from the confidence intervals for input uncertainty obtained in other papers.

Original languageEnglish (US)
Pages (from-to)1439-1452
Number of pages14
JournalOperations Research
Volume62
Issue number6
DOIs
StatePublished - Nov 1 2014

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Management Science and Operations Research

Fingerprint

Dive into the research topics of 'A bayesian framework for quantifying uncertainty in stochastic simulation'. Together they form a unique fingerprint.

Cite this