TY - GEN
T1 - The dangers of sparse sampling for uncertainty propagation and model calibration
AU - Hemez, François M.
AU - Atamturktur, H. Sezer
N1 - Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2011
Y1 - 2011
N2 - Activities such as sensitivity analysis, statistical effect screening, uncertainty propagation, or model calibration have become integral to the Verification and Validation (V&V) of numerical models and computer simulations. Because these analyses involve performing multiple runs of a computer code, they can rapidly become computationally expensive. For example, propagating uncertainty with a 1,000 Monte Carlo samples wrapped around a finite element calculation that takes only 10 minutes to run requires seven days of single-processor time. An alternative is to combine a design of computer experiments to meta-modeling, and replace the potentially expensive computer simulation by a fast-running surrogate. The surrogate can then be used to estimate sensitivities, propagate uncertainty, and calibrate model parameters at a fraction of the cost it would take to wrap a sampling algorithm or optimization solver around the analysis code. In this publication, we focus on the dangers of using too sparsely populated design-of-experiments to propagate uncertainty or train a fast-running surrogate model. One danger for sensitivity analysis or calibration is to develop meta-models that include erroneous sensitivities. This is illustrated with a high-dimensional, non-linear mathematical function in which several parameter effects are statistically insignificant, therefore, mimicking a situation that is often encountered in practice. It is shown that using a sparse design of computer experiments leads to an incorrect approximation of the function.
AB - Activities such as sensitivity analysis, statistical effect screening, uncertainty propagation, or model calibration have become integral to the Verification and Validation (V&V) of numerical models and computer simulations. Because these analyses involve performing multiple runs of a computer code, they can rapidly become computationally expensive. For example, propagating uncertainty with a 1,000 Monte Carlo samples wrapped around a finite element calculation that takes only 10 minutes to run requires seven days of single-processor time. An alternative is to combine a design of computer experiments to meta-modeling, and replace the potentially expensive computer simulation by a fast-running surrogate. The surrogate can then be used to estimate sensitivities, propagate uncertainty, and calibrate model parameters at a fraction of the cost it would take to wrap a sampling algorithm or optimization solver around the analysis code. In this publication, we focus on the dangers of using too sparsely populated design-of-experiments to propagate uncertainty or train a fast-running surrogate model. One danger for sensitivity analysis or calibration is to develop meta-models that include erroneous sensitivities. This is illustrated with a high-dimensional, non-linear mathematical function in which several parameter effects are statistically insignificant, therefore, mimicking a situation that is often encountered in practice. It is shown that using a sparse design of computer experiments leads to an incorrect approximation of the function.
UR - http://www.scopus.com/inward/record.url?scp=80051486847&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=80051486847&partnerID=8YFLogxK
U2 - 10.1007/978-1-4419-9834-7_48
DO - 10.1007/978-1-4419-9834-7_48
M3 - Conference contribution
AN - SCOPUS:80051486847
SN - 9781441998330
T3 - Conference Proceedings of the Society for Experimental Mechanics Series
SP - 537
EP - 556
BT - Structural Dynamics - Proceedings of the 28th IMAC, A Conference on Structural Dynamics, 2010
PB - Springer New York LLC
ER -