首页> 外文期刊>Reliability Engineering & System Safety >The dangers of sparse sampling for the quantification of margin and uncertainty
【24h】

The dangers of sparse sampling for the quantification of margin and uncertainty

机译:稀疏采样对余量和不确定性进行量化的危险

获取原文
获取原文并翻译 | 示例
           

摘要

Activities such as global sensitivity analysis, statistical effect screening, uncertainty propagation, or model calibration have become integral to the Verification and Validation (V&V) of numerical models and computer simulations. One of the goals of V&V is to assess prediction accuracy and uncertainty, which feeds directly into reliability analysis or the Quantification of Margin and Uncertainty (QMU) of engineered systems. Because these analyses involve multiple runs of a computer code, they can rapidly become computationally expensive. An alternative to Monte Carlo-like sampling is to combine a design of computer experiments to meta-modeling, and replace the potentially expensive computer simulation by a faSt-running emulator. The surrogate can then be used to estimate sensitivities, propagate uncertainty, and calibrate model parameters at a fraction of the cost it would take to wrap a sampling algorithm or optimization solver around the physics-based code. Doing so, however, offers the risk to develop an incorrect emulator that erroneously approximates the "true-but-unknown" sensitivities of the physics-based code. We demonstrate the extent to which this occurs when Gaussian Process Modeling (GPM) emulators are trained in high-dimensional spaces using too-sparsely populated designs-of-experiments. Our illustration analyzes a variant of the Rosenbrock function in which several effects are made statistically insignificant while others are strongly coupled, therefore, mimicking a situation that is often encountered in practice. In this example, using a combination of GPM emulator and design-of-experiments leads to an incorrect approximation of the function. A mathematical proof of the origin of the problem is proposed. The adverse effects that too-sparsely populated designs may produce are discussed for the coverage of the design space, estimation of sensitivities, and calibration of parameters. This work attempts to raise awareness to the potential dangers of not allocating enough resources when exploring a design space to develop fast-running emulators.
机译:诸如全局敏感性分析,统计效果筛选,不确定性传播或模型校准之类的活动已成为数值模型和计算机模拟的验证和确认(V&V)不可或缺的一部分。 V&V的目标之一是评估预测的准确性和不确定性,将其直接输入到可靠性分析或工程系统的保证金和不确定性量化(QMU)中。由于这些分析涉及多次计算机代码运行,因此它们可能会迅速变得计算量大。类似于蒙特卡洛采样的替代方法是将计算机实验设计与元建模相结合,并用运行最快的仿真器代替可能昂贵的计算机仿真。然后,可以使用替代项来估算灵敏度,传播不确定性并校准模型参数,而所需费用只是围绕基于物理学的代码包装采样算法或优化求解器所花费的一小部分。但是,这样做会冒着开发错误的仿真器的风险,该仿真器错误地近似了基于物理学的代码的“真实但未知”的敏感性。我们演示了当使用稀疏的实验设计在高维空间中训练高斯过程建模(GPM)仿真器时,这种情况的发生程度。我们的插图分析了Rosenbrock函数的一种变体,其中一些影响在统计上是微不足道的,而其他影响则是强烈耦合的,因此模仿了实际中经常遇到的情况。在此示例中,结合使用GPM仿真器和实验设计会导致函数的错误近似。提出了问题来源的数学证明。讨论了过于稀疏的设计可能产生的不利影响,包括设计空间的覆盖范围,灵敏度的估计以及参数的校准。这项工作试图提高人们在探索设计空间来开发快速运行的仿真器时潜在的危险,即未分配足够的资源。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号