【24h】

On the Deep Active-Subspace Method

机译:关于深度主动子空间方法

获取原文
获取原文并翻译 | 示例
           

摘要

The deep active-subspace method is a neural-network based tool for the propagation of uncertainty through computational models with high-dimensional input spaces. Unlike the original activesubspace method, it does not require access to the gradient of the model. It relies on an orthogonal projection matrix constructed with Gram--Schmidt orthogonalization to reduce the input dimensionality. This matrix is incorporated into a neural network as the weight matrix of the first hidden layer (acting as an orthogonal encoder), and optimized using back propagation to identify the active subspace of the input. We propose several theoretical extensions, starting with a new analytic relation for the derivatives of Gram--Schmidt vectors, which are required for back propagation. We also study the use of vector-valued model outputs, which is difficult in the case of the original active-subspace method. Additionally, we investigate an alternative neural network with an encoder without embedded orthonormality, which shows equally good performance compared to the deep active-subspace method. Two epidemiological models are considered as applications, where one requires supercomputer access to generate the training data.
机译:是一个深active-subspace方法基于神经网络的传播工具通过计算模型的不确定性高维输入空间。原始activesubspace方法,它不需要访问的梯度模型。依赖于一个正交投影矩阵由克——施密特正交化减少输入维数。纳入一个神经网络作为吗权重矩阵的第一个隐层(表演作为一个正交编码器),和优化使用反向传播来识别活跃的子空间的输入。扩展,开始一个新的解析关系克的衍生品——施密特向量,这对反向传播是必需的。研究向量值的使用模型输出,这是困难的在原来的吗active-subspace方法。另一种神经网络与一个调查编码器没有嵌入式正规化,相比同样显示了良好的性能深active-subspace方法。模型是应用程序,其中一个需要超级计算机生成的访问训练数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号