首页> 外文期刊>Statistics and computing >Convergence analysis of herded-Gibbs-type sampling algorithms: effects of weight sharing
【24h】

Convergence analysis of herded-Gibbs-type sampling algorithms: effects of weight sharing

机译:Herded-Gibbs型采样算法的收敛分析:重量分享的影响

获取原文
获取原文并翻译 | 示例
           

摘要

Herded Gibbs (HG) and discretized herded Gibbs (DHG), which are Gibbs samplings combined with herding, are deterministic sampling algorithms for Markov random fields with discrete random variables. In this paper, we introduce the notion of "weight sharing" to systematically view these HG-type algorithms, and also investigate their convergence theoretically and numerically. We show that, by sharing and reducing the number of weight variables, the HG-type algorithm achieves fast initial convergence at the expense of asymptotic convergence. This means that the HG-type algorithm can be practically more efficient than conventional Markov chain Monte Carlo algorithms, although its estimate does not necessarily converge to the target asymptotically. Moreover, we decompose the numerical integration error of HG-type algorithms into several components and evaluate each of them in relation to herding and weight sharing. By using this formulation, we also propose novel variants of the HG-type algorithm that reduce the asymptotic bias.
机译:埃拉德·吉布斯(HG)和离散的细胞GIBBS(DHG),即GIBBS采样结合放牧,是Markov随机场的确定性采样算法,具有离散随机变量。在本文中,我们介绍了“重量共享”的概念来系统地观察这些HG型算法,并且在理论上和数值上也调查它们的收敛。我们表明,通过共享和减少重量变量的数量,HG型算法以渐近收敛为代价实现快速初始收敛。这意味着HG型算法可以实际上比传统的马尔可夫链蒙特卡罗算法更有效,尽管其估计不一定会聚到目标渐近的目标。此外,我们将HG型算法的数值集成误差分解为几个组件,并在与牧装和权重共享相关的每个组件中评估它们中的每一个。通过使用这种配方,我们还提出了降低渐近偏差的HG型算法的新型变体。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号