首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Approximate Inference by Kullback-Leibler Tensor Belief Propagation
【24h】

Approximate Inference by Kullback-Leibler Tensor Belief Propagation

机译:Kullback-Leibler张量信念传播近似推断

获取原文

摘要

Probabilistic programming provides a structured approach to signal processing algorithm design. The design task is formulated as a generative model, and the algorithm is derived through automatic inference. Efficient inference is a major challenge; e.g., the Shafer-Shenoy algorithm (SS) performs badly on models with large treewidth, which arise from various real-world problems. We focus on reducing the size of discrete models with large treewidth by storing intermediate factors in compressed form, thereby decoupling the variables through conditioning on introduced weights. This work proposes pruning of these weights using Kullback-Leibler divergence. We adapt a strategy from the Gaussian mixture reduction literature, leading to Kullback-Leibler Tensor Belief Propagation (KL-TBP), in which we use agglomerative hierarchical clustering to subsequently merge pairs of weights. Experiments using benchmark problems show KL-TBP consistently achieves lower approximation error than existing methods with competitive runtime.
机译:概率编程为信号处理算法设计提供了一种结构化方法。设计任务作为生成模型制定,并且通过自动推断导出算法。有效的推论是一项重大挑战;例如,Shafer-Shenoy算法(SS)在具有大型绿地的模型上表现严重,从而从各种真实问题出现。我们专注于通过在压缩形式中存储中间因子来减少大型树木的离散模型的大小,从而通过在引入的重量上通过调节来解耦变量。这项工作提出了使用Kullback-Leibler发散提出这些重量的修剪。我们根据高斯混合还原文献调整策略,导致Kullback-Leibler张量信念传播(KL-TBP),其中我们使用附聚层次聚类来随后合并重量。使用基准问题的实验显示KL-TBP始终如一地实现比具有竞争力运行时的现有方法更低的近似误差。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号