首页> 外文期刊>IEEE Transactions on Information Theory >Mixture Models, Bayes Fisher Information, and Divergence Measures
【24h】

Mixture Models, Bayes Fisher Information, and Divergence Measures

机译:混合模型,贝叶斯渔业信息和分歧措施

获取原文
获取原文并翻译 | 示例
           

摘要

This paper presents the Bayes Fisher information measures, defined by the expected Fisher information under a distribution for the parameter, for the arithmetic, geometric, and generalized mixtures of two probability density functions. The Fisher information of the arithmetic mixture about the mixing parameter is related to chi-square divergence, Shannon entropy, and the Jensen-Shannon divergence. The Bayes Fisher measures of the three mixture models are related to the Kullback-Leibler, Jeffreys, Jensen-Shannon, Renyi, and Tsallis divergences. These measures indicate that the farther away are the components from each other, the more informative are data about the mixing parameter. We also unify three different relative entropy derivations of the geometric mixture scattered in statistics and physics literatures. Extensions of two of the formulations to the minimization of Tsallis divergence give the generalized mixture as the solution.
机译:本文介绍了贝叶斯Fisher信息措施,由预期的Fisher信息根据参数的分布,用于两个概率密度函数的算术,几何和广义混合物。关于混合参数的算术混合物的Fisher信息与Chi-Square分歧,香农熵和Jensen-Shannon发散有关。三种混合模型的贝叶斯渔业措施与Kullback-Leibler,Jeffreys,Jensen-Shannon,Renyi和Tsallis分歧有关。这些措施表明,彼此的组件越远,越多的信息是关于混合参数的数据。我们还统一了统计和物理文献中散落的几何混合物的三种不同相对熵衍生。两种制剂的延伸率为最小化Tsallis发散使普通混合物作为溶液。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号