首页> 外文期刊>IEEE Transactions on Information Theory >Information-theoretic asymptotics of Bayes methods
【24h】

Information-theoretic asymptotics of Bayes methods

机译:贝叶斯方法的信息理论渐近性

获取原文
获取原文并翻译 | 示例
           

摘要

In the absence of knowledge of the true density function, Bayesian models take the joint density function for a sequence of n random variables to be an average of densities with respect to a prior. The authors examine the relative entropy distance D/sub n/ between the true density and the Bayesian density and show that the asymptotic distance is (d/2)(log n)+c, where d is the dimension of the parameter vector. Therefore, the relative entropy rate D/sub n/ converges to zero at rate (log n). The constant c, which the authors explicitly identify, depends only on the prior density function and the Fisher information matrix evaluated at the true parameter value. Consequences are given for density estimation, universal data compression, composite hypothesis testing, and stock-market portfolio selection.
机译:在不知道真实密度函数的情况下,贝叶斯模型将n个随机变量序列的联合密度函数作为相对于先验密度的平均密度。作者检查了真实密度和贝叶斯密度之间的相对熵距离D / sub n /,并表明渐近距离为(d / 2)(log n)+ c,其中d是参数矢量的维数。因此,相对熵比率D / sub n // n以比率(log n)/ n收敛到零。作者明确标识的常数c仅取决于先验密度函数和以真实参数值评估的Fisher信息矩阵。结果给出了密度估计,通用数据压缩,复合假设检验和股票市场投资组合选择的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号