...
首页> 外文期刊>IEEE Transactions on Information Theory >How Biased Is Your Model? Concentration Inequalities, Information and Model Bias
【24h】

How Biased Is Your Model? Concentration Inequalities, Information and Model Bias

机译:你的型号有多偏见?集中不平等,信息和模型偏差

获取原文
获取原文并翻译 | 示例
           

摘要

We derive tight and computable bounds on the bias of statistical estimators, or more generally of quantities of interest, when evaluated on a baseline model P rather than on the typically unknown true model Q . Our proposed method combines the scalable information inequality derived by P. Dupuis, K.Chowdhary, the authors and their collaborators together with classical concentration inequalities (such as Bennett's and Hoeffding-Azuma inequalities). Our bounds are expressed in terms of the Kullback-Leibler divergence R(Q|P) of model Q with respect to P and the moment generating function for the statistical estimator under P . Furthermore, concentration inequalities, i.e. bounds on moment generating functions, provide tight and computationally inexpensive model bias bounds for quantities of interest. Finally, they allow us to derive rigorous confidence bands for statistical estimators that account for model bias and are valid for an arbitrary amount of data.
机译:当在基线模型P上的评估而不是在通常未知的真实模型Q上评估时,我们在统计估计器的偏差上获得了紧张和可计算的界限,或者更普遍的感兴趣的界限。我们所提出的方法结合了P. Dupuis,K.Chowdhands,作者及其合作者的可扩展信息不等式,以及古典浓度不平等(例如Bennett和Hoeffding-azuma不平等)。我们的界限以型号Q的Kullback-Leibler分歧R(Q | P)表示为P和P下统计估计器的矩产生功能。此外,浓缩不等式,即界限的界限,为感兴趣的数量提供紧密和计算廉价的模型偏置界限。最后,他们允许我们导出统计估算器的严格置信带,该估算器占模型偏置,并且对于任意数量的数据有效。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号