首页> 外文期刊>IEEE Transactions on Information Theory >On the maximum entropy of the sum of two dependent random variables
【24h】

On the maximum entropy of the sum of two dependent random variables

机译:关于两个因变量的和的最大熵

获取原文
获取原文并翻译 | 示例
           

摘要

Investigates the maximization of the differential entropy h(X+Y) of arbitrary dependent random variables X and Y under the constraints of fixed equal marginal densities for X and Y. We show that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave. The maximum is achieved when X=Y. If f is not log-concave, the maximum is strictly greater than h(2X). As an example, identically distributed Gaussian random variables have log-concave densities and satisfy max[h(X+Y)]=h(2X) with X=Y. More general inequalities in this direction should lead to capacity bounds for additive noise channels with feedback.
机译:研究在X和Y的边际密度相等的固定约束下,任意因变量X和Y的微分熵h(X + Y)的最大值。我们证明max [h(X + Y)] = h(2X ),在且仅当f为对数凹面的情况下,在X和Y具有相同的固定边际密度f的约束下。当X = Y时达到最大值。如果f不是对数凹入的,则最大值严格大于h(2X)。例如,相同分布的高斯随机变量具有对数凹面密度,并且满足X [Y]的max [h(X + Y)] = h(2X)。在这个方向上更普遍的不平等应导致带有反馈的附加噪声通道的容量限制。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号