...
首页> 外文期刊>Signal processing >Fast kernel entropy estimation and optimization
【24h】

Fast kernel entropy estimation and optimization

机译:快速核熵估计和优化

获取原文
获取原文并翻译 | 示例
           

摘要

Differential entropy is a quantity used in many signal processing problems. Often we need to calculate not only the entropy itself. but also its gradient with respect to various variables, for efficient optimization, sensitivity analysis, etc. Entropy estimation can be based on an estimate of the probability density function, which is computationally costly if done naively. Some prior algorithms use computationally efficient non-parametric entropy estimators. However, differentiation of the previously proposed estimators is difficult and may even be undefined. To counter these obstacles, we consider non-parametric kernel entropy estimation that is differentiable. We present two different accelerated kernel algorithms. The first accelerates the entropy gradient calculation based on a back propagation principle. It allows calculating the differential entropy g.-adient in the same complexity as that of calculating the entropy itself. The second algorithm accelerates the estimation of both entropy and its gradient by using fast convolution over a uniform grid. As an example, we apply both algorithms to blind source separation. (c) 2005 Elsevier B.V. All rights reserved.
机译:微分熵是许多信号处理问题中使用的量。通常,我们不仅需要计算熵本身。熵估计可以基于概率密度函数的估计,如果天真地进行,则计算成本很高。一些现有算法使用计算效率高的非参数熵估计量。然而,先前提出的估计器的区分是困难的,甚至可能是不确定的。为了克服这些障碍,我们考虑了可微的非参数核熵估计。我们提出两种不同的加速内核算法。第一种基于反向传播原理来加快熵梯度计算。它允许以与计算熵本身相同的复杂度来计算微分熵g。第二种算法通过在均匀网格上使用快速卷积来加速熵及其梯度的估计。例如,我们将两种算法都应用于盲源分离。 (c)2005 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号