【24h】

Quantized Minimum Error Entropy Criterion

机译:量化最小误差熵准则

获取原文
获取原文并翻译 | 示例
           

摘要

Comparing with traditional learning criteria, such as mean square error, the minimum error entropy (MEE) criterion is superior in nonlinear and non-Gaussian signal processing and machine learning. The argument of the logarithm in Renyi's entropy estimator, called information potential (IP), is a popular MEE cost in information theoretic learning. The computational complexity of IP is, however, quadratic in terms of sample number due to double summation. This creates the computational bottlenecks, especially for large-scale data sets. To address this problem, in this paper, we propose an efficient quantization approach to reduce the computational burden of IP, which decreases the complexity from O(N-2) to O(MN) with M N. The new learning criterion is called the quantized MEE (QMEE). Some basic properties of QMEE are presented. Illustrative examples with linear-in-parameter models are provided to verify the excellent performance of QMEE.
机译:与均方误差等传统学习准则相比,最小误差熵(MEE)准则在非线性和非高斯信号处理和机器学习中具有优势。仁义的熵估计器中对数的论点称为信息势(IP),是信息理论学习中流行的MEE成本。然而,由于两次求和,IP的计算复杂度在样本数方面是二次的。这会产生计算瓶颈,尤其是对于大型数据集。为了解决这个问题,在本文中,我们提出了一种有效的量化方法来减轻IP的计算负担,从而将复杂度从O(N-2)降低到O(MN),且M N。新的学习准则是称为量化MEE(QMEE)。介绍了QMEE的一些基本属性。提供了带有参数线性模型的说明性示例,以验证QMEE的出色性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号