首页> 外文期刊>IEEE Transactions on Information Theory >Analysis of KNN Density Estimation
【24h】

Analysis of KNN Density Estimation

机译:Analysis of KNN Density Estimation

获取原文
获取原文并翻译 | 示例
           

摘要

We analyze the convergence rates of $k$ nearest neighbor density estimation method, under $ell _{alpha} $ norm with $alpha in [1,infty]$ . Our analysis includes two different cases depending on whether the support set is bounded or not. In the first case, the probability density function has a bounded support. We show that if the support set is known, then the kNN density estimator is minimax optimal under $ell _{alpha} $ with both $alpha in big[1,inftybig)$ and $alpha =infty $ . If the support is unknown, the kNN density estimator is still minimax optimal under $ell _{1}$ , but is suboptimal under $ell _{alpha} $ for $alpha 1$ , and not consistent under $ell _infty $ . In the second case, the support is unbounded and the probability density function is smooth everywhere. Moreover, the Hessian is assumed to decay with the density values. For this case, our result shows that the $ell _infty $ error of kNN density estimation is nearly minimax optimal. The $ell _{alpha} $ error for the original kNN density estimator is not consistent. To address this issue, we design a new adaptive kNN estimator, which can select different $k$ for different samples. Using this adaptive estimator, the $ell _{alpha} $ bound is minimax optimal. For comparison, we show that the popular kernel density estimator is not minimax optimal for this case.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号