We analyze the convergence rates of $k$ nearest neighbor density estimation method, under $ell _{alpha} $ norm with $alpha in [1,infty]$ . Our analysis includes two different cases depending on whether the support set is bounded or not. In the first case, the probability density function has a bounded support. We show that if the support set is known, then the kNN density estimator is minimax optimal under $ell _{alpha} $ with both $alpha in big[1,inftybig)$ and $alpha =infty $ . If the support is unknown, the kNN density estimator is still minimax optimal under $ell _{1}$ , but is suboptimal under $ell _{alpha} $ for $alpha 1$ , and not consistent under $ell _infty $ . In the second case, the support is unbounded and the probability density function is smooth everywhere. Moreover, the Hessian is assumed to decay with the density values. For this case, our result shows that the $ell _infty $ error of kNN density estimation is nearly minimax optimal. The $ell _{alpha} $ error for the original kNN density estimator is not consistent. To address this issue, we design a new adaptive kNN estimator, which can select different $k$ for different samples. Using this adaptive estimator, the $ell _{alpha} $ bound is minimax optimal. For comparison, we show that the popular kernel density estimator is not minimax optimal for this case.
展开▼