首页> 外文期刊>Frontiers in Applied Mathematics and Statistics >Optimal Rates for the Regularized Learning Algorithms under General Source Condition
【24h】

Optimal Rates for the Regularized Learning Algorithms under General Source Condition

机译:一般源条件下正则化学习算法的最优速率

获取原文
           

摘要

We consider the learning algorithms under general source condition with the polynomial decay of the eigenvalues of the integral operator in vector-valued function setting. We discuss the upper convergence rates of Tikhonov regularizer under general source condition corresponding to increasing monotone index function. The convergence issues are studied for general regularization schemes by using the concept of operator monotone index functions in minimax setting. Further we also address the minimum possible error for any learning algorithm.
机译:我们考虑在一般源条件下,向量值函数设置中积分算子特征值的多项式衰减的学习算法。我们讨论了Tikhonov正则化器在一般源条件下对应于增加的单调索引函数的较高收敛速度。通过使用极小极大值设置中的算子单调索引函数的概念,研究了一般正则化方案的收敛问题。此外,我们还针对任何学习算法解决了最小可能的错误。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号