首页> 外文会议>International Conference on Machine Learning >Quickly Boosting Decision Trees - Pruning Underachieving Features Early
【24h】

Quickly Boosting Decision Trees - Pruning Underachieving Features Early

机译:快速提升决策树 - 提前修剪劣势

获取原文

摘要

Boosted decision trees are among the most popular learning techniques in use today. While exhibiting fast speeds at test time, relatively slow training renders them impractical for applications with real-time learning requirements. We propose a principled approach to overcome this drawback. We prove a bound on the error of a decision stump given its preliminary error on a subset of the training data; the bound may be used to prune unpromising features early in the training process. We propose a fast training algorithm that exploits this bound, yielding speedups of an order of magnitude at no cost in the final performance of the classifier. Our method is not a new variant of Boosting; rather, it is used in conjunction with existing Boosting algorithms and other sampling methods to achieve even greater speedups.
机译:提升决策树是今天使用中最受欢迎的学习技术之一。虽然在测试时间展示快速速度,但相对缓慢的训练使它们对于具有实时学习要求的应用程序不切实际。我们提出了一个原则的方法来克服这一缺点。在训练数据的子集上初步错误,我们证明了决策树桩的错误的绑定;界限可用于在培训过程中提前修剪不妥协的功能。我们提出了一种快速训练算法,可以利用这种界限,在分类器的最终性能下,在没有成本的情况下不会产生幅度的加速。我们的方法不是提升的新变种;相反,它与现有的升压算法和其他采样方法结合使用,以实现更大的加速。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号