首页> 中文期刊> 《计算机应用》 >基于实例的强分类器快速集成方法

基于实例的强分类器快速集成方法

         

摘要

Focusing on the issue that the ensemble classifier based on weak classifiers needs to sacrifice a lot of training time to obtain high precision,an ensemble method of strong classifiers based on instances named Fast Strong-classifiers Ensemble (FSE) was proposed.Firstly,the evaluation method was used to eliminate substandard classifier and order the restclassifiers by the accuracy and diversity to obtain a set of classifiers with highest precision and maximal difference.Secondly,the FSE algorithm was used to break the existing sample distribution,to re-sample and make the classifier pay more attention to learn the difficult samples.Finally,the ensemble classifier was completed by determining the weight of each classifier simultaneously.The experiments were conducted on UCI dataset and customized dataset.The accuracy of the Boosting reached 90.2% and 90.4% on both datasets respectively,and the accuracy of the FSE reached 95.6% and 93.9%.The training time of ensemble classifier with FSE was shortened by 75% and 80% compared to the ensemble classifier with Boosting when they reached the same accuracy.The theoretical analysis and simulation results show that FSE ensemble model can effectively improve the recognition accuracy and shorten training time.%针对集成分类器由于基分类器过弱,需要牺牲大量训练时间才能取得高精度的问题,提出一种基于实例的强分类器快速集成方法——FSE.首先通过基分类器评价方法剔除不合格分类器,再对分类器进行精确度和差异性排序,从而得到一组精度最高、差异性最大的分类器;然后通过FSE集成算法打破已有的样本分布,重新采样使分类器更多地关注难学习的样本,并以此决定各分类器的权重并集成.实验通过与集成分类器Boosting在UCI数据库和真实数据集上进行比对,Boosting构造的集成分类器的识别精度最高分别能达到90.2%和90.4%,而使用FSE方法的集成分类器精度分别能达到95.6%和93.9%;而且两者在达到相同精度时,使用FSE方法的集成分类器分别缩短了75%和80%的训练时间.实验结果表明,FSE集成模型能有效提高识别精度、缩短训练时间.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号