首页> 外文会议>International Conference on Artificial Neural Networks >Central-Diffused Instance Generation Method in Class Incremental Learning
【24h】

Central-Diffused Instance Generation Method in Class Incremental Learning

机译:班级增量学习中的中央扩散实例生成方法

获取原文

摘要

Class incremental learning is widely applied in the classification scenarios as the number of classes is usually dynamically changing. Meanwhile, class imbalance learning often occurs simultaneously in class incremental learning when the new class emerges. Previous studies mainly proposed different methods to handle this problem. But these methods focus on classification tasks with a fixed class set and cannot adjust the peripheral contour features of the original instance distribution. As a result, the classification performance degrades seriously in an open dynamic environment, and the synthetic instances are always clustered within the original distribution. In order to solve class imbalance learning effectively in class incremental learning, we propose a Central-diffused Instance Generation Method to generate the instances of minority class as the new class emerging, called CdlGM. The key is to randomly shoot direction vectors of fixed length from the center of new class instances to expand the instance distribution space. The vectors diffuse to form a distribution which is optimized to satisfy properties that produce a multi-classification discriminative classifier with good performance. We conduct the experiments on both artificial data streams with different imbalance rates and real-world ones to compare CdlGM with some other proposed methods, e.g. SMOTE, OPCIL, OB and SDCIL. The experiment results show that CdlGM averagely achieves more than 4.01%, 4.49%, 8.81% and 9.76% performance improvement over SMOTE, OPCIL, OB and SDCIL, respectively, and outperforms in terms of overall and real-time accuracy. Our method is proved to possess the strength of class incremental learning and class imbalance learning with good accuracy and robustness.
机译:班级增量学习广泛应用于分类方案中,因为班级数量通常是动态变化的。同时,当新班级出现时,班级不平衡学习通常在班级增量学习中同时发生。先前的研究主要提出了不同的方法来处理此问题。但是这些方法集中于具有固定类集的分类任务,并且无法调整原始实例分布的外围轮廓特征。结果,在开放的动态环境中分类性能会严重降低,并且合成实例始终会聚集在原始分布内。为了有效地解决班级增量学习中的班级不平衡学习问题,我们提出了一种集中扩散实例生成方法,以生成作为新出现的新班级的少数派实例的实例,称为CdlGM。关键是从新类实例的中心随机拍摄固定长度的方向向量,以扩展实例分布空间。向量扩散以形成分布,该分布经优化以满足满足产生具有良好性能的多分类判别式分类器的属性的要求。我们对具有不同失衡率的人工数据流和现实世界中的数据流进行了实验,以将CdlGM与其他一些建议的方法进行比较,例如SMOTE,OPCIL,OB和SDCIL。实验结果表明,与SMOTE,OPCIL,OB和SDCIL相比,CdlGM的平均性能提高分别超过4.01%,4.49%,8.81%和9.76%,并且在整体和实时准确性方面均优于同类产品。实践证明,该方法具有班级增量学习和班级不平衡学习的优势,具有很好的准确性和鲁棒性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号