首页> 外文会议>IEEE Winter Conference on Applications of Computer Vision >MUSCLE: Strengthening Semi-Supervised Learning Via Concurrent Unsupervised Learning Using Mutual Information Maximization
【24h】

MUSCLE: Strengthening Semi-Supervised Learning Via Concurrent Unsupervised Learning Using Mutual Information Maximization

机译:使用相互信息最大化,通过并发无监督学习来加强半监督学习

获取原文

摘要

Deep neural networks are powerful, massively parameterized machine learning models that have been shown to perform well in supervised learning tasks. However, very large amounts of labeled data are usually needed to train deep neural networks. Several semi-supervised learning approaches have been proposed to train neural networks using smaller amounts of labeled data with a large amount of unlabeled data. The performance of these semisupervised methods significantly degrades as the size of labeled data decreases. We introduce Mutual-information-based Unsupervised & Semi-supervised Concurrent LEarning (MUSCLE), a hybrid learning approach that uses mutual information to combine both unsupervised and semisupervised learning. MUSCLE can be used as a standalone training scheme for neural networks, and can also be incorporated into other learning approaches. We show that the proposed hybrid model outperforms state of the art on several standard benchmarks, including CIFAR-10, CIFAR-100, and Mini-Imagenet. Furthermore, the performance gain consistently increases with the reduction in the amount of labeled data, as well as in the presence of bias. We also show that MUSCLE has the potential to boost the classification performance when used in the fine-tuning phase for a model pre-trained only on unlabeled data.
机译:深度神经网络是强大的,大量参数化的机器学习模型,已被证明在监督的学习任务中表现良好。然而,通常需要非常大量的标记数据来培训深神经网络。已经提出了几种半监督学习方法,用于使用具有大量未标记数据的较少量的标记数据训练神经网络。随着标记数据的大小降低,这些半体验方法的性能显着降低。我们介绍基于互动的无监督和半监督并发学习(肌肉),混合学习方法,使用相互信息结合无监督和半体验的学习。肌肉可以用作神经网络的独立培训方案,也可以纳入其他学习方法。我们表明,在几个标准基准测试中,所提出的混合模型优于现有技术,包括CiFar-10,CiFar-100和迷你想象。此外,性能增益始终如一地随着标记数据量的降低而增加,以及在存在偏差的情况下。我们还表明肌肉有可能在用于仅在未标记数据上预先培训的模型的微调阶段时提高分类性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号