...
首页> 外文期刊>International Journal of Pattern Recognition and Artificial Intelligence >INCREMENTAL SPARSE PSEUDO-INPUT GAUSSIAN PROCESS REGRESSION
【24h】

INCREMENTAL SPARSE PSEUDO-INPUT GAUSSIAN PROCESS REGRESSION

机译:渐进式稀疏伪输入高斯过程回归

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, we devise a novel method that incrementally learns pseudo-data, which represent the whole training data set for Gaussian Process (GP) regression. The method involves sparse approximation of the GP by extending the work of Snelson and Ghahramani. We call the proposed method Incremental Sparse Pseudo-input Gaussian Process (ISPGP) regression. Unlike the Snelson and Ghahramani's work, the proposed ISPGP algorithm allows for training from either a huge amount of training data by scanning through it only once or an online incremental training data set. We also design a likelihood weighting scheme to incrementally determine pseudo-data while maintaining the representational power. Due to the nature of the incremental learning algorithm, the proposed ISPGP algorithm can theoretically work with infinite data to which the conventional GP or Sparse Pseudo-input Gaussian Process (SPGP) algorithm is not applicable. From our experimental results on the KIN40K data set, we can see that the proposed ISPGP algorithm is comparable to the conventional GP algorithm using the same number of training data. It also significantly reduces the computational cost and memory requirement in regression and is scalable to a large training data set without significant performance degradation. Although the proposed ISPGP algorithm performs slightly worse than Snelson and Ghahramani's SPGP algorithm, the level of performance degradation is acceptable.
机译:在本文中,我们设计了一种增量学习伪数据的新方法,该伪数据代表用于高斯过程(GP)回归的整个训练数据集。该方法通过扩展Snelson和Ghahramani的工作来涉及GP的稀疏近似。我们将所提出的方法称为增量稀疏伪输入高斯过程(ISPGP)回归。与Snelson和Ghahramani的工作不同,所提议的ISPGP算法仅通过扫描一次就可以从大量的训练数据中进行训练,也可以通过在线增量训练数据集进行训练。我们还设计了一种似然加权方案,以在保持表示能力的同时逐步确定伪数据。由于增量学习算法的性质,所提出的ISPGP算法在理论上可以处理不适用常规GP或稀疏伪输入高斯过程(SPGP)算法的无限数据。从我们在KIN40K数据集上的实验结果中,我们可以看到,所提出的ISPGP算法与使用相同数量训练数据的常规GP算法具有可比性。它还显着降低了回归中的计算成本和内存需求,并且可扩展到大型训练数据集,而不会显着降低性能。尽管建议的ISPGP算法的性能比Snelson和Ghahramani的SPGP算法稍差,但性能下降的程度还是可以接受的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号