...
首页> 外文期刊>Journal of nonparametric statistics >Multiple predicting K-fold cross-validation for model selection
【24h】

Multiple predicting K-fold cross-validation for model selection

机译:用于模型选择的多重预测K折交叉验证

获取原文
获取原文并翻译 | 示例
           

摘要

K-fold cross-validation (CV) is widely adopted as a model selection criterion. In K-fold CV, (K - 1) folds are used for model construction and the hold-out fold is allocated to model validation. This implies model construction is more emphasised than the model validation procedure. However, some studies have revealed that more emphasis on the validation procedure may result in improved model selection. Specifically, leave-m-out CV with n samples may achieve variable-selection consistency when m approaches to 1. In this study, a new CV method is proposed within the framework of K-fold CV. The proposed method uses (K - 1) folds of the data for model validation, while the other fold is for model construction. This provides (K - 1) predicted values for each observation. These values are averaged to produce a final predicted value. Then, the model selection based on the averaged predicted values can reduce variation in the assessment due to the averaging. The variable-selection consistency of the suggested method is established. Its advantage over K-fold CV with finite samples are examined under linear, non-linear, and high-dimensional models.
机译:K-fold交叉验证(CV)被广泛用作模型选择标准。在K折CV中,将(K-1)折用于模型构建,将保留的折分配给模型验证。这意味着比模型验证过程更强调模型构建。但是,一些研究表明,对验证程序的更多重视可能会导致改进的模型选择。具体来说,当m / n接近1时,带有n个样本的遗漏m-CV可以实现变量选择的一致性。在这项研究中,提出了一种新的C-fold CV方法。所提出的方法使用(K-1)折数据进行模型验证,而另一折用于模型构建。这将为每个观测值提供(K-1)个预测值。将这些值平均以产生最终的预测值。然后,基于平均预测值的模型选择可以减少由于平均导致的评估差异。建立了建议方法的变量选择一致性。在线性,非线性和高维模型下检验了它与有限样本的K折CV相比的优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号