首页> 外文学位 >Utilizing the multivariate reduced rank regression model for selecting predictors in linear multivariate regression
【24h】

Utilizing the multivariate reduced rank regression model for selecting predictors in linear multivariate regression

机译:利用多元降阶回归模型在线性多元回归中选择预测变量

获取原文
获取原文并翻译 | 示例

摘要

Applications in which several quantities are to be predicted using a common set of predictor variables are becoming increasingly important in various disciplines. However, three problems are associated with the application of the usual multivariate multiple regression model. First, the assumption that the regression coefficient matrix is of full rank. Second, the inclusion of a large number of predictors where some of them might be slightly correlated with the dependent variables or they may be redundant because of high correlations with other independent variables. Third, the model does not consider the correlations among the dependent variables which may reduces the prediction accuracy.;Multivariate variable selection procedures (stepwise regression and all-possible-regression) are usually used to solve the second problem, however the first and third problems are usually not considered by the multivariate multiple regression users. Reinsel & Velu (1998) propose a new technique that solves the two problems simultaneously. The variable selection procedure, which uses the reduced rank regression model, takes into account the interrelations among the multiple dependent variables in finding the "best" subset of predictors. This model is called the reduced rank regression (RRR) procedure.;This study is designed to compare, using Monte Carlo methods, several multivariate variable selection procedures and the reduced rank regression (RRR) procedure to select the "best" subset of predictors under various conditions: three levels of the sample size (n = 30, 60, 100, and 500), two levels of the total number of predictors ( k = 5 and 9), three levels of the correlation among the dependent variables (rhoy = .2, .5, and .8), three levels of the correlation among the independent variables (rhox = .1, .3, and .5), and two levels of the correlation between the set of the independent variables and the dependent variables (rhoZY = .1 and .4).;The results of the study demonstrated that the RRR procedure is superior to all other multivariate variable selection procedures studied. It is considered the best among the selection criteria under most of the study conditions although the Akaike's corrected information criterion (AICc) performed satisfactorily under some conditions.
机译:在各种学科中,使用一组通用的预测变量来预测几个数量的应用变得越来越重要。然而,三个问题与通常的多元多元回归模型的应用有关。首先,假设回归系数矩阵是满秩的。其次,包含大量的预测变量,其中一些预测变量可能与因变量略相关,或者由于与其他自变量的高度相关而可能是多余的。第三,该模型没有考虑因变量之间的相关性,这可能会降低预测准确性。;通常使用多元变量选择程序(逐步回归和所有可能回归)来解决第二个问题,但是第一个和第三个问题多元多元回归用户通常不考虑这些变量。 Reinsel&Velu(1998)提出了同时解决两个问题的新技术。使用缩减秩回归模型的变量选择过程在查找预测变量的“最佳”子集时考虑了多个因变量之间的相互关系。该模型称为简化秩回归(RRR)过程。本研究旨在使用蒙特卡罗方法比较几种多元变量选择过程和简化秩回归(RRR)过程,以选择“最佳”预测子集。各种条件:三个级别的样本量(n = 30、60、100和500),两个级别的预测变量总数(k = 5和9),三个级别的因变量之间的相关性(rhoy = .2,.5和.8),三个自变量之间的相关性(rhox = .1,.3和.5)以及两个自变量集和因变量之间的相关性变量(rhoZY = .1和.4)。研究结果表明,RRR过程优于所研究的所有其他多元变量选择过程。尽管Akaike的校正信息标准(AICc)在某些条件下表现令人满意,但在大多数研究条件下,它被认为是最佳选择标准。

著录项

  • 作者

    Al-Subaihi, Ali Ahmed.;

  • 作者单位

    University of Pittsburgh.;

  • 授予单位 University of Pittsburgh.;
  • 学科 Statistics.
  • 学位 Ph.D.
  • 年度 2000
  • 页码 98 p.
  • 总页数 98
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号