International Conference on Computational Statistics (COMPSTAT) edition:21 location:Geneva, Switzerland date:19-22 August 2014
Principal covariates regression is a method that combines dimension reduction with regression, in that the predictors (X) are reduced to a few components, on which the criteria (Y) are regressed. The extent to which both aspects are emphasized can be manipulated through a weighting parameter alpha, ranging between 0 (corresponding with reduced-rank regression) and 1 (corresponding with principal components regression). However, how the value of alpha can be optimally tuned, is not so obvious as well as how the number of components impacts the optimal alpha. Recently, we integrated the scattered findings on the impact of alpha and conducted a simulation study which verified the resulting hypothesis that the effect of alpha is most pronounced when the underlying components differ strongly in strength (i.e., explained variance in X) and relevance (i.e., explained variance in Y). Additionally, we proposed a couple of model selection techniques that combine the selection of alpha and the number of components. In the present study, we will evaluate the performance of these techniques, especially in conditions where model selection is challenging (e.g., presence of components that have a zero regression weight, multiple criteria…). Specifically, we will study the recovery of the criteria and the underlying components.