Erhard ReschenhoferH AkaikeH AkaikeJ ChenZ ChenG DiehrD HoflinD FosterE GeorgeE GeorgeD FosterA KhursheedK WalleniusY KimS KwonH ChoiS NadarajahR TeamA RnyiE ReschenhoferE ReschenhoferE ReschenhoferE ReschenhoferD PreinerstorferL SteinbergerE ReschenhoferM SchildeE ObereckerE PayrH TandoganL WakolbingerD RothmanG SchwarzR TibshiraniK KnightH WangB LiC LengX ZhengW.-Y Loh
In this paper, conditions for the consistent selection of a subset from a large set of potential regressors are derived.It is assumed that the number of potential regressors increases as the sample size increases and, in addition, that the regressors are orthogonal.Subset-selection criteria are proposed which satisfy these conditions.These criteria do not depend on any tuning parameters.It is also shown that some other criteria, which include AIC and BIC, violate these conditions.Simulation studies with different sample sizes and large sets of orthogonal regressors are conducted to compare the performance of the new criteria with that of conventional model-selection criteria.The results of these simulation studies corroborate the theoretical findings.In large samples, the consistent criteria always make the correct decisions.They include all genuine regressors and exclude the others.In contrast, AIC tends to select always the maximum number of regressors and BIC is also not competitive when the number of potential regressors grows too fast.
Julian A. A. CollazosRonaldo DiasAdriano Zanin Zambom
Dietmar ZellnerFrieder KellerGünter E. Zellner