Predictor relative importance and matching regression parameters
Predictor importance in applied regression modeling gives the main operational tools for managers and decision-makers. The paper considers estimation of predictors' importance in regression using measures introduced in works by Gibson and R. Johnson (GJ), then modified by Green, Carroll,
and DeSarbo, and developed further by J. Johnson (JJ). These indices of importance are based on the orthonormal decomposition of the data matrix, and the work shows how to improve this approximation. Using predictor importance, the regression coefficients can also be adjusted to reach the
best data fit and to be meaningful and interpretable. The results are compared with the robust to multicollinearity, but computationally difficult, Shapley value regression (SVR). They show that the JJ index is good for importance estimation, but the GJ index outperforms it if both predictor
importance and coefficients of regression are needed; hence, this index (GJ) can be used in place of the more computationally intensive estimation by SVR. The results can be easily estimated by the considered approach that is very useful in practical regression modeling and analysis, especially
for big data.
Keywords: Johnson relative weights; Shapley value regression; multicollinearity; orthonormal approximation; predictor importance
Document Type: Research Article
Affiliations: GfK North America, 8401 Golden Valley Road, Minneapolis, MN, 55427, USA
Publication date: 04 May 2015
- Access Key
- Free content
- Partial Free content
- New content
- Open access content
- Partial Open access content
- Subscribed content
- Partial Subscribed content
- Free trial content