Skip to main content
padlock icon - secure page this page is secure

On cross‐validation for sparse reduced rank regression

Buy Article:

$52.00 + tax (Refund Policy)

In high dimensional data analysis, regularization methods pursuing sparsity and/or low rank have received much attention recently. To provide a proper amount of shrinkage, it is typical to use a grid search and a model comparison criterion to find the optimal regularization parameters. However, we show that fixing the parameters across all folds may result in an inconsistency issue, and it is more appropriate to cross‐validate projection–selection patterns to obtain the best coefficient estimate. Our in‐sample error studies in jointly sparse and rank deficient models lead to a new class of information criteria with four scale‐free forms to bypass the estimation of the noise level. By use of an identity, we propose a novel scale‐free calibration to help cross‐validation to achieve the minimax optimal error rate non‐asymptotically. Experiments support the efficacy of the methods proposed.
No References
No Citations
No Supplementary Data
No Article Media
No Metrics

Keywords: Cross‐validation; Information criterion; Low rank estimation; Scale‐free regularization; Variable selection

Document Type: Research Article

Publication date: February 1, 2019

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more