Penalized likelihood methods provide a range of practical modelling tools, including spline smoothing, generalized additive models and variants of ridge regression. Selecting the correct weights for penalties is a critical part of using these methods and in the single-penalty case the analyst has several well-founded techniques to choose from. However, many modelling problems suggest a formulation employing multiple penalties, and here general methodology is lacking. A wide family of models with multiple penalties can be fitted to data by iterative solution of the generalized ridge regression problem minimize ||W1/2 (Xp-y) ||2ρ+Σi=1m θip′Sip (p is a parameter vector, X a design matrix, Si a non-negative definite coefficient matrix defining the ith penalty with associated smoothing parameter θ i, W a diagonal weight matrix, y a vector of data or pseudodata and ρ an ‘overall’ smoothing parameter included for computational efficiency). This paper shows how smoothing parameter selection can be performed efficiently by applying generalized cross-validation to this problem and how this allows non-linear, generalized linear and linear models to be fitted using multiple penalties, substantially increasing the scope of penalized modelling methods. Examples of non-linear modelling, generalized additive modelling and anisotropic smoothing are given.
No Supplementary Data
No Article Media