Skip to main content

Smoothing spline Gaussian regression: more scalable computation via efficient approximation

Buy Article:

$51.00 plus tax (Refund Policy)

Abstract:

Summary. 

Smoothing splines via the penalized least squares method provide versatile and effective nonparametric models for regression with Gaussian responses. The computation of smoothing splines is generally of the order O(n3), n being the sample size, which severely limits its practical applicability. We study more scalable computation of smoothing spline regression via certain low dimensional approximations that are asymptotically as efficient. A simple algorithm is presented and the Bayes model that is associated with the approximations is derived, with the latter guiding the porting of Bayesian confidence intervals. The practical choice of the dimension of the approximating space is determined through simulation studies, and empirical comparisons of the approximations with the exact solution are presented. Also evaluated is a simple modification of the generalized cross-validation method for smoothing parameter selection, which to a large extent fixes the occasional undersmoothing problem that is suffered by generalized cross-validation.

Keywords: Bayesian confidence interval; Computation; Generalized cross-validation; Penalized least squares

Document Type: Research Article

DOI: http://dx.doi.org/10.1046/j.1369-7412.2003.05316.x

Affiliations: 1: Yale University, New Haven, USA. 2: Purdue University, West Lafayette, USA.

Publication date: April 1, 2004

bpl/rssb/2004/00000066/00000002/art00005
dcterms_title,dcterms_description,pub_keyword
6
5
20
40
5

Access Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content
Cookie Policy
X
Cookie Policy
ingentaconnect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more