Skip to main content

Penalized Gaussian Process Regression and Classification for High‐Dimensional Nonlinear Data

Buy Article:

$51.00 plus tax (Refund Policy)

Abstract:

Summary The model based on Gaussian process (GP) prior and a kernel covariance function can be used to fit nonlinear data with multidimensional covariates. It has been used as a flexible nonparametric approach for curve fitting, classification, clustering, and other statistical problems, and has been widely applied to deal with complex nonlinear systems in many different areas particularly in machine learning. However, it is a challenging problem when the model is used for the large‐scale data sets and high‐dimensional data, for example, for the meat data discussed in this article that have 100 highly correlated covariates. For such data, it suffers from large variance of parameter estimation and high predictive errors, and numerically, it suffers from unstable computation. In this article, penalized likelihood framework will be applied to the model based on GPs. Different penalties will be investigated, and their ability in application given to suit the characteristics of GP models will be discussed. The asymptotic properties will also be discussed with the relevant proofs. Several applications to real biomechanical and bioinformatics data sets will be reported.

Document Type: Research Article

DOI: http://dx.doi.org/10.1111/j.1541-0420.2011.01576.x

Affiliations: 1: School of Mathematics & Statistics, Newcastle University, United Kingdom 2: Department of Statistics, Korea University, South Korea

Publication date: December 1, 2011

bpl/biom/2011/00000067/00000004/art00011
dcterms_title,dcterms_description,pub_keyword
6
5
20
40
5

Access Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more