Skip to main content

Properties of bagged nearest neighbour classifiers

Buy Article:

$51.00 plus tax (Refund Policy)



It is shown that bagging, a computationally intensive method, asymptotically improves the performance of nearest neighbour classifiers provided that the resample size is less than 69% of the actual sample size, in the case of with-replacement bagging, or less than 50% of the sample size, for without-replacement bagging. However, for larger sampling fractions there is no asymptotic difference between the risk of the regular nearest neighbour classifier and its bagged version. In particular, neither achieves the large sample performance of the Bayes classifier. In contrast, when the sampling fractions converge to 0, but the resample sizes diverge to ∞, the bagged classifier converges to the optimal Bayes rule and its risk converges to the risk of the latter. These results are most readily seen when the two populations have well-defined densities, but they may also be derived in other cases, where densities exist in only a relative sense. Cross-validation can be used effectively to choose the sampling fraction. Numerical calculation is used to illustrate these theoretical properties.

Keywords: Bayes risk; Bootstrap; Classification error; Cross-validation; Density; Discrimination; Error rate; Marked point process; Poisson process; Prediction; Regret; Statistical learning; With-replacement sampling; Without-replacement sampling

Document Type: Research Article


Affiliations: 1: Australian National University, Canberra, Australia 2: Australian National University, Canberra, Australia, and University of Cambridge, UK

Publication date: 2005-06-01

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more