Skip to main content

Intentionally biased bootstrap methods

Buy Article:

$43.00 plus tax (Refund Policy)

A class of weighted bootstrap techniques, called biased bootstrap or b-bootstrap methods, is introduced. It is motivated by the need to adjust empirical methods, such as the ‘uniform’ bootstrap, in a surgical way to alter some of their features while leaving others unchanged. Depending on the nature of the adjustment, the b-bootstrap can be used to reduce bias, or to reduce variance or to render some characteristic equal to a predetermined quantity. Examples of the last application include a b-bootstrap approach to hypothesis testing in nonparametric contexts, where the b-bootstrap enables simulation ‘under the null hypothesis’, even when the hypothesis is false, and a b-bootstrap competitor to Tibshirani's variance stabilization method. An example of the bias reduction application is adjustment of Nadaraya–Watson kernel estimators to make them competitive with local linear smoothing. Other applications include density estimation under constraints, outlier trimming, sensitivity analysis, skewness or kurtosis reduction and shrinkage.
No References
No Citations
No Supplementary Data
No Article Media
No Metrics

Keywords: Bias reduction; Empirical likelihood; Hypothesis testing; Local linear smoothing; Nonparametric curve estimation; Variance stabilization; Weighted bootstrap

Document Type: Original Article

Affiliations: Australian National University, Canberra, Australia

Publication date: 1999-01-01

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more