Skip to main content

Intentionally biased bootstrap methods

Buy Article:

$48.00 plus tax (Refund Policy)

Abstract:

A class of weighted bootstrap techniques, called biased bootstrap or b-bootstrap methods, is introduced. It is motivated by the need to adjust empirical methods, such as the ‘uniform’ bootstrap, in a surgical way to alter some of their features while leaving others unchanged. Depending on the nature of the adjustment, the b-bootstrap can be used to reduce bias, or to reduce variance or to render some characteristic equal to a predetermined quantity. Examples of the last application include a b-bootstrap approach to hypothesis testing in nonparametric contexts, where the b-bootstrap enables simulation ‘under the null hypothesis’, even when the hypothesis is false, and a b-bootstrap competitor to Tibshirani's variance stabilization method. An example of the bias reduction application is adjustment of Nadaraya–Watson kernel estimators to make them competitive with local linear smoothing. Other applications include density estimation under constraints, outlier trimming, sensitivity analysis, skewness or kurtosis reduction and shrinkage.

Keywords: Bias reduction; Empirical likelihood; Hypothesis testing; Local linear smoothing; Nonparametric curve estimation; Variance stabilization; Weighted bootstrap

Document Type: Original Article

DOI: http://dx.doi.org/10.1111/1467-9868.00168

Affiliations: Australian National University, Canberra, Australia

Publication date: January 1, 1999

bpl/rssb/1999/00000061/00000001/art00010
dcterms_title,dcterms_description,pub_keyword
6
5
20
40
5

Access Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content
Cookie Policy
X
Cookie Policy
ingentaconnect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more