Skip to main content

Using the Non-Parametric Classifier CART to Model Forest Tree Mortality

Buy Article:

$29.50 plus tax (Refund Policy)


A binary classification tree (CART) was used to predict forest tree mortality for two conifer species. CART models were fitted using binary recursive splitting of the data set into increasingly homogeneous subsets. Models were compared in terms of improvement of prediction accuracy, representativeness in average size of the predicted mortality trees, and interpretability of the results. For shade intolerant ponderosa pine, crown ratio, diameter increment prediction, or variables indicating the relative position of a tree in a stand were used for splitting. For shade tolerant white fir, height increment prediction and stand density were selected for splitting. The prediction accuracies for mortality trees of the best CART models were between 28-36% for ponderosa pine and between 11-17% for white fir. CART was also compared with logistic regression using a stochastic and a deterministic assignment of mortality. Efficiencies similar to those achieved with CART were reached with deterministic logistic models using thresholds probabilities. However, CART and the logistic model tended to utilize different predictor variables, especially for white fir. CART uncovered additional factors for white fir important for predicting mortality not identified by the logistic regression. For. Sci. 44(4):507-516.

Keywords: Tree mortality; classification trees; logistic regression

Document Type: Journal Article

Publication date: November 1, 1998

More about this publication?
  • Membership Information
  • ingentaconnect is not responsible for the content or availability of external websites

Access Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content
Cookie Policy
Cookie Policy
ingentaconnect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more