Skip to main content

Sparse Bayesian Estimation of Forest Stand Characteristics from Airborne Laser Scanning

Buy Article:

$29.50 plus tax (Refund Policy)


In this article, a new method is applied to modeling forest stand characteristics from airborne laser scanning measurements. The method is an alternative to the cross-validation procedure of variable selection used in the ordinary least-squares (OLS) method and seemingly unrelated regression (SUR) with automatic selection of the features used in the model. This method is called the sparse Bayesian method. It does not suffer from overfitting thanks to the Bayesian formulation of the problem. The proposed method is applied to sample plot data obtained from inventory by compartments. The results show that the sparse Bayesian method performs as well as OLS and SUR methods, in terms of accuracy of total stand characteristics. The methods are comparable also in their demand for sample plot data. None of the methods lose much of their accuracy, even when just a few dozen sample plots are available. A Bayesian approach makes it possible to automate model formulation and sample plot selection processes. It is therefore possible to automatically generate a different model for intrastand strata, thus addressing intrastand variability. The proposed method automatically maintains a balance between the number of forest parameters and the rank of the model used to estimate them.

Keywords: forest inventory; laser scanning; lidar; model; stand volume

Document Type: Research Article

Publication date: October 1, 2008

More about this publication?
  • Membership Information
  • ingentaconnect is not responsible for the content or availability of external websites

Access Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content
Cookie Policy
Cookie Policy
ingentaconnect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more