Skip to main content

An unsupervised change detection and recognition system for forestry

Buy Article:

$60.90 plus tax (Refund Policy)

Abstract:

Abstract. This article presents a new unsupervised method (AutoChange) for change detection and identification. It uses, as an input, two images, acquired on different dates, and a parameter list given by the user. Change detection and identification are performed in separate procedures, and the output is a five channel image estimating the change magnitude and characterizing the changed and unchanged areas. The method carries out the change analysis using homogeneous units selected from the images and only in the ultimate phase the whole image is classified. Changes are detected and identified using clustering in two phases. First, clustering is performed on the earlier and later images to form the so called 'primary clusters'. Second, clustering is performed within the primary clusters of the later image to produce the 'secondary clusters'. Then the change magnitude and change type are obtained by comparing the primary clusters in the earlier image to the secondary clusters in the later image. The method, which was tested in southern Finnish Boreal forest using Landsat Thematic Mapper data, could reliably detect and identify clearcuts. In addition, the method provided information on forest damage since the type of the spectral change was consistent on damaged areas despite a minor magnitude of the change.

Document Type: Research Article

DOI: http://dx.doi.org/10.1080/014311698215612

Publication date: April 1, 1998

More about this publication?
tandf/tres/1998/00000019/00000006/art00005
dcterms_title,dcterms_description,pub_keyword
6
5
20
40
5

Access Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more