Skip to main content

Data classification using the Dempster–Shafer method

Buy Article:

$71.00 + tax (Refund Policy)

In this paper, the Dempster–Shafer (D–S) method is used as the theoretical basis for creating data classification systems. Testing is carried out using three popular multiple attribute benchmark data-sets that have two, three and four classes. In each case, a subset of the available data is used for training to establish thresholds, limits or likelihoods of class membership for each attribute, and hence create mass functions that establish probability of class membership for each attribute of the test data. Classification of each data item is achieved by combination of these probabilities via Dempster's rule of combination. Results for the first two data-sets show extremely high classification accuracy that is competitive with other popular methods. The third data-set is non-numerical and difficult to classify, but good results can be achieved provided the system and mass functions are designed carefully and the right attributes are chosen for combination. In all cases, the D–S method provides comparable performance to other more popular algorithms, but the overhead of generating accurate mass functions increases the complexity with the addition of new attributes. Overall, the results suggest that the D–S approach provides a suitable framework for the design of classification systems and that automating the mass function design and calculation would increase the viability of the algorithm for complex classification problems.

Keywords: Dempster's rule of combination; Dempster–Shafer theory; data classification

Document Type: Research Article

Affiliations: 1: School of Computer Science, University of Nottingham, Jubilee Campus, Wollaton Road, Nottingham, NG8 1BB, UK 2: School of Computer Science, University of Nottingham Malaysia Campus, Jalan Broga 43500 Semenyih, Selangor, Darul Ehsan, Malaysia

Publication date: 02 October 2014

More about this publication?
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content