Skip to main content

Evaluating the accuracy and calibration of expert predictions under uncertainty: predicting the outcomes of ecological research

Buy Article:

$51.00 plus tax (Refund Policy)



Aim  Expert knowledge routinely informs ecological research and decision‐making. Its reliability is often questioned, but is rarely subject to empirical testing and validation. We investigate the ability of experts to make quantitative predictions of variables for which the answers are known.

Location  Global.

Methods  Experts in four ecological subfields were asked to make predictions about the outcomes of scientific studies, in the form of unpublished (in press) journal articles, based on information in the article introduction and methods sections. Estimates from students were elicited for one case study for comparison. For each variable, participants assessed a lower and upper bound, best guess and level of confidence that the observed value will lie within their ascribed interval. Responses were assessed for (1) accuracy: the degree to which predictions corresponded with observed experimental results, (2) informativeness: precision of the uncertainty bounds, and (3) calibration: degree to which the uncertainty bounds contained the truth as often as specified.

Results  Expert responses were found to be overconfident, specifying 80% confidence intervals that captured the truth only 49–65% of the time. In contrast, student 80% intervals captured the truth 76% of the time, displaying close to perfect calibration. Best estimates from experts were on average more accurate than those from students. The best students outperformed the worst experts. No consistent relationships were observed between performance and years of experience, publication record or self‐assessment of expertise.

Main conclusions  Experts possess valuable knowledge but may require training to communicate this knowledge accurately. Expert status is a poor guide to good performance. In the absence of training and information on past performance, simple averages of expert responses provide a robust counter to individual variation in performance.

Document Type: Research Article


Affiliations: Australian Centre of Excellence for Risk Analysis, School of Botany, University of Melbourne, Parkville, Vic. 3010, Australia

Publication date: 2012-08-01

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more