An IRT forecasting model: linking proper scoring rules to item response theory

Journal Title: Judgment and Decision Making - Year 2017, Vol 12, Issue 2

Abstract

This article proposes an Item Response Theoretical (IRT) forecasting model that incorporates proper scoring rules and provides evaluations of forecasters’ expertise in relation to the features of the specific questions they answer. We illustrate the model using geopolitical forecasts obtained by the Good Judgment Project (GJP) (see Mellers, Ungar, Baron, Ramos, Gurcay, Fincher, Scott, Moore, Atanasov, Swift, Murray, Stone & Tetlock, 2014). The expertise estimates from the IRT model, which take into account variation in the difficulty and discrimination power of the events, capture the underlying construct being measured and are highly correlated with the forecasters’ Brier scores. Furthermore, our expertise estimates based on the first three years of the GJP data are better predictors of both the forecasters’ fourth year Brier scores and their activity level than the overall Brier scores obtained and Merkle’s (2016) predictions, based on the same period. Lastly, we discuss the benefits of using event-characteristic information in forecasting.

Authors and Affiliations

Yuanchao Emily Bo, David V. Budescu, Charles Lewis, Philip E. Tetlock and Barbara Mellers

Keywords

Related Articles

Real and hypothetical rewards in social discounting

Laboratory studies of choice and decision making among real monetary rewards typically use smaller real rewards than those common in real life. When laboratory rewards are large, they are almost always hypothetical. In a...

The default pull: An experimental demonstration of subtle default effects on preferences

The impact of default options on choice is a reliable, well-established behavioral finding. However, several different effects may lend to choosing defaults in an often indistinguishable manner, including loss aversion,...

Semantic cross-scale numerical anchoring

Anchoring effects are robust, varied and can be consequential. Researchers have provided a variety of alternative explanations for these effects. More recently, it has become apparent that anchoring effects might be prod...

Framing effect in evaluation of others’ predictions

This paper explored how frames influence people’s evaluation of others’ probabilistic predictions in light of the outcomes of binary events. Most probabilistic predictions (e.g., “there is a 75% chance that Denver will w...

Subjective integration of probabilistic information from experience and description

I report a new judgment task designed to investigate the subjective weights allotted to experience and description when integrating information from the two sources. Subjects estimated the percentage of red balls in a ba...

Download PDF file
  • EP ID EP678279
  • DOI -
  • Views 137
  • Downloads 0

How To Cite

Yuanchao Emily Bo, David V. Budescu, Charles Lewis, Philip E. Tetlock and Barbara Mellers (2017). An IRT forecasting model: linking proper scoring rules to item response theory. Judgment and Decision Making, 12(2), -. https://europub.co.uk/articles/-A-678279