The effects of feedback on judgmental interval predictions
dc.citation.epage | 39 | en_US |
dc.citation.issueNumber | 1 | en_US |
dc.citation.spage | 29 | en_US |
dc.citation.volumeNumber | 20 | en_US |
dc.contributor.author | Bolger, F. | en_US |
dc.contributor.author | Önkal-Atay, D. | en_US |
dc.date.accessioned | 2015-07-28T11:57:15Z | |
dc.date.available | 2015-07-28T11:57:15Z | |
dc.date.issued | 2004 | en_US |
dc.department | Department of Management | en_US |
dc.description.abstract | The majority of studies of probability judgment have found that judgments tend to be overconfident and that the degree of overconfidence is greater the more difficult the task. Further, these effects have been resistant to attempts to ‘debias’ via feedback. We propose that under favourable conditions, provision of appropriate feedback should lead to significant improvements in calibration, and the current study aims to demonstrate this effect. To this end, participants first specified ranges within which the true values of time series would fall with a given probability. After receiving feedback, forecasters constructed intervals for new series, changing their probability values if desired. The series varied systematically in terms of their characteristics including amount of noise, presentation scale, and existence of trend. Results show that forecasts were initially overconfident but improved significantly after feedback. Further, this improvement was not simply due to ‘hedging’, i.e. shifting to very high probability estimates and extremely wide intervals; rather, it seems that calibration improvement was chiefly obtained by forecasters learning to evaluate the extent of the noise in the series. D 2003 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved. | en_US |
dc.description.provenance | Made available in DSpace on 2015-07-28T11:57:15Z (GMT). No. of bitstreams: 1 10.1016-S0169-2070(03)00009-8.pdf: 173046 bytes, checksum: 77ccaae7e133b8d06a4235f1d54f218b (MD5) | en |
dc.identifier.doi | 10.1016/S0169-2070(03)00009-8 | en_US |
dc.identifier.issn | 0169-2070 | |
dc.identifier.uri | http://hdl.handle.net/11693/11251 | |
dc.language.iso | English | en_US |
dc.publisher | Elsevier | en_US |
dc.relation.isversionof | http://dx.doi.org/10.1016/S0169-2070(03)00009-8 | en_US |
dc.source.title | International Journal of Forecasting | en_US |
dc.subject | Judgmental forecasting | en_US |
dc.subject | Calibration | en_US |
dc.subject | Feedback | en_US |
dc.subject | Overconfidence | en_US |
dc.subject | Confidence intervals | en_US |
dc.title | The effects of feedback on judgmental interval predictions | en_US |
dc.type | Article | en_US |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- 10.1016-S0169-2070(03)00009-8.pdf
- Size:
- 168.99 KB
- Format:
- Adobe Portable Document Format
- Description:
- Full printable version