Share this post
FaceBook  Twitter  

Problem: Few practitioners or academics use findings from nearly a century of experimental research that would allow them to substantially reduce forecast errors. In order to improve forecasting practice, this paper develops evidence-based guidance in a form that is easy for forecasters and decision-makers to access, understand, and use: checklists.
Methods: Meta-analyses of experimental research on forecasting were used to identify the principles and methods that lead to accurate out-of-sample forecasts. Cited authors were contacted to check that summaries of their research were correct. Checklists to help forecasters and their clients practice evidence-based forecasting were then developed from the research findings. Finally, appeals to identify errors of omission or commission in the analyses and summaries of research findings were sent to leading researchers.
Findings: Seventeen simple forecasting methods can between them be used to provide accurate forecasts for diverse problems. Knowledge on forecasting is summarized in the form of five checklists with guidance on the selection of the most suitable methods for the problem, and their implementation.
Originality: Three of the five checklists—addressing (1) evidence-based methods, (2) regression analysis, and (3) assessing uncertainty—are new. A fourth—the Golden Rule checklist—has been improved. The fifth—the Simple Forecasting checklist (Occam's Razor)—remains the same.
Usefulness: Forecasters can use the checklists as tools to reduce forecast errors—often by more than one-half—compared to those of forecasts from commonly used methods. Scientists can use the checklists to devise valid tests of the predictive validity of their hypotheses. Finally, clients and other interested parties can use the checklists to determine whether forecasts were derived using evidence-based procedures and can, therefore, be trusted.

Please send This email address is being protected from spambots. You need JavaScript enabled to view it. suggestions of evidence that the authors' may have missed (papers with relevant experimental evidence from tests of alternative methods), or mistakes they have made by Novermber 21 at the latest. The latest version of the working paper is available from ResearchGate, here.