Special Issue on Simple Versus Complex Forecasting
Journal of Business Research
Volume 68, Issue 8, Pages 1657-1818 (August 2015)
- Simple versus complex forecasting: The evidence. 1678-1685
Green, K. C., & Armstrong, J. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.026 - Golden Rule of Forecasting: Be conservative. 1717-1731
Armstrong, J. S., Green, K. C., & Graefe, A., http://dx.doi.org/10.1016/j.jbusres.2015.03.031 - Is there a Golden Rule? 1742-1745
Fildes, R., & Petropoulos, F., http://dx.doi.org/10.1016/j.jbusres.2015.01.059 - Is a more liberal approach to conservatism needed in forecasting? 1753-1754
Goodwin, P., http://dx.doi.org/10.1016/j.jbusres.2015.01.060 - The Golden Rule of Forecasting: Objections, refinements, and enhancements. 1702-1704
Soyer, E., & Hogarth, R. M., http://dx.doi.org/10.1016/j.jbusres.2015.03.029 - Conservative forecasting with the damped trend. 1739-1741
Gardner Jr., E. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.033 - Golden rule of forecasting rearticulated: Forecast unto others as you would have them forecast unto you. 1768-1771
Green, K. C., Armstrong, J. S., & Graefe, A., http://dx.doi.org/10.1016/j.jbusres.2015.03.036 - The bias bias. 1772-1784
Brighton, H., & Gigerenzer, G., http://dx.doi.org/10.1016/j.jbusres.2015.01.061 - Simple versus complex selection rules for forecasting many time series. 1692-1701
Fildes, R., & Petropoulos, F., http://dx.doi.org/10.1016/j.jbusres.2015.03.028 - Forecasting new product trial with analogous series. 1732-1738
Wright, M. J., & Stern, P., http://dx.doi.org/10.1016/j.jbusres.2015.03.032 - Decomposition of time-series by level and change. 1755-1758
Tessier, T. H., & Armstrong, J. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.035 - Improving forecasts using equally weighted predictors. 1792-1799
Graefe, A., http://dx.doi.org/10.1016/j.jbusres.2015.03.038 - Picking profitable investments: The success of equal weighting in simulated venture capitalist decision making. 1705-1716
Woike, J. K., Hoffrage, U., & Petty, J. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.030 - Forecasting intermittent inventory demands: simple parametric methods vs. bootstrapping. 1746-1752
Syntetos, A. A., Babai, M. Z., & Gardner Jr., E. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.034 - Relative performance of methods for forecasting special events. 1785-1791
Nikolopoulos, K., Litsa, A., Petropoulos, F., Bougioukos, V., & Khammash, M., http://dx.doi.org/10.1016/j.jbusres.2015.03.037 - Improving forecasts for noisy geographic time series. 1810-1818
Huddleston, S. H., Porter, J. H., & Brown, D. E., http://dx.doi.org/10.1016/j.jbusres.2015.03.040 - When simple alternatives to Bayes formula work well: Reducing the cognitive load when updating probability forecasts. 1686-1691
Goodwin, P., http://dx.doi.org/10.1016/j.jbusres.2015.03.027 - Collective wisdom: Methods of confidence interval aggregation. 1759-1767
Lyon, A., Wintle, B. C., & Burgman, M., http://dx.doi.org/10.1016/j.jbusres.2014.08.012 - Communicating forecasts: The simplicity of simulated experience. 1800- 1809
Hogarth, R. M., & Soyer, E., http://dx.doi.org/10.1016/j.jbusres.2015.03.039.
Green, K. C. and Armstrong, J. S. (2007). "The Ombudsman: Value of Expertise for Forecasting Decisions in Conflicts, Interfaces, 37, 287-299.
Green and Armstrong's paper (available in full text as a working paper and from the publisher) provides evidence on the accuracy of forecasts from the method usually used for forecasting the decision people will make in a conflict situation: unaided expert judgment. The authors obtained 106 forecasts by experts and 169 forecasts by novices about eight real conflicts. Conflicts included a military conflict in the Middle East, a hostile takeover attempt in the telecommunications industry, and a union-management dispute between nurses and the hospital that employed them. Experts' forecasts were little better than novices', and their forecasts were not meaningfully different from choosing at random.
The Green and Armstrong paper provides evidence on two principles:
6.3 Use structured rather than unstructured forecasting methods
G&A's evidence on this principle is indirect. The research reported in G&A does not include structured methods. Other research by the same authors, however, shows that two structured methods (structured analogies and simulated interaction) provided forecasts for the same situations that were substantially more accurate than those from unaided expert judgment in G&A.
6.7 Match the forecasting method(s) to the situation.
Prior research has shown that unaided judgment is not an appropriate forecasting method for complex situations, when relationships are unclear, when feedback on predictions is poor, and when experts are biased. All four of these problems are likely to arise with conflict situations. Despite the fact that this evidence has been available for many years, unaided expert judgment is still the method of choice for forecasting decisions in conflicts.
Kesten Green
August 28, 2007
Goodwin, P. (2002) "Integrating management judgment with statistical methods to improve short-term forecasts," Omega,30, 127-135.
This paper reviews the research literature to assess the effectiveness of methods that are designed to allow management judgment and statistical methods to be integrated when short-term point forecasts are required. A systematic search led to 45 empirical studies.
Two integration processes are identified: voluntary integration (where the forecaster is able to choose how much weight the statistical forecast will have in establishing the ‘final’ forecast) and mechanical integration (where the 'final' forecast is obtained by applying a statistical process to the judgmental forecasts). The main findings are:
Voluntary Integration
· When only time series information is available (i.e. there is no domain knowledge) judgmental adjustments of statistical forecasts will tend to reduce accuracy because people attempt to forecast the noise in the series (supporting principle 11.4 Limit subjective adjustments of quantitative forecasts).
· Judgmental adjustments should therefore only be made on the basis of ‘important’ domain knowledge that is not included in the statistical forecast (supporting principle 7.5 Adjust for events expected in the future).
· Requiring forecasters to record reasons for any adjustments is likely to reduce the likelihood of damaging adjustments being made (supporting principle 8.3 Ask experts to justify their forecasts in writing).
· More research is needed to establish the effectiveness of structured decomposition of judgmental adjustments (see principle 11.2 Use structured judgment as inputs to quantitative models).
Mechanical integration
· Combination of judgmental and statistical forecasts is most effective when the forecasts are unbiased and their errors are negatively correlated (supporting principle 12.1 Combine forecasts from approaches that differ),
· In many business environments there is insufficient data to justify using anything other than equal weights in the combination (supporting principle 12.4 Start with equal weights).
· There is some empirical evidence that statistical correction for judgmental bias is likely to be more effective than combination in situations where there is an absence of ‘strong’ time patterns, but forecasters are in possession of domain knowledge that is difficult to model statistically.
· When only time series information is available there is, as yet, no evidence to suggest that the use of judgmental bootstrapping will improve accuracy (see principle 11.5 Use judgmental bootstrapping instead of expert forecasts).
Voluntary versus mechanical integration
· It is essential that forecasting methods are acceptable to decision makers (Principle 1.5 Obtain decision makers’ agreement on methods). Acceptance is more likely to be achieved through voluntary integration methods.
Abstract
The complementary strengths that management judgment and statistical methods can bring to the forecasting process have been widely discussed. This paper reviews research on the effectiveness of methods that are designed to allow judgment and statistical methods to be integrated when short-term point forecasts are required. The application of both voluntary and mechanical integration methods are considered and conditions identified where the use of particular methods is appropriate, according to current research. While acknowledging the power of mechanical integration methods that exclude the judgmental forecaster from the integration process, the paper suggests that future research effort should focus on the design of forecasting support systems that facilitate voluntary integration. (Reprinted with permission, Copyright © 2002 Elsevier Science Ltd.)
This email address is being protected from spambots. You need JavaScript enabled to view it.
This page is for researchers, practitioners, and students who want to know about papers related to forecasting principles. Papers with information on forecasting principlesFull-text copies of most of the other papers can be obtained through JSTOR or Business Premier (EBSCO) at subscribing institutions. Special Issue on "Simple versus Complex Forecasting"A Special Issue of the Journal of Business Research (Volume 68, Issue 8, pp 1657-1818), guest edited by Kesten Green and Scott Armstrong, is devoted to evidence on the effect of simplicity versus complexity on forecast accuracy. The Table of Contents for the Special Issue is available from the link below. The issue includes a summary of the evidence in the first paper, and a paper describing a unifying theory of forecasting, the Golden Rule of Forecasting. Reviews of important papers on forecastingAdd your papers to the RePEc Archive or the Munich Personal RePEc Archive (MPRA)You may add your own materials to RePEc through a department or institutional archive. All institutions are welcome to join and contribute their materials by establishing and maintaining their own RePEc archive. RePEc does not support personal archives (only institutional archives). Working Papers by authors seeking reviews and adviceIf you would like your paper posted here, please send a copy or a URL to This email address is being protected from spambots. You need JavaScript enabled to view it. YOUR PAPER COULD BE HERE!
Papers with information on forecasting principlesThe papers below are provided in full text (PDF format). Articles from Elsevier Science, John Wiley, and other publishers have been reproduced with permission. Single copies of these articles may be downloaded and printed for the reader's personal research and study. Researchers are invited to submit relevant papers.
2018
2011
2010
2009
2007
2006
2005
2004
2003
2002
2001
2000
1999
1998
1995
1994
1993
1992
1991
1989
1987
1986
1984
1983
1982
1981
1980
1978
1977
1975
1974
1972
1971
1970
1969
1967
Reviews of important papers on forecasting before 1985J. Scott Armstrong's summaries and critiques (written from 1982 to 1985 for the Journal of Forecasting) of important articles on forecasting originally appeared in various major journals. Authors of the original papers were invited to comment, and they did so in almost all cases. The list is not comprehensive; it is simply a selection of papers that were thought to be important. These summaries are reproduced here with the kind permission of the journals' editors and John Wiley. Click on the author[s] and date to read the review.
Reviews of important papers on forecasting from 1985 and onJ. Scott Armstrong's summaries and critiques of important articles on forecasting originally appeared in the International Journal of Forecasting and other major journals. Authors of the original papers were invited to comment, and they did so in almost all cases. The list is not comprehensive; it is simply a selection of papers that were thought to be important. Papers from the International Journal of Forecasting were not included. These summaries are reproduced here with the kind permission of the journals' editors and Elesevier Science. Click on the author[s] and date to read the review.
|