Diffusion of Forecasting Principles by Softwares

The following analysis was prepared by J. Scott Armstrong. The SAS grants can be used to address these issues.

Forecasting software can:

  1. provide the most cost effective way to use forecasting principles,
  2. incorporate the latest findings, and
  3. help gain acceptance of principles (need to take action to avoid use)

But does it?

The attached table identifies which principles are currently used and how other principles could be incorporated.

FORECASTING PRINCIPLES 

Used
already?

1=yes
2 = well

Add?
A=Yes
A* =easy

Need
Front-End
DS

Need
New
Program?

 

 

 

 

 

PROBLEM

 

 

 

 

 

 

 

 

 

1. Setting Objectives

 

 

 

 

1.1. Describe decisions that might be affected.

 

 

Y

 

1.2. Agree on actions for different possible forecasts.

 

 

Y

 

1.3. Make forecast independent of organizational politics.

 

 

Y

 

1.4. Consider whether events or series are forecastable.

1

 

 

 

1.5. Gain decision makers' agreement on methods.

 

 

Y

 

 

 

 

 

 

2. Structuring the Problem

 

 

 

 

2.1. Identify possible outcomes prior to making forecasts.

 

 

Y

 

2.2. Tailor the level of data aggregation to the decisions.

 

 

Y

 

2.3. Decompose the problem into sub problems.

 

A

 

 

2.4. Decompose time series by causal forces.

 

A*

 

 

2.5. Structure problems to deal with important interactions.

 

A

 

 

2.6. Structure problems that involve causal chains.

 

A

 

 

2.7. Decompose time series by level and trend.

 

A*

 

 

 

 

 

 

 

INFORMATION

 

 

 

 

 

 

 

 

 

3. Identifying Information Sources

 

 

 

 

3.1. Use theory to guide information search on explanatory variables.

 

 

Y

 

3.2. Ensure that data match the forecasting situation.

 

 

Y

 

3.3. Avoid biased data sources.

 

 

Y

 

3.4. Use diverse sources of data.

 

 

Y

 

3.5. Obtain information from similar (analogous) series or cases.

 

 

Y

 

 

 

 

 

 

4. Collecting Data

 

 

 

 

4.1. Use unbiased and systematic procedures to collect data.

 

 

Y

 

4.2. Ensure that information is reliable.

 

 

Y

 

4.3. Ensure information is valid.

 

 

Y

 

4.4. Obtain all important data.

 

 

Y

 

4.5. Avoid collection of irrelevant data.

 

 

Y

 

4.6. Obtain the most recent data.

 

 

Y

 

 

 

 

 

 

5. Preparing Data

 

 

 

 

5.1. Clean the data.

2

 

 

 

5.2. Use transformations as required by expectations.

2

 

 

 

5.3. Adjust intermittent series.

2

 

 

 

5.4. Adjust for unsystematic past events (outliers).

2

 

 

 

5.5. Adjust for systematic events (e.g., seasonality).

2

 

 

 

5.6. Use multiplicative adjustments for seasonality for stable series w trends.

A*

 

 

5.7. Damp seasonal factors for uncertainty.

 

A*

 

 

5.8. Use graphical displays for data.

2

 

 

 

 

 

 

 

 

METHODS

 

 

 

 

 

 

 

 

 

6. Selecting Methods

 

 

 

 

6.1. Develop list of all important criteria.

 

 

Y

 

6.2. Ask unbiased experts to rate potential methods.

 

 

Y

 

6.3. Use structured forecasting methods rather than unstructured.

 

 

Y

 

6.4. Use quantitative methods rather than qualitative methods.

 

 

Y

 

6.5. Use causal rather than naïve methods.

 

 

Y

 

6.6. Select simple methods unless evidence favors complex methods.

2

 

 

 

6.7 Match forecasting method(s) to the situation.

2

 

 

 

6.8. Compare track records of various methods.

2

 

 

 

6.9. Assess acceptability and understandability of methods to users.

 

 

Y

 

6.10. Examine value of alternative forecasting methods.

 

 

Y

 

 

 

 

 

 

7. Implementing Methods: General

 

 

 

 

7.1. Keep methods simple.

 

A*

 

 

7.2. Provide a realistic representation of the forecasting situation.

 

 

Y

 

7.3. Be conservative in situations of uncertainty or instability.

 

A*

 

 

7.4. Do not forecast cycles.

 

 

Y

 

7.5. Adjust for expected events in future.

2

 

 

 

7.6. Pool similar types of data.

 

A

 

 

7.7. Ensure consistency with forecasts of related series.

 

A

 

 

 

 

 

 

 

8. Implementing Methods: Judgment

 

 

 

 

8.1. Pretest questions used to solicit judgmental forecasts.

 

 

 

Judge

8.2. Use questions that have been framed in alternative ways.

 

 

 

Judge

8.3. Ask experts to justify their forecasts.

 

 

 

Judge

8.4. Use numerical scales with several categories.

 

 

 

Judge

8.5. Obtain forecasts from heterogeneous experts.

 

 

 

Judge

8.6. Obtain intentions or expectations from representative samples.

 

 

 

Judge

8.7. Obtain forecasts from sufficient number of respondents.

 

 

 

Judge

8.8. Obtain multiple estimates of an event from each expert.

 

 

 

Judge

 

 

 

 

 

9. Implementing Method: Quantitative

 

 

 

 

9.1. Tailor the forecasting model to the horizon.

 

A*

 

 

9.2. Match model to underlying process.

 

A*

 

 

9.3. Do not use fit to develop a model.

2

 

 

 

9.4. Weight the most relevant data more heavily.

2

 

 

 

9.5. Update models frequently.

 

A*

 

 

 

 

 

 

 

10. Implementing Methods: Quant Models with Explanatory Variables

 

 

 

 

10.1. Use theory and domain expertise to select casual variables.

 

A

 

 

10.2. Use all important variables.

 

A

 

 

10.3. Use theory and domain expertise to specify directions of  relationships.

A

 

 

10.4. Use theory and domain expertise to estimate/limit relationships.

 

A

 

 

10.5. Use different types of data to estimate a relationship.

 

A

 

 

10.6. Forecast for at least two alternative environments.

 

A

 

 

10.7. Forecast for alternative interventions.

 

A

 

 

10.8. Apply the same principles to the forecasts of explanatory variables.

2

 

 

 

10.9. Shrink forecasts of change if uncertainty in explanatory variables.

 

A

 

 

 

 

 

 

 

11. Integrating Judgmental and Quantitative Methods

 

 

 

 

11.1. Use structured procedures to do the integration.

 

 

Y

 

11.2. Use structured judgment as inputs to models.

 

A

 

 

11.3. Use prespecified domain knowledge as input in selecting, weighting, and modifying quantitative methods.

2

 

 

 

11.4. Limit subjective adjustments of quantitative forecasts.

 

A*

 

 

11.5. Use judgmental bootstrapping instead of expert forecasts.

 

 

 

Bootstrp

 

 

 

 

 

12. Combining Forecasts

 

 

 

 

12.1. Combine forecasts from approaches that differ.

 

 

 

Combine

12.2. Use many approaches (or forecasters), preferably at least five.

 

 

 

Combine

12.3. Use formal procedures to combine forecasts.

2

 

 

 

12.4. Start with equal weights.

 

 

 

Combine

12.5. Use trimmed means.

 

 

 

Combine

12.6.   Use evidence on each method’s accuracy to vary the weights on the component forecasts.

 

 

 

Combine

12.7. Use domain knowledge to vary the weights on the component forecasts.

 

 

 

Combine

12.8. Combine when there is uncertainty about which method is best.

 

 

 

Combine

12.9. Combine when uncertainty exists about situation.

 

 

 

Combine

12.10. Combine when it is important to avoid large errors.

 

 

 

Combine

 

 

 

 

 

EVALUATION

 

 

 

 

 

 

 

 

 

13. Evaluating Methods

 

 

 

 

13.1.   Compare reasonable methods.

 

 

 

Evaluate

13.2.   Use objective tests of assumptions.

2

 

 

 

13.3.   Design test situation to match the forecasting problem.

2

 

 

 

13.4.  Describe conditions associated with the forecasting problem.

 

A

 

 

13.5.   Tailor the analysis to the decision.

 

 

 

Evaluate

13.6.    Describe potential forecaster biases.

 

 

 

Evaluate

13.7.   Assess reliability and validity of the data.

 

 

 

Evaluate

13.8.   Provide easy access to the data.

 

 

 

Evaluate

13.9.   Provide full disclosure of methods.

 

 

 

Evaluate

13.10. Test assumptions for validity.

 

 

 

Evaluate

13.11. Test client's understanding of the methods.

 

 

 

Evaluate

13.12. Use direct replications of the evaluations to identify mistakes.

 

 

 

Evaluate

13.13. Use replications of the forecast evaluations to assess reliability.

 

 

 

Evaluate

13.14. Use extensions of evaluations for generalizability.

 

 

 

Evaluate

13.15. Conduct extensions of evaluations in realistic situations.

 

 

 

Evaluate

13.16. Compare forecasts generated by different methods.

 

 

 

Evaluate

13.17. Examine all important criteria.

 

 

 

Evaluate

13.18. Specify criteria prior to analyzing the data.

 

 

 

Evaluate

13.19. Assess face validity.

 

 

 

Evaluate

13.20. Use error measures that adjust for scale.

2

 

 

 

13.21. Ensure error measures are valid.

 

 

 

Evaluate

13.22. Use error measures insensitive to degree of difficulty in forecasting.

 

 

 

Evaluate

13.23. Avoid biased error measure.

 

 

 

Evaluate

13.24. Avoid error measures with high sensitivity to outliers.

 

 

 

Evaluate

13.25. Use multiple measures of accuracy.

2

 

 

 

13.26. Use out-of-sample (ex ante) error measures.

2

 

 

 

13.27. Use ex post accuracy test to evaluate effects.

 

A

 

 

13.28. Do not use adjusted R-square to compare models.

 

A*

 

 

13.29. Use statistical significance only to compare the accuracy of reasonable methods.

 

A*

 

 

13.30. Do not use root-mean-square errors to make comparisons.

 

A*

 

 

13.31. Base comparisons on large sample.

 

 

Y

 

13.32. Conduct explicit cost-benefit analyses.

 

 

Y

 

 

 

 

 

 

14. Assessing Uncertainty

 

 

 

 

14.1.  Estimate prediction intervals (PI).

2

 

 

 

14.2.  Use objective procedures.

2

 

 

 

14.3.  Develop PI using realistic representation of situation.

 

 

 

Pred Int

14.4.  Use transformations when needed to estimate symmetric PIs.

 

A*

 

 

14.5.  Ensure consistency over forecast horizon.

 

A*

 

 

14.6.  List reasons why forecast might be wrong.

1

 

 

 

14.7.  Consider likelihood of alternative outcomes in assessing PIs.

 

 

 

Pred Int

14.8.  Obtain good feedback on accuracy and reasons for errors.

 

 

 

Pred Int

14.9.  Combine PIs from alternative methods.

1

 

 

 

14.10.Use safety factors for PIs.

 

A*

 

 

14.11.Conduct experiments.

 

 

 

Pred Int

14.12.Do not assess uncertainty in a traditional group meeting.

 

 

Y

 

14.13. For prediction intervals, incorporate the uncertainty associated with the prediction of the explanatory variables.

 

A

 

 

 

 

 

 

 

USING FORECASTS

 

 

 

 

 

15. Presenting Forecasts

 

 

 

 

 

 

 

 

 

15.1.  Provide clear summary of forecasts and data.

 

A*

 

 

15.2. Provide clear explanation of methods.

2

 

 

 

15.3. Describe assumptions.

2

 

 

 

15.4. Present point forecast and prediction intervals.

2

 

 

 

15.5. Present forecasts as scenarios.

 

 

 

Scenario

 

 

 

 

 

16. Learning

 

 

 

 

16.1. Consider use of adaptive models.

 

A

 

 

16.2. Seek feedback about forecasts.

 

 

 

Judge

16.3. Use a formal review process for forecasting methods.

 

 

Y

 

16.4. Use a formal review process for use of forecasts.

 

 

Y

 

 

 

 

 

 

For detailed description of each principle, see

 

 

 

 

Principles of Forecasting, J. Scott Armstrong (ed.), Kluwer, 2001