The Unscaled Mean Bounded Relative Absolute Error (UMBRAE) is a new way to measure forecast errors proposed, and well supported in, Chen, Twycross, and Garibaldi (2017). "A new accuracy measure based on bounded relative error for time series forecasting". The new measure appears to be a promising alternative, and is certainly worthy of further comparative research. Some analysts may want to continue using the RAE until further testing is done. We suggest using both measures in the meantime.

We have added Ev Gardner's spreadsheet for damped trend exponential smoothing to the Software Page.

Don Miller and Dan Williams have recently re-posted spreadsheets and X-12 specifications that can be used to implement the seasonal damping method they proposed in "Shrinkage Estimators Of Time Series Seasonal Factors And Their Effect On Forecasting Accuracy," International Journal of Forecasting 19(4): 669-684, and "Damping seasonal factors: Shrinkage estimators for the X-12-ARIMA program," International Journal of Forecasting 20(4): 529-549.

Damping is needed to counteract excessive seasonal variation produced by classical decomposition and X-12, which is an artifact of the random noise in the data.These articles show that a small but significant reduction in forecast error is obtained when using seasonal damping. The benefit is more pronounced when seasonal factors are determined through Classical Decomposition, but remains significant when X-12 is used.

The spreadsheet software for seasonal factor damping is available from the Software Page.

Scott Armstrong and Kesten Green are seeking suggestions of relevant experimental evidence that they have overlooked in their new working paper, "Demand Forecasting II: Evidence-based methods and checklists". They describe the problem that the paper addresses as follows:

Decision makers in the public and private sectors would benefit from more accurate forecasts of demand for goods and services. Most forecasting practitioners are unaware of discoveries from experimental research over the past half-century that can be used to reduce errors dramatically, often by more than half. The objective of this paper is to improve demand forecasting practice by providing forecasting knowledge to forecasters and decision makers in a form that is easy for them to use.

The paper is available from ResearchGate, here.

Scott Armstrong presented a talk at Heartland's Twelfth International Conference on Climate Change (ICCC12) on March 23 in Chicago that summarised his research on forecasting climate and the effects of climate policies with Kesten Green.

The talk asked the question, "Are long-term forecasts of dangerous global warming scientific?", and concluded...

"No, because...

  1. the only 2 papers with scientific forecasts found no long-term trends
  2. IPCC methods violate 81% of the 89 relevant scientific principles
  3. IPCC long-term forecasts errors for 90-100 years ahead were 12 times larger than the no-trend forecasts
  4. tests on three other data sets, one going back to 112 AD, found similarly poor accuracy
  5. the "long-term global cooling" hypothesis was twice as accurate as the dangerous global warming hypothesis
    Also "no" because the warming alarm...
  6. ignores all 20 of the relevant Golden Rule of Forecasting guidelines; the AGS scientific forecasts violated only one
  7. violates Occam's razor
  8. fails to comply with any of the 8 criteria for scientific research
  9. fails to provide scientific forecasts of harm to people
  10. fails to provide scientific forecasts that "solutions" will work
  11. fails to meet any of the 10 necessary conditions for successful regulation
  12. is similar to 23 earlier environmental alarms supported by the government: all lacked scientific forecasts and all were wrong."

A video of his presentation and a copy of a more complete set of slides with links to evidence, is available here.