Polly the parrot is back for forecasting the outcome of the U.S. presidential election. In order to calculate the PollyVote forecast, Polly uses the evidence-based principle of combining forecasts to average forecasts within and across different methods, each of which rely on different information. Election forecasting thus provides ideal conditions for demonstrating the benefits of combining. In fact, PollyVote has provided highly accurate U.S. election forecasts since her first appearance in 2004.

The PollyVote currently predicts that the Democrats will gain 51.5% of the national popular two-party vote, compared to 48.5% for the Republicans. Yet, there is still a lot of uncertainty until the candidates are known. You can track the daily updated forecast at pollyvote.com.

The political leaders and government officials who gathered in Paris have made agreements to implement disruptive and expensive policies on the basis of forecasts of dangerous manmade global warming. The forecasts—which are called scenarios and projections by the U.N. Intergovernmental Panel on Climate Change (IPCC)—are the product of complex computer models involving multitudes of interacting assumptions.

The finding of Kesten Green and Scott Armstrong's recent review that complexity increased forecast errors by 27% on average should have given delegates at the Paris climate policy talks pause for thought: Occam's razor applies to scientific forecasting, too.

At this year's International Symposium on Forecasting, Kesten and Scott presented a review of the IPCC's modeling procedures using a nine-item checklist on conformance with evidence-based guidance on simplicity in forecasting. They found that the IPCC procedures have a "simplicity rating" of 19%. That figure contrasts with a simplicity rating of 93% for the Green, Armstrong and Soon no-change (no-trend) model of long-term global average temperatures.

There is no evidence that climate forecasting is an exception to Occam's razor.

The global political and media elites descending on Paris for talks on climate policy might want to consider the Golden Rule of Forecasting.

The Golden Rule derives from many decades of experimental research on forecasting across diverse fields and all kinds of forecasting problem. The Golden Rule of Forecasting "requires forecasters to be conservative by forecasting in a way that is consistent with cumulative knowledge about the situation and about forecasting."

Ignoring the Golden Rule has important practical consequences: The size of forecast errors is typically increased by more than 40%.

The Paris climate talks are predicated on the dangerous manmade global warming scenarios and projections of the UN's Intergovernmental Panel on Climate Change (IPCC). These projections are being treated as forecasts by the policy makers and media attending the Paris talks.

In anticipation of the Paris talks, Scott Armstrong and Kesten Green delivered a paper at this year's International Symposium on Forecasting assessing whether the IPCC's procedures are consistent with the Golden Rule. They found that the IPCC's procedures violated all 20 of the relevant Golden Rule guidelines.

The abstract and slides of their paper, titled "Are dangerous warming forecasts consistent with the Golden Rule?" are available from ResearchGate, here. A supporting flyer is available, here.

The 36th International Symposium on Forecasting is to be held in Santander, Spain, from the 19th to 22nd of June, 2016, at Palace of La Magdalena. 

The International Symposium on Forecasting (ISF) is the premier forecasting conference, attracting the world's leading forecasting researchers, practitioners, and students. Through a combination of keynote speaker presentations, academic sessions, workshops, and social programs, the ISF provides many excellent opportunities for networking, learning, and fun. 

Important Dates:
Invited Session Proposals: January 31 2016
Abstract Submissions: March 16 2016
Early Registration Ends: May 15 2016

For more information on the 2016 ISF see here. Keynote & Feature Speakers are listed here.

At the recent International Symposium of Forecasting in Riverside, Scott Armstrong and Kesten Green presented three papers illustrating the effects of simplicity and conservatism in forecasting. In one example, with Andreas Graefe, they described applying three Golden Rule checklist items to improving eight established election forecasting models. Their resulting simple model reduced forecast error by 45% compared to the original models. To see the slides for their papers, visit the pages at goldenruleofforecasting.com and simple-forecasting.com, and scroll down.