history of foresight

The Evolution of Forecasting

Eric Garland DataLab

This infographic shows that the process of foresight has always been primary for humans. Blessed and cursed with the ability to perceive linear time, mankind has always sought to learn from the past and present to improve its lot in the future.

When looking at forecasting from pre-history to the 21st century, there are really two different constraints: data sets and analytical tools. For the longest time, the biggest constraint has been data sets. Lacking reliable information on which to base decisions, man turned to drawing inferences from the natural world or from literally rolling sheep knuckle bones to see if patterns could be identified, analyzed, and turned into actionable insights.

The Renaissance- and Industrial-Age versions of this mostly dealt with the data constraint, looking for reliable information. As far as the interpretation of this data, until recently most of the analytical tools have involved simply applying local transitions, prejudices, and superstitions.

It is only when we get into Big Data that we realize that the only remaining constraint of the Information Age is in analytical sophistication. With modern sensors and databases, organizations and their leaders can now command a nearly infinite number of data points for the first time in history. Now, we are faced with the more pressing issues of:

  1. determining whether we are asking the right questions and
  2. asking whether we are able to interpret the data accordingly.

What’s next in the evolution of forecasting? Leaders will probably have to deal with the trouble caused by creating elaborate and shiny visions of the future, well supported by massive data set and polished in their presentation, and completely, amusingly wrong.