Most predictive analytics suffer from a fatal flaw: they are confined to historical views. Analysts try everything from machine learning (ML) to traditional forecasting to flashy BI dashboards and visualizations in attempts to estimate details about the future of complex business operations: The predictions fail.
A wake does not drive the ship; and the wake cannot tell us where a ship is going next. Solely looking back, at past business operations, delivers severely impaired future forecasts. We wouldn’t drive ahead at full speed while focused on the rear view mirror; why would we attempt to support enterprise operations in that way?
So by subscribing to rear-facing predictions, businesses are limited to executing a reactive string of tactical actions—always responding to the past rather than developing informed strategy from accurate future views. Chasing the effects of last quarter, the past year, or even more aged observations results in error, waste, and missed opportunities. Even worse, some organizations build long-term strategies out of historical views. This majestic error directs business actions in the wrong direction for years.
Today, many large budgets are aimed at efforts around collecting and harnessing the rapidly expanding volumes, rapid velocity, varied forms, and uncertainty of historical data. The cost of these big data programs intensifies pressure to demonstrate ROI. So rear-facing BI, algorithms, and data scientists are flung at the mountain of past information. A few tactical insights and some interesting correlations are uncovered, but truly meaningful decision support is elusive.
Too many predictive models attempt accurate forecasts by examining historical data—applying methods to environments rich in complexity and uncertainty produces decisions that miss their target time and again. For operations that include layers of interconnected inputs and process variability, planning the future while looking back at history, simply does not work—regardless of the modeling and analysis method applied.
Here’s the simplified overview: If today’s business operations are simply a repeat of yesterday, and each subsequent day is like the last, then it certainly makes sense to base business tactics and strategy on historical data. In fact, these cases do not require complex software and analytic applications. We can achieve accurate results, for steady-state cases, with a simple spreadsheet. Yet, businesses rarely operate in steady-state, they typically face daily uncertainties and changing environments. The historical, traditional methods quickly fail in these cases. Establishing a connection to the past, feels comfortable and is mathematically attainable. We can spend time and effort collecting volumes of historical data. But we will miss the links between the present and the future if we focus entirely on rear-facing information. Our decisions and strategies become nothing more than reactions with history-driven analytics.
Even predictive analysis based on newly re-named processes like machine learning and AI will produce answers that are far off-target in complex, changing, uncertain settings. These approaches rely on examining data produced by past events. The more advanced analyses leverage near-real time data, and may even incorporate streaming data from sensors into a machine learning algorithm that tries to keep up. All of these approaches are assuming that, to some degree or another, the past equals the future. In some cases, the very recent past is heavily weighted and the predictions adjust quickly—to something that just happened.
Few businesses operate in a predictable steady state: where the past closely approximates the future. In these cases, sweeping assumptions, clever algorithms, and forecasts will be accurate. However, these cases can actually be managed by using a few well-known, simplistic formulas. Data science is overkill for steady state ventures. They are well-served by the decades-old traditional forecasting approach. However, steady-state business operations are rare.
Rapid advancements in the Industrial Internet of Things (IIoT) will soon deliver billions of interconnected machines streaming sensor data in unprecedented volumes and speed. We’ll be able to quickly map details about conditions across complex networks of assets. We’ll have a clearer view of the past than ever before. Still, a swift, near-perfect view of the past will not change in inescapable fact that the future is not entirely caused by the past. To develop strategic insights and apply them to future operations, we require the ability to link together the past, present and the future in a holistic manner. In reality, modeled events are not separate, but elements of an operational continuum. Observing future operations across many possible outcomes fulfills this requirement and allows us to analyze the complete view of the continuous processes that are represented by disjoint events.
Dynamic demands, changing environments, and even chaotic situations influence businesses on a daily bases. Decision makers can only influence a subset of these variable inputs; however, accurate predictive analysis should have the capacity to capture the detail and uncertainty inherent to all these factors. Future events and policies must be represented with high-resolution to make insights from predictive analytics meaningful. Unfortunately many decision makers are distracted by the many approaches strictly bound to historical data: ML, inference, debating between frequentist and Bayesian methods, mistaking correlations for causality, traditional forecasting, and simplistic BI. We’ll examine each of these and highlight their murky entanglements.