Hannibal Smith, the fearless leader of the A-team, always loved it when a plan came together and on that campy ’80’s TV show, they always seemed to, no matter how intricate and contrived. It seems quaint now. In the real world, things rarely happen as we imagine they will.
As Mike Tyson, another icon from the ’80’s, liked to say, everybody’s got a plan until they get hit and, like it or not, we all get hit, usually sooner rather than later. When that happens, as it inevitably always does, even our best laid plans go awry.
In truth, planning has never been about strategy, but control and control has always been an illusion. Nevertheless, for a long time, it was true enough to be successful. Plans set direction and, if problems arose, plans could always be changed. The problem is that as technology cycles compress, planning cycles can’t keep up. We need a new model.
The Rise and Fall of Strategic Planning
When Alfred Sloan conceived the modern corporation at General Motors, he based it on hierarchical military organizations. Companies were split into divisions, each with their own leadership.
By the 1980’s, the seams started to show. When Jack Welch took the helm of General Electric he largely dismantled the strategic planning process, because, as he said at the time,”the books got thicker, the printing got more sophisticated, the covers got harder and the drawing got better,” but none of that improved how the company performed.
Today, planning has become even less tenable. As the pace continues to accelerate and technology cycles become shorter than corporate planning cycles, the false certainty that planning engenders is becoming an impediment to, rather than a tool for, attaining objectives.
As Roger Martin put it in a recent article in The Harvard Business Review, strategy is not planning. There is a better way.
Why Our Numbers Are Always Wrong
In the academic world of statistics, there has been a similar debate raging over how we should use information. Up until recently, the predominant approach has been frequentist statistics, the kind you most likely learned in school, which utilizes controlled variables and large sample sizes to arrive at conclusions that are statistically significant.
If a hypothesis passes the significance test then it is treated as being true, but a lot can go wrong. Sometimes controls are overlooked or data is mishandled or you simply get freak results (95% confidence is standard, which leaves ample room for error). Just because a study says something is true, doesn’t mean it is.
A recent study in the journal Nature found that a majority of cancer research studies could not be replicated. That’s scary. Incredibly scary. Do we really believe that the data we’re feeding into our strategic plans is any better?
How to Become Less Wrong Over Time
The alternative is the Bayesian method, in which you simply guess at the answer and then continue to collect data as it comes in. It doesn’t convey the same sense of certainty that the “hard numbers” of the frequentist method do and you can still get wrong answers, but they become less wrong over time and you get a sense of how likely they are to be right.
To give a simple example, if a frequentist was asked to evaluate a free throw shooter in basketball, he couldn’t give you an answer until he saw a fixed amount of trials (say 100). You would simply have to wait. If you wanted to check again later, you would have to do another 100 trials.
A Bayesian, however, would keep updating his evaluation from the very first attempt and would continue to keep a running score. In today’s business environment, we simply can’t wait for the strategic equivalent of 100 free throws.
Unfortunately, there just seems to be something about guessing that just doesn’t seem right to tough minded, hard data types, which is why Bayes and his method fell out of favor. Nevertheless, it’s been making a comeback lately.
The Road To The White House
The 2012 presidential election was unusual for two reasons. Firstly, despite a poor economy, the incumbent, Barack Obama, won by a surprisingly comfortable margin. Second, polling played an unusually prominent role in the coverage of the race. Bayesian methods played a part in both.
As Sasha Issenberg reported in The Victory Lab, Obama’s campaign collected a large amount of data to determine the probability that a particular individual would support the president and turn out for the election. As more data came in, these scores were adjusted. The results were then used to run simulations.
Nate Silver also used Bayesian methods to predict the outcome of the election. He aggregated public polls along with other data and ran all of it through a model to test outcomes under different conditions. He not only got the final outcome right, but correctly predicted the results of all 50 states and the District of Columbia.
In a sense, there wasn’t just one election, but millions of them: The one in the real world and the countess other run by computer models.
Management by Simulation
It’s become increasingly obvious that the old planning model has become untenable. It not only relies too heavily on weak evidence, but it’s simply too slow. We need discard our false sense of certainty and learn to rely more on simulation. Perhaps not surprisingly, “test and learn” is a phrase that has become standard in the corporate lexicon.
Much like financial analysts have used scenario planning to test multiple strategies, digital marketers constantly run small campaigns to carry out A/B testing so that performance increases over time and logistics operations run their routes through genetic algorithms before they start loading trucks.
Failing “fast and cheap” is becoming too slow and too expensive.
As our information technology continues to advance and the Web of Things gives us an unprecedented amount of data to work with, we’re beginning to use agent based models and Bayesian methods to predict likely outcomes of everything from disease epidemics to the consequences of economic and political policy.
One thing’s for sure, the days of plans are over. In the future, we’ll need to simulate failure in order to succeed real world.