David StoughtonXP Day 2: David Stoughton

Is uncertainty really unmanageable?

All our plans are based on assumptions. Market changes break assumptions.

Strategic planning tries to plan for the long term. Someone does some inspection to model possible futures and construct scenarios. It's the assumptions that are critical, but they get forgotten in travelling along the lines set out by the main scenario.

Sensitivity analysis is another method: create a model, change key variables, and work out which variable affects outcomes the most. It's a good means of discarding some scenarios, but in general only looks at local events, rather than causes: "this set of customers stops buying", not why they've stopped.

Then there's emergent strategy, with folks on the shop floor making decisions. In practice this usually only allows for minor adjustments, anything major needs to go through planning: so it's flexible administration of rigid policy.

Three disabling assumptions:

  1. We predicate the configuration of our business on stable market conditions, as we can't respond to chaos;
  2. To change is to lose face;
  3. Change is difficult and expensive;

The result:

  1. All uncertainty must be treated as a risk;
  2. Change is only made in fits and starts as poor alignment between company configuration and market realities becomes unbearable;

Risk management isn't enough; the clash between technology and social change is accelerating: look at the holiday and music industries already. Organisations can't cope with sudden changes because they can't anticipate. They're reactive.

Action is preferred to inaction because management likes to be seen to act. Decisions are made before uncertainty is resolved. Changing strategies means losing face.

What's missing? We need to make our assumptions explicit: what people will pay, what the market is, etc. Financial ones tend to be more explicit than non-financial ones.

How can we model assumptions? Look at all of them and their knock-on effects. (Shows v complicate model of technical, political, competitive etc assumptions)

It's tough to model P&L, though you can model likely demand and likely costs of supply.

Use the model to reduce response times. You need to react before the outcome of a change in conditions occurs: the trigger point needs to be earlier. You need to plan systemic responses to predicted outcomes and assign or acquire enablers and resources for critical response capability - which is a cost.

What do we need to understand about markets? Complexity, competitors, substitutes and complements. Cumulative impact of remote events on local variables: several factors combining. Positive feedback loops and negative (damping) ones, abrupt changes caused by positive feedback.

Currently we work with a Bayesian network model: embed the dynamics of the system into a cause/effect structure. We have a pile of indicators for each variable: confidence, valid range, etc.

To plan:

  1. create systems to gather information about each variable
  2. establish a timebox, matching to the clock rate of market change
  3. automate data collection
  4. view current status through a dashboard
  5. review and prioritise

Deploy options according to trigger conditions.

This implies redundancy of effort and assets, which doesn't fit well with minimising costs. Audience member points out that maximising utilisation suboptimises for throughput (which I think is the thesis behind Slack).

Is it all worth it? The value of systems is enhanced. A lot of this is about protecting old assets rather than creating new ones. This is an agile system: incremental, evolutionary, frequent delivery.

There's prejudice remaining: redundancy is seen as expensive... when we know agile is cheaper.