LOBUS.WORKS
← All insights
Flow & Measurement

The Dashboard Delusion

14 Dec 2025·7 min

Every organisation I've worked with has dashboards. Velocity charts, burndown reports, sprint completion rates, status summaries that roll up into executive decks with green, amber, and red indicators.

Almost none of them show what's actually happening.

The measurement theatre

The problem isn't the data. The data is accurate. The problem is the question the dashboard was built to answer. Most delivery dashboards answer: "Are we doing what we said we'd do?" What they should answer is: "Is our delivery system healthy, and are we getting better?"

Those are different questions. The first rewards compliance. The second rewards learning.

Warning

When a dashboard reports green and the engineering team is exhausted, the dashboard isn't wrong. It's answering the wrong question.

What your dashboards are hiding

I built FlowMaster — an analytics engine that connects to Jira and runs eight diagnostic engines — because I needed to see what dashboards hide. Here's what consistently surfaces:

Metric gaming. Teams manipulate statuses, bulk-update items at sprint boundaries, and time their transitions to make completion rates look right. This isn't cheating — it's a rational response to an incentive structure that rewards appearance over reality. FlowMaster's anomaly detection flags 15 patterns of gaming. Each one tells a story about psychological safety and power.

Flow debt. The gap between your proxy-based cycle time (WIP divided by throughput) and your actual cycle time grows wider every quarter. This divergence measures how far your stated process has drifted from reality. Positive flow debt means your governance reports optimistically. The gap is governance theatre, quantified.

Invisible constraints. The same bottleneck sits in the same status for months. Everyone works around it. Nobody has the authority or incentive to fix it. The dashboard doesn't show bottleneck dwell time, so the constraint remains invisible to leadership while the teams feel it every day.

WIP explosion. Work in progress is three times capacity. New items keep arriving. Nobody says no. The dashboard shows individual items progressing. What it doesn't show is that the system is overloaded, aging work is piling up, and every new item makes every existing item slower.

Percentiles, not averages

The most fundamental delusion is the average. Your average cycle time is 14 days. This number is meaningless.

What matters is the distribution. Your 50th percentile (median) is 8 days. Your 85th percentile is 22 days. Your 95th percentile is 47 days. That spread tells you everything: your process is unpredictable, your forecasts are unreliable, and your stakeholders are making commitments based on a number that describes nobody's actual experience.

Percentile-based forecasting replaces the fiction of the average with a conversation about probability: "We have 85% confidence this will be done in 22 days." That's a conversation leaders can actually use.

Process behaviour charts: signal vs noise

When your cycle time goes from 14 to 18 days, is that a problem? The dashboard doesn't know. A process behaviour chart does.

Statistical process control distinguishes signal (a real change in the system) from noise (normal variation). Without this distinction, organisations chase every fluctuation, firefighting symptoms while the structural causes go unaddressed.

The four control rules that separate signal from noise are simple enough to fit on a sticky note. But most organisations have never heard of them, because their dashboards show trend lines instead of control limits.

The structural diagnosis

Every dashboard failure I've described is a structural failure, not a data failure:

  • Metric gaming exists because the incentive structure rewards appearance over reality. Fix the incentives, not the data.
  • Flow debt accumulates because governance is disconnected from delivery reality. Close the gap between stated and actual process.
  • Invisible constraints persist because nobody has the authority to fix them. Redesign accountability.
  • WIP explosion continues because nobody says no. That's a governance design problem.

The dashboards aren't lying. They're faithfully reflecting an organisation that's measuring the wrong things, asking the wrong questions, and optimising for compliance instead of learning.


FlowMaster is the diagnostic analytics engine I built to see what dashboards hide. Learn more about the diagnostic.