The current finds its path around what's fixed in place. The same principle applies to analytics; define the decision first, let the data flow from there.

Every executive I know wants better visibility. They want more dashboards and real-time data. Valid ‘want’ for sure, and the requests always sound reasonable. The results rarely are.

A company spends six months and $400K building a comprehensive analytics platform with sophisticated visualizations, tons of KPIs, and real-time updates.

Six months later, three people use it regularly. The rest of the organization makes decisions the same way they always did: by gut feel, spreadsheets, and whoever talks loudest in the meeting.

The numbers confirm what observation suggests.

According to Gartner's 2024 CDAO Agenda Survey, only 22% of organizations have clearly defined, tracked, and shared business impact metrics for their data and analytics projects. Research shows that only 20% of enterprise decision-makers who could be using BI applications actually do. The other 80% rely on the data skills of that active minority.

Meanwhile, 73-88% of company data goes entirely unused for analytics; sitting in warehouses as expensive digital inventory that nobody knows exists.

The problem has a name. Researchers refer to it as the decision framework gap. Understanding it explains why throwing money at dashboards produces expensive wallpaper.

Behavioral scientists Bart De Langhe and Stefano Puntoni spent years studying why investments in analytics fail. Their conclusion, published in MIT Sloan Management Review and expanded in their 2024 book, inverts conventional wisdom: "Instead of finding a purpose for data, find data for a purpose."

Most organizations build analytics backwards. They start with available data, construct dashboards around it, and then hope decision-makers will extract value.

De Langhe and Puntoni's research shows this approach anchors teams on the wrong questions; the ones their data happens to answer, rather than the ones their business actually needs answered.

The right sequence looks different:

  1. Identify the specific decisions you need to make

  2. Determine what information would help you choose between alternatives

  3. Figure out what data would provide that information

  4. Only then build the analytics to deliver it

This sounds obvious, but almost nobody does it.

Cassie Kozyrkov, Google's first Chief Decision Scientist, offers a simple test before any analytics investment: "What would it take to change your mind?"

If stakeholders can't articulate what evidence would shift their decisions, the dashboard will become decoration for decisions already made.

Kozyrkov coined the term "data-inspired" to critique organizations that claim to be data-driven but actually use data as a post hoc justification. The dashboard is displayed in meetings, people nod at the charts, and then they do what they were going to do anyway.

The Wavestone 2024 Data and AI Leadership Executive Survey of Fortune 1000 companies puts a number on this: 74% of companies aim to be data-driven, but only 29% believe they're good at translating analytics into action.

That 45-point gap represents a massive deficit in the decision framework. Organizations have invested in data infrastructure without investing in decision-making infrastructure.

What does the infrastructure of decisions actually look like? It means documented answers to specific questions before any dashboard gets built:

  • What decisions will this inform?

  • Who has the authority to act on these insights?

  • What would we do differently if the numbers showed X versus Y?

  • What's the timeline from insight to action?

Without these answers, you're building a monitoring system, not a decision system. Monitoring systems are useful for pattern recognition and anomaly detection, but they're terrible for driving organizational change.

Dylan Etkin, former Atlassian development manager for Jira and Bitbucket, describes the dashboard death cycle he's observed repeatedly:

A dashboard is created with the best intentions. People look at it for a while. Nothing happens most of the time. Data becomes stale. People notice and care less. The dashboard is ignored entirely.

Dylan Etkin

His conclusion cuts to the root of the problem: "Dashboards don't come with interpretation. The same data means different things to different people."

For operational data, you want alerts, not dashboards. For business KPIs, you want periodic reports with context. Dashboards are suboptimal for both use cases; yet organizations continue to build them.

Dashboards in the Age of GenAI

The generative AI wave has made this worse, not better.

MIT's NANDA Initiative “GenAI Divide" report found that 95% of generative AI projects failed to achieve ROI targets. A 2025 survey found that 42% of companies had abandoned most of their AI initiatives, up from 17% in 2024.

The technology improved, but the organizational capacity to convert insights into action did not.

AI tools multiply data without multiplying decision-making capability. Tableau's own analysis of what they call the "last mile" problem identifies the persistent barriers: context switching (BI tools live outside operational systems), limited accessibility (platforms built for data experts, not decision-makers), and overreliance on users to drive action.

More sophisticated AI doesn't solve these barriers; it amplifies them.

BCG and McKinsey research came to the same conclusion: "AI does not lack capabilities; it is organizations that lack the structure to absorb them."

Redesigning workflows requires renegotiating scopes, roles, and hierarchies. It calls into question middle managers, decision cycles, and control models. Organizations skip this uncomfortable work, hoping technology will substitute for organizational change.

It won't.

Two Examples

The UK National Health Service's National Programme for IT stands as perhaps the most expensive dashboard failure in history. From 2002 to 2011, the NHS spent £12.7 billion on integrated electronic patient records, online booking systems, and national network infrastructure, yet delivered only £2.6 billion in documented benefits.

The technology worked, but the clinical staff didn't trust it. Doctors maintained parallel spreadsheets because the data didn't align with their decision-making needs. A University of Cambridge case study concluded the project was "too large for leadership to manage competently."

The technology worked, but the decisions didn't follow.

Now consider Georgia State University. They implemented predictive analytics to identify at-risk students. The system analyzes 800 risk factors daily for each student.

But the critical difference is that every alert triggers a specific advisor intervention. The decision framework preceded the technology.

The result: a 23 percentage-point increase in the graduation rate and the elimination of achievement gaps based on race, ethnicity, and income. Fifty-two thousand proactive interventions annually.

The NHS built dashboards and hoped decisions would follow, while Georgia State defined decisions and built systems to support them: same technology category, opposite outcomes.

What Actually Works

The organizations that extract value from analytics share a familiar pattern. They define decisions before building dashboards. They link every metric to a specific action and embed analytics within operational workflows rather than treating them as observation posts.

Avalign Technologies, a medical device manufacturer, followed this approach. They defined the decision to be focused on “decreasing downtime and increasing throughput for grinding and lathe equipment” and then built dashboards specifically to support it.

Within nine months: 40% increase in overall equipment effectiveness, $4.5 million in capacity utilization gains, 14,000 hours saved.

The diagnostic for any analytics investment is straightforward.

Before building anything, document:

  • What specific decision does this inform?

  • What would trigger a change in that decision?

  • Who has the authority to act?

  • What's the response time from insight to action?

If the answers are vague, the dashboard will be just another place to look at numbers and will have wasted time and money.

For existing dashboards, audit:

  • When was the last time this dashboard changed a decision?

  • Can anyone name a specific action taken because of this data?

  • If this dashboard disappeared tomorrow, what decisions would become harder?

If the audit reveals dashboards that inform nothing and change nothing, you've discovered where the budget is hiding.

The 80% failure rate in BI projects has remained consistent for over a decade, despite exponential technological advances. The problem was never the technology. The problem is that organizations treat analytics as an observation function rather than a decision function.

Data should inform strategy. Data should support decisions.

But "data-driven" too often becomes "we cannot make a decision unless the data supports it 90%," followed by "It’s not our fault we failed; the data told us it would work."

Data-informed preserves judgment, experience, and the capacity to see what metrics cannot measure.

The executives who will extract value from their analytics investments are those who start with decisions, not data. They ask "what would we do differently?" before "what can we measure?" They build decision frameworks first, then fill them with analytics.

The 80% failure rate persists because most organizations refuse to work in this sequence, continuing to hope that sufficient data, sufficient dashboards, and adequate AI will eventually produce decisions worth making.

It won't.

The decision framework comes first.

Everything else is expensive decoration.

This is the kind of problem I work through with executive teams: determining what's worth building before anything is built. If your analytics investments are producing reports that nobody uses, that's a solvable problem.

More at ericbrown.com or subscribe to my newsletter at newsletter.ericbrown.com.

Newsletter Recommendations

The Magnus Memo

The Magnus Memo

A personal dispatch from my corner of the tech world, 25 years in the making, I write about a blend of tech wisdom, hard-won lessons, behind-the-scenes stories, and the occasional life hack — all t...

Westenberg.

Westenberg.

Where Builders Come to Think.

Brian Maierhofer

Brian Maierhofer

One decision to change your life; one decision to save your heart

Reply

or to participate

Keep Reading

No posts found