Russell Beech

Global Service Principal (Design & Product)
AI

December 17, 2025

The ambiguity about ‘value’ is the silent killer of project success and client trust

We don’t need a new metric, we need a new mindset.

In technology consulting, we’re constantly talking about ‘value’: added value, delivering value, time-to-value, and many other phrases where ‘value’ is treated with universally positive connotations. It feels good to talk about value and associate it to the work we do, but what happens when a consultant, a client, a shareholder, and an end-user all have different expectations for what ‘value’ means?

The disconnect between delivery metrics and the clients’ business outcomes can cost millions in misdirected effort.

This isn’t because of poor intentions or lack of skill – it’s about fundamentally different definitions of success existing in parallel but never actually aligning.

Fundamental questions

For consultants focused on delivery, value is typically defined by the health, speed, and stability of the software delivery process. In other words, “are we building things right?”

While from a service design perspective, the crucial value question is about outcomes for customers and the organisation: “are we building the right things?”

A narrow focus of measuring our own practice puts us at risk of building the wrong thing, efficiently. This leads to misaligned expectations between teams and stakeholders, unmet user needs and failure to deliver any true meaningful impact.

So the goal here isn’t to pit these views against each other, but to recognise that true, sustainable value lies at their intersection.

But how?

What to measure?

There’s no lack of metrics in the software industry. They all perform an important job in their own right, though each has limitations (I’ve seen OKRs that read like JIRA epics more times than I can remember). But the point here isn’t to focus on specific metrics per se but to infer there’s an existing bias towards measuring output more than outcome.

Our clients operate services and deliver products that have a job to do; the things we build for them are components in a wider system that deliver customer outcomes. And it’s this that we need to better understand, so that we can better measure them.

Irrespective of how small or buried in the backend a technology product may be, everything an organisation does should be linked to value generation for its customers.

So while DORA metrics are vital for measuring the health of our delivery process, they don’t tell us if we have built the right thing or if it’s having the intended effect.

While the fundamental question remains the same – “what outcome are we trying to create?” – the broader definition of value we’re seeking requires a broader set of measurements; a holistic, layered story that connects the performance of our delivery engine to the real-world impact it has:

  • Organisation and user outcomes: the WHY? This is the highest level of value. Did we achieve the intended purpose? For product managers, this might be measured through OKRs – for example increased activation rates or reduced churn; for commercial businesses, it could be market share or customer lifetime value; for public services, the outcome is often societal. Service design considers a holistic perspective taking all of this into account when creating service performance frameworks.

For years, HM Passport Office has processed millions of paper applications using an inefficient, expensive legacy system. Through our transformational work we’ve delivered an award-winning Digital Application Processing (DAP) system that’s brought faster passport processing for millions of citizens, and a more resilient service that could be operated from anywhere, which proved critical during the pandemic.

Success was measured in millions of applications processed, reduced manual interventions, and the ability to keep a vital national service running under any circumstances. Other outcomes can be binary but critical, such as achieving legislative compliance or solving a specific user problem.

  • Experience and interaction quality: the WHAT? This measures the journey for everyone involved. Are we making life easier for users? We can track this with indicators like the Customer Effort Score (CES). Crucially, this also includes the employee experience. A service that creates immense manual work and stress for staff has not created value; it has simply shifted the cost.

Co-op Food was struggling to manage rota scheduling for 45,000 staff using off-the-shelf, generic systems used exclusively by managers.

Through contextual enquiry, EE surfaced the true needs of all staff across multiple locations, enabling the design of a customised digital product.

The web application we built meant managers could complete their tasks on mobile store tablets or own mobile devices. And for the first time, staff were given access to functionality like viewing rotas, holiday requests, payslips and clocking in and out of shifts. The app achieved  record adoption, improved attendance rates and positive feedback from staff, thanks to the greatly increased control it gave them across 2,500 stores. It’s become a vital component of every store’s operation.

  • Delivery and operational performance: the HOW?: This is where our delivery engine metrics, like DORA*, fit in. They are the essential indicators of our ability to deliver and maintain the service.

Our work with IG Group, for example, demonstrates two of the DORA metrics. We dramatically reduced the Change Failure Rate from 33% to 0%, and cut Lead Time for Changes from 18 hours to under 4 hours.

Our client was experiencing frequent production issues that impacted trader confidence and platform stability. By improving deployment practices, automated testing, and observability, the team created the foundation for reliable, rapid delivery of new trading features. 

* Google’s DevOps Research and Assessment (DORA) measure four metrics of software delivery performance: Deployment Frequency, Lead Time for Changes, Change Failure Rate, and Mean Time to Recovery.

The power of this approach isn’t in choosing one perspective over another, it’s in connecting them. And when they are aligned through a shared understanding of value, teams can move fast without losing sight of what matters.

The AI productivity detour

The current rush to adopt AI tools in development practice has led to a very engineering-centric view of value with the seductive promise of more code, much faster.

The AI hype machine has been full of bold claims about drastic efficiency gains, but as we’ve come to realise, the bottleneck has never been limited to how fast developers can type.

A recent randomised controlled trial found that experienced developers using AI tools actually took 19% longer to complete complex tasks. The real time sinks are non-coding activities: meetings, documentation, debugging complex systems, and navigating organisational politics.

Phil Parker refers to these as the theory of constraints in his recent blog post on the challenge (and need) to measure developer productivity.

Duncan Brown, CTO for Digital Prevention Services Portfolio at NHS England recently wrote a compelling blog entitled ‘Team dynamics after AI’ where he speaks to real-world experience of running teams seeking the benefits of AI amplification.

He states “if you [incorrectly] understand much of the work of the team to be making artefacts then making 10 times more artefacts is undeniably good”, then goes on to recount:

“I led a project which took full advantage of AI’s production tools, only to run heavily and terminally aground on organisational governance and confused expectations in the department with which we were engaging… there was literally no quantity of artefacts that would have helped… all of the real “people problems” still exist.

And this is finally proof positive, hard evidence, that artefacts are not really what we’re here for. AI is telling you in quite a painful way that being able to get feedback on your artefacts is much, much more important than the artefacts themselves.”

With our attention drawn to the perceived software development productivity gains promised by AI we risk losing sight of some stuff that really matters. Solving the right problem and subsequently building the right thing matters just as much now as it did before. Perhaps even more so given our ability to progress with a solution very far, very quickly in the wrong direction with the aid of AI.

The path forward: a shared language of value

We need a deliberate shift in our conversations and practices; engaging our clients on this at the first opportunity and co-creating that shared understanding of value through what success looks like for their services.

It’s not about finding a new metric; it’s about adopting a new mindset.

Many consultancies emphasise delivery velocity. We believe true value requires a broader lens – one that connects technical excellence with business and user outcomes from the very start.

First, we must start with the outcome. Before a single line of code is written, we need to align with our clients on the “job of the service.” This means using tools like Service Blueprints to map the entire end-to-end journey, not just the digital touchpoints. This forces us to see the system holistically, across user interactions, staff processes, and technical enablers and ensures every decision is anchored to the overall purpose.

Then, we must connect all work back to the customer. Every technical decision, every sprint, every DORA metric should ultimately be justifiable in terms of how it contributes to the higher-level service outcomes we defined at the start. This is how we bridge the gap between building things right and building the right things.

Conclusion: value is a dialogue

A technically perfect dashboard that nobody uses isn’t a success; it’s a monument to misaligned assumptions. The same is true for any perfectly engineered system. Its true value is not in its technical elegance, but in the human or business outcome it enables.

Achieving this requires us to move beyond our silos and create a shared language of value, one that integrates engineering excellence with a deep, empathetic understanding of the user’s world. It requires a dialogue where every perspective is heard and aligned toward a common vision.

  • How does your organisation define value?
  • Are your delivery metrics connected to the outcomes that matter most to your customers?
  • If you’re struggling to align technical delivery with business impact, or if you’re curious about how service design thinking can help bridge this gap, we’d welcome the conversation.

 

About the author:

Russell Beech is a Principal Consultant and Service Designer at Equal Experts, focused on helping organisations create technology transformations that deliver meaningful outcomes for users and businesses alike.

You may also like

Blog

Why is AI making us talk about “developer productivity” (again)?

Blog

AI will amplify traditional product adoption barriers

Blog

AI in production: How banks can move from pilot to measurable impact

Get in touch

Solving a complex business problem? You need experts by your side.

All business models have their pros and cons. But, when you consider the type of problems we help our clients to solve at Equal Experts, it’s worth thinking about the level of experience and the best consultancy approach to solve them.

 

If you’d like to find out more about working with us – get in touch. We’d love to hear from you.