How To Measure What Matters

Typical product metrics

A typical organisation might track the following product management metrics:

  • Number of new users
  • Net promoter score
  • Retention rate

These metrics tell you something about the product. When tracked over time, they tell you if these measures of the product are improving. It is a good thing to be measured and held accountable for improving product results.

However, these metrics don’t tell you the single most important thing you need to know: how to improve.

To achieve results, you must first understand what causes the results.

Drivers vs results

An effective way to get a handle on this is to group metrics (KPIs) into categories – drivers and results.

Drivers are the cause of outputs. They are human behaviours. You can quantify a driver by measuring the health of selected processes or methods. These are things like amount of WIP, cycle time and defects found before release.

Results come from optimising the drivers. Things like number of new users, net promoter score and retention rate.

Consider an initiative to migrate product modules to a new UI. Separating the drivers from the results helps identify where to focus improvement efforts.

Summary: Measure drivers over results

Focus your improvement efforts on drivers, not results. If you identify and improve the true drivers, improvement in the results will follow.

Are You Afraid?

Fear!

It might seem like an odd question to ask a Product Manager, but here goes … Are you afraid? If you’re working on a non-trivial, interesting product, of course you are! If you’re not at least a little afraid, you’re probably not being adequately challenged.

A better question to ask is: how afraid are you (or should you be!) about your organisation’s intentions for your product? And what impact does this have on the best ways to steer product efforts? What lessons have already been learnt about how to effectively deal with and manage this kind of fear?

Size & uncertainty

It’s common knowledge that agile methods came about because on most technology projects, scope has most uncertainty at the beginning, and scope uncertainty only reduces through continuous learning as the projects progress.

One way to understand the impact this has on the optimal approach is to consider the range of all possible product initiatives on a scale of uncertainty. The less uncertainty there is, the more you can lean towards waterfall or step-by-step phase-gated project approaches. Conversely, agile methods come into their own for initiatives with more uncertainty. It’s a spectrum of uncertainty: as uncertainty increases, agile methods work better.

Size1.jpg

Equally important is the project management approach and rigour required. A single project (eg. building a customer invoicing product) requires a different management approach to a portfolio of projects (eg. building the modules that make up an ERP system). This is a spectrum of project management approaches according to size: as size increases, more rigour is required.

Size2.jpg

But wait: project size is closely correlated with project complexity. Leading authors have called out size as the reason for complexity in large software projects, simply because as projects get bigger, interactions between elements of the system increase in a nonlinear way. From the Fred Brooks 1986 classic No Silver Bullet:

…a scaling-up of a software entity is not merely a repetition of the same elements in larger sizes, it is necessarily an increase in the number of different elements. In most cases, the elements interact with each other in some nonlinear fashion, and the complexity of the whole increases much more than linearly.

Clearly uncertainty and size are correlated. When uncertainty is plotted against size, it’s clear that each product initiative has a unique position on a matrix. The position on this matrix can be used to understand when it is appropriate to apply which techniques.

Methodology matrix v1

It is impossible to determine the exact uncertainty and size of a given initiative ahead of time. However, it is useful to consider the relative position of initiatives within your organisation’s context. This can inform as to the best product management approach to apply in each circumstance.

Size3.jpg

To apply this practically, consider some examples plotted relative to each other on a size/uncertainty chart:

Size4.jpg

Building a residential house is a relatively small initiative and it has low scope uncertainty (millions of houses have been built before). It is a single project and it is appropriate to use waterfall together with conventional project management techniques.

Staging the Olympic Games is a massive undertaking, consisting of many sub-projects. Yet it too has low uncertainty: after all, the Olympics has been run many times before and the format remains largely unchanged, so we pretty much know the requirements upfront. It is appropriate to use waterfall, but this time with the added discipline and rigour of programme and portfolio management techniques.

Most mobile applications are relatively small single projects, but often with high uncertainty. It typical that the functional requirements are known in full up front, and the appropriate user experience (UX) can typically only be discovered with some level of trial and error. Agile methods are appropriate here, together with conventional project management.

A technology start-up is likely to be have very high scope uncertainty and potentially unbounded size. We know that agile methods are appropriate to help deal with the scope uncertainty and for all but the simplest examples, it calls for some form of portfolio management.

Size5.jpg

Methodology matrix v2

A second version of the matrix shows common methodologies and when they are relevant in terms of size plotted against uncertainty:

Size6.jpg

PRojects IN Controlled Environments (PRINCE2)/Project Management Body of Knowledge (PMBoK): These methods were designed to bring discipline to individual projects, with low uncertainty. Both come with a significant body of knowledge and associated certifications. PRINCE2 was developed in the UK and is the most commonly utilised project management methodology in many industries and contexts. PMBoK was developed in the USA. PRINCE2 and PMBoK position themselves as complementary products – PRINCE2 as a “methodology” and PMP as a “standard”.

PRINCE2 Agile/DSDM: These are ways to apply the traditional single-project methods while simultaneously utilising agile methods to deal with uncertainty. PRINCE2 Agile combines agile principles with the discipline and governance of PRINCE2. The Dynamic Systems Development Method (DSDM) project lifecycle grew out of the Radical Application Development (RAD) movement in the 1990s, a precursor to modern agile methods.

Managing Successful Programmes (MSP): MSP is a framework for breaking large and complex changes into manageable, inter-related projects. It defines a set of programme management principles and best practices and is designed to be adapted to specific circumstances. Some MSP users include London Olympics and the UK Ministry of Defence.

Management of Portfolios (MoP): MOP was developed as a way to scale project management disciplines to the management of multiple programmes. It focusses on the needs of decision makers at senior levels and how to sustain progress across a range of parallel initiatives. MoP extends the programme and project management methodologies of MSP and PRINCE2 by putting focus on change management and organisation-level benefits realisation, rather than individual projects or programmes.

Scaled Agile Framework (SAFe) / Large Scale Scrum (LeSS) / Scrum@Scale / Disciplined Agile Development (DAD): These methods are appropriate for large projects with high uncertainty; the idea is to get the benefits of agile methods but at scales beyond single teams.