Introducing the METSCO High-Voltage Testing Lab. Learn more

Doing a Best Practices Scan?

Leave it for a later phase of your project.

December 11, 2018

We have all read them and many of us have conducted them. An industry best practices scan is where many of us start our projects that seek to solve specific problems.

Looking at what others have done when facing problems similar to yours seems to make a lot of sense. Yet, starting with industry scans may not be optimal if you’re looking to innovate, deliver on time/budget, and manage your team’s morale.

The aim of this post is not to discredit a trusted approach to problem-solving, but rather to contemplate an alternative timing and role for it in the project timeline.

Why We Like Them and Readily Take Them On

Don’t quote my science here – but this is how I see the fundamental mechanics of our desire to problem-solve through best practices emulation:

  • When facing a problem, an impulse to find out if someone else has already solved is natural, and even efficient from a cognitive perspective (think of Daniel Kahneman’s description of the “Lazy System Two” part of your brain).
  • The preference for a ready-made solution may be amplified by what is known as the Worse-Than-Average Effect – a tendency to underestimate one’s capabilities relative to others. If we have a problem that we can’t readily solve – there must be someone that has already cracked it, right…?
  • And then there’s the internet, with its promise of instant answers, which seem like they’re only a few clicks away. We may even have an example or two in mind, which makes us think that more should be readily available.

The cumulative effect of these three factors is one where even industry veterans with deep sector knowledge readily agree to research how other governments / utilities / regulators deal with a given problem. Yet, their findings are often far from ground-breaking, particularly in the context of time and effort spent

What We Miss Out on in Process

Starting a best practices scan is easy. It is, however, more difficult to stop. When do you know that you’ve done enough? Even if you find some examples – do they really amount to “best” practices – or are they just “most readily available practices”? It’s always hard to tell.

The situation is worse when one does not find many relevant examples. Returning to a superior claiming that nothing of relevance is “out there” reeks of ineptitude or lack of diligence. A single example not covered in your results known to a superior (whether or not it’s real, relevant, or both) and the credibility of your research is more or less compromised.

So… you research some more, spending more time and resources that you could have invested in active problem-solving – that is, looking to resolve the issue at hand in the relevant context of your operating environment, devising an original solution.

As you uncover examples from other jurisdictions, you make assumptions that may or may not be true, are often tempted to stretch the relevance of things that marginally fit the bill, and/or spend mental energy contemplating examples that may be relevant, but for which there is a lack of context. You proceed until you generate a respectable list – most often measured in volume rather than quality.  

Beyond the “operational” struggles of best practices research, their danger stems from the reality that most of us are doing it. As we all look to others for answers to common problems, everyone is trudging along, but no one is innovating. When major shifts do happen (think RIIO) – we anchor ourselves on them, and then spend more time to adapt their most relevant aspects to our own environments.

To be sure, adapting best practices from elsewhere is not always a “square peg / round hole” scenario. But it’s rarely anything resembling a slam-dunk. Meanwhile, the time, money and cognitive energy spilled into the best practices churn are diverted from devising a homegrown solution to a “local” problem.

In process, best practices quickly become common practices – otherwise known in competitive market settings by an unflattering term of “commodity” – things devoid of originality and unlikely to be perceived as valuable by customers.

Another Way to Deploy Best Practice Scans

There is, however, one context in which jurisdictional practice scans can be very useful – and that’s when they are employed as short-spurt “sanity checks” for original solutions, programs or policies further along their development path. In this case, a team / organization first develops a “homegrown” solution that fits a given problem’s specific dynamics.

When the concept is beginning to take shape, the team can deploy an industry scan to verify the baseline validity of their intended approach and/or see whether it has any major unaccounted-for vulnerabilities. In fact, these shorter, focused scans can be used at several junctures – so long as the first instance occurs after an original idea has emerged.

In this manner, the team is less likely to anchor itself on the low-hanging fruit of “most readily available practices” and may be more efficient in their subsequent scans – using more specific search terms reflective of its refined understanding of the problem. Having a potential solution in hand is also less likely to lead to lengthy scans, where the fear of missing something is driving one to search more.

Granted, the best practices search of the "verification" variety must be sufficiently thorough to ensure that the team looks far enough past the solution at hand. There are, however, ways to account for it – such as framing the research as an explicit attempt to uncover vulnerabilities in the homegrown approach.

In the end, if the project scope calls for a best practices scan, it may be worth placing it further along the critical path – as an ongoing validation tool rather than an idea generator. Doing so enables researchers to focus on developing a range of their own solutions, informed by the context of their specific issue. Having done this, scanning the industry landscape for relevant examples would amount to stress-testing an idea – not defining it.

Even if subsequent search reveals solutions identical to what you have devised – the thinking process you do in developing it on your own should advance you further along the implementation path in a manner most relevant to your circumstances.

In the end, jurisdictional best practices scans arise out of a need to solve a local problem. If the problem is real, it requires a resolution which need not be anchored on whether and how others may have dealt with it. It simply needs an effective and workable resolution.

Using best practices scans as a validation tool for an original concept once it is developed – rather than treating it as a means to find a solution at the outset of the process, preserves the scans’ fundamental value as a comparative tool. Yet it also helps researchers innovate, by steering them clear from being anchored on ideas adopted elsewhere.    

Might be a practice worth trying.