About a month ago I had a fantastic meal in a restaurant in central London for an anniversary. It was slightly expensive, but absolutely worth it – I would recommend that place to friends.
Last week I went to the corner shop and bought milk. There was nothing wrong with the experience, but I wouldn’t recommend either the shop or the milk. Why? Because none of my friends want that recommendation.
This is a small but vital part of the problem with Net Promoter Score (NPS). If you’re familiar with NPS you know the drill; after an interaction the customer receives a survey asking “How likely is it that you would recommend our service to your friends?”
Respondents are split into promoters, passives, and detractors on a 0-10 scale, and the results tend to correlate (more or less) with customer satisfaction.
What the results don’t correlate with is whether or not the customer will really promote the brand. The reason there’s no significant correlation is that consumers only really discuss and promote a few kinds of product.
Movies, restaurants, apps – yes. Financial institutions, energy companies, telecomm’s providers – not so much.
But it must be better than nothing?
Ok, you might be saying, so NPS doesn’t really align with word of mouth promotion – it’s still telling us something useful, right?
There are two ways to answer that. The first answer is, if you’re putting resources into surveys, you should know what they’re really measuring. NPS might tell you something about customer satisfaction, and it might tell you something about word of mouth. Maybe.
The second answer is that the score itself is nearly useless. The 11 point scale NPS uses has notoriously unsteady predictive power, and it treats consumers who score ‘0’ just the same as users who score ‘6’.
There’s also this issue:
Which of these three restaurants would you rather eat at? Which would you rather own? If we go by NPS alone, these restaurants look exactly the same, but they’re clearly providing different experiences to their customers.
Let’s find a credible alternative
So those are the issues – NPS does a bad job of measuring something it also fails to define. What’s the opposite of that?
Rather than ask hypothetical questions about the future, Customer Effort Score asks customers about an experience they’ve already had. When dealing with your business via IVR or an advisor, what’s important to them? They want it to be quick and easy. So what’s the best measure of success? It’s effort.
A few year ago the Harvard Business Review conducted research on the impact of ‘Customer Delight’ initiatives in service. They found that while 89% of businesses were trying to go above and beyond for their customers, only 16% of the customers had noticed.
That tells me consumers are less interested in the bells and whistles than the consistency of the core service. It tells me that – all together now – they want it to be quick and easy.
Have you ever thought about the way you measure customer experience? If you want to learn more about implementing Customer Effort Score, drop us line today and we’ll tell you more.