After a week of startlingly frequent and frustrating customer service encounters, I’ve wondered: to what extent do customer self-service metrics measure what matters to customers? There are many metrics to assess the performance of online self-service: average visits per customer, average page views, online resolution rate, number of zero-results searches, customer feedback on FAQ quality … But how well does this measure a satisfied customer? Let’s think about what customers want:

  • Fast resolution. No one wants to spend unnecessary time finding an answer to a question. In fact, 57% of online consumers tell us that they are very likely to abandon an online purchase if they can’t find the quick answers to their questions. (North American Technographics Customer Experience Online Survey, Q4 2009 [US]).
  • Quality/accuracy of resolution. We all want accurate and thorough answers to our questions. Incomplete answers, failure to assist to the next step or wrong information is disappointing. Performance metrics can reasonably assess the above two consumer needs. But the next two consumer desires are harder to measure – and more important in the difference between a loyal customer and one who will never return or recommend.
  • Effort needed to resolve. Most companies will look at metrics to determine if there was resolution but most don’t consider what effort was needed to resolve the problem. I had to return a product under warranty earlier this week and needed to know the process to do this. I logged in and did a search on the company’s help pages but got zero results. Information in my product insert was incomplete and inaccurate, and it took three calls to the call center to find someone who could provide the information I needed. It should have been a quick process; instead, it took a disproportionate amount of effort to find out where to send the returned item.
  • Gaining a positive/avoiding a negative emotional response. The emotional element is directly related to all of the above. In my example, I wanted to feel confident about the information I was receiving and hoped for a little bit of sympathy from someone who would apologize for the defective product. Instead, I felt irritated, that my time was taken for granted, and uncertain I was getting accurate information. If the company I was dealing with had asked me about my online or overall service experience, they would have learned that the warranty information insert had an incorrect phone number printed, the insert did not include a return mailing address, the information was not available on their website via either search or FAQs, and that there was confusion among call center reps on a co-branded product. They could have been alerted to an opportunity to improve online self-service content that would defect calls to the call center, reduce resolution time, and improve my satisfaction. Alas, they didn’t ask.

Inadequate metrics is not an unfamiliar issue to US retailers: only 24% believe that they have the best metrics in place to measure their customer service activities and only 24% have a system in place to analyze structured and unstructured customer feedback (based on Forrester's Q4 2009 US Retail Executive Online Survey). eBusiness professionals must be able to determine the success of their online self-service and cross-channel customer service. Here are some suggestions:

  • Explore what analytics are available to you. Talk to your vendor about your business needs and wish lists – there may be more data available than you think.
  • Ask customers a Yes or No question “Did this FAQ/search/recommendation solve your problem?" Only 23% of Web sites in Forrester's July 2009 eBusiness Customer Service And Support Benchmarks invited customers to provide feedback on how effectively a question was answered – a huge missed opportunity to improve answer relevance. • If you require a log-in to access self-service, consider surveying your customers. The key here is to be prompt – don’t wait more than 24 hours or your customers’ memories will fade and a delayed effort may seem half-hearted.
  • Use analytics to determine how many customers used self-service but had to reach out for additional support through another channel. Forrester’s Cross-Channel Review can be of assistance to examine the transitions between online self-service, email, interactive voice response (IVR), and telephone agents.