The Dawn of the Era of “Human-assisted Machine Service” in Customer Care

For the longest time, we have looked at how to complement human labor in the contact center with computer programs – to reduce costs, provide 24×7 service, and offer quicker access to basic information. IVR systems are the prime example of technology that complements human service by pre-qualifying a caller through some simple questions and routing them to the right agent. What we have been finding until recently is that customers usually preferred the “human touch” over automation. The terms “automation,” “bots,” or ”IVR” tend to come with heavily negative connotations. TV ads have been mocking bad IVR systems and showing off with short wait times to get to live operators. Websites such as have been launched to show consumers how to quickest get to an operator.

As another example of how technology is complementing human labor, let’s look at the agent workplace itself. In the contact center, agents are equipped with Internet access, dedicated knowledge bases, document management systems, CRM systems, and more, to have the answer to the customer’s question ready as quickly as possible.

All of this, however, is slowly changing. Rather than having machines assist humans, we’re slowly entering the era of the reverse: algorithm-based customer service, assisted by humans to put the “finishing touches” on an otherwise increasingly impeccable experience. Virtual assistants on websites are recently gaining in popularity as they expose a more natural interface (conversational, “spoken language” style) to finding information vs. manual search. The younger demographic prefers texting to calling. The older ones seem to catch up and agree. We are now carefully sending a text first to check if it is OK to call. What a remarkable change in behavior and expectations!

Why is this change happening?

First and foremost: the mobility revolution. The introduction of the iPhone in 2007 has truly brought the mass consumer market to realize the power of computer technology if applied to their daily lives. For the first time ever did the Internet, an invention that in its current form came into being around 1989, find its way into a pocket-sized device without compromising the user experience. More and more people realized that with a mobile device of this form factor, they had the world of information and communication literally at their fingertips. It suddenly became cool to be a nerd – a person who understands and can program computers. Consumers now seem to shout ”give me an app, I can look this up quicker than your agents!” at any company they do business with.

Other enhancements came with the iPhone and Apple’s insistence on quality user experiences. Siri, Apple’s speech assistant, is only possible as general-purpose data connectivity and required bandwidths allowed overcoming the restrictions of the PSTN (public switched telephone network) in terms of what sound frequency ranges it submits. Sounds such as “s”, “f”, or “th” differ in frequency ranges that are simply cut off in normal telephony (above 8 kHz). Siri can submit the full range of an utterance recording in high definition, which improves speech recognition accuracy. Cloud computing does its share to quickly respond with a transcription of what was said, of which Siri then applies a semantic analysis to truly “understand” the user – at least to the extent that it can perform the operation the user asked for.

Big Data and today’s capability to automatically farm large quantities of data contribute to making systems such as Siri more and more perfect. The formula is simple: the more context you have, the better you can understand someone. While this is even more true for pragmatic context (i.e. dialog history and “world” knowledge), it is even important in the phonetic domain alone. As an example: try to understand a few words of a spoken conversation by only hearing a second or so of what was said. If you don’t have any context at all, you will realize it’s not that easy to do. However, if you heard the domain and topic the conversation is about, and heard what the immediately neighboring words were, your brain uses deduction and other mental techniques in addition to the pure “hearing”, to understand an utterance.

What does all of this have to do with customer service?

Recently, companies and technologies have emerged that invert the paradigm of machines helping humans become better. There are companies out there in the field of IVR technology that are running call centers of people that do nothing but listen to what people tell IVR systems. They never talk to the callers directly, they are merely jumping in if automated speech recognition cannot tell what a caller said, or isn’t confident enough it understood the caller properly (speech recognition engines always associate a likelihood with a recognition hypothesis). When hearing an utterance, they then either quickly type in what they heard, so that the IVR system can move forward, or they click on predefined data items (the so-called “semantic interpretation” of a verbatim utterance) that are expected in the current dialog step. This is a case of “human-assisted machine service” in the field of customer service that is an amazing testament to the change that is taking place.

After great success on TV’s “Jeopardy,” IBM released Watson to developers to build applications that use Watson’s unique cognitive capabilities in creative new ways. A prime use case for Watson, however, is customer service. When done right, Watson can engage with customers, say through chat on a website, as if the customer was talking to a live person. Watson doesn’t merely bring up Web pages that seem to have the information you are asking for – it answers your question. Something that Google also increasingly does on their core search product (try searching for “who wrote Norwegian Wood,” and Google will answer your question – in addition to showing you relevant websites). Watson goes beyond Google, though, in that it can ask back to narrow down your question, to lead you to the right answer. It can deduce. It can learn. Like a child absorbing everything, or a very astute student. Most importantly: Watson learns from unstructured data, i.e. data expressed in human language such as English. That’s a new level of computing, beyond plain big data analysis.

With Watson, humans again take a step back from the spotlight, and operate “behind the scenes.” They need to feed Watson with information, constantly. Watson doesn’t go out by itself to learn. Watson needs to be fed product brochures, manuals, data sheets, research papers, books, etc. Anything that is relevant for the domain of knowledge Watson is operating in.

This is the emerging new role of humans in customer service: make sure that the data is accurate, but let machines do the “talking” and “serving.” Humans then also step in when that “human touch” is really needed: Not to answer the simple questions, but to mitigate in complex situations, to calm down angry customers, to provide a level of confidence and confidentiality when needed, e.g. in the domain of financial advising.

It’s going to be exciting to see what the limits are.