AI in the Contact Center – Natural Language Understanding
- Posted by: Donna Penwell
- No Comments
- Under: Archives
People use Artificial Intelligence (AI) more than ever to do things like verbally place orders for goods and services or ask for directions to a place they’re unfamiliar with. With the introduction of devices like Alexa and Google Home, we’re seeing the commercial use of artificial intelligence more and more.
AI also continues to transform the way businesses can provide people with a better customer experience in their Contact Centers. It’s also being looked to use Natural Language Understanding (NLU) applications like IBM’s Watson to do things like help medical providers collect and interpret clinical data from patients.
But how does it all come together? What is it that allows you to issue a command and get back what you need from an AI-enabled application?
How Do Computers Understand Us?
A machine uses its own language for processing commands and performing tasks. In order to directly interact with people, it needs a cognitive way to turn human language into a syntax the computer can read. This is where natural language processing (NLP) comes in.
Let’s say you chat with Alexa and issue a voice command. NLP takes those words and parses them into machine language. You then get a response within seconds that answers your question.
Is that it? Not even close. Alexa at this point has the words from your query, but no idea what you’re asking for or why. Think of NLP as your smartphone. From the outside, you see the sleek design that presents visual and audio confirmation of your requests. But inside there’s a lot more going on.
How Does It Work?
If NLP is your new smartphone, natural language understanding (NLU) is the engine inside working hard to keep things going. NLP captures your command and turns it into machine language. But your automation still has no idea what you actually want.
NLU takes that machine language and uses various algorithms to parse out what your actual intent. Let’s say you’re in a chat with Alexa. You ask, “Alexa, give me directions to San Francisco”. NLU breaks it down the following way:
Directions [intent] San Francisco [location]
Once it understands that you need directions to San Francisco, it can take things further. It knows it needs to gather your location information from your device and do a search for sources that provide it with the cleanest match to your query.
This is where NLP kicks in again. NLP still has no idea what you want or why you asked for it, but it knows that it needs to be sent to you in a way you understand. It takes that information and turns it into a verbal or visual response and presents it back to you in human language.
In the Future?
NLU is the heart of any AI system. NLP facilitates an interactive voice response between a human and machines and allows NLU to receive the information needed to interpret and respond to human commands.
The continuing challenge for computer scientists and programmers is designing NLU algorithms that adequately interpret and understand the variations in human language. To do that, AI must continuously interact with humans so it can learn and grow as our language does.
Chrysalis can help you incorporate AI in your Contact Center. Call or write us: http://www.chrysalis.net/contact/