Building Bots with Natural Language Processing

If you talk to a man in a language he understands, that goes to his head. If you talk to him in his language, that goes to his heart.

— Nelson Mandela

I would venture to guess that most people had their first encounter with natural language processing (NLP) when Apple added Siri to the iPhone. Starting with the iPhone 4S, you could ask “her” simple questions such as “Who was the 12th president of the United States?” (Zachary Taylor) and “Will you marry me?” (We hardly know one another). Personally, I use Siri on a near daily basis for getting me to where I need to go and finding the best Indian, Thai, or Mediterranean restaurant once I arrive there.

Of course, NLP isn’t limited to iPhones, today. You can now talk to your Android devices, and contact centers are increasingly adding automated “Tell me what you are calling about” functionality. It’s not out of the realm to envision a world where typing becomes as old fashioned as rotary telephones and stick shifts.

In order to help you bridge the gap from NLP confused to moderately dangerous, I would like to take you deeper into the technology and give you a few examples of how developers are creating NLP solutions using some of today’s most common platforms.

The Basics of Natural Language Processing

To understand NLP, it’s important to know what’s going on underneath the covers. While a detailed look at NLP is beyond the scope of this article, there are a few simple concepts that should supply most people with enough knowledge to consider themselves dangerous.

First, there is the intent. As it implies, intent is the intention conveyed by the user. For instance, “weather” is the intent of the question, “Will it rain today?”

You can classify intents into two groups. Casual intents are like small talk. Greetings such as “hello” and “goodbye” are casual intents. If I say “Hi” to a text bot, an appropriate response might be “What can I do for you today?” The same can be said for affirmative and negative responses — “Yes,” “Thank you,” and “Not today” fall into the casual intent category.

The second group, business intents, correspond directly to the focus of the statement or conversation. “When will my package arrive?” would direct the NLP computer to return a date or send a tracking number.

The next big concept of NLP is entities. An entity is the metadata of an intent.

I like to think of entities as modifiers of intents.  For example, in the above example, the intent of “When will my package arrive?” might be ArrivalQuery.  If I were to rephrase the query as “Will my package arrive today,” I will still have the intent ArrivalQuery, but “today” represents an entity I might call @Date.  Valid values for @Date could be today, tomorrow, next Tuesday, or July 12th.

The last major NLP concept is the dialog.  A dialog consists of the words returned following NLP processing.

For example, “When will my package arrive” might return a dialog of “Your package is schedule to arrive on May 10th.”

In this case, the dialog was dynamically created based on the intent and any entities encountered during processing.  To satisfy the request, a backend database may have been queried to obtain shipping information.  A more sophisticated solution might involve IoT sensors placed in delivery trucks.

Some dialogs might fixed.  For example, “Hi” might always return “Good day to you.  How might I help?”

Teach Me

Once you’ve designed your intents and entities, the next step is to train the solution. This requires you to ask a series of questions that the NLP system might encounter. For a weather bot, you might enter the following:

“Will it be sunny today?”

“Will it snow in Minneapolis today?”

“What are the chances of it raining tomorrow?”

“Is there snow in the forecast?”

“Do I need to bring an umbrella to work?”

These questions train the system in the many ways that a user might ask for the same information. A simple rule for training is that you can never provide the system with too much data. Additionally, training is not a once-and-forget operation. A well behaved NLP system must be trained and retrained throughout its entire lifecycle.

I can add entities to the above questions with phrases such as these:

“Will it be sunny @Date?”

“Will it snow in @Location @Date?”

Dialogs for intents must also be created before the solution is fully configured.  This may involve both static and dynamic dialogs.

NLP in Practice

There are a number of different development tools that can be used to create NLP solutions.  I have personally used Google’s Dialogflow, IBM Watson’s Assistant, and Amazon’s Alexa Developer.

Here are a few screenshots of the different tools.  Notice how they all implicitly or explicitly reference all the NLP concepts I wrote about.

Putting it to the Test

Once the flow/skill has been created and any connections to backend fulfillment system(s) completed, the solution needs to be deployed.  This generally involves integrating it into software and hardware that communicates with a user.  I have deployed my solutions to Google Home and Alexa devices, WebEx Teams, Microsoft Teams, SMS text, and web chat.  If done correctly, the same solution can run nearly unchanged on any of these user interfaces – regardless of whether the user types or speaks his or her questions.

The following is an example of my CloudHawk fleet management IoT bot integrated into Microsoft Teams.

To see (and hear) that same bot running as an Amazon Alexa skill, watch this very short video.

Mischief Managed

So, where is this technology useful? A better question might be, “Where isn’t it useful?” From help desks, to contact centers, to text bots, to whatever, processing language into actionable components is the future of communication. In fact, I will venture to say that within the next several years, it will be nearly impossible to know “who” you are talking to when you make that customer service call or chat. For better or worse, machines will become indistinguishable from humans.

If you plan on attending Avaya Engage this year, be sure to see my “Bot Building Basics” breakout session.  I will explain NLP and bots in more detail, while walking you through the steps to create your own bots.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: