guide to customer service chatbots
Back to Articles

A Starter's Guide to Building a Customer Service Chatbot

December 4th, 2017

1361

Becca Le Blond



Automation is the future of customer service, and chatbots are leading the charge. They help cut costs, enable faster solutions to customer queries, and facilitate better customer experiences (CX’s). Knowing how to make a chatbot is an essential skill for a customer service contact centre to survive.

We’ve collaborated with a few of our clients to produce several chatbots, and we want to share our experience. We’re going to talk about the building blocks you’ll need to create your own customer service chatbot, and why you need them.

A brief

Want your own chatbot? Start by defining what you want it to achieve, and use that definition to shape the scope of the project. You’ll be tempted to dive straight into the coding/mechanics of a bot and fix problems as they arise. But that isn’t always possible. The best way to avoid starting from scratch, after pouring hours and money into development, is to create a blueprint of what you want to achieve before you start.

You need to decide if you’re going to have a bot that works independently of, or alongside, a human CS team. This is the most important decision to be made. Not only will it influence the software decisions you’ll be making, it’s going to define how your chatbot affects your CX.

Front-end bots (those that can handle interactions independently) can provide greater cost savings by removing labour costs if you decide to cut your human team. But the current limitations of artificial intelligences (AI’s) mean that the bots can only reliably deal with simple queries.

One way around this limitation is to have a front-end chatbot deal with said simple queries, whilst more complex questions go straight to your human team. Otherwise, you’re going to have to accept that your bot is going to negatively affect customer satisfaction, and weigh that against the financial benefit it presents.

Bots that work collaboratively with your team enable you to cut costs without sacrificing quality. They can take over small data gathering tasks within conversations before passing a customer to an operator (for example, asking a customer for their account number). Or, they could recommend a response to an operator based on context collected from the customer’s messages.

Once you know what you want your bot to be able to do, you’re ready to start looking for the software you’re going to need.

Natural language processor (NLP)

A bot’s ability to derive meaning from human language is dependent on an NLP. This is the building block that determines a bot’s artificial intelligence. NLP’s are the result of a lot of very complicated programming and, thankfully, you won’t need to create your own. There's a lot of NLP software suppliers you can work with. The difficulty instead is making sure you pick the best NLP for your bot.

And so, we reach one of our top tips: choose an NLP engine capable of machine learning.

Chatbot NLP best practice

You have to consider how to make a chatbot learn from customers’ messages, rather than simply understanding them. People use language in different ways, and your chatbot will need to be able to decipher unique queries.

Think about it this way: a chatbot without machine learning that understands “how much does the six-month warranty cost?” won’t recognise “whats the price for teh coverage for if it breaks?”.  Avoid sending a frustrating “I’m sorry, I didn’t understand that” because the message didn’t contain the key phrases ‘six-month’ or ‘warranty’. An AI that knows that ‘coverage’ means the same thing (and that ‘teh’ is a common misspelling of ‘the’) can ask the customer how long they’d like to be covered for.

Whilst you won’t need a team of developers to create your NLP, you will need an implementation team to teach it the intent behind phrases specific to your operation. For example, if your online store sells big-name items, you’ll need to teach it to recognise the brand name, any colloquial terms for the brand (e.g. ‘JD’ as opposed to ‘Jack Daniels’), and the product the brand name is applied to (‘whiskey’).

Once your bot understands what a customer is asking for, it's going to need to find an answer.

Integrations

For a bot to be able to independently find answers for customers, you’ll need to connect your NLP to the relevant software with application programming interfaces (API’s - strings of code that enable two pieces of software to interact). What do we mean by relevant software? That will be unique to your business, but it generally includes any software that contains customer account or order details. For example, if you’re an online retailer, this will include an order management system and delivery management system.

This is the difference between a bot that sends generic stock answers and a bot that can send personalised messages that reflect an order’s latest update. We know which response we’d prefer to receive.

You’re going to need a team of developers for this step. They’ll need API’s for each system you’re using so that your NLP can grab the information it needs to form a response. Most systems will have API’s created already which will significantly reduce the expensive developer time needed.

Once your systems are all hooked up to your NLP, the next question you need to ask yourself is: how is my bot going to receive my customer service queries? There are a few possible solutions, and the best one for you will depend on how you’re planning to use your bot.

Responses

If you want a front-end bot, then responses can be sent straight from your NLP. You’ll need an API for each platform you take customer queries on.

This can cause problems for a Twitter-based CS operation, because Twitter has a native AI that tends to block other bots. Some NLP engines have created a solution to this problem, so if you're hoping to handle Twitter interactions with a bot, be sure to include this in your NLP research. 

chatbot AI best practice

You can use an API to send your received messages to the NLP, which can then send responses using the same API. But we don’t recommend this practice until AI has gotten even better (we’ve written more about why in another blog).

A bot working alongside human agents will need to be able to send responses via the communication platform your operators are using. Any communication platform that plans to stay relevant will be able to integrate with NLP’s. If an incoming message fits the criteria for a conversation to be sent to your bot, the platform can send it to the NLP which sends back a response for the platform to send to a customer.

By nature, bots will attempt to reply to every message they receive. If you don’t plan for your bot to reply to all customer queries, you’re going to need a way to gate its access to incoming messages. How you achieve this will depend on the brief you created when you first stepped down the path of creating your bot.

How are they helping your team? A data gathering bot will only need access to new conversations where a customer hasn’t supplied the information an operator needs to help them. If your bot is designed to take simple interactions - so your operators can focus on more complicated queries - then it needs to be able to determine the reason a customer has made contact.

Such message-gating won’t be possible with an NLP alone. We use the workflow system within our communication platform, Gnatta. We can use regexes (code that enables a programme to search for text that fits a given character pattern. For example: a 9-digit order number) to recognise if a customer has sent us the information an operator needs before they can advise them. Once the bot has collected the necessary security information, the conversation then goes to an operator. This cuts labour costs because no time is lost to easy, run-of-the-mill messages. Customers benefit from an increase in responsiveness when messaging the bot, and from higher quality responses.

Testing

The final stage of chatbot development is testing. The purpose of this stage is to do everything in your power to try and trip your bot up, and then fix the reason why it tripped up. Doing this yourself reduces the potential for a customer to cause such a problem for your bot once it goes live.

Be it one person, or one hundred, your testing team will be responsible for teaching your bot any phrases it doesn’t understand. They’ll need to feed it lots of messages based on the types of query your bot will face, and a diverse team will help you to cover a wider range of phrases.

customer service API's

Creating a customer service chatbot isn’t simple. Implementation can be fiddly, and expertise comes with experience. We’ve been working with some of the biggest brands in the UK to create customer service bots that delight customers whilst reducing customer service costs. If you’d like to find out more about our work with automation, contact us here. We'd love to have a chat with you.

NEWSLETTER

Improve your customer experience with insights from our monthly newsletter. Subscribe today to get started.

Stay Updated
Please type a valid email
Please remember to verify you are not a robot.