Artificial Intelligence
Using Chatbots To Improve Customer Experience
Better customer experience, faster response time, and longer availability are some of the many benefits of leveraging automation and artificial intelligence (AI). As more federal teams consider the use of chatbots to improve access for millions of people needing government services, here are some tips and lessons learned from three different case studies.
How to get started
One of the greatest advantages of AI is that it facilitates decision-making by making the process faster and smarter. Chatbots can perform the same task 24/7 without getting bored or tired, and the outcome is relatively consistent. Therefore, answering basic frequently asked questions about a topic or a service usually can be a good first process to automate.
A team in the Technology Transformation Services (TTS) at the U.S. General Services Administration (GSA) was charged with improving the USA.gov content for users who look for information on scams. After talking to the users who engaged with the site through various channels, the team realized that breaking into AI with a chatbot by using a question-answer format would be a good solution. This type of approach has a more straight-forward setup than one involving processing free-form text entered by users.
Tapping into already available technology can provide an easy way to get started even on a smaller scale. You can test out the solution, and see how well it is adopted by your users.
The USA.gov team built the chatbot internally, using human-centered design and already available technology. Marietta Jelks, USA.gov research lead, suggests that one way to start is to ask: “Do you have somebody on your team comfortable with or able to create the logic — if this, then that — on all the different response paths who can understand what the words and language will be to go with that?”
In the case of Transportation Security Administration (TSA), the team managing AskTSA on Twitter and AskTSA on Facebook needed a faster way to answer questions from people on their way to board a plane. The team’s goal is to be able to respond to the traveling public with travel-related questions and ease the burden of the checkpoint.
TSA social media manager Janis Burl said that to prepare for AI use, her team broke down the questions they received to see what was redundant versus what was more complex and needed a person to answer. They noticed that questions about prohibited and permitted items on an airplane were easy and thus prime candidates for automation.
The bot has dramatically reduced answering time and increased customer satisfaction score.
“When we first started … we had a 1.5 hour wait time just to answer a question for someone standing in line at the airport. By the time we would answer that question most people were already on the plane and flying and it was too late. Now our time to reply and answer a question is less than two minutes” said Burl.
Integration and technology
Unlike the USA.gov team, the AskTSA team purchased an off-the-shelf chatbot solution. Such an approach can allow a team to get off the ground faster. Some important requirements to consider when deciding on a chatbot solution are if the chatbot collects any personally identifiable information (PII), if it integrates with other federal systems, whether it needs a FedRAMP authorization or it might require an Authority to Operate (ATO). Engaging with your agency’s privacy officer and CIO early in the process can help make the decision process smoother.
When the team at the Federal Student Aid (FSA) started designing Aidan, their virtual assistant, they decided that providing information about a customer’s account – such as a loan balance – was very important for success. This required integration with their existing legacy system, so their chatbot solution had to be compliant with FSA’s cybersecurity and privacy regulations.
Abraham Marinez, director of the Product Design Group at FSA, said that when they started the project, there were no FedRAMP-authorized chatbot solutions available. So, the team reviewed their options and chose an open-source conversational AI platform. They installed the platform in the FSA FedRAMPed environment, built it, and trained it. Aidan is currently available for authenticated users only, and in the near future, Aidan will be available to all unauthenticated users on StudentAid.gov.
Test and retest
Conducting user research and usability testing before launching the chatbot has been crucial for the teams’ success.
You might have all the data in your operational reports, but you can learn a lot from the qualitative data you get by observing somebody using your product, Marinez said. His team conducts frequent usability testing on Aidan to learn how to improve the product. They conduct testing on both desktop and mobile, and have been able to conduct it while most people are working remotely.
In addition to usability testing, the teams also recommend listening to your customers to see if they report issues—whether via contact centers, on social media or other channels—to be able to fix any issues in a timely manner.
Should you name your bot?
The Federal Student Aid team created the Aidan persona after talking with their customers. They had multiple name choices but Aidan resonates with “aid,” and it was also a name that wasn’t trademarked. It is important to trademark the name and icon or other branding you may use to avoid potential future litigation.
Regardless of whether or not you name your chatbot, it is best to be upfront with your users that they are interacting with a bot and not a person; even unnamed bots can have such a friendly personality that they are mistaken as human beings, guiding users through the steps they need to find an answer.
Lessons learned
Getting started with AI may seem daunting at first, but that shouldn’t be a deterrent.
Marinez said one of the key things is to dare to experiment.
“There is no such thing as failure. It is working and getting data to help improve the customer experience. Be perseverant and have that mindset of experimentation. It is not about failing, it’s all about experimentation,” he said.
Burl shared that one of her lessons learned was that “just because nobody else has it doesn’t mean you can’t do it.”
Her advice to teams getting started is to define the problem they are trying to solve and make sure they can show a return on investment, however they define that ROI. Staying focused helps navigate the many options and vendors available to design and develop chatbots.
In addition, Jelks encourages teams to think early about metrics and how they measure success, as well as if they want to launch a chatbot in other languages, making them multilingual. Taking into account whether you have content and resources to create different versions, and the ability to sustain them, is also important.
Content is (still) king
Chatbots can help users find information faster, but they are not a replacement for answers to frequently asked questions on your organization website. Chatbots require good content that can be easily parsed, and that content must be created and maintained by humans.
Sometimes the chatbot may not know the answer. When that happens, a chatbot can transfer users to an agent or direct them to a self-service knowledge center. Being resourceful and continuing to improve the quality of your content can help create a successful chatbot solution and improve overall customer experience.