The “Last mile” Problem
The “last mile” problem refers to the “last mile” in a telephone signal’s journey to the subscriber. This is the most difficult and expensive part of the call to provide.
In contact centre knowledge management terms, the “last mile” is when the agent searches for an answer to a specific question in the contact centre’s databases.
This is frequently problematic. Companies store information in various ways that include CRM system entries, instruction manuals written as wikis or .pdf files.
This knowledge may be widely dispersed across various databases and other tools.
When companies have been through a merger, and their separate knowledge management tools are brought together, this makes the search for a “single source of truth” even more difficult.
It’s almost impossible to include everything in one search. Agents often produce their own quick notes or “cheat sheets”. These do not appear on any databases or search engines. This compounds the problem because agents’ answers become inconsistent and often wrong when an agent’s favourite “cheat sheet” is not updated.
How much time could be saved if agents could access all this data through a “one stop shop”?
How much more time would be saved if agents didn’t have to look for the information, they need at all?
This might seem like science fiction, but the technology is here and coming to a browser near you!
AWS Wisdom, according to their own website, “delivers agents the information they need, reducing the time spent searching for answers.”
It’s a machine learning powered search engine. It can be connected to various applications and databases with connectors. AWS have ready-made connectors for Salesforce, Zendesk and ServiceNow, among others. AWS will also supply APIs to allow users to develop their own connectors to their own applications. Wisdom can also be connected to Contact Lens to speed up the search process.
By connecting it to all these data sources, agents can use Wisdom as a one-stop-shop to search for and retrieve information from knowledge bases.
Search users often run into the “magic word” issue. They do not know the exact search term they need, which requires several attempts before they get what they want.
Wisdom overcomes this by including natural language processing (NLP) in its programming. The search engine can then respond to an “intent” instead of relying on the agent to enter the correct search term. This makes it easier for less experienced agents to use the tool to retrieve the information they need.
Search engine responses can also vary in their relevance. Wisdom allows users to rate responses on how useful they were. This helps “train” the system to find more relevant responses to specific intents or key phrases.
Wisdom can be connected to Contact Lens, AWS’s real time speech to text transcription and analysis engine. Wisdom can then initiate searches based on phrases or intents expressed by agents or customers during the call. It can automatically generate search results for agents without any intervention at all.
The system can be used to retrieve information or recommendations for the next best action. If recommendations are set up to be presented in series, and agent could even be guided through a sequence of events with branching paths based on specific phrases. This could be very effective in helping newly trained agents handle troubleshooting scenarios that would normally be beyond their capabilities.
Wisdom and Contact Lens simplify and speed up searches for relevant information and advice for agents in the contact centre.
In their “Traeger” case study, AWS quotes Bryan Carey, Traeger’s Head of Operations and Analytics, who says: “agents using Amazon Connect Wisdom have seen an increase in customer satisfaction and first contact resolution of roughly 15%. They have also decreased their call handle times by roughly 15%.”
To find out more about how Omningage can work with you to realize the benefits of using Wisdom and Contact Lens in your contact centre, contact your local AWS Partner or get in touch with our sales team.