Two weeks ago we launched Air Charter Amy, an Alexa skill for use in a sales role. In this post I’ll cover why it makes sense, where it fits in the sales funnel, what it does, and some of the challenges we faced using the Alexa platform.
Air Charter Amy is a combination of my interest in conversation AI and my ownership of Planeviz, a private jet charter brokerage. Being your own executive sponsor can expedite a project but the use case must still be solid if it is to succeed.
Using voice AI for sales requires slotting it into the sales funnel where it makes sense. And where it makes sense is where it adds value to the both customer and to the business.
With private jet charter customers spending 5-figures for a few hours of flight time, private jet charter is a high-touch business. The first question to be answered then is where in the jet charter business model does voice AI makes sense? Where can it add value to both the CX and the business?
One common private jet use case is the “on demand” flight in which a person rents the jet for a specific trip. Unlike booking an airline flight, there is significant manual work involved in reserving a jet for an on-demand trip.
The process starts with the traveler’s request for a quote, typically made through a website form or a phone call. There are many jet charter companies and it’s in a customer’s best interest to shop around and negotiate for the desired combination of plane, price, and confidence. (A jet charter broker typically does this on behalf of the customer.)
On the other side of this shopping experience are sales agents who field the quote requests. It’s a very competitive market and many quotes produce no further action by the shopper. In other words, there’s plenty of low value busywork.
The opportunity to use voice AI lies at the very top of the sales funnel where it can potentially improve the user experience and augment the existing business process—with minimal impact on the high-touch aspects of the business.
What Air Charter Amy does
For now, Amy does one: engage in conversation to gather just the essential trip information and prospect contact details, then send this information to the sales agent and to the prospect.
How does this help?
- It’s a better customer experience than having the prospect complete a web form
- Amy is on the job around the clock
- Conversations can be personalized with prospects name (and later with personal preferences)
- Prospect contact details can be automatically added to a CRM system
To get the prospect’s contact details we can either request the user’s Alexa profile info or connect to an external account, such as Google. In both case the user grants permission or not. We chose to ask for the Alexa first name and email contact details.
Accessing a user’s contact details enables us to use the person’s first name in the dialog and to welcome that person back by name if they return in the future.
With the trip details in hand, we package it in an email that is sent to the sales agent and to the prospect. This information can be sent to a CRM system too for a nurturing campaign.
For bigger companies it might make sense to send the quote request to a Slack channel or another team communication tool to leverage the advantages they offer over email.
We stayed away from making Amy a transaction-capable voice assistant for two reasons
- Actually getting the quotes (versus a request for quotes) moves automation farther down the sales funnel into the high-touch part of the sales process.
- Voice AI is not the right engagement method for evaluating many choices (multiple charter quotes). Conversation Design starts with User Context explains why.
The Air Charter Amy graphic has been the face of the Planeviz brand for years so it made sense to user it to personify the skill. A cheery face in the confirmation email gives it some personality.
We are using the stock Alexa voice. Alexa has a voice markup language that can be used to make her sound more human-like, maybe we’ll give her a verbal makeover in a future update.
Other voice options were hiring a voice actor to record the dialog or generating a synthetic voice based on a human voice. Being Amy version 1.0, those two didn’t make sense.
I modeled the dialog after a typical request-for-quote phone call with a human. It’s a to-the-point, guided conversation.
People respond to questions in various ways and we added some of those possibilities to her intelligence. For example, Amy asks “when are you leaving?” and a prospect might reply “tomorrow afternoon” or “May 1” or “April 30 around 7am”.
We also rotate through variations of the questions she asks to keep the dialog fresh every time someone speaks with her. She might say “where are going?” or “flying to?”
Our simple interaction model intentionally does not handle conversational diversions. These are when a person gets to a point in the dialog then goes off topic or back to an earlier point to change something or to ask a question related to that earlier point in the conversation.
If the user makes a mistake or wants to make a change then he or she can just say “start over” and make the changes. The customer experience cost of starting agian is not that high in our case.
In an earlier IBM Watson chatbot project we did implement diversions. From that expereince I learned it adds a level of design and development complexity that I don’t feel is necessary for Amy, at least as she now is.
At first I was writing the dialog in a Google Doc. That was fine for me but hard for my developer to keep straight while coding it. We adopted a draw.io to map the interaction model. It integrates with Google Drive for easy sharing. The blue and gray are dialog, the rest is logic and functions.
Developing on Alexa
Why Alexa? To expand our development repertoire and to leverage Amazon’s continuous promotion of Alexa to the public.
Despite the Alexa PR bombardment, I read a Microsoft study that broke down the percentage of respondents who have used a voice assistant as Siri 36%, Google Assistant 36% and Alexa 25% and Cortana 19%.
While I chose Alexa for round one, the advantage of developing a Google Assistant action versus Alexa is the ubiquity of Android. That said, Amazon is making a push into hands-free in the car, which was my original vision for Air Charter Amy 2 years ago, albeit with Siri.
With Watson we built our own NLU model using our own data. With Alexa, Amazon owns and controls the natural language model. In practice, if Watson didn’t understand something we just trained it to be smarter. With Alexa, all we can do is add the expressions people might use in conversation and leave the rest to Alexa.
This manifested with our invocation name Air Charter Amy, which Alexa would routinely overrule in favor of Air Cheddar Amy or Air Trotter Amy followed by the inevitable “sorry I don’t know that one.” Not the best outcome for demoing your cool Alexa skill.
This hiccup wasn’t a matter of a thick accent or speaking while under the influence; it had to do with the algorithm Alexa uses to surface skills. Apparently the voice equivalent of “did you mean” that we get with search engines.
The Alexa developer support team was great and fixed this on their end. Coincidence or not, shortly after this hiccup they changed the skill certification requirement to only one invocation (“open Air Charter Amy”) instead of three.
Just 20 years ago ecommerce was a novelty, no enterprise would entrust its data to “the cloud”, and streaming media was, well, it wasn’t. Early adopters Amazon, SalesForce, and NetFlix jumped in and are today’s market leaders.
While we made Amy for the jet charter business, her information gathering capabilites can be used in other businesses that use calendar and contact information too.
Air Charter Amy is available in the US, Canada, and UK Amazon Alexa stores.
Contact us if you’re thinking about adding voice AI to multichannel mix.