Virtual sales meetings have made it tougher than ever for salespeople to read the room. So, some well funded tech providers are stepping in with a bold sales pitch of their own: that AI can not only help sellers communicate better, but detect the “emotional state” of a deal — and the people they’re selling to.
In fact, while AI researchers have attempted to instill human emotion into otherwise cold and calculating robotic machines for decades, sales and customer service software companies including Uniphore and Sybill are building products that use AI in an attempt to help humans understand and respond to human emotion. Virtual meeting powerhouse Zoom also plans to provide similar features in the future.
“It’s very hard to build rapport in a relationship in that type of environment,” said Tim Harris, director of Product Marketing at Uniphore, regarding virtual meetings. The company sells software that attempts to detect whether a potential customer is interested in what a salesperson has to say during a video call, alerting the salesperson in real time during the meeting if someone seems more or less engaged in a particular topic.
The system, called Q for Sales, might indicate that a potential customer’s sentiment or engagement level perked up when a salesperson mentioned a particular product feature, but then drooped when the price was mentioned. Sybill, a competitor, also uses AI in an attempt to analyze people’s moods during a call.
Uniphore’s software incorporates computer vision, speech recognition, natural-language processing and emotion AI to pick up on the behavioral cues associated with someone’s tone of voice, eye and facial movements or other non-verbal body language, then analyzes that data to assess their emotional attitude.
And there’s an actual digital emotion scorecard.
Sitting alongside someone’s image on camera during a virtual meeting, the Q for Sales application visualizes emotion through fluctuating gauges indicating detected levels of sentiment and engagement based on the system’s combined interpretation of their satisfaction, happiness, engagement, surprise, anger, disgust, fear or sadness. The software requires video calls to be recorded, and it is only able to assess someone’s sentiment when that individual customer — or room full of potential customers — and the salesperson have approved recording.
Image: Uniphore
Although Harris said Uniphore does not compile profiles of individual people based on the data it intercepts and generates, its software does provide data it says indicates the “emotional state of a deal” based on the sentiment and engagement of all members of a buying committee who have been present in meetings across the timeline of discussions with that potential customer.
Always be … recording?
But the mere request to record a virtual conversation can alter a customer’s attitude, said Grace Briscoe, senior vice president of Client Development at digital ad company Basis Technologies. “As soon as that recording alert comes up, it puts people on guard,” she said. “I think it would be off-putting for the clients; they would be less candid. I don’t think it would be conducive to the kind of relationship building that we want to do."
While some sales meeting participants might be uncomfortable being recorded, others will be more open to it, said Josh Dulberger, head of Product, Data and AI at Zoom. “Part of it is the culture of the sales team,” he said, noting that recording might not be tolerated when selling to more sensitive industries such as financial services.
Zoom, the king of virtual meetings, said Wednesday it is introducing new features called Zoom IQ for Sales that provide sales meeting hosts with post-meeting conversation transcriptions and sentiment analysis. Although some AI-based transcription services have been known to make mistakes, Dulberger said Zoom’s software was built in-house using its own automated speaker recognition and natural-language-understanding system. The system is integrated with Salesforce.
“We’re looking at things like speaker cadence and other factors in the linguistic approach to try to disentangle one speaker from another,” Dulberger said.
For now, the new Zoom features for salespeople do not assess sentiment in real time during a meeting. Instead, they deliver post-meeting analysis. For instance, Dulberger said an interaction might be labeled as “low engagement” if the potential customer did not speak much.
“You will be able to measure that they weren’t very well engaged,” he said, noting that salespeople aim for balanced conversations during which customers talk as often as a sales rep.
Frustration detected. Show empathy.
Sentiment analysis is nothing new. Since the early days of social media, software providers have sucked up text from posts and tweets and product reviews, analyzing their content to help determine what they mean for consumer brands, restaurants or political candidates. Today, software for help desk chats and call centers employ voice recognition and natural-language-processing AI to prompt customer service reps to speak more slowly or be more energetic. For example, Amazon has partnered with Salesforce to bring sentiment analysis to apps used by customer service agents, and a product from Cogito uses in-call voice analysis to assess the emotional state of callers or service reps.
“Frustration detected. Show empathy,” states an alert shown as an example on Cogito’s website.
Questionable AI for coaching basic human skills
But what companies such as Uniphore, which recently collected $400 million in series E funding at a valuation of $2.5 billion, and Sybill are doing goes further than customer service prompts. Uniphore and Sybill aim to monitor human behavior during video calls in real time. And they are betting that even seasoned salespeople can benefit from the guidance of their emotional AI coaching.
Dulberger said Zoom also has active research underway to incorporate emotion AI into the company’s products in the future. He pointed to research he said shows that improvements are being made to AI used to detect people’s emotions, including a study involving a technique that removes facial images from background imagery that can confuse computers and a new data set that incorporated facial expression data, physiological signals such as heart rate and body temperature and self-reported emotions.
“These are informational signals that can be useful; they’re not necessarily decisive,” Dulberger said, noting that metrics based on emotion AI could be added to provide salespeople with a richer understanding of what happened during a sales meeting, for instance by detecting, “We think sentiments went south in this part of the call.”
Briscoe said she recognized the potential value of emotion-AI-based technologies as management tools to help determine which salespeople might be experiencing problems. However, she said, “Companies should hire people who have some level of emotional intelligence. If the people on our team cannot read that someone has lost interest, those are basic human skills that I don’t know why you’d need AI [to facilitate].”
Image: Uniphore
Even if emotional AI guidance is appealing to some sales teams, its validity is in question.
“The claim that a person’s interior state can be accurately assessed by analyzing that person’s face is premised on shaky evidence,” wrote Kate Crawford in a 2021 article in The Atlantic. In the article, Crawford, an AI ethics scholar, research professor at USC Annenberg and a senior principal researcher at Microsoft Research, cited a 2019 research paper that stated, “The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation.”
“We’re able to look at faces and classify them into different emotional expressions established by psychologists that are pretty standard out there,” said Patrick Ehlen, Uniphore’s vice president of AI.
You could be smiling and nodding, and in fact, you’re thinking about your vacation next week.
Ehlen said the technology Uniphore has developed uses the same signals people use to infer what others are thinking or feeling, such as facial expressions, body language and tone of voice. “We endeavor to do as well as a human,” he said. Uniphore’s software incorporates computer vision and human emotion analysis technology the company acquired when it purchased Emotion Research Labs in 2021 for an undisclosed price.
Uniphore’s AI model was trained using open-source and private data sets featuring images of diverse ethnic groups of people, Ehlen said. Some of that data came from actual sales meetings the company held. To help the machine learn what facial cues represent certain types of emotions, the image data was labeled by people Uniphore hired to make those annotations based on a set of guidelines the company established then modified based on whether people agreed on certain criteria, he said.
“Going forward there’s always room for these things to improve as the system gets in the hands of larger domains,” Ehlen said. The company is also conducting a validation study of the software.
But Ehlen recognized the limitations of the technology. “There is no real objective way to measure people’s emotions,” he said. “You could be smiling and nodding, and in fact, you’re thinking about your vacation next week.”