AI customer service.
Photo credit: theFreesheet/Google ImageFX

Artificial intelligence systems designed to detect human emotions – sometimes called “emotion AI” – can improve customer service but work most effectively when integrated with human employees, according to a new study from the University of Texas McCombs School of Business. The research suggests that a blended approach helps companies manage complaints effectively while mitigating risks, such as customers “gaming” the system by exaggerating dissatisfaction.

The study, led by Assistant Professor Yifan Yu, analysed how emotion AI could be deployed in customer care scenarios. Using game theory, the researchers modelled interactions among customers, employees, and companies, accounting for factors such as emotional intensity and the cost of resolving complaints.

“Firms can refine how they use AI to ensure fairer, more effective decision-making,” said Yu. “Our study provides a practical framework for businesses to navigate this balance, particularly in customer care, where emotional communication plays a crucial role.”

The analysis concluded that a hybrid human-AI model generally outperforms AI or humans alone. Key recommendations include:

  • Enhancing chatbots: Adding emotion detection to basic AI chatbots allows them to “better gauge frustration, confusion, or urgency” and tailor responses or escalate complex issues to human agents more effectively, according to Yu.
  • AI as first responder: Using emotion AI as the initial contact point for angry customers can reduce the emotional burden on human staff and potentially lower employee turnover. Humans can then intervene for situations requiring more nuance.
  • Channel-specific strategies: Human agents may be better suited for handling complaints in public forums like social media due to the need for sensitivity. Emotion AI might be more appropriate for private channels like phone calls.
  • ‘Weaker’ AI can be better: Counterintuitively, the study found that emotion AI with more “noise” (random or irrelevant data hampering perfect emotion recognition) might discourage customers from exaggerating complaints to gain benefits. “When AI is too strong, customers are more likely to game the system by exaggerating their emotions, creating a ‘rat race’ of emotional escalation,” Yu explained. This can lead to wasted resources.

While suggesting emotion AI could also assist in screening job candidates or monitoring employees, Yu emphasised the necessity of retaining a human element in all applications.

“AI has made remarkable strides in reasoning and problem-solving,” Yu noted. “But its ability to understand and respond to human emotions is still in its early stages.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Political misinformation key reason for US divorces and breakups, study finds

Political misinformation or disinformation was the key reason for some US couples’…

Pinterest launches user controls to reduce AI-generated content in feeds

Pinterest has introduced new controls allowing users to adjust the amount of…

Meta launches ad-free subscriptions after ICO forces compliance changes

Meta will offer UK users paid subscriptions to use Facebook and Instagram…

Wikimedia launches free AI vector database to challenge Big Tech dominance

Wikimedia Deutschland has launched a free vector database enabling developers to build…

Titan submersible’s memory card survives but held no fatal dive footage

Recovery teams have found an undamaged SD card inside a specialist underwater…

Cranston deepfake forces OpenAI to strengthen Sora 2 actor protections

Bryan Cranston’s voice and likeness were generated in Sora 2 outputs without…