AI customer service.
Photo credit: theFreesheet/Google ImageFX

Artificial intelligence systems designed to detect human emotions – sometimes called “emotion AI” – can improve customer service but work most effectively when integrated with human employees, according to a new study from the University of Texas McCombs School of Business. The research suggests that a blended approach helps companies manage complaints effectively while mitigating risks, such as customers “gaming” the system by exaggerating dissatisfaction.

The study, led by Assistant Professor Yifan Yu, analysed how emotion AI could be deployed in customer care scenarios. Using game theory, the researchers modelled interactions among customers, employees, and companies, accounting for factors such as emotional intensity and the cost of resolving complaints.

“Firms can refine how they use AI to ensure fairer, more effective decision-making,” said Yu. “Our study provides a practical framework for businesses to navigate this balance, particularly in customer care, where emotional communication plays a crucial role.”

The analysis concluded that a hybrid human-AI model generally outperforms AI or humans alone. Key recommendations include:

  • Enhancing chatbots: Adding emotion detection to basic AI chatbots allows them to “better gauge frustration, confusion, or urgency” and tailor responses or escalate complex issues to human agents more effectively, according to Yu.
  • AI as first responder: Using emotion AI as the initial contact point for angry customers can reduce the emotional burden on human staff and potentially lower employee turnover. Humans can then intervene for situations requiring more nuance.
  • Channel-specific strategies: Human agents may be better suited for handling complaints in public forums like social media due to the need for sensitivity. Emotion AI might be more appropriate for private channels like phone calls.
  • ‘Weaker’ AI can be better: Counterintuitively, the study found that emotion AI with more “noise” (random or irrelevant data hampering perfect emotion recognition) might discourage customers from exaggerating complaints to gain benefits. “When AI is too strong, customers are more likely to game the system by exaggerating their emotions, creating a ‘rat race’ of emotional escalation,” Yu explained. This can lead to wasted resources.

While suggesting emotion AI could also assist in screening job candidates or monitoring employees, Yu emphasised the necessity of retaining a human element in all applications.

“AI has made remarkable strides in reasoning and problem-solving,” Yu noted. “But its ability to understand and respond to human emotions is still in its early stages.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Space mining ‘gold rush’ is off but water extraction may save Mars missions

The dream of an asteroid mining bonanza has been grounded by a…

Daydream believing a better boss can actually work, brain scans reveal

Employees dreading their next performance review might have a new secret weapon:…

Amazon in deadly new ‘hypertropical’ climate unseen for millions of years

The Amazon rainforest is transitioning into a new, hostile climate state characterised…

Boys wired for gaming addiction as dopamine loops hook one in 10

The competitive rush of gaming is rewiring the reward centres of young…

Using ’67’ in ads will kill engagement unless you’re actually, genuinely cool

Attempting to cash in on trending slang terms like “67” can actively…

Understanding sarcasm and white lies rely on three hidden brain skills

Understanding whether “lovely weather” denotes sunshine or a sarcastic comment about the…