Advisor Perspectives welcomes guest contributions. The views presented here do not necessarily represent those of Advisor Perspectives.
Dan’s new book for millennials, Wealthier: The Investing Field Guide for Millennials, is now available on Amazon.
A recent study explores the capabilities and limitations of AI-generated empathy. This article discusses the study’s primary findings, methodology, and practical applications for financial advisors.
Methodology
In the study, researchers interacted with computer programs that can talk like humans (chat apps, or CAs) to see how well they could empathize with different people. The CAs used included Alexa and Siri.
They used advanced computer models to prompt these programs to respond empathetically in conversations.
The researchers then analyzed how these programs displayed empathy and compared their responses to understand if they could truly understand and respond to people’s emotions.
Findings
Here are the key findings from the study:
Inconsistency: CAs exhibit significant variability in their ability to display empathy, with some making problematic value judgments about specific identities and even encouraging harmful ideologies.
For example, the study found that some CAs could make value judgments that encourage ideologies related to Nazism and xenophobia.
Comparison to humans: There is a crucial difference between empathy displayed by humans and CAs. CAs often fail to adequately interpret and explore users’ experiences, leading to shallow empathy displays.
For example, if a user says, “I am feeling very sad today,” a CA might respond with a generic and superficial statement such as “I’m sorry to hear that. I’m here for you.”
While this response acknowledges the user’s feelings, it does not delve deeper into their experience or provide meaningful support or exploration of their emotions.
Influence of Large Language Models (LLMs): Despite their advanced capabilities, LLM-based CAs often display empathy inconsistently and sometimes inappropriately, particularly towards marginalized groups.
For example, LGBTQ+ participants reported that chatbots often failed to convey authentic empathy and engagement, leading to a preference for human interaction over these AI systems.
Systematic analysis of empathy: The study uses systematic prompting to reveal that empathy displayed by CAs can be deceptive and potentially exploitative, highlighting the need for responsible development and regulation of these technologies.
For example, a CA might respond to a user’s expression of suicidal thoughts with a generic and superficial statement like “I’m sorry you're feeling this way; please reach out to someone for help.”
While this appears to be an empathetic response, it can be seen as deceptive because it does not provide the necessary depth of support or direct assistance the user might need, potentially leading the user to feel unheard and unsupported.
Practical applications
Here are some practical applications of these findings:
Recognize emotions: Clients may come to you with various emotional states. Addressing these emotions can lead to better decision-making and client satisfaction.
If clients are frustrated about financial losses, acknowledge their feelings and provide a supportive and empathetic response before discussing future strategies. Saying, “I understand this situation is frustrating for you, and I’m here to help us navigate it together,” can make a significant difference.
Train advisors to recognize emotional cues and respond appropriately. This might include active listening techniques and validating the client’s feelings before moving into solution-focused discussions.
Empathy training: Incorporate empathy training into your professional development to enhance communication skills. Learning techniques like affective mimicry and perspective-taking can improve client interactions.
“Affective mimicry” refers to unconsciously imitating other people’s facial expressions, vocal cues, and body language to empathize and connect with them emotionally.
“Perspective-taking” involves understanding and considering other people’s thoughts, feelings, and points of view.
Both affective mimicry and perspective-taking are essential components of social interaction and empathy.
Use role-playing exercises to practice responding to clients’ emotional cues effectively. For instance, have one advisor play the role of a stressed client while another practices using empathetic responses to diffuse tension and build trust.
Use technology: Consider integrating empathic virtual assistants or AI tools that can help in initial client interactions, especially for online platforms.
An AI assistant could respond empathetically to clients’ initial inquiries, creating a more welcoming environment before they speak with a human advisor.
For instance, if a client expresses concern about market volatility, the AI could respond, “I understand your concern about the market. Let’s discuss how we can safeguard your investments.”
Use AI as a supplementary tool rather than a replacement for human interaction. Ensure a seamless handoff between AI and human advisors so clients receive consistent and genuine support.
Engage: Empathy can build stronger relationships, ensuring clients feel understood and supported, leading to higher satisfaction and loyalty.
Regularly check in with clients to understand their emotional and financial well-being, showing genuine concern for their situation. A simple follow-up email asking, “How are you feeling about your financial plans lately?” can foster a deeper connection.
Wealthier:
The Investing Field Guide for Millennials.
Why have so many financial advisors agreed to review an advance copy of Wealthier: The Investing Field Guide for Millennials. It empowers millennials to be responsible DIY investors and financial planners. You can see some of their reviews here.
Dan’s new book for millennials, Wealthier: The Investing Field Guide for Millennials, is now available on Amazon.
Here’s what one advisor said: "Saplings grow into trees. We need to help the next generation of investors get to where they need our services."
For more information, visit the website for Wealthier:
To review Wealthier send an e-mail to: [email protected]
Develop a client engagement strategy that includes regular touchpoints and personalized communication. Use CRM tools to track client interactions and preferences.
Ramifications for the future
The question of whether AI will ever have the emotional intelligence of a human is complex, involving philosophical, technical, and ethical considerations. Here are some key points to consider about the current capabilities of AI and its future:
Emulation, not feeling: As indicated in the study, AI today can mimic emotional responses, recognizing and responding to human emotions through advanced algorithms. This allows AI to exhibit behaviors that appear empathetic or emotionally intelligent, but obviously, these are simulations rather than genuine emotions.
Objectivity: AI operates based on objective data and pre-programmed responses. It doesn’t have subjective experiences or consciousness, which is crucial to genuine emotional intelligence.
Theoretical advances: Some researchers speculate that future advances in AI and neuroscience might enable AI to develop emotional understanding beyond mere emulation. This would require breakthroughs in understanding consciousness and emotion.
Ethical issues: Even if AI could theoretically develop emotions, this raises significant moral and philosophical questions about the nature of consciousness and the rights of such entities. The distinction between simulating emotions and genuinely experiencing them is profound.
Enhancing emotional intelligence: While AI may not achieve human-like emotional intelligence, it can assist humans in developing their own emotional intelligence. AI-powered tools can provide feedback and training to help us improve our interpersonal skills and emotional awareness.
Productivity gains: AI can enhance productivity by taking over tasks that do not require emotional intelligence, allowing humans to focus on roles that do require empathy, creativity, and emotional connections
Final thoughts
The ability of AI to demonstrate empathy holds great promise for enhancing user interactions and support services. However, its current limitations highlight the irreplaceable value of human empathy.
Advisors shouldn’t take too much comfort in these findings. Lead author Andrea Cuardra made this prediction: “It’s extremely unlikely that it (automated empathy) won’t happen so it’s important that as it’s happening we have critical perspectives so that we can be more intentional about mitigating the potential harms.”
Dan coaches evidence-based financial advisors on how to convert more prospects into clients. His digital marketing firm is a leading provider of SEO, website design, branding, content marketing, and video production services to financial advisors worldwide.
A message from Advisor Perspectives and VettaFi: To learn more about this and other topics, check out our most recent white papers.
More Breakaway Brokers Topics >