What's Missing in Conversations about Artificial Empathy?

What's Missing in Conversations about Artificial Empathy?

Is your company using AI? Considering it? Looking into Artificial Empathy? Before you go any further, consider this:

A Google search for “artificial empathy” (AE) has over 8 million results. Clearly its a hot topic today, in business, healthcare, agriculture, and just about any other sector. Wiki defines AE as, “the development of AI systems − such as companion robots − that are able to detect and respond to human emotions.” As an executive business professional, I think it’s a fascinating concept, making technology more human by teaching it about feelings. However, as I’m also a practicing clinician as a licensed mental health therapist, the struggle comes when I put my therapist hat on and think about the deep, immersive, complicated world of human emotions. My challenge is when in ALL the material I read in this space, it ALMOST NEVER includes any mention, quote, contribution, insight, or otherwise from a psychologist or therapist. Granted, I haven’t read all 8 million Google hits, but of the extensive articles I do read, I see engineers, data scientists, researchers, programmers, executives, linguists, and more…but rarely a psychologist or therapist.

Why does that matter? I understand that human emotions are produced as a result of complex transactions that happen in the human body as we process an event, register the experience, create a story, and birth emotions based on that sequence. Deeply understanding human emotions and behaviors requires extensive training and exercising that muscle regularly is one reason I maintain a small clinical practice, without which (like any muscle) my strength would weaken. And I understand that the closest we can get, for now at least, to machines and programming mimicking this is by teaching them markers they can “measure” (frown, smile, facial muscle movement, physical response metrics, etc), but not having a clinician in these conversations is like building a building without talking to the civil engineer; you’re missing information relevant to the foundation of the work you’re doing.

So how would this work? I’m certain the AI world is far more complicated than I understand and perhaps what I’m about to say is already being done, but again, its never mentioned in what I see. So here goes. Consider this:

A company is investing in AI technology for their marketing efforts. They are working with a team of people who are attempting to teach the AI platform to understand consumer feelings and preferences, then translate that into purchasing potential. Though every member of the team mentioned before can be necessary for this process, I believe the clinician is the only professional that lives in the marriage of the words, phrases, context, body language, facial expressions, cultural considerations, nature, nurture, and environment as a whole enough to be able to provide feedback as to the effectiveness towards the desired outcome in real-world application. It is perhaps the human/”un-science” part of all this science, the absence of which makes AI and artificial empathy continue to feel just that, artificial.

As the AI platform is designed, and the engineers make sure it works properly and the data scientists analyze the information gathered and the linguists make their assessments, the clinician is then able to offer more “practical” input for how the complex human engaging with this platform is receiving and responding to what’s happening.

I believe this integration is a critical piece in continuing to weave AI more seamlessly into standard application.

ENGAGE:  Want to discuss how to make your AI efforts more “human”? Click here to contact me and let’s get started. 

GO FURTHER:  I’d love to hear your thoughts on my perspective. Feel free to comment below or send me a message.

EMOTION DRIVES ACTION!