Using EQ to personalize AI

Using EQ to personalize AI

Are you excited about all the advancements with AI? Curious how we can continue to personalize its integration into our daily lives?  Read on…

The evolution of AI and healthcare is all the buzz right now, with everyone wondering how this will impact care, cost and revenue, staffing needs, and more.  Interest in artificial intelligence (AI) is exploding, with Accenture forecasting that AI in health care will grow to $6.6 billion in a few short years, at a 40% annual compounded growth rate.

A recent techadvisory.com article discussed how “AI is changing the face of healthcare.”  One aspect of this change is with integration of virtual personal health assistants, such as Siri, Cortana, and Alexa.   And within that focus is the goal of increased personalization of the assistants’ interactions with consumers.  So, for example, Libertana Home Health is using Alexa along with the Amazon Echo Dot to empower the elderly to live independently, including the use of cognitive games to make the clients think and laugh, the ability to set a timer, and reminders such as taking medications at specified times. 

This is fantastic.  What a way to use AI to improve quality of life!  But one of the challenges with this model is that we continue to struggle with the technology feeling “personal” vs. “algorithmic.” To be truly personal, technology integration would benefit from knowledge of details such as the events that have been emotionally impactful to individuals. 

For example, imagine this:  when consumers first purchase a virtual personal assistant (VPA), part of the set up includes building a profile.  However, this profile would be progressively more intimate, based on the user’s comfort level.  The user would be able to enter dates such as anniversaries of the loss of a loved one.  The VPA would then ask, “How can I support you on this date?”  Options would include giving you a supportive message, checking in with you, or perhaps setting a reminder for you to call an encouraging friend.

So as not to feel intrusive, with each entry, the VPA would ask, “Would you like to tell me more?”  If the user says yes, then the VPA would proceed to deeper questions, much like one might do in personal relationship building.  “What was this like for you?  How were you feeling?”  This could then be tied back to incredible work being done in the language processing space for mental health, helping the VPA formulate appropriate responses to engage the user. This personalization can also be taken so much further over time (mood trend tracking and analysis, learning of response reactions between the user and the VPA with voice analysis, etc).

So much opportunity lies ahead for how we as humans continue to build our relationship with AI. Exciting times ahead!

ENGAGE:  Looking for a way to inject some EQ into your AI product? Click here to contact me and let’s get started. 

GO FURTHER:  I’d love to hear your thoughts. Feel free to comment below or send me a message.

EMOTION DRIVES ACTION!