AI Detects Human Emotions

Emotion AI: How Emotionally Intelligent Computers Can Benefit Society

Rana el Kaliouby is Deputy CEO of Smart Eye and co-founder of Affectiva, a technology company developing emotion artificial intelligence (AI) software that enables machines to detect human emotions and complex cognitive states. AI is developing fast, and with many potential applications in the world of communications technology and connected devices, we spoke with Rana at the Arch Summit in Luxembourg to learn how emotion AI can impact the way people learn, work and monitor their health.  

“In the future, our technology is going to interact with us in the way we interact with one another,” says Rana el Kaliouby. “It’s going to be perceptive, empathetic, emotional and social. And there are so many applications for that.”

Technology: a black hole for emotions

When Rana el Kaliouby moved from Cairo to study at the University of Cambridge in 2000, working to build emotionally intelligent machines as part of a PhD in computer vision and machine learning, her husband remained in Egypt.

They would keep in touch through instant messaging platforms, as international calls were too expensive and video conferencing didn’t exist. Rana, alone with her laptop, was homesick. And her laptop had no clue.

“It dawned on me how completely insensitive computers are to our emotional state. I was messaging home, but there was no way for me to genuinely transmit how I felt. Technology was a black hole for emotions – all the richness of nonverbal communication that we as humans use to build empathy, love, loyalty and trust was gone. That sent me on a journey to bring emotional intelligence into our machines.”

The commercial opportunity for emotionally intelligent devices

After Cambridge, Rana took her ideas to the Massachusetts Institute of Technology (MIT), where she joined Dr Rosalind Picard’s lab.

Together, they worked to develop emotion AI for different applications, initially focusing on how the technology could be used to help children with autism better understand emotional expression.  

At an MIT event for corporate sponsors, Rana noted an increasing commercial interest in her work. Realising she had a unique opportunity to take something she cared about and deliver it at scale, she left MIT and partnered with Rosalind to set up Affectiva.

One of their early projects involved a partnership with Brain Power, a tech startup which embedded Affectiva’s technology into smart glasses as part of a research programme for children with autism and developmental disabilities.

After 12 years, and more than USD $51m raised through funding rounds, their company was acquired by Smart Eye, where Rana is now deputy CEO.

Applications for health and wellness

The Covid-19 pandemic accelerated interest in emotion AI, as many businesses and schools made the shift to remote work and online learning. Increasing reliance on video conferencing platforms like Microsoft Teams and Zoom led to a rise in other challenges.

“Certain issues quickly rose to the top. We saw a drop off in learner and student engagement, and concerns with mental health. Through emotion AI, we can detect the facial and vocal biomarkers of stress, anxiety, depression, and even suicidal intent. The data is there, it’s just a case of deploying that technology at scale. But imagine the potential, with us all spending so much time on our devices,” Rana says.

Rana also sits on the board of an early-stage AI start-up, Videra Health, which operates in the remote patient monitoring space.

Videra provides an app that enables patients with depression, for example, to log in and upload a self-recorded video. Videra then uses algorithms to quantify the patients’ levels of stress and anxiety.

“The problem,” Rana says, “is that not everybody has this app. There is a need to partner with hospitals, clinicians and therapists so the technology can scale and support as many patients as possible.”

How technology can intersect with health and wellness continues to be a key focus area for Rana.

Having spent most of her career exploring how AI can sense emotion expressed physically, she is turning inward – how can sensors and AI work together to track internal health indicators, such as gut health or microbiome levels, hydration, or even ageing clocks?

“Sensors give us the data, AI can help us analyse it and personalise interventions for different people, like specific diets or exercise programmes. I think this intersection of AI and biology is the next area where AI can have a lot of impact.”

Source: Vodafone media announcement

Latest Updates

Subscribe to our YouTube Channel