Part one: What is emotional AI and how is it embracing computing?

part-one-what-is-emotional-ai-and-how-is-it-embracing-computing-feat-img
part-one-what-is-emotional-ai-and-how-is-it-embracing-computing-feat-img

How affective computing emotional AI are changing society


ā€œā€¦there are lots of facets to how we signal emotion, body language captures thisā€ā€¦ ā€œmulti-level signals can be monitored in real-timeā€ ā€” a talk on Emotional AI by Amir Tamrakar, Sr. Technical Manager of SRI International

Our emotions are intrinsically associated with humanity. They inform cognition and are a vital aspect of human communication. So how can a machine possibly recognize human feelings and itself seem ā€˜emotionalā€™?

Many artistic presentations of the ā€œemotional computerā€ have had dystopian undertones. The 1973 book, Demon Seed by Dean R Koontz, presents an emotionally controlling and even angry intelligent computer. The 2014 film, Her, is a love story between a man and a digital assistant. However, Emotional AI (Artificial Intelligence) has a more utopian back story influenced by biology.

Line-of-robots-looking at-computer-monitors

The next step on the evolutionary roadmap of the computer is emotion recognition. Artificial Intelligence provides the digital equivalent of a computer dipping a toe into an ocean; add in emotion, and computers can jump on a boat and swim away from shore.

AI is driving the new discipline of ā€˜Affective Computingā€™ where machines take on human-like feelings to bring a new era in technology to life. Affective Computing is a truly multi-disciplinary approach to computing, combining the skills and knowledge from areas as diverse as engineering, neuroscience, and behavioral psychology.

What is affective computing?

We already have the nascent murmurings of human-machine connectivity in the form of digital assistants. Amazon Echo has initiated ā€˜emotional responsesā€™ the world over, some good, some not so good. The ability to interpret biometric data to manage the machine is evidenced in the Toyota Concept-i car. This smart car is based on SRI Internationalā€™s Multimodal Driver Monitoring System (DMS) equipped with technology that uses biometric sensors to monitor the driverā€™s condition and adjust operations based on those inputs.

Robot-posed-like-the-thinker22

Affective Computing takes smart to new levels. It is all about emotions. ā€œAffectā€ is another word for emotion. The discipline, as already mentioned, takes its steer from many areas that deal with computing and human behavior, pulling these together to create highly innovative and game-changing tools.

The concept was first proposed in a beautifully composed seminal paper by Rosalind Picard, published in 1995, entitled ā€œAffective Computingā€. The paper is written with an emphasis on the (then early) development of ā€˜wearablesā€™. One of the conclusions of the paper is:

ā€œemotions play a necessary role not only in human creativity and intelligence, but also in rational human thinking and decision-making. Computers that will interact naturally and intelligently with humans need the ability to at least recognize and express affect.ā€

The building blocks of Emotional AI

The emotional computer is in many ways like its human counterpart. Just as the human brain uses the Limbic system for emotions, working across multiple connected parts, so too, Affected Computing is made up from fundamental building blocks:

In 2018, IEEE published an article that outlines the three building blocks of emotional AI:

Emotion recognition

This fundamental area of emotional AI plays a significant part in music, sound, images, video, and text. It primarily works by analyzing acoustic speech, written content, facial expressions, posture and movement, and even brain activity. Early emotion recognition engines include openSMILE, used for audio analysis, and OpenCV which is used with video content. Current emotional AI-related solutions such as the End2You toolkit, focus heavily on end-to-end learning.

Emotion generation

These technologies have been around for more than three decades and primarily use pre-defined rules rather than data-training models for their functionality. Text-to-speech systems such as the MARY text-to-speech (MARYtts) engine are the most prevalent examples of these solutions.

Emotion augmentation

These solutions are centered on taking AI engines that face humans, then adding emotional capabilities to them. The SEMAINE project and ARIA VALUSPA are examples of AI-engines that enable developers to create virtual characters that can sustain interactions with humans for an extended period of time, then react appropriately to the userā€™s non-verbal behavior.

Emotional AI and affected computing, where next?

The building blocks of Emotional AI are established, with enabling developments within each of these areas; these provide the ground rock for further development. As we continue to build more accurate systems based on Emotional AI, we will likely see further innovations in products like wearables, smart cars, healthcare, and many more.

In Part Two, we will explore further the areas Emotional AI is pushing into; whilst keeping a watchful eye on the privacy aspects of the technology.


Read more from SRI