• Skip to primary navigation
  • Skip to main content
SRI logo
  • About
    • Press room
    • Our history
  • Expertise
    • Advanced imaging systems
    • Artificial intelligence
    • Biomedical R&D services
    • Biomedical sciences
    • Computer vision
    • Cyber & formal methods
    • Education and learning
    • Innovation strategy and policy
    • National security
    • Ocean & space
    • Quantum
    • Robotics, sensors & devices
    • Speech & natural language
    • Video test & measurement
  • Ventures
  • NSIC
  • Careers
  • Contact
  • 日本支社
Search
Close
Story February 21, 2020

Part one: What is emotional AI and how is it embracing computing?

How affective computing emotional AI are changing society


part-one-what-is-emotional-ai-and-how-is-it-embracing-computing-feat-img

“…there are lots of facets to how we signal emotion, body language captures this”… “multi-level signals can be monitored in real-time” — a talk on Emotional AI by Amir Tamrakar, Sr. Technical Manager of SRI International

Our emotions are intrinsically associated with humanity. They inform cognition and are a vital aspect of human communication. So how can a machine possibly recognize human feelings and itself seem ‘emotional’?

Many artistic presentations of the “emotional computer” have had dystopian undertones. The 1973 book, Demon Seed by Dean R Koontz, presents an emotionally controlling and even angry intelligent computer. The 2014 film, Her, is a love story between a man and a digital assistant. However, Emotional AI (Artificial Intelligence) has a more utopian back story influenced by biology.

Line-of-robots-looking at-computer-monitors

The next step on the evolutionary roadmap of the computer is emotion recognition. Artificial Intelligence provides the digital equivalent of a computer dipping a toe into an ocean; add in emotion, and computers can jump on a boat and swim away from shore.

AI is driving the new discipline of ‘Affective Computing’ where machines take on human-like feelings to bring a new era in technology to life. Affective Computing is a truly multi-disciplinary approach to computing, combining the skills and knowledge from areas as diverse as engineering, neuroscience, and behavioral psychology.

What is affective computing?

We already have the nascent murmurings of human-machine connectivity in the form of digital assistants. Amazon Echo has initiated ‘emotional responses’ the world over, some good, some not so good. The ability to interpret biometric data to manage the machine is evidenced in the Toyota Concept-i car. This smart car is based on SRI International’s Multimodal Driver Monitoring System (DMS) equipped with technology that uses biometric sensors to monitor the driver’s condition and adjust operations based on those inputs.

Robot-posed-like-the-thinker22

Affective Computing takes smart to new levels. It is all about emotions. “Affect” is another word for emotion. The discipline, as already mentioned, takes its steer from many areas that deal with computing and human behavior, pulling these together to create highly innovative and game-changing tools.

The concept was first proposed in a beautifully composed seminal paper by Rosalind Picard, published in 1995, entitled “Affective Computing”. The paper is written with an emphasis on the (then early) development of ‘wearables’. One of the conclusions of the paper is:

“emotions play a necessary role not only in human creativity and intelligence, but also in rational human thinking and decision-making. Computers that will interact naturally and intelligently with humans need the ability to at least recognize and express affect.”

The building blocks of Emotional AI

The emotional computer is in many ways like its human counterpart. Just as the human brain uses the Limbic system for emotions, working across multiple connected parts, so too, Affected Computing is made up from fundamental building blocks:

Robots-shake-hands

In 2018, IEEE published an article that outlines the three building blocks of emotional AI:

Emotion recognition

This fundamental area of emotional AI plays a significant part in music, sound, images, video, and text. It primarily works by analyzing acoustic speech, written content, facial expressions, posture and movement, and even brain activity. Early emotion recognition engines include openSMILE, used for audio analysis, and OpenCV which is used with video content. Current emotional AI-related solutions such as the End2You toolkit, focus heavily on end-to-end learning.

Emotion generation

These technologies have been around for more than three decades and primarily use pre-defined rules rather than data-training models for their functionality. Text-to-speech systems such as the MARY text-to-speech (MARYtts) engine are the most prevalent examples of these solutions.

Emotion augmentation

These solutions are centered on taking AI engines that face humans, then adding emotional capabilities to them. The SEMAINE project and ARIA VALUSPA are examples of AI-engines that enable developers to create virtual characters that can sustain interactions with humans for an extended period of time, then react appropriately to the user’s non-verbal behavior.

Emotional AI and affected computing, where next?

The building blocks of Emotional AI are established, with enabling developments within each of these areas; these provide the ground rock for further development. As we continue to build more accurate systems based on Emotional AI, we will likely see further innovations in products like wearables, smart cars, healthcare, and many more.

In Part Two, we will explore further the areas Emotional AI is pushing into; whilst keeping a watchful eye on the privacy aspects of the technology.

Share this
Career call to action image

Work with us

Search jobs

How can we help?

Once you hit send…

We’ll match your inquiry to the person who can best help you.

Expect a response within 48 hours.

Our work

Case studies

Publications

Timeline of innovation

Areas of expertise

Institute

Leadership

Press room

Media inquiries

Compliance

Careers

Job listings

Contact

SRI Ventures

Our locations

Headquarters

333 Ravenswood Ave
Menlo Park, CA 94025 USA

+1 (650) 859-2000

Subscribe to our newsletter


日本支社
SRI International
  • Contact us
  • Privacy Policy
  • Cookies
  • DMCA
  • Copyright © 2023 SRI International
Manage Cookie Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
View preferences
{title} {title} {title}