Futurism logo

What is the difference between emotion detection and emotion recognition?

Recently, we have been doing research on emotion recognition. We all know that the majority of emotion recognition is done by classification, and there have been a lot of research. Later, I was very interested in the word "emotion detection", but I didn't understand it deeply, so I wondered if it was like the prediction of time series? What has been done in the current research on emotion detection?

By jean208Published 3 years ago 5 min read

As the interaction between us and the machine gets deeper and deeper, will you hope to get an accurate and surprising response when chatting with Siri, and the songs played by the smart speaker can follow your mood to achieve intelligent changes.

Let the machine understand you better, which is the direction of researchers in the field of emotional computing. Laughing when happy, blushing when shy, and speaking faster when nervous are human emotional expressions. These emotional expressions are also being understood by machines through collection, recognition and calculation.

Emotion recognition is the key technology for machines to understand human emotions. From text, speech, facial expressions, body movements to physiological signals inside the body, researchers are trying to integrate more emotional signals to make recognition more accurate and human-computer interaction more natural, smooth and warm.

Decipher human emotions:

In 1995, Professor Rosalind W. Picard of the Media Laboratory of Massachusetts Institute of technology first proposed to create a computer system that can perceive, recognize and understand human emotions and make intelligent, sensitive and friendly responses by recognizing human emotional signals.

In 1997, Picard's monograph "effective computing" was published, in which she pointed out that "emotional computing is a calculation that can measure and analyze human external performance, and can exert influence on emotion", opening up a new field of computer science. Emotional computing started from this. In the "emotional computing of artificial intelligence" jointly released by the Institute of artificial intelligence of Harvard University and other institutions, emotional computing is divided into three modules, namely, the collection of emotional signals, the analysis, modeling and recognition of emotional signals, the research on the fusion algorithm of emotional signals, and emotional recognition includes multiple subdivision modules such as language, facial expression, voice, limbs, etc.

The current research focuses on text, facial expression and speech recognition. However, many times, human emotions are not externalized in the face or voice. Some researchers believe that relying on facial expressions to recognize human emotions is not accurate.

In 2019, five experts in the United States published a paper entitled "emotional expressions reconciled: challenges to inferring emotion from human facial movements", pointing out that the way people express anger, anger, sadness, surprise and other emotions changes with the changes of culture and situation, and even in the same case, different people have different ways of expression.

"People's emotional performance is not just language, expression and voice. Emotion itself is particularly complex. For example, when I am nervous, my heart rate, blood pressure and body temperature will change. Facial expression, language, voice, physiological signals and other modes can reflect human emotional changes." Li Taihao, assistant director of the cutting-edge theory research center of the Artificial Intelligence Research Institute of Zhijiang laboratory, pointed out.

In 2016, researchers at MIT's computer science and Artificial Intelligence Laboratory developed a device called "EQ radio", which can recognize others' emotions through wireless signals. "EQ radio" claims that by monitoring the subtle changes in breathing and heart rate, it can judge whether a person is excited, happy, angry or sad, with an accuracy of 87%. Li Taihao pointed out that the accuracy of single-mode emotion recognition is not high. At present, multimodal emotion recognition is the promising direction of researchers. "Language and external expressions can be artificially controlled, but physiological signals are not controlled. It is difficult to accurately judge human emotions with one mode. Now we hope to improve the accuracy of the original single mode through multimodality, and integrate all kinds of perceptible information and signals for emotion recognition.

”Multimodal emotion recognition is still in a very cutting-edge research stage. Li Taihao said that there are two major problems to be solved in this field.

The first is the problem of perception. "On the one hand, we need to make breakthroughs in hardware, such as developing a simple and comfortable wearable device that can accurately capture the changes of human heart rate, blood pressure, EEG and other physiological signals. On the other hand, some studies are trying to collect physiological signals without contact, such as using machine vision and image processing technology to obtain temperature, blood pressure and other data. But on the whole, there are still some problems in accuracy." Li Taihao explained.

The second is the problem of multimodal fusion. Li Taihao said, "it is a difficult point to know how to associate so much information and how to fuse multiple modes instead of relying too much on a certain mode. When the perceptual signals are collected well and the algorithm fusion has made a breakthrough, the machine may recognize emotions more accurately than people.

”Let robots have "emotions"

When human emotions are decoded, what work and life scenarios can they be applied to?

At present, there are case reports in the field of automatic driving. Affectiva, a Boston based start-up, once launched a software product "affectiva automotive AI", which focuses on the passengers in the car. This product can monitor the status of drivers, recognize emotions from their voices, body language and facial expressions, and help reduce traffic accidents.

The person in charge of the above "EQ radio" project also predicted that the emotion recognition device could be used in entertainment, consumer behavior, health care and other fields. Film studios and advertisers can test the audience's response in real time through this system, and the smart home system can also use your emotional information to adjust the temperature, etc.

"An advertisement doesn't know the effect before it is put into use. At this time, emotional recognition can play a role. For example, if the audience of an advertisement is young people, the advertiser can select some samples in advance and judge whether the advertisement has achieved the effect through emotional recognition, so as to make corrections." Li Taihao explained. In addition, emotion recognition is useful in the fields of auxiliary diagnosis of mental diseases, education, transportation and so on.

Service robot may be the largest application scenario of multimodal emotion recognition in the future. In Li Taihao's view, robots today are task-based, without temperature, and can't interact emotionally with people. "There are more and more elderly people, and it is likely that robots will care for them in the future. If robots are just machines, the elderly may be more lonely. People are social and need to communicate. If service robots want to really enter the application, they must have 'emotion'.

”In the Disney animated film "super Marine Corps", the robot has attracted countless people. It is not only a personal health assistant, but also a friend of the protagonist Xiao Hong, guiding Xiao Hong out of the haze. With the development of artificial intelligence technology such as emotional computing, perhaps one day in the future, robots will no longer be cold instruments, and we can also have a warm "big white".

artificial intelligence

About the Creator

jean208

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2026 Creatd, Inc. All Rights Reserved.