What is the difference between emotion detection and emotion recognition?
Researchers in the field of emotional computing are trying to make machines understand you better. Laughing when happy, blushing when shy, and speed-talking when nervous are human expressions of emotion. These expressions of emotion are also being understood by machines through collection, recognition and calculation. Emotion recognition is a key technology for machines to understand human emotions. From text, speech, facial expressions, body movements, and physiological signals inside the body, researchers are trying to integrate more emotional signals to make recognition more accurate and human-computer interaction more natural, smooth, and warm. Decipher human emotion in 1995, Professor Rosalind W. Picard of the Massachusetts Institute of Technology media Lab first proposed to create a computer system that can sense, recognize and understand human emotion, and make intelligent, sensitive and friendly responses by identifying human emotion signals. In 1997, Picard's book Affective Computing was published, in which she argued that "Affective Computing is Computing that measures and analyzes the external performance of human beings and influences emotions", breaking new ground in computer science. This is where affective computing started. In artificial intelligence research institute, tsinghua university and other institutions jointly issued the "emotional calculation of artificial intelligence", emotional calculation is divided into three modules, respectively is the emotional signal acquisition, signal analysis, modeling and recognition of emotion and emotion signal fusion algorithm research, emotion recognition and include language, facial expressions, voice, body, and many other segmentation module. Current research focuses on text, facial expression and speech recognition. But a lot of times, human emotions don't externalize in faces or words and sounds. Some researchers believe that relying on facial expressions to identify human emotions is not accurate. In 2019, five experts in the United States published an article entitled "Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements points out that the way people express anger, anger, sadness, surprise and other emotions changes with the change of culture and situation. Even in the same situation, Different people express themselves in different ways. "A person's emotional expression is not just a few pieces of language, expression and voice. Emotions themselves are very complicated. For example, when I am nervous, my heart rate, blood pressure and temperature will change. Facial expressions, language, voice and physiological signals are all modality that reflect human emotions." Zhijiang laboratory artificial intelligence research institute frontier theory research center director assistant Li Taihao pointed out. In 2016, researchers at MIT's Computer Science and Artificial Intelligence Laboratory developed a device called eq-Radio that can identify a person's emotions using wireless signals. Eq-radio claims to be able to tell with 87 percent accuracy whether a person is excited, happy, angry or sad by monitoring subtle changes in breathing and heart rate. Li Taihao pointed out that the accuracy of single modal emotion recognition is not high, and multimodal emotion recognition is the promising direction for researchers at present. "Language and external expressions can be controlled, but physiological signals are not. One mode is difficult to accurately judge human emotion. Now we hope to improve the accuracy of the original single mode through multi-mode, and integrate all kinds of perceptive information and signals for emotion recognition." Multimodal emotion recognition is still at the forefront of research. Mr Lee says there are two big problems in this area that need to be solved. One is perception. "On the one hand, we need to make breakthroughs in hardware, such as developing simple and comfortable wearable devices that can accurately capture changes in physiological signals such as heart rate, blood pressure and brain electricity. On the other hand, other studies are experimenting with contact-free ways of collecting physiological signals, such as using machine vision and image processing to get temperature and blood pressure. But overall, there are some problems with accuracy." Li Taihao explains. The second is the problem of multi-mode fusion. "The difficulty is how so much information is related to each other and how to integrate multiple modes rather than relying too much on one mode," Says Lee. "When perceptual signals are collected better and algorithms fuse, it is possible that machines will be able to identify emotions more accurately than humans." When human emotions are decoded, what work and life situations can they be applied to? There are already reported cases in autonomous driving. Affectiva Automotive AI, a boston-based startup, monitors drivers' moods and identifies emotions in their voice, body language and facial expressions to help reduce accidents. The leaders of the eQ-Radio project also predicted that the emotion recognition device could be used in entertainment, consumer behavior and health care. Movie studios and advertisers can test viewers' reactions in real time, and smart home systems can use your emotional information to adjust temperatures, among other things. "You don't know the effect of an AD until it's delivered, and that's when emotion recognition comes into play. For example, if an AD is aimed at young people, an advertiser can select samples in advance and use emotion recognition to determine if the AD is working." Li Taihao explains. In addition, emotion recognition can be used in auxiliary diagnosis of mental diseases, education, transportation and other fields. Service robot may be the biggest application scenario of multimodal emotion recognition in the future. In Li Taihao's view, today's robots are task-oriented, without temperature, and unable to interact "emotionally" with people. "As there are more and more elderly people, it is very likely that robots will take care of them in the future. If robots are just machines, the elderly will be more lonely. "Human beings are social and need to communicate. In order for service robots to truly enter the application, they must have 'emotions'." In The Disney animated film Big Hero 6, Baymax becomes a cute robot who not only acts as a personal health assistant, but also acts as hiro's friend and guides him through his dark days. With advances in artificial intelligence such as emotional computing, robots may one day be a warm baymax instead of a cold one.