What is the difference between emotion detection and emotion recognition?
Making machines understand you better is what researchers in the field of emotional computing are trying to do. Laughing when you're happy, blushing when you're shy, and talking faster when you're nervous are human expressions of emotion. And these emotions are being understood by machines through collection, recognition and computation. Emotion recognition is the key technology for machines to understand human emotions. From text, speech, facial expressions, body movements, and internal physiological signals, researchers are trying to integrate more emotional signals to make recognition more accurate, and make human-computer interaction more natural, smooth, and warm. Deciphering human emotions In 1995, Professor Rosalind W. Picard of the Media Lab of Massachusetts Institute of Technology first proposed to create a computer system that can perceive, recognize and understand human emotions, and make intelligent, sensitive and friendly responses by identifying human emotional signals. In 1997, Picard's monograph Affective Computing was published, in which she pointed out that "Affective Computing is the calculation that focuses on the external manifestations of human beings, can be measured and analyzed, and can influence the emotions", which opened a new field of computer science. This is where emotional computing begins. In artificial intelligence research institute, tsinghua university and other institutions jointly issued the "emotional calculation of artificial intelligence", emotional calculation is divided into three modules, respectively is the emotional signal acquisition, signal analysis, modeling and recognition of emotion and emotion signal fusion algorithm research, emotion recognition and include language, facial expressions, voice, body, and many other segmentation module. At present, most researches focus on text, facial expression and speech recognition. But most of the time, human emotions are not externalized in the face or the sound of words. Some researchers believe that relying on facial expressions to identify human emotions is not accurate. In 2019, five U.S. experts published an article called Emotional Expressions Reconsidered: In Challenges to Inferring Emotion From Human Facial Movements, the paper points out that the ways people express emotions such as anger, anger, sadness and surprise change with culture and situation, and even in the same situation. Different people use different expressions. "The expression of human emotion is not just words, expressions and sounds. The emotions themselves are very complex. For example, my heart rate, blood pressure, and temperature all change when I'm nervous. "Facial expressions, language, voice, physiological signals and other modes can reflect human emotional changes." Zhijiang Laboratory artificial intelligence research Institute frontier theory research center assistant director Li Taihao pointed out. In 2016, researchers at MIT's Computer Science and Artificial Intelligence Lab developed a device called EQ-Radio that can identify emotions via wireless signals. Eq-radio claims to be able to tell with 87 per cent accuracy whether a person is excited, happy, angry or sad by monitoring subtle changes in breathing and heart rate. Li Tae-ho pointed out that the accuracy of single modal emotion recognition is not high, and now multi-modal emotion recognition is a promising direction for researchers. "Speech and physical expressions can be controlled, but physiological signals are not. One mode is difficult to accurately judge the human emotion, now we hope to improve the accuracy of the original single mode through multi-mode, all kinds of perceived information, signal integrated emotion recognition." Multimodal emotion recognition is still at the forefront of research. Mr Lee says there are two big problems in this area that need to be addressed. One is perception. "On the one hand, we need to make breakthroughs in hardware, such as the development of simple and comfortable wearable devices that can accurately capture changes in physiological signals such as heart rate, blood pressure and EEG. On the other hand, some studies are trying to collect physiological signals without contact, such as using machine vision and image processing technology to obtain body temperature, blood pressure and other data. But overall, there is a problem with accuracy." Lee Tae-ho explained. The second is the problem of multi-modal fusion. "The difficulty is how so much information is related to each other, how to integrate multiple modes without relying too much on one mode," Lee said. When the perceptual signals are collected better and the algorithm fusion is improved, machines may be able to recognize emotions more accurately than humans." Let robots have "emotions" When human emotions are deciphered, can be applied in what work and life scenarios? Cases have been reported in the field of autonomous driving. Affectiva Automotive AI, a Boston-based startup, has developed a passenger-focused software product that can help reduce traffic accidents by monitoring drivers' state and identifying emotions from their voice, body language and facial expressions. Leaders of the EQ-Radio project have also predicted that the emotion-recognition device could be used in entertainment, consumer behavior and health care. Movie studios and advertisers can use the system to test audience reactions in real time, and smart home systems can use information about your mood to adjust the temperature, for example. "You don't know how an AD will work until it's launched, and that's where emotion recognition comes in. For example, if an AD is aimed at young people, the advertiser can select a sample in advance and use emotion recognition to determine whether the AD is effective, so as to make corrections." Lee Tae-ho explained. In addition, emotion recognition has been used in the auxiliary diagnosis of mental diseases, education, transportation and other fields. Service robots may be the biggest application scenario for multimodal emotion recognition in the future. In Lee's view, today's robots are task-based, have no temperature and cannot interact "emotionally" with humans. "There are more and more elderly people, and it is very likely that robots will take care of them in the future. If robots are just machines, the elderly may be more lonely. People are social and need to communicate, and service robots must have 'emotion' if they want to be used." In Disney's "Big Hero 6," Baymax is a robot that transforms the hearts of millions of people as a personal health aide and a friend to Hiro, who guides him through his dark days. With the development of artificial intelligence technology such as emotional computing, maybe one day in the future, robots will no longer be cold devices, we can also have warm "baymax".