patents.google.com

CN104091153A - Emotion judgment method applied to chatting robot - Google Patents

  • ️Wed Oct 08 2014

CN104091153A - Emotion judgment method applied to chatting robot - Google Patents

Emotion judgment method applied to chatting robot Download PDF

Info

Publication number
CN104091153A
CN104091153A CN201410312835.0A CN201410312835A CN104091153A CN 104091153 A CN104091153 A CN 104091153A CN 201410312835 A CN201410312835 A CN 201410312835A CN 104091153 A CN104091153 A CN 104091153A Authority
CN
China
Prior art keywords
chatter
chat
facial features
facial
emotion
Prior art date
2014-07-03
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410312835.0A
Other languages
Chinese (zh)
Inventor
赵展
魏雯
王勤
王栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Vocational Institute of Industrial Technology
Original Assignee
Suzhou Vocational Institute of Industrial Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2014-07-03
Filing date
2014-07-03
Publication date
2014-10-08
2014-07-03 Application filed by Suzhou Vocational Institute of Industrial Technology filed Critical Suzhou Vocational Institute of Industrial Technology
2014-07-03 Priority to CN201410312835.0A priority Critical patent/CN104091153A/en
2014-10-08 Publication of CN104091153A publication Critical patent/CN104091153A/en
Status Pending legal-status Critical Current

Links

  • 238000000034 method Methods 0.000 title claims abstract description 18
  • 230000008451 emotion Effects 0.000 title claims abstract description 14
  • 230000014509 gene expression Effects 0.000 claims abstract description 27
  • 230000001815 facial effect Effects 0.000 claims abstract description 26
  • 230000008921 facial expression Effects 0.000 claims abstract description 16
  • 210000004709 eyebrow Anatomy 0.000 claims description 11
  • 230000004886 head movement Effects 0.000 claims description 4
  • 210000001747 pupil Anatomy 0.000 claims description 3
  • 238000005516 engineering process Methods 0.000 abstract description 4
  • 230000006854 communication Effects 0.000 description 3
  • 238000004891 communication Methods 0.000 description 2
  • 230000002996 emotional effect Effects 0.000 description 2
  • 210000000744 eyelid Anatomy 0.000 description 2
  • 238000013473 artificial intelligence Methods 0.000 description 1
  • 230000007812 deficiency Effects 0.000 description 1
  • 210000000887 face Anatomy 0.000 description 1
  • 210000001061 forehead Anatomy 0.000 description 1
  • 210000003128 head Anatomy 0.000 description 1
  • 238000012544 monitoring process Methods 0.000 description 1
  • 230000036651 mood Effects 0.000 description 1
  • 238000006467 substitution reaction Methods 0.000 description 1

Landscapes

  • Manipulator (AREA)

Abstract

本发明提供一种应用于聊天机器人的情绪判断方法,其包括采集人脸数字图像的图像采集装置,其特征在于:其包括以下步骤:1)建立人脸表情数据库,存储人脸在不同表情下的面部特征;2)通过所述图像采集装置定期收集聊天者的数字图像;3)从所述数字图像中识别聊天者的五官,统计聊天者面部特征;4)根据聊天者面部特征,从所述人脸表情数据库中获得和上述位置关系对应的表情,作为聊天者的表情。本发明提供一种应用于聊天机器人的情绪判断方法,其充分运用人脸识别技术、心理学知识,通过判断聊天者的真实含义,从而调整聊天内容,使得聊天者获得接近真实的聊天感受,从而极大提高聊天机器人的真实感、准确性。

The invention provides an emotion judgment method applied to a chat robot, which includes an image acquisition device for collecting digital images of human faces, and is characterized in that it includes the following steps: 1) Establishing a facial expression database, storing human faces under different expressions 2) The digital image of the chatter is regularly collected by the image acquisition device; 3) The facial features of the chatter are recognized from the digital image, and the facial features of the chatter are counted; 4) According to the facial features of the chatter, the Obtain the expression corresponding to the above positional relationship from the facial expression database, and use it as the expression of the chatter. The present invention provides an emotion judgment method applied to a chat robot, which makes full use of face recognition technology and psychological knowledge, and adjusts the chat content by judging the real meaning of the chatter, so that the chatter can obtain a chatting feeling close to real, thereby Greatly improve the realism and accuracy of chatbots.

Description

应用于聊天机器人的情绪判断方法Emotional Judgment Applied to Chatbots

技术领域 technical field

本发明涉及一种信息技术,尤其是应用于聊天机器人的识别谈话对象情绪的方法。 The invention relates to an information technology, in particular to a method for identifying the emotion of a conversation object applied to a chat robot.

背景技术 Background technique

聊天机器人是一种通过人工智能技术识别文字语意,并合理选择答案,到达于模拟真人聊天目的的机器人(软件),如最近网络爆炒的微信平台的“微软小冰”。目前,聊天机器人的开发都集中在提高识别对方的语意,选择合理的答案的阶段,相关的研究很多,其还仅限于双方的文字沟通。然而实际上的语言沟通,绝非局限于语言,还包括表情、肢体动作等。相同的语言,配以不同的表情和语气,其可能表达完全相反的含义。比如,某人说“你请我吃饭?”,配以一个不屑一顾的表情,可能表明的是拒绝,而配以惊喜的表情,这显然是接受。聊天机器人的发展趋势应当是综合这些相关内容,到达一个尽可能真实、可信的沟通过程,使得聊天者产生和真人聊天的“错觉”。 A chat robot is a robot (software) that uses artificial intelligence technology to recognize the semantics of words and choose answers reasonably to achieve the purpose of simulating a chat with a real person, such as the "Microsoft Xiaobing" on the WeChat platform that has recently exploded on the Internet. At present, the development of chatbots is focused on improving the semantic meaning of the other party and choosing a reasonable answer. There are many related studies, but it is limited to the text communication between the two parties. However, the actual language communication is by no means limited to language, but also includes facial expressions, body movements, etc. The same language, coupled with different facial expressions and tones, may express completely opposite meanings. For example, if someone says "You invite me to dinner?" with a dismissive expression, it may indicate rejection, but with a surprised expression, it is obviously acceptance. The development trend of chatbots should be to integrate these related contents to achieve a communication process that is as authentic and credible as possible, so that chatters have the "illusion" of chatting with real people.

  the

发明内容 Contents of the invention

针对上述问题,本发明提供一种真实感强、判断准确的应用于聊天机器人的情绪判断方法。 In view of the above problems, the present invention provides an emotional judgment method applied to chat robots with a strong sense of reality and accurate judgment.

本发明提供一种应用于聊天机器人的情绪判断方法,其包括采集人脸数字图像的图像采集装置,其特征在于:其包括以下步骤: The present invention provides a kind of emotion judgment method that is applied to chat robot, and it comprises the image collection device of collection human face digital image, it is characterized in that: it comprises the following steps:

1)建立人脸表情数据库,存储人脸在不同表情下的面部特征; 1) Establish a facial expression database to store facial features under different expressions;

2)通过所述图像采集装置定期收集聊天者的数字图像; 2) regularly collect digital images of chatters through the image acquisition device;

3)从所述数字图像中识别聊天者的五官,统计聊天者面部特征; 3) Identify the facial features of the chatter from the digital image, and count the facial features of the chatter;

4)根据聊天者面部特征,从所述人脸表情数据库中获得和上述位置关系对应的表情,作为聊天者的表情。 4) According to the facial features of the chatter, an expression corresponding to the above positional relationship is obtained from the facial expression database as the chatter's expression.

优选的,所述步骤1)中存储人脸以下表情中的面部特征:无表情、快乐、哀伤、焦虑、愤怒、忧愁、厌恶、惊讶、轻蔑。 Preferably, in the step 1), facial features of the following facial expressions are stored: expressionless, happy, sad, anxious, angry, sad, disgusted, surprised, and contemptuous.

优选的,所述步骤1)中按以下栏位存储人脸不同表情中的面部特征:人种、性别、年龄。 Preferably, in the step 1), facial features in different facial expressions are stored in the following columns: race, gender, age.

优选的,所述面部特征包括:眉毛方向、两个眉毛之间的距离、眼睛大小、瞳孔大小、两嘴角角度、嘴巴大小、上下嘴唇形状、鼻孔直径。 Preferably, the facial features include: eyebrow direction, distance between two eyebrows, eye size, pupil size, angle of two mouth corners, mouth size, shape of upper and lower lips, and diameter of nostrils.

优选的,所述步骤3)中,还包括从所述数字图像中识别获取聊天者头部动作、双肩动作和手部动作。 Preferably, the step 3) further includes recognizing and acquiring the chatter's head movement, shoulder movement and hand movement from the digital image.

优选的,还包括以下步骤: Preferably, the following steps are also included:

5)根据聊天内容获得聊天主题,并以当前聊天者的语言和表情为依据,判断聊天者对当前聊天主题的真实态度。 5) Obtain the chat topic according to the chat content, and judge the chatter's true attitude towards the current chat topic based on the current chatter's language and expressions.

优选的,还包括以下步骤: Preferably, the following steps are also included:

6)根据步骤5)中得到的聊天者对当前聊天主题的真实态度调整聊天的策略:改变聊天主题、修正机器人对聊天主题的观点、调整机器人的语言风格。 6) Adjust the chat strategy according to the chatter's true attitude towards the current chat topic obtained in step 5): change the chat topic, correct the robot's view on the chat topic, and adjust the robot's language style.

本发明针对现有技术中的不足,提供一种真实感强、判断准确的应用于聊天机器人的情绪判断方法。其充分运用目前业已成熟的人脸识别技术、心理学常识等,通过不断判断聊天者的真实含义,从而调整聊天内容,使得聊天者获得接近真实的聊天感受,从而极大提高聊天机器人的真实感、准确性。本发明不仅可以用于聊天机器人,同样也可以应用于机器人、安全监控设备等。 Aiming at the deficiencies in the prior art, the present invention provides an emotional judgment method applied to a chat robot with a strong sense of reality and accurate judgment. It makes full use of the current mature face recognition technology and common sense of psychology, etc., and adjusts the chat content by constantly judging the real meaning of the chatter, so that the chatter can get a chat experience close to the real one, thereby greatly improving the realism of the chat robot ,accuracy. The present invention can be used not only for chatting robots, but also for robots, security monitoring equipment and the like.

  the

附图说明 Description of drawings

下面结合附图和具体实施方式对本发明作进一步详细的说明。 The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

图1是本发明的流程图。 Fig. 1 is a flow chart of the present invention.

具体实施方式 Detailed ways

为了使本技术领域的人员更好地理解本发明方案,并使本发明的上述目的、特征和优点能够更加明显易懂,下面结合实施例及实施例附图对本发明作进一步详细的说明。 In order to enable those skilled in the art to better understand the solution of the present invention, and to make the above-mentioned purpose, features and advantages of the present invention more obvious and understandable, the present invention will be further described in detail below in conjunction with the embodiments and the accompanying drawings of the embodiments.

如图1所示,本发明的一种应用于聊天机器人的情绪判断方法,包括以下步骤: As shown in Figure 1, a kind of emotion judgment method that is applied to chat robot of the present invention comprises the following steps:

1)建立人脸表情数据库,存储人脸在不同表情下的面部特征;主要包括:无表情、快乐、哀伤、焦虑、愤怒、忧愁、厌恶、惊讶、轻蔑等常见表情。各种表情都会有对应的面部特征,例如:· 伤心。面部特征包括眯眼,眉毛收紧,嘴角下拉,下巴抬起或收紧。害怕。害怕时,嘴巴和眼睛张开,眉毛上扬,鼻孔张大。愤怒。这时眉毛下垂,前额紧皱,眼睑和嘴唇紧张。 厌恶。厌恶的表情包括嗤鼻,上嘴唇上抬,眉毛下垂,眯眼。 惊讶。惊讶时,下颚下垂,嘴唇和嘴巴放松,眼睛张大,眼睑和眉毛微抬。 轻蔑。轻蔑的著名特征就是嘴角一侧抬起,作讥笑或得意笑状。只要捕获聊天者这些面部特征的变化,即可判断出聊天者的当前心情。并且根据:人种、性别、年龄进行分类,因为,不同人种、性别和年龄的人,表达表情时的面部特征肯定不同。例如,很多白种人在表达相同情感时,面部表情比黄种人要更为夸张,动作幅度大。此外,还可以进行表情丰富和不丰富进行区分,有人表情丰富,就像人们常说的“高兴不高兴都挂在脸上”,而有的人则相对表情单一,仅能通过局部细微的变化,才能扑捉到他真实的心理。根据聊天者的历史聊天记录,机器人可以给聊天者进行一定的分类:表情丰富或表情不丰富。从而采用不同的判断策略,以提高判断的准确性。 1) Establish a facial expression database to store facial features under different expressions; mainly include: expressionless, happy, sad, anxious, angry, sad, disgusted, surprised, contempt and other common expressions. Each expression will have a corresponding facial feature, for example: · Sad. Facial features include squinting, eyebrows tightening, mouth corners drawn down, and jaw raised or tightened. Fear. When frightened, the mouth and eyes are opened, the eyebrows are raised, and the nostrils are dilated. anger. At this time, the eyebrows droop, the forehead is furrowed, and the eyelids and lips are tense. disgusted. Expressions of disgust include sniffling, raised upper lip, drooping eyebrows, and squinting. surprise. In surprise, the jaw drops, the lips and mouth relax, the eyes widen, and the eyelids and eyebrows are slightly raised. Scorn. The well-known feature of contempt is the one corner of the mouth raised in a sneer or smirk. As long as the changes of these facial features of the chatter are captured, the current mood of the chatter can be judged. And classify according to: race, gender, age, because people of different races, genders and ages must have different facial features when expressing expressions. For example, when many Caucasians express the same emotion, their facial expressions are more exaggerated and their movements are larger than those of yellow people. In addition, it is also possible to distinguish between rich and unrich expressions. Some people have rich expressions, as people often say, "Happy or unhappy are all on their faces", while some people have relatively single expressions and can only change through local subtle changes. , in order to capture his true psychology. According to the chatter's historical chat records, the robot can classify the chatter into certain categories: expressive or not expressive. Therefore, different judgment strategies are adopted to improve the accuracy of judgment.

2)通过图像采集装置定期收集聊天者的数字图像;此即通过摄像头等设备,在聊天的同时,实时采集聊天者的图像,主要包括面部表情,肩部动作、头部动作和手部动作。肩部动作包括耸肩等常见动作;头部动作包括点头、摇头等常见动作;手部动作包括摆手、挥手、下压、抬手等常见动作。 2) The digital image of the chatter is regularly collected through the image acquisition device; that is, through the camera and other equipment, while chatting, the image of the chatter is collected in real time, mainly including facial expressions, shoulder movements, head movements and hand movements. Shoulder movements include common movements such as shrugging; head movements include common movements such as nodding and shaking the head; hand movements include common movements such as waving, waving, pressing down, and raising hands.

3)从数字图像中识别聊天者的五官,统计聊天者面部特征,面部特征包括:眉毛方向、两个眉毛之间的距离、眼睛大小、瞳孔大小、两嘴角角度、嘴巴大小、上下嘴唇形状、鼻孔直径; 3) Identify the facial features of the chatter from the digital image, and count the facial features of the chatter. The facial features include: eyebrow direction, distance between two eyebrows, eye size, pupil size, angle of two mouth corners, mouth size, shape of upper and lower lips, nostril diameter;

4)根据聊天者面部特征,从人脸表情数据库中获得和上述位置关系对应的表情,作为聊天者的表情。 4) According to the facial features of the chatter, the expression corresponding to the above positional relationship is obtained from the facial expression database as the chatter's expression.

5)根据聊天内容获得聊天主题,并以当前聊天者的语言和表情为依据,判断聊天者对当前聊天主题的真实态度。例如,聊天机器人在陈述某件事实时,聊天者回复“不可能吧?”,聊天机器人则可以通过其表情来加以佐证,聊天者表情惊讶,表明其可能已经接受这个事实;如果聊天者表情轻蔑,显示他并不认可这个事实。 5) Obtain the chat topic according to the chat content, and judge the chatter's true attitude towards the current chat topic based on the current chatter's language and expressions. For example, when a chatbot is stating something, the chatter replies "Isn't it possible?" The chatbot can prove it through its expression. The chatter's expression is surprised, indicating that he may have accepted this fact; if the chatter's expression is contemptuous , showing that he does not recognize this fact.

6)根据步骤5)中得到的聊天者对当前聊天主题的真实态度调整聊天的策略:改变聊天主题、修正机器人对聊天主题的观点、调整机器人的语言风格,以适应聊天者,使得聊天者获得相对较为愉悦的聊天体验。毕竟,聊天机器人只是一个陪人聊天的工具,而非真实的人,没有必要和聊天者进行争辩等聊天内容,一个相对倾向于聊天者的观点和主张,更容易让聊天者获得愉悦的聊天感受。其中,步骤2-6)是聊天机器人在聊天过程中,随聊天的进度而反复使用的过程。 6) Adjust the chat strategy according to the chatter's true attitude towards the current chat topic obtained in step 5): change the chat topic, correct the robot's view on the chat topic, adjust the robot's language style to suit the chatter, and make the chatter gain Relatively pleasant chat experience. After all, a chatbot is just a tool for chatting with people, not a real person. There is no need to argue with the chatter and other chat content. A relatively inclined view and proposition of the chatter makes it easier for the chatter to get a pleasant chatting experience . Among them, steps 2-6) are the process in which the chat robot is used repeatedly with the progress of the chat during the chat process.

以上所述,仅为本发明的具体实施方式。本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应该以权利要求所界定的保护范围为准。 The above descriptions are only specific embodiments of the present invention. The protection scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope disclosed in the present invention shall fall within the protection scope of the present invention. Therefore, the protection scope of the present invention should be defined by the claims.

Claims (7)

1.一种应用于聊天机器人的情绪判断方法,其包括采集人脸数字图像的图像采集装置,其特征在于:其包括以下步骤: 1. a method for judging emotion applied to a chat robot, which comprises an image acquisition device collecting a digital image of a human face, is characterized in that: it may further comprise the steps: 1)建立人脸表情数据库,存储人脸在不同表情下的面部特征; 1) Establish a facial expression database to store facial features under different expressions; 2)通过所述图像采集装置定期收集聊天者的数字图像; 2) regularly collect digital images of chatters through the image acquisition device; 3)从所述数字图像中识别聊天者的五官,统计聊天者面部特征; 3) Identify the facial features of the chatter from the digital image, and count the facial features of the chatter; 4)根据聊天者面部特征,从所述人脸表情数据库中获得和上述位置关系对应的表情,作为聊天者的表情。 4) According to the facial features of the chatter, an expression corresponding to the above positional relationship is obtained from the facial expression database as the chatter's expression. 2.根据权利要求1所述的应用于聊天机器人的情绪判断方法,其特征在于:所述步骤1)中存储人脸以下表情中的面部特征:无表情、快乐、哀伤、焦虑、愤怒、忧愁、厌恶、惊讶、轻蔑。 2. The emotion judging method applied to chat robots according to claim 1, characterized in that: in the step 1), facial features in the following facial expressions are stored: expressionless, happy, sad, anxious, angry, sad , disgust, surprise, contempt. 3.根据权利要求2所述的应用于聊天机器人的情绪判断方法,其特征在于:所述步骤1)中按以下栏位存储人脸不同表情中的面部特征:人种、性别、年龄。 3. The emotion judging method applied to chat robots according to claim 2, characterized in that: in the step 1), facial features in different facial expressions are stored in the following fields: race, gender, age. 4.根据权利要求3所述的应用于聊天机器人的情绪判断方法,其特征在于:所述面部特征包括:眉毛方向、两个眉毛之间的距离、眼睛大小、瞳孔大小、两嘴角角度、嘴巴大小、上下嘴唇形状、鼻孔直径。 4. The emotion judgment method applied to a chat robot according to claim 3, wherein the facial features include: eyebrow direction, distance between two eyebrows, eye size, pupil size, angle of two mouth corners, mouth Size, upper and lower lip shape, nostril diameter. 5.根据权利要求4所述的应用于聊天机器人的情绪判断方法,其特征在于:所述步骤3)中,还包括从所述数字图像中识别获取聊天者头部动作、双肩动作和手部动作。 5. The emotion judgment method applied to a chat robot according to claim 4, characterized in that: in the step 3), it also includes identifying and acquiring the head movement, shoulder movement and hand movement of the chatter from the digital image. action. 6.根据权利要求5所述的应用于聊天机器人的情绪判断方法,其特征在于:还包括以下步骤: 6. the emotion judging method that is applied to chat robot according to claim 5, it is characterized in that: also comprise the following steps: 5)根据聊天内容获得聊天主题,并以当前聊天者的语言和表情为依据,判断聊天者对当前聊天主题的真实态度。 5) Obtain the chat topic according to the chat content, and judge the chatter's true attitude towards the current chat topic based on the current chatter's language and expressions. 7.根据权利要求6所述的应用于聊天机器人的情绪判断方法,其特征在于:还包括以下步骤: 7. the emotion judging method that is applied to chat robot according to claim 6, it is characterized in that: also comprise the following steps: 6)根据步骤5)中得到的聊天者对当前聊天主题的真实态度调整聊天的策略:改变聊天主题、修正机器人对聊天主题的观点、调整机器人的语言风格。 6) Adjust the chat strategy according to the chatter's true attitude towards the current chat topic obtained in step 5): change the chat topic, correct the robot's view on the chat topic, and adjust the robot's language style.

CN201410312835.0A 2014-07-03 2014-07-03 Emotion judgment method applied to chatting robot Pending CN104091153A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410312835.0A CN104091153A (en) 2014-07-03 2014-07-03 Emotion judgment method applied to chatting robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410312835.0A CN104091153A (en) 2014-07-03 2014-07-03 Emotion judgment method applied to chatting robot

Publications (1)

Publication Number Publication Date
CN104091153A true CN104091153A (en) 2014-10-08

Family

ID=51638868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410312835.0A Pending CN104091153A (en) 2014-07-03 2014-07-03 Emotion judgment method applied to chatting robot

Country Status (1)

Country Link
CN (1) CN104091153A (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104644189A (en) * 2015-03-04 2015-05-27 刘镇江 Analysis method for psychological activities
CN104700094A (en) * 2015-03-31 2015-06-10 江苏久祥汽车电器集团有限公司 Face recognition method and system for intelligent robot
CN105049249A (en) * 2015-07-09 2015-11-11 中山大学 Scoring method and system of remote visual conversation services
CN105184058A (en) * 2015-08-17 2015-12-23 李泉生 Private conversation robot
CN105893771A (en) * 2016-04-15 2016-08-24 北京搜狗科技发展有限公司 Information service method and device and device used for information services
CN106372604A (en) * 2016-08-31 2017-02-01 北京光年无限科技有限公司 Intelligent robot negative emotion detection method and system
CN106650621A (en) * 2016-11-18 2017-05-10 广东技术师范学院 Deep learning-based emotion recognition method and system
CN106695843A (en) * 2017-03-22 2017-05-24 海南职业技术学院 Interactive robot capable of imitating human facial expressions
CN106909896A (en) * 2017-02-17 2017-06-30 竹间智能科技(上海)有限公司 Man-machine interactive system and method for work based on character personality and interpersonal relationships identification
CN106985137A (en) * 2017-03-09 2017-07-28 北京光年无限科技有限公司 Multi-modal exchange method and system for intelligent robot
CN107437052A (en) * 2016-05-27 2017-12-05 深圳市珍爱网信息技术有限公司 Blind date satisfaction computational methods and system based on micro- Expression Recognition
WO2017219450A1 (en) * 2016-06-21 2017-12-28 中兴通讯股份有限公司 Information processing method and device, and mobile terminal
CN107924482A (en) * 2015-06-17 2018-04-17 情感爱思比株式会社 Emotional control system, system and program
CN107943974A (en) * 2017-11-28 2018-04-20 合肥工业大学 Consider the automatic session method and system of emotion
CN108009490A (en) * 2017-11-29 2018-05-08 宁波高新区锦众信息科技有限公司 A kind of determination methods of chat robots system based on identification mood and the system
CN108153169A (en) * 2017-12-07 2018-06-12 北京康力优蓝机器人科技有限公司 Guide to visitors mode switching method, system and guide to visitors robot
CN109598217A (en) * 2018-11-23 2019-04-09 南京亨视通信息技术有限公司 A kind of system that the micro- Expression analysis of human body face is studied and judged
CN110046580A (en) * 2019-04-16 2019-07-23 广州大学 A kind of man-machine interaction method and system based on Emotion identification
CN110121026A (en) * 2019-04-24 2019-08-13 深圳传音控股股份有限公司 Intelligent capture apparatus and its scene generating method based on living things feature recognition
WO2020087534A1 (en) * 2018-11-02 2020-05-07 Microsoft Technology Licensing, Llc Generating response in conversation
CN111327772A (en) * 2020-02-25 2020-06-23 广州腾讯科技有限公司 Method, device, equipment and storage medium for automatic voice response processing
CN111696536A (en) * 2020-06-05 2020-09-22 北京搜狗科技发展有限公司 Voice processing method, apparatus and medium
CN111696538A (en) * 2020-06-05 2020-09-22 北京搜狗科技发展有限公司 Voice processing method, apparatus and medium
CN111696537A (en) * 2020-06-05 2020-09-22 北京搜狗科技发展有限公司 Voice processing method, apparatus and medium
US10832002B2 (en) 2018-05-08 2020-11-10 International Business Machines Corporation System and method for scoring performance of chatbots
US10838967B2 (en) 2017-06-08 2020-11-17 Microsoft Technology Licensing, Llc Emotional intelligence for a conversational chatbot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100189358A1 (en) * 2007-06-18 2010-07-29 Canon Kabushiki Kaisha Facial expression recognition apparatus and method, and image capturing apparatus
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for controlling user gestures by using image capture device
CN103246879A (en) * 2013-05-13 2013-08-14 苏州福丰科技有限公司 Expression-recognition-based intelligent robot system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100189358A1 (en) * 2007-06-18 2010-07-29 Canon Kabushiki Kaisha Facial expression recognition apparatus and method, and image capturing apparatus
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for controlling user gestures by using image capture device
CN103246879A (en) * 2013-05-13 2013-08-14 苏州福丰科技有限公司 Expression-recognition-based intelligent robot system

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104644189A (en) * 2015-03-04 2015-05-27 刘镇江 Analysis method for psychological activities
CN104700094A (en) * 2015-03-31 2015-06-10 江苏久祥汽车电器集团有限公司 Face recognition method and system for intelligent robot
CN107924482A (en) * 2015-06-17 2018-04-17 情感爱思比株式会社 Emotional control system, system and program
CN105049249A (en) * 2015-07-09 2015-11-11 中山大学 Scoring method and system of remote visual conversation services
CN105184058B (en) * 2015-08-17 2018-01-09 安溪县凤城建金产品外观设计服务中心 A kind of secret words robot
CN105184058A (en) * 2015-08-17 2015-12-23 李泉生 Private conversation robot
CN105893771A (en) * 2016-04-15 2016-08-24 北京搜狗科技发展有限公司 Information service method and device and device used for information services
CN107437052A (en) * 2016-05-27 2017-12-05 深圳市珍爱网信息技术有限公司 Blind date satisfaction computational methods and system based on micro- Expression Recognition
WO2017219450A1 (en) * 2016-06-21 2017-12-28 中兴通讯股份有限公司 Information processing method and device, and mobile terminal
CN106372604A (en) * 2016-08-31 2017-02-01 北京光年无限科技有限公司 Intelligent robot negative emotion detection method and system
CN106650621A (en) * 2016-11-18 2017-05-10 广东技术师范学院 Deep learning-based emotion recognition method and system
CN106909896A (en) * 2017-02-17 2017-06-30 竹间智能科技(上海)有限公司 Man-machine interactive system and method for work based on character personality and interpersonal relationships identification
CN106909896B (en) * 2017-02-17 2020-06-30 竹间智能科技(上海)有限公司 Man-machine interaction system based on character personality and interpersonal relationship recognition and working method
CN106985137A (en) * 2017-03-09 2017-07-28 北京光年无限科技有限公司 Multi-modal exchange method and system for intelligent robot
CN106695843A (en) * 2017-03-22 2017-05-24 海南职业技术学院 Interactive robot capable of imitating human facial expressions
US10838967B2 (en) 2017-06-08 2020-11-17 Microsoft Technology Licensing, Llc Emotional intelligence for a conversational chatbot
CN107943974A (en) * 2017-11-28 2018-04-20 合肥工业大学 Consider the automatic session method and system of emotion
CN108009490A (en) * 2017-11-29 2018-05-08 宁波高新区锦众信息科技有限公司 A kind of determination methods of chat robots system based on identification mood and the system
CN108153169A (en) * 2017-12-07 2018-06-12 北京康力优蓝机器人科技有限公司 Guide to visitors mode switching method, system and guide to visitors robot
US10832002B2 (en) 2018-05-08 2020-11-10 International Business Machines Corporation System and method for scoring performance of chatbots
WO2020087534A1 (en) * 2018-11-02 2020-05-07 Microsoft Technology Licensing, Llc Generating response in conversation
CN109598217A (en) * 2018-11-23 2019-04-09 南京亨视通信息技术有限公司 A kind of system that the micro- Expression analysis of human body face is studied and judged
CN110046580A (en) * 2019-04-16 2019-07-23 广州大学 A kind of man-machine interaction method and system based on Emotion identification
CN110121026A (en) * 2019-04-24 2019-08-13 深圳传音控股股份有限公司 Intelligent capture apparatus and its scene generating method based on living things feature recognition
CN111327772B (en) * 2020-02-25 2021-09-17 广州腾讯科技有限公司 Method, device, equipment and storage medium for automatic voice response processing
CN111327772A (en) * 2020-02-25 2020-06-23 广州腾讯科技有限公司 Method, device, equipment and storage medium for automatic voice response processing
CN111696537A (en) * 2020-06-05 2020-09-22 北京搜狗科技发展有限公司 Voice processing method, apparatus and medium
CN111696538A (en) * 2020-06-05 2020-09-22 北京搜狗科技发展有限公司 Voice processing method, apparatus and medium
CN111696536A (en) * 2020-06-05 2020-09-22 北京搜狗科技发展有限公司 Voice processing method, apparatus and medium
CN111696536B (en) * 2020-06-05 2023-10-27 北京搜狗智能科技有限公司 Voice processing method, device and medium
CN111696537B (en) * 2020-06-05 2023-10-31 北京搜狗科技发展有限公司 Voice processing method, device and medium
CN111696538B (en) * 2020-06-05 2023-10-31 北京搜狗科技发展有限公司 Voice processing method, device and medium

Similar Documents

Publication Publication Date Title
CN104091153A (en) 2014-10-08 Emotion judgment method applied to chatting robot
US9031293B2 (en) 2015-05-12 Multi-modal sensor based emotion recognition and emotional interface
CN107765852A (en) 2018-03-06 Multi-modal interaction processing method and system based on visual human
Dubey et al. 2016 Automatic emotion recognition using facial expression: a review
EP3627381A1 (en) 2020-03-25 Method of facial expression generation with data fusion and related device
TWI661363B (en) 2019-06-01 Smart robot and human-computer interaction method
CN106933345B (en) 2020-02-07 Multi-modal interaction method and device for intelligent robot
CN108009490A (en) 2018-05-08 A kind of determination methods of chat robots system based on identification mood and the system
WO2019050881A1 (en) 2019-03-14 Methods and apparatus for silent speech interface
TW201530326A (en) 2015-08-01 Method for selecting music based on face recognition, music selecting system and electronic apparatus
WO2020215590A1 (en) 2020-10-29 Intelligent shooting device and biometric recognition-based scene generation method thereof
Biancardi et al. 2017 Analyzing first impressions of warmth and competence from observable nonverbal cues in expert-novice interactions
Niewiadomski et al. 2015 Automated laughter detection from full-body movements
KR102351008B1 (en) 2022-01-14 Apparatus and method for recognizing emotions
CN109117952A (en) 2019-01-01 A method of the robot emotion cognition based on deep learning
Dupont et al. 2016 Laughter research: a review of the ILHAIRE project
Zhang et al. 2014 Intelligent Facial Action and emotion recognition for humanoid robots
Jain et al. 2021 Study for emotion recognition of different age groups students during online class
Zheng et al. 2019 Babebay-a companion robot for children based on multimodal affective computing
CN113920561A (en) 2022-01-11 Facial expression recognition method and device based on zero sample learning
Savadi et al. 2014 Face based automatic human emotion recognition
KR20220075908A (en) 2022-06-08 Method and apparatus for supporting online virtual classroom learning
KR102285482B1 (en) 2021-08-03 Method and apparatus for providing content based on machine learning analysis of biometric information
CN108960191B (en) 2021-12-14 A robot-oriented multimodal fusion emotional computing method and system
WO2020175969A1 (en) 2020-09-03 Emotion recognition apparatus and emotion recognition method

Legal Events

Date Code Title Description
2014-10-08 C06 Publication
2014-10-08 PB01 Publication
2014-10-29 C10 Entry into substantive examination
2014-10-29 SE01 Entry into force of request for substantive examination
2018-03-27 RJ01 Rejection of invention patent application after publication
2018-03-27 RJ01 Rejection of invention patent application after publication

Application publication date: 20141008