...
首页> 外文期刊>International Journal of Social Robotics >Can You Read My Face? A Methodological Variation for Assessing Facial Expressions of Robotic Heads
【24h】

Can You Read My Face? A Methodological Variation for Assessing Facial Expressions of Robotic Heads

机译:你能读脸吗? 评估机器人头部表达的方法论变化

获取原文
获取原文并翻译 | 示例
           

摘要

Our paper reports about an online study on robot facial expressions. On the one hand, we performed this study to assess the quality of the current facial expressions of two robot heads. On the other hand, we aimed at developing a simple, easy-to-use methodological variation to evaluate facial expressions of robotic heads. Short movie clips of two different robot heads showing a happy, sad, surprised, and neutral facial expression were compiled into an online survey, to examine how people interpret these expressions. Additionally, we added a control condition with a human face showing the same four emotions. The results showed that the facial expressions could be recognized well for both heads. Even the blender emotion surprised was recognized, although it resulted in positive and negative connotations. These results underline the importance of the situational context to correctly interpret emotional facial expressions. Besides the expected finding that the human is perceived significantly more anthropomorphic and animate than both robot heads, the more human-like designed robot head was rated significantly higher with respect to anthropomorphism than the robot head using animal-like features. In terms of the validation procedure, we could provide evidence for a feasible two-step procedure. By assessing the participants' dispositional empathy with a questionnaire it can be ensured that they are in general able to decode facial expressions into the corresponding emotion. In subsequence, robot facial expressions can be validated with a closed-question approach.
机译:我们的论文报告了对机器人面部表情的在线研究。一方面,我们进行了这项研究,以评估两个机器人头的当前面部表情的质量。另一方面,我们旨在开发一种简单,易于使用的方法,以评估机器人头部的面部表情。两个不同的机器人头部的短片剪辑显示出快乐,悲伤,惊讶和中性面部表情被编制到一个在线调查中,以检查人们如何解释这些表达。此外,我们添加了一个控制条件,人类面向显示相同的四种情绪。结果表明,对于两个头,可以很好地识别面部表情。即使是搅拌机的情绪也被认可,尽管它导致积极和消极的内涵。这些结果强调了情境上下文正确解释情绪面部表情的重要性。除了预期的发现,人类被引人注定的人比机器人头部显着更明显更多的人类和动画,相对于使用类似动物的特征的人格头,就越多的人类设计的机器人头比机器人头更高。就验证程序而言,我们可以为可行的两步程序提供证据。通过评估与调查问卷的参与者的拟议表情,可以确保他们一般能够将面部表情解码为相应的情感。在随后,可以通过封闭式方法验证机器人面部表达式。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号