首页> 外文会议>Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), 2012 IEEE International Multi-Disciplinary Conference on >Intraindividual and interindividual multimodal emotion analyses in Human-Machine-Interaction
【24h】

Intraindividual and interindividual multimodal emotion analyses in Human-Machine-Interaction

机译:人机交互中的个体内和个体间多峰情感分析

获取原文
获取原文并翻译 | 示例

摘要

Interactive computer systems today interact nearly effortlessly with humans by menu-driven mouse- and text-based input. In case of other modalities like audio and gesture control systems still lack on flexibility. To respond appropriately, these intelligent systems require specific cues about the user's internal state. Reliable emotion recognition of technical systems is therefore an important issue in computer sciences and applications. In order to develop an appropriate methodology for emotion analyses, a multimodal study is introduced here. Audio and video signals as well as biopsychological signals of the user are applied to detect intraindividual behavioural prototypes that can be used for predictions of the user's emotional states. Additionally, interindividual differences are considered and discussed. Statistical analyses showed results in most cases with statistical significance of probability value p <; 0.05 and an effect size d >; 1.05.
机译:如今,交互式计算机系统通过菜单驱动的基于鼠标和文本的输入,可以与人类进行几乎毫不费力的交互。在其他形式的情况下,例如音频和手势控制系统仍然缺乏灵活性。为了做出适当的响应,这些智能系统需要有关用户内部状态的特定提示。因此,对技术系统的可靠情感识别是计算机科学和应用程序中的重要问题。为了开发一种合适的情感分析方法,这里介绍了一种多峰研究。用户的音频和视频信号以及生物心理信号被应用于检测可用于预测用户的情绪状态的个体内行为原型。此外,考虑并讨论了个体差异。统计分析表明,大多数情况下的结果具有概率值p <;的统计意义。 0.05和效应大小d>; 1.05。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号