首页> 外文期刊>Affective Computing, IEEE Transactions on >The MatchNMingle Dataset: A Novel Multi-Sensor Resource for the Analysis of Social Interactions and Group Dynamics In-the-Wild During Free-Standing Conversations and Speed Dates
【24h】

The MatchNMingle Dataset: A Novel Multi-Sensor Resource for the Analysis of Social Interactions and Group Dynamics In-the-Wild During Free-Standing Conversations and Speed Dates

机译:Matchnmingle DataSet:用于在自由谈话和速度日期期间分析社交交互和群体组动态的新型多传感器资源

获取原文
获取原文并翻译 | 示例
           

摘要

Y We present MatchNMingle, a novel multimodal/multisensor dataset for the analysis of free-standing conversational groups and speed-dates in-the-wild. MatchNMingle leverages the use of wearable devices and overhead cameras to record social interactions of 92 people during real-life speed-dates, followed by a cocktail party. To our knowledge, MatchNMingle has the largest number of participants, longest recording time and largest set of manual annotations for social actions available in this context in a real-life scenario. It consists of 2 hours of data from wearable acceleration, binary proximity, video, audio, personality surveys, frontal pictures and speed-date responses. Participants' positions and group formations were manually annotated; as were social actions (eg. speaking, hand gesture) for 30 minutes at 20 FPS making it the first dataset to incorporate the annotation of such cues in this context. We present an empirical analysis of the performance of crowdsourcing workers against trained annotators in simple and complex annotation tasks, founding that although efficient for simple tasks, using crowdsourcing workers for more complex tasks like social action annotation led to additional overhead and poor inter-annotator agreement compared to trained annotators (differences up to 0.4 in Fleiss' Kappa coefficients). We also provide example experiments of how MatchNMingle can be used.
机译:Y我们呈现匹配,一种新型多模式/多传感器数据集,用于分析自由站立的会话组和野外的速度日期。匹配邮政利用可穿戴设备和架空摄像机的使用,以在现实速度日期期间记录92人的社交交互,其次是鸡尾酒会。据我们所知,匹配界拥有最多的参与者,最长的录制时间和最大的手动注释,在真实方案中在此上下文中可用的社交行动。它由可穿戴加速度,二进制邻近,视频,音频,人格调查,正面图片和快速日期响应组成的2小时数据。参与者的职位和组形成是手动注释的;与社交行为(例如,发言,手势)以20个FPS在20 FPS中运行30分钟,使其在此背景下包含此类提示的注释。我们在简单和复杂的注释任务中提出了对众群注册器对训练有素的注释器的表现的实证分析,尽管为简单的任务有效,但使用众群任务,以获得更复杂的任务,如社会行动注释导致了额外的开销和穷人的注册间协议与培训的注释器(Fleiss Kappa系数的差异高达0.4)。我们还提供了如何使用匹配的示例实验。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号