首页> 外文期刊>Information Processing & Management >Introduction to the special issue on evaluating interactive information retrieval systems
【24h】

Introduction to the special issue on evaluating interactive information retrieval systems

机译:评估交互式信息检索系统的特殊问题简介

获取原文
获取原文并翻译 | 示例
       

摘要

Evaluation has always been a strong element of Information Retrieval (IR) research, much of our focus being on how we evaluate IR algorithms. As a research field we have benefited greatly from initiatives such as Cranfield, TREC, CLEF and INEX that have added to our knowledge of how to create test collections, the reliability of system-based evaluation criteria and our understanding of how to interpret the results of an algorithmic evaluation. In contrast, evaluations whose main focus is the user experience of searching have not yet reached the same level of maturity. Such evaluations are complex to create and assess due to the increased number of variables to incorporate within the study, the lack of standard tools available (for example, test collections) and the difficulty of selecting appropriate evaluation criteria for study.
机译:评估一直是信息检索(IR)研究的重要组成部分,我们的重点主要放在评估IR算法的方式上。作为研究领域,我们从Cranfield,TREC,CLEF和INEX等计划中受益匪浅,这些计划增加了我们对如何创建测试集合的知识,基于系统的评估标准的可靠性以及对如何解释测试结果的理解。算法评估。相反,主要关注用户搜索体验的评估尚未达到相同的成熟度。由于要纳入研究的变量数量增加,缺乏可用的标准工具(例如,测试集)以及难以选择合适的评估标准进行研究,因此创建和评估此类评估非常复杂。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号