首页> 外文会议>International conference on neural information processing;Annual conference of Asia-Pacific Neural Network Society >Cross-View Image Retrieval - Ground to Aerial Image Retrieval Through Deep Learning
【24h】

Cross-View Image Retrieval - Ground to Aerial Image Retrieval Through Deep Learning

机译:跨视图图像检索-通过深度学习从地面到空中图像检索

获取原文

摘要

Cross-modal retrieval aims to measure the content similarity between different types of data. The idea has been previously applied to visual, text, and speech data. In this paper, we present a novel cross-modal retrieval method specifically for multi-view images, called Cross-view Image Retrieval CVIR. Our approach aims to find a feature space as well as an embedding space in which samples from street-view images are compared directly to satellite-view images (and vice-versa). For this comparison, a novel deep metric learning based solution "DeepCVIR" has been proposed. Previous cross-view image datasets are deficient in that they (1) lack class information; (2) were originally collected for cross-view image geolocalization task with coupled images; (3) do not include any images from off-street, locations. To train, compare, and evaluate the performance of cross-view image retrieval, we present a new 6 class cross-view image dataset termed as Cross ViewRet which comprises of images including freeway, mountain, palace, river, ship, and stadium with 700 high-resolution dual-view images for each class. Results show that the proposed DeepCVIR outperforms conventional matching approaches on CVIR task for the given dataset and would also serve as the baseline for future research.
机译:跨模式检索旨在测量不同类型数据之间的内容相似性。该想法先前已应用于视觉,文本和语音数据。在本文中,我们提出了一种新颖的针对多视图图像的跨模态检索方法,称为跨视图图像检索CVIR。我们的方法旨在找到一个特征空间以及一个嵌入空间,在其中将街景图像的样本直接与卫星视图图像进行比较(反之亦然)。为了进行这种比较,提出了一种新颖的基于深度度量学习的解决方案“ DeepCVIR”。先前的交叉视图图像数据集的不足之处在于:(1)缺少类信息; (2)最初收集用于耦合图像的跨视图图像地理定位任务; (3)不包含来自街道外位置的任何图像。为了训练,比较和评估交叉视图图像检索的性能,我们提出了一个名为Cross ViewRet的新的6类交叉视图图像数据集,该数据集包含高速公路,山脉,宫殿,河流,船舶和体育场等700个图像每个类别的高分辨率双视图图像。结果表明,对于给定的数据集,拟议的DeepCVIR优于CVIR任务的常规匹配方法,并且可以作为将来研究的基准。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号