首页> 外文期刊>Pattern Analysis and Machine Intelligence, IEEE Transactions on >Robust Stereo Matching Using Adaptive Normalized Cross-Correlation
【24h】

Robust Stereo Matching Using Adaptive Normalized Cross-Correlation

机译:使用自适应归一化互相关的稳健立体声匹配

获取原文
获取原文并翻译 | 示例
           

摘要

A majority of the existing stereo matching algorithms assume that the corresponding color values are similar to each other. However, it is not so in practice as image color values are often affected by various radiometric factors such as illumination direction, illuminant color, and imaging device changes. For this reason, the raw color recorded by a camera should not be relied on completely, and the assumption of color consistency does not hold good between stereo images in real scenes. Therefore, the performance of most conventional stereo matching algorithms can be severely degraded under the radiometric variations. In this paper, we present a new stereo matching measure that is insensitive to radiometric variations between left and right images. Unlike most stereo matching measures, we use the color formation model explicitly in our framework and propose a new measure, called the Adaptive Normalized Cross-Correlation (ANCC), for a robust and accurate correspondence measure. The advantage of our method is that it is robust to lighting geometry, illuminant color, and camera parameter changes between left and right images, and does not suffer from the fattening effect unlike conventional Normalized Cross-Correlation (NCC). Experimental results show that our method outperforms other state-of-the-art stereo methods under severely different radiometric conditions between stereo images.
机译:现有的大多数立体声匹配算法都假设相应的颜色值彼此相似。但是,实际上并非如此,因为图像颜色值通常受各种辐射度因素(例如照明方向,光源颜色和成像设备更改)的影响。因此,不应完全依靠相机记录的原始颜色,并且在真实场景中的立体图像之间,颜色一致性的假设不能很好地适用。因此,在辐射度变化下,大多数常规立体声匹配算法的性能可能会严重降低。在本文中,我们提出了一种新的立体声匹配度量,该度量对左右图像之间的辐射度变化不敏感。与大多数立体声匹配度量不同,我们在框架中显式使用了颜色形成模型,并提出了一种称为自适应归一化互相关(ANCC)的新度量,以实现健壮且准确的对应度量。我们的方法的优点是,它对照明的几何形状,光源的颜色以及左右图像之间的相机参数变化具有鲁棒性,并且不像传统的归一化互相关(NCC)那样具有增肥效果。实验结果表明,在立体图像之间严重不同的辐射条件下,我们的方法优于其他最新的立体方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号