首页> 外文会议>Biomedical image registration >Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration
【24h】

Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration

机译:多模态图像配准通用信息熵的熵的归一化度量

获取原文
获取原文并翻译 | 示例

摘要

Mutual information (MI) was introduced for use in multi-modal image registration over a decade ago [1,2,3,4]. The MI between two images is based on their marginal and joint/conditional entropies. The most common versions of entropy used to compute MI are the Shannon and differential entropies; however, many other definitions of entropy have been proposed as competitors. In this article, we show how to construct normalized versions of MI using any of these definitions of entropy. The resulting similarity measures are analogous to normalized mutual information (NMI), entropy correlation coefficient (ECC), and symmetric uncertainty (SU), which have all been shown to be superior to MI in a variety of situations. We use publicly available CT, PET, and MR brain images1 with known ground truth transformations to evaluate the performance of the normalized measures for rigid multimodal registration. Results show that for a number of different definitions of entropy, the proposed normalized versions of mutual information provide a statistically significant improvement in target registration error (TRE) over the non-normalized versions.
机译:互信息(MI)于十多年前被引入用于多模式图像配准[1,2,3,4]。两个图像之间的MI基于其边际和联合/条件熵。用于计算MI的最常见的熵是Shannon和微分熵。然而,已经提出了许多其他的熵定义作为竞争者。在本文中,我们展示了如何使用熵的任何这些定义来构造MI的规范化版本。产生的相似性度量类似于归一化互信息(NMI),熵相关系数(ECC)和对称不确定性(SU),它们在各种情况下均优于MI。我们使用公开的CT,PET和MR脑图像1以及已知的地面真相变换来评估刚性多峰配准的标准化度量的性能。结果表明,对于许多不同的熵定义,互信息的规范化版本相对于非规范化版本在目标配准错误(TRE)方面具有统计学上的显着改善。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号