首页> 美国卫生研究院文献>BioMed Research International >A Registration Method Based on Contour Point Cloud for 3D Whole-Body PET and CT Images
【2h】

A Registration Method Based on Contour Point Cloud for 3D Whole-Body PET and CT Images

机译:基于轮廓点云的3D全身PET和CT图像配准方法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The PET and CT fusion image, combining the anatomical and functional information, has important clinical meaning. An effective registration of PET and CT images is the basis of image fusion. This paper presents a multithread registration method based on contour point cloud for 3D whole-body PET and CT images. Firstly, a geometric feature-based segmentation (GFS) method and a dynamic threshold denoising (DTD) method are creatively proposed to preprocess CT and PET images, respectively. Next, a new automated trunk slices extraction method is presented for extracting feature point clouds. Finally, the multithread Iterative Closet Point is adopted to drive an affine transform. We compare our method with a multiresolution registration method based on Mattes Mutual Information on 13 pairs (246~286 slices per pair) of 3D whole-body PET and CT data. Experimental results demonstrate the registration effectiveness of our method with lower negative normalization correlation (NC = −0.933) on feature images and less Euclidean distance error (ED = 2.826) on landmark points, outperforming the source data (NC = −0.496, ED = 25.847) and the compared method (NC = −0.614, ED = 16.085). Moreover, our method is about ten times faster than the compared one.
机译:PET和CT融合图像结合了解剖学和功能信息,具有重要的临床意义。 PET和CT图像的有效配准是图像融合的基础。本文提出了一种基于轮廓点云的3D全身PET和CT图像多线程配准方法。首先,创造性地提出了一种基于几何特征的分割(GFS)方法和一种动态阈值降噪(DTD)方法,分别对CT和PET图像进行预处理。接下来,提出了一种新的自动躯干切片提取方法,用于提取特征点云。最后,采用多线程迭代闭包点来驱动仿射变换。我们将我们的方法与基于Mattes Mutual Information的13对3D全身PET和CT数据(每对246〜286片)的多分辨率配准方法进行比较。实验结果表明,我们的方法在特征图像上具有较低的负归一化相关性(NC = -0.933),在地标点上具有较小的欧几里德距离误差(ED = 2.826),其配准效果优于源数据(NC = -0.496,ED = 25.847) )和比较方法(NC = −0.614,ED = 16.085)。而且,我们的方法比比较方法快十倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号