首页> 外文期刊>Future generation computer systems >A unified two-parallel-branch deep neural network for joint gland contour and segmentation learning
【24h】

A unified two-parallel-branch deep neural network for joint gland contour and segmentation learning

机译:联合的两平行分支深度神经网络用于联合腺体轮廓和分割学习

获取原文
获取原文并翻译 | 示例
           

摘要

Existing state-of-the-art gland segmentation methods usually extract different high-level features from shared low-level layers in a deep framework for separately learning gland segmentation and contour prediction and fusing the results. Such an architecture does not fully respect the complementary relationship between the two tasks, and the independency between the two kinds of task-specific features, which are meant to depict different parts of gland objects. To address the issues, we propose here a new unified end-to-end trainable deep neural network. It consists of two parallel branches, each extracts high-level features from separate low-level feature maps for a specific task under deep supervision. The gland segmentation and contour learning are jointly performed based on combined features of the two branches, while their correlations are explored through feature propagation. Besides, the proposed architecture better facilitates leveraging the power of transfer learning, which alleviates the quandary of insufficient training data and eases the learning process by weight migration from multiple task specific pre-trained models. Experiments on the benchmark dataset of 2015 MICCAI Gland Segmentation Challenge show that the proposed method delivers superior performance over the state-of-the-art approaches. (C) 2019 Elsevier B.V. All rights reserved.
机译:现有的最先进的腺体分割方法通常在一个深层框架中从共享的低层中提取不同的高层特征,以分别学习腺体分割和轮廓预测并融合结果。这样的体系结构没有完全尊重两个任务之间的互补关系,以及两种特定于任务的特征之间的独立性,这意味着要描绘腺体的不同部分。为了解决这些问题,我们在这里提出了一个新的统一的端到端可训练深度神经网络。它由两个并行分支组成,每个分支都从单独的低级特征图中提取高级特征,以在深度监督下完成特定任务。腺体分割和轮廓学习是基于两个分支的组合特征共同执行的,而它们的相关性则通过特征传播来探索。此外,所提出的架构更好地促进了转移学习的力量,从而减轻了训练数据不足的困扰,并通过从多个任务特定的预训练模型进行权重迁移,简化了学习过程。在2015年MICCAI腺体分割挑战的基准数据集上进行的实验表明,该方法提供了优于最新方法的优越性能。 (C)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号