基于双路特征的宫颈细胞核分割
DOI:
作者:
作者单位:

沈阳工业大学信息科学与工程学院 沈阳 110870

作者简介:

通讯作者:

中图分类号:

TP391

基金项目:


Cervical nuclear segmentation based on two-path features
Author:
Affiliation:

School of Information Science and Engineering, Shenyang University of Technology,Shenyang 110870, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    宫颈细胞核的分割问题研究对宫颈癌筛查诊断具有重要意义,但受边缘模糊及存在干扰物影响给分割任务带来了巨大挑战。针对此问题提出一种基于DeepLabV3+网络的细胞核分割方法,首先充分利用主干网络的输出进行多尺度特征融合,并引入注意力机制,构建了细胞团分割模型,以减少背景中干扰物对细胞核分割的影响;基于此设计了融合Transformer与ResNet50的双路特征提取模块,兼顾模型对全局信息的获取及低层上下文特征的敏感度,提高了模型对细胞核与干扰信息的辨别能力。实验结果表明,算法在宫颈细胞核的分割任务中取得了良好的分割效果,均交并比为0.832 9,较DeepLabV3+模型提高了2.33%,且与其他方法相比获得了更优的性能指标。

    Abstract:

    The study of cervical nucleus segmentation is of great significance for cervical cancer screening and diagnosis, but it brings great challenges to the segmentation task due to the influence of blurring edge and interference. In response to this problem, a nuclear segmentation method based on the DeepLabv3+network was proposed. Firstly, utilizing the output of the backbone network for multi-scale feature fusion and introducing an attention mechanism, a cell mass segmentation model was built to reduce the effects of interferences in the background on nuclear segmentation. Based on this, a two-path feature extraction module combining Transformer and ResNet50 was designed, which takes into account the sensitivity of the model to global information acquisition and low-level context features, and improves the discrimination ability of the model to nuclei and interference information. The experimental results show that the algorithm has achieved good segmentation results in the task of cervical cell nuclei, and MIoU is 0.832 9, which has increased by 2.33% and obtained better performance indicators compared to other methods.

    参考文献
    相似文献
    引证文献
引用本文

崔文成,杨丹,邵虹.基于双路特征的宫颈细胞核分割[J].电子测量技术,2023,46(6):129-136

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-02-19
  • 出版日期: