复杂环境基于YOLACT电缆识别与定位
DOI:
作者:
作者单位:

南京航空航天大学自动化学院 南京 211100

作者简介:

通讯作者:

中图分类号:

TP2

基金项目:

南京航空航天大学校创新计划项目(xcxjh20210304)资助


Cable identification and location based on YOLACT in complex environment
Author:
Affiliation:

College of Automation Engineering, Nanjing University of Aeronautics and Astronautics,Nanjing 211100, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    目前,电力公司的电缆维护都是人工完成的。人工维护不仅工作量大、效率低,而且存在很大的安全问题。随着机器视觉的快速发展和机器人技术在各行各业的广泛应用,将机器人和视觉技术应用于电缆的自动维护已成为一种必然趋势。本文提出了一种基于YOLACT模型的双目电缆识别与定位方法,该方法首先利用改进的YOLACT网络对复杂环境下的密集电缆进行识别和分割,然后对电缆分割图像进行边缘优化与提取,最后利用得到的电缆边缘特征对双目图像中的相同目标进行匹配,从而实现复杂环境下对电缆的识别与定位。与传统的YOLACT模型相比,本文提出的电缆候选框相关度计算方法可以很好地解决识别密集电缆时出现的漏检和误检问题,提高了电缆识别的准确率。

    Abstract:

    At present, the cable maintenance of power companies is completed manually. Manual maintenance not only has heavy workload and low efficiency, but also has serious secure issue. With the rapid development of machine vision and the wide application of robot technology in all walks of life, it has become an inevitable trend to apply these techniques to the automatic cable maintenance. This paper presents a method of binocular cable recognition and location based on the YOLACT model. Firstly, it uses the improved YOLACT network to recognize and segment dense cables in complex environments, then optimizes and extracts the edge of the cable segmentation image, and finally uses the obtained cable edge features to match the targets in the binocular image, so as to realize the recognition and location of cables in complex environments. Compared with the traditional YOLACT model, the correlation calculation method of cable candidate frame proposed in this paper can well solve the problems of missed detection and false detection when identifying dense cables, and improve the accuracy of cable identification.

    参考文献
    相似文献
    引证文献
引用本文

李瑾,范佳能,刘屹然.复杂环境基于YOLACT电缆识别与定位[J].电子测量技术,2023,46(4):114-120

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-02-22
  • 出版日期: