基于改进YOLOX的安全帽佩戴实时检测
DOI:
作者:
作者单位:

1.武汉工程大学 计算机科学与工程学院 武汉 430205;2.武汉工程大学 智能机器人湖北省重点实验室 武汉 430205

作者简介:

通讯作者:

中图分类号:

TP391

基金项目:

武汉工程大学教育创新基金(CX2021273)资助


Real-time detection of helmet wearing based on improved YOLOX
Author:
Affiliation:

1.School of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan 430205, China;2.Hubei Key Laboratory of Intelligent Robot, Wuhan Institute of Technology, Wuhan 430205, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    在建筑行业中,因未佩戴安全帽而导致的安全事故占比较大。针对安全帽检测中存在的干扰性强、小目标准确率低等问题,提出了一种基于YOLOX的改进算法。首先,在加强特征提取网络中加入ECA-Net注意力机制,进行跨通道交互,根据生成的对应通道权重值,抑制干扰信息,加强模型对目标特征的关注度,再将重校准后的特征图进行更深度地特征融合,提高目标特征的表达能力。其次,使用CIoU来计算损失,将两框中心点距离和长宽比考虑进惩罚项,不断调整更新损失函数,加快模型收敛速度。最后,构建了一个真实施工场景下的小目标安全帽数据集。实验结果表明,改进后的算法mAP达91.7%,比原YOLOX算高出1.2%,对已佩戴安全帽的工人检测平均精度达93.9%,对未佩戴安全帽的检测平均精度达89.5%,检测速度达到71.9帧/s,保证安全帽佩戴情况实时检测的同时有较高准确率。

    Abstract:

    In the construction industry, safety accidents caused by not wearing helmets account for a relatively large proportion. Aiming at the problems of strong interference and low accuracy of small targets in helmet detection, an improved algorithm based on YOLOX is proposed. Firstly, an ECA-Net attention mechanism is added to the enhanced feature extraction network to carry out cross-channel interaction, suppress the interference information according to the corresponding channel weight value generated, strengthen the model's attention to the target feature, and then fuse the recalibrated feature map more deeply to improve the expression ability of the target feature. Secondly, the CIoU is used to calculate the loss, the distance between the two boxes of center points and the aspect ratio are considered into the penalty term, and the loss function is constantly adjusted and updated to accelerate the model convergence speed. Finally, a small target helmet dataset in a real construction scenario is constructed. Experimental results show that the improved algorithm mAP reaches 91.7%, which is 1.2% higher than the original YOLOX calculation, the average accuracy of the detection of workers who have worn helmets reaches 93.9%, the average accuracy of detection of those who have not worn helmets reaches 89.5%, and the detection speed reaches 71.9 frames/s, which ensures that the real-time detection of helmet wearing has a high accuracy rate.

    参考文献
    相似文献
    引证文献
引用本文

丁 田,陈向阳,周 强,肖浩樑.基于改进YOLOX的安全帽佩戴实时检测[J].电子测量技术,2022,45(17):72-78

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-04-02
  • 出版日期: