基于预过滤注意力的Transformer物体检测
DOI:
作者:
作者单位:

青岛科技大学自动化与电子工程学院 青岛 266061

作者简介:

通讯作者:

中图分类号:

TP391.9

基金项目:


Object detection based on Transformer with prefiltered attention
Author:
Affiliation:

College of Automation and Electronic Engineering, Qingdao University of Science and Technology, Qingdao 266061, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    近几年提出的基于Transformer的目标检测器简化了模型结构,展现出具有竞争力的性能。然而,由于Transformer注意力模块处理特征图的方式,大部分模型存在收敛速度慢和小物体检测效果差的问题。为了解决这些问题,本研究提出了基于预过滤注意力模块的Transformer检测模型,该模块以目标点为参照,提取目标点附近部分特征点进行交互,节省训练时长并提高检测精度。同时在该模块中融入新提出的一种有向相对位置编码,弥补因模块权重计算导致的相对位置信息缺失,提供精确的位置信息,更有利于模型对小物体的检测。在COCO 2017数据集上的实验表明我们的模型可以将训练时长缩短近10倍,并获得更好的性能,特别是在小物体检测上精度达到了26.8 APs。

    Abstract:

    Transformer-based target detectors proposed in recent years have simplified the model structure and demonstrated competitive performance. However, most of the models suffer from slow convergence and poor detection of small objects due to the way the Transformer attention module handles feature maps. To address these issues, this study proposes a Transformer detection model based on a pre-filtered attention module. Using the target point as reference, the module only samples a part of the feature points near the target point, which saves training time and improves detection accuracy. A newly defined directional relative position encoding is also integrated in the module. The encoding compensates for the lack of relative position information in the module due to the weight calculation that is more helpful for the detection of small objects. Experiments on the COCO 2017 dataset show that our model reduces the training time by a factor of 10 and improves the detection accuracy, especially on small object detection by 26.8 APs.

    参考文献
    相似文献
    引证文献
引用本文

王琪,赵文仓.基于预过滤注意力的Transformer物体检测[J].电子测量技术,2022,45(24):145-152

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-03-08
  • 出版日期: