一种融合间接注意力的自适应特征提取方法
DOI:
作者:
作者单位:

南京理工大学自动化学院 南京 210094

作者简介:

通讯作者:

中图分类号:

TP391

基金项目:


A method of adaptive feature extraction with indirect attention
Author:
Affiliation:

Automation College, Nanjing University of Science and Technology, Nanjing 210094, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对基于ViT模型的细粒度图像识别算法存在特征提取不全面、参数选取不具普适性等问题,提出一种融合间接注意力的自适应特征提取方法(Adaptive Feature Extraction with Indirect Attention,AFEIA)。首先,对于目标对象的特征提取,采用改进后的自然断点分类算法将特征分为最相关、次相关、不相关三种,对不同的输入样本可以自适应地提取最具辨别性特征,保证了特征提取的准确性;然后,利用注意力权重矩阵,获取被忽略特征中与目标对象间接相关的特征,以获取各对象之间细微的差异,保证了特征提取的全面性。实验表明,使用AFEIA方法的ViT模型在两个细粒度数据集CUB-200-2011、Stanford Dogs上分别达到91.6%、91.5%的预测准确率,通过可视化方法和消融实验,验证了AFEIA方法的有效性。

    Abstract:

    The fine-grained image recognition algorithm based on the ViT model has some problems, such as feature extraction is not comprehensive and parameter selection is not universal. To solve these problems, this paper presents an Adaptive Feature Extraction method with Indirect Attention (AFEIA). Firstly, to classify the characteristics of the object as most relevant, less relevant, and irrelevant, the improved natural breakpoint classification algorithm is used. This method can extract the most discriminative features adaptively for different input samples, which ensures the accuracy of feature extraction. Secondly, the attention weight matrix is used to obtain the features that are indirectly related to the object. This method acquires subtle differences between objects and ensures comprehensive feature extraction. Experiments show that the ViT model using the AFEIA method achieved 91.6% and 91.5% prediction accuracy on two fine-grained datasets CUB-200-2011, and Stanford Dogs, respectively. Visualization methods and ablation experiments verified the effectiveness of the AFEIA method.

    参考文献
    相似文献
    引证文献
引用本文

张书恒,李 军,张礼轩,王子文.一种融合间接注意力的自适应特征提取方法[J].电子测量技术,2022,45(21):75-81

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-03-19
  • 出版日期: