基于ICNet模型的人体步态识别研究
DOI:
CSTR:
作者:
作者单位:

成都理工大学机电工程学院, 成都 610059

作者简介:

通讯作者:

中图分类号:

TP75

基金项目:

国家重点研发计划项目(2018YFC1505102)资助


Research on gait recognition of human body based on ICNet model
Author:
Affiliation:

School of mechanical and electrical engineering, Chengdu University of technology, Chengdu 610059

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    步态作为一种行为特征,具有非侵犯性高、伪装性低和远距离识别等特点,具有广阔的应用前景。在实际应用时,步态识别易受环境因素干扰,识别率低。本文提出了一种在胶囊网络中引入空间注意机制,提升有效步态特征在胶囊的权重,又通过反馈权重矩阵的设计,更新输入图像,从而获得网络性能提升的方法。该方法在CASIA-B数据集进行了大量的实验。在正常行走、带包行走、大衣行走三种不同的行走条件下,平均识别率分别达到93%,85%,67%。同时在OU-MVLP数据集上进行了多视角的步态识别实验,平均识别率达到了85%。

    Abstract:

    Gait recognition refers to the technology of identity verification by identifying the walking posture of pedestrians. Different from the physiological characteristics such as fingerprints and palmprint that need close contact, gait, as a behavioral feature, has the characteristics of high non-invasive, low camouflage and long-distance recognition. Therefore, gait recognition has broad application prospects in various fields. This paper proposes a gait feature recognition method based on capsule network, and introduces spatial attention mechanism in the capsule network to improve the weight of effective gait features in the capsule, and updates the input image through the design of feedback weight matrix to improve the performance. The designed gait recognition model based on capsule network has been tested on casia-b dataset. The average recognition rate is 93%, 85% and 67% respectively under three different walking conditions: normal walking, walking with bags and walking with coats. At the same time, a multi view gait recognition experiment is carried out on the ou-mvlp data set, and the average recognition rate reaches 85%.

    参考文献
    相似文献
    引证文献
引用本文

曾维,何刚强,罗伟洋,郭翼凌.基于ICNet模型的人体步态识别研究[J].电子测量技术,2022,45(4):120-125

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-06-12
  • 出版日期:
文章二维码