基于HRAGS模型的混合式摘要生成方法
DOI:
CSTR:
作者:
作者单位:

中北大学信息与通信工程学院 太原 030051

作者简介:

通讯作者:

中图分类号:

TP391.1

基金项目:


Hybrid summary generation method based on HRAGS model
Author:
Affiliation:

School of Information and Communication Engineering, North University of China, North University of China, Taiyuan 030051, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对传统的抽取式、生成式方法在摘要自动生成任务上存在可读性、准确性不足的问题,提出了基于HRAGS (Hybrid Guided Summarization with Redundancy-Aware) 模型的混合式摘要生成方法。该方法首先使用BERT预训练语言模型获取上下文句子表示,结合冗余感知方法构造抽取模型;然后将训练完毕的BERT双编码器和随机初始化的具有双编码-解码注意力模块的Transformer解码器相结合构造生成模型,采用二阶段微调策略解决编、解码器训练不平衡的问题;最后使用Oracle贪婪算法选择关键句作为指导信号,将原文和指导信号分别输入生成模型以获取摘要。在LCSTS数据集上进行验证,实验结果表明,相比于其他基准模型,HRAGS模型能够生成更具可读性、准确性和ROUGE得分更高的摘要。

    Abstract:

    Traditional extractive and abstractive methods lack readability and accuracy in the summary auto-generated task, so a HRAGS (Hybrid Guided Summarization with Redundancy-Aware) model-based hybrid summary generation method was proposed. First, the method used the BERT pre-trained language model to obtain a contextual representation and combined with redundancy-aware method to construct an extractive model. Then a couple of trained BERT encoders were united with a randomly-initialized Transformer decoder contained two encoder-decoder attention modules to construct an abstractive model. The abstractive model adopted a two-staged fine-tuning approach to resolve the training imbalance problem between encoders and decoders. Finally, an Oracle greedy algorithm chose key sentences as external guidance and source document with guidance were put into the abstractive model to acquire a summary, which was verified on the LCSTS evaluation dataset. Experimental results shows that the HRAGS model can generate a more readable, accurate and high ROUGE score summary compared with other benchmark models.

    参考文献
    相似文献
    引证文献
引用本文

岳 琳,杨风暴,王肖霞.基于HRAGS模型的混合式摘要生成方法[J].电子测量技术,2022,45(15):75-83

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-04-08
  • 出版日期:
文章二维码