基于分解技术的IZOA-Transformer-BiGRU短期风电功率预测
DOI:
CSTR:
作者:
作者单位:

贵州大学

作者简介:

通讯作者:

中图分类号:

TN91 , TM614

基金项目:

国家自然科学基金(62261005);贵州省科技支撑项目(黔科合[2022]一般017,黔科合ZK[2022]135);贵州电科院2023年计及显著季节负荷与光伏波动性的单相调容变压器关键技术与示范(K23-0109-014)


IZOA-Transformer-BiGRU short-term wind power prediction based on decomposition technique
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    准确的风电功率预测对于保障电网平稳运行和提升风资源利用效率具有重要意义。针对风电功率数据的非平稳性和间歇性等特征,本文提出了一种结合数据分解技术的IZOA-Transformer-BiGRU组合预测模型,以提升短期风电功率预测的精度和可靠性。首先,采用能量差值法确定变分模态分解(VMD)的子模态数,将具有较强随机波动性的原始风电功率分解为一系列相对平稳的子序列,从而更加充分地提取时序特征。其次,构建Transformer-BiGRU模型,引入多头注意力机制并行处理多个特征之间的交互关系,并利用BiGRU捕捉时序序列间的前后依赖性,从而提升预测性能。为了进一步优化模型性能,采用融合Singer混沌映射、透镜折射反向学习和单纯形法策略的改进斑马优化算法(IZOA),对Transformer-BiGRU模型的隐藏层神经元数、初始学习率、正则化系数和多头注意力头数四个关键超参数进行优化。最后,通过IZOA-Transformer-BiGRU对分解后的各子序列进行预测,经过叠加重构得到最终的预测结果。实验结果表明,与单一BiGRU模型相比,所提模型的决定系数提升了5.10%,平均绝对误差、均方根误差以及平均绝对百分比误差分别降低了56.17%、54.58%、54.55%,具有较高的预测精度。

    Abstract:

    Accurate wind power prediction is crucial for ensuring the stable operation of power grids and improving the efficiency of wind resource utilization. To address the non-stationary and intermittent characteristics of wind power data, this paper proposes a combined IZOA-Transformer-BiGRU prediction model based on data decomposition techniques to enhance the accuracy and reliability of short-term wind power forecasting. First, the energy difference method is employed to determine the number of sub-modalities for variational mode decomposition, which decomposes the original wind power with strong random fluctuations into a series of relatively stable sub-sequences, enabling better more effective extraction of temporal features. Next, the Transformer-BiGRU model is constructed, incorporating a multi-head attention mechanism to process interactions between multiple features in parallel, while the BiGRU component captures temporal dependencies within the sequence, thus enhancing prediction performance. To further improve the model’s forecasting accuracy, an improved zebra optimization algorithm, integrating singer chaotic mapping, lens refraction-based learning, and the simplex method, is developed to optimize four key hyperparameters of the Transformer-BiGRU model: the number of hidden layer neurons, initial learning rate, regularization coefficient, and the number of attention heads. Finally, the IZOA-Transformer-BiGRU model predicts the subsequences derived from VMD, and the final prediction is reconstructed through aggregation. Experimental results show that, compared to the standalone BiGRU model, the proposed model improves the coefficient of determination by 5.10% and reduces the mean absolute error, root mean square error, and mean absolute percentage error by 56.17%, 54.58%, and 54.55%, respectively. demonstrating its high prediction accuracy.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-10-17
  • 最后修改日期:2024-12-06
  • 录用日期:2024-12-06
  • 在线发布日期:
  • 出版日期:
文章二维码
×
《电子测量技术》
财务封账不开票通知