《计算机应用》唯一官方网站

• •    下一篇

较短的长序列时间序列预测模型

徐泽鑫1,杨磊1,李康顺2   

  1. 1. 华南农业大学数学与信息学院
    2. 华南农业大学 信息学院
  • 收稿日期:2023-06-21 修回日期:2023-08-10 发布日期:2023-08-21 出版日期:2023-08-21
  • 通讯作者: 杨磊
  • 基金资助:
    国家自然科学基金;广东省自然科学基金;广州市农业科技特派员项目

Shorter long-sequence time series forecasting model

  • Received:2023-06-21 Revised:2023-08-10 Online:2023-08-21 Published:2023-08-21
  • Supported by:
    National Natural Science Foundation of China;Natural Science Foundation of Guangdong Province;Agricultural Science and Technology Commissioner Project of Guangzhou City

摘要: 针对现有的研究大多都将短序列时间序列预测和长序列时间序列预测分开研究而导致模型在较短的长序列时序预测精度较差的问题,提出了一种较短的长序列时间序列预测模型(SLTSFM)。首先,利用卷积神经网络(CNN)和新的PBUSM(Probsparse Based on Uniform Selection Mechanism)自注意力机制搭建了一个序列到序列结构(S2S),该结构用于提取长序列输入的特征。然后,新设计 的“远轻近重”策略将多个短序列输入特征提取能力较强的长短时记忆(LSTM)模块提取的各时段数据特征进行重分配。最后,将重分配完的特征对提取到的长序列输入特征进行增强,提高预测精度,并实现时序预测。利用四个公开的时间序列数据集验证模型的有效性。实验结果表明,与表现最优的对比模型循环门单元(GRU)相比,SLTSFM的平均绝对误差(MAE)指标在四个数据集上的单变量时序预测分别减小了61.54%,13.48%,0.92%和19.58%,多变量时序预测分别减小了17.01%,18.13%,3.24%和6.73%。由此可见,模型在提升较短的长序列时序预测精度方面十分有效。

关键词: 较短的长序列时间序列预测, 序列到序列, 长短期记忆, 自注意力机制, 特征重分配

Abstract: Aiming at the problem that most of the existing researches have studied short-sequence time series forecasting and long-sequence time series forecasting separately, which leads to the poor forecasting accuracy of the model in the shorter long-sequence time series, a Shorter Long-sequence Time Series Forecasting Model (SLTSFM) was proposed. Firstly, a Sequence-To-Sequence (S2S) structure was constructed using Convolutional Neural Network (CNN) and a new PBUSM (Probsparse Based on Uniform Selection Mechanism) self-attention mechanism, which was used to extract the features of the long-sequence input. Secondly, the newly designed "far light, near heavy" strategy reallocated the features of each time period extracted from multiple Long Short-Term Memory (LSTM) modules, which were more capable of short-sequence input feature extraction. Finally, the reassigned feature was used to enhance the extracted long-sequence input feature to improve the forecasting accuracy and realized the time series forecasting. Four publicly available time series datasets were utilized to verify the effectiveness of the model. When compared with the best-performing comparative model, Gated Recurrent Unit (GRU), the experimental results show that the Mean Absolute Error (MAE) metrics of SLTSFM are reduced by 61.54%, 13.48%, 0.92% and 19.58% for univariate time series forecasting on the four datasets. Reducing by 17.01%, 18.13%, 3.24% and 6.73% for multivariate time series forecasting. Demonstrating that the model is effective in improving the accuracy of shorter long-sequence time series forecasting.

Key words: shorter long-sequence time series forecasting, Sequence-To-Sequence (S2S), Long Short-Term Memory (LSTM), self-attention mechanism, feature reallocation

中图分类号: