《计算机应用》唯一官方网站 ›› 2024, Vol. 44 ›› Issue (9): 2739-2746.DOI: 10.11772/j.issn.1001-9081.2023091320
收稿日期:
2023-09-26
修回日期:
2023-11-14
接受日期:
2023-11-20
发布日期:
2023-12-01
出版日期:
2024-09-10
通讯作者:
黄铝文
作者简介:
任烈弘(1997—),男,山西吕梁人,硕士研究生,CCF会员,主要研究方向:时间序列预测、深度学习基金资助:
Liehong REN1, Lyuwen HUANG1(), Xu TIAN1, Fei DUAN2
Received:
2023-09-26
Revised:
2023-11-14
Accepted:
2023-11-20
Online:
2023-12-01
Published:
2024-09-10
Contact:
Lyuwen HUANG
About author:
REN Liehong, born in 1997, M. S. candidate. His research interests include time series prediction, deep learning.Supported by:
摘要:
在进行多变量长时间序列预测时,仅利用时域分析通常无法充分捕捉长时间序列依赖,而这会导致信息利用率不足、预测精度不够高。因此,结合频域时域分析,提出一种基于离散傅里叶变换(DFT)的频率敏感双分支多变量长时间序列预测(FSDformer)方法。首先,通过DFT实现时间和频率的相互转换,从而将复杂的时间序列数据分解为结构简单的低频趋势项、中频季节项和高频余项3个分量;其次,采用双分支结构,针对中高频分量预测,应用Encoder-Decoder结构,设计了周期性增强注意力机制;针对低频趋势分量预测,采用多层感知机(MLP)结构;最后将中高频分量与低频分量预测结果相加,得到多变量长时间序列的最终预测结果。在2个数据集上把FSDformer与其他5个经典算法进行了对比分析,在Electricity数据集上,当历史序列长度为96,预测序列长度为336时,相较于Autoformer等对比算法,FSDformer的平均绝对误差(MAE)下降了11.5%~29.1%,均方误差(MSE)下降了20.9%~43.7%,达到了最优预测精度。实验结果表明,FSDformer能有效捕捉长时间序列的相关依赖,在提升预测精度和计算效率的同时,增强了模型预测的稳定性。
中图分类号:
任烈弘, 黄铝文, 田旭, 段飞. 基于DFT的频率敏感双分支Transformer多变量长时间序列预测方法[J]. 计算机应用, 2024, 44(9): 2739-2746.
Liehong REN, Lyuwen HUANG, Xu TIAN, Fei DUAN. Multivariate long-term series forecasting method with DFT-based frequency-sensitive dual-branch Transformer[J]. Journal of Computer Applications, 2024, 44(9): 2739-2746.
数据集 | 采样时间间隔/h | 数据维度 | 时间步 |
---|---|---|---|
Electricity | 1 | 321 | 26 304 |
Traffic | 1 | 862 | 17 544 |
表1 数据集基本信息
Tab. 1 Basic information of datasets
数据集 | 采样时间间隔/h | 数据维度 | 时间步 |
---|---|---|---|
Electricity | 1 | 321 | 26 304 |
Traffic | 1 | 862 | 17 544 |
数据集 | Lpred | FSDformer | Autoformer | Informer | LogTrans | Pyraformer | LST-Net | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | ||
Electricity | 96 | 0.180 | 0.288 | 0.206 | 0.321 | 0.324 | 0.423 | 0.262 | 0.358 | 0.386 | 0.449 | 0.283 | 0.357 |
192 | 0.205 | 0.315 | 0.250 | 0.356 | 0.355 | 0.438 | 0.271 | 0.371 | 0.386 | 0.443 | 0.329 | 0.385 | |
336 | 0.223 | 0.332 | 0.288 | 0.375 | 0.396 | 0.468 | 0.282 | 0.382 | 0.378 | 0.443 | 0.357 | 0.391 | |
720 | 0.262 | 0.364 | 0.282 | 0.384 | 0.792 | 0.687 | 0.285 | 0.376 | 0.376 | 0.445 | 0.442 | 0.442 | |
Traffic | 96 | 0.597 | 0.383 | 0.658 | 0.411 | 0.797 | 0.452 | 0.685 | 0.384 | 2.085 | 0.468 | 1.107 | 0.668 |
192 | 0.634 | 0.403 | 0.652 | 0.412 | 0.946 | 0.535 | 0.688 | 0.392 | 0.867 | 0.467 | 1.159 | 0.712 | |
336 | 0.647 | 0.404 | 0.652 | 0.407 | 1.382 | 0.760 | 0.735 | 0.408 | 0.869 | 0.469 | 1.220 | 0.733 | |
720 | 0.677 | 0.428 | 0.684 | 0.423 | 1.152 | 0.649 | 0.717 | 0.397 | 0.881 | 0.473 | 1.482 | 0.807 |
表2 不同预测长度下多变量时间序列预测结果
Tab. 2 Prediction results of multivariate time-series with different prediction lengths
数据集 | Lpred | FSDformer | Autoformer | Informer | LogTrans | Pyraformer | LST-Net | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | ||
Electricity | 96 | 0.180 | 0.288 | 0.206 | 0.321 | 0.324 | 0.423 | 0.262 | 0.358 | 0.386 | 0.449 | 0.283 | 0.357 |
192 | 0.205 | 0.315 | 0.250 | 0.356 | 0.355 | 0.438 | 0.271 | 0.371 | 0.386 | 0.443 | 0.329 | 0.385 | |
336 | 0.223 | 0.332 | 0.288 | 0.375 | 0.396 | 0.468 | 0.282 | 0.382 | 0.378 | 0.443 | 0.357 | 0.391 | |
720 | 0.262 | 0.364 | 0.282 | 0.384 | 0.792 | 0.687 | 0.285 | 0.376 | 0.376 | 0.445 | 0.442 | 0.442 | |
Traffic | 96 | 0.597 | 0.383 | 0.658 | 0.411 | 0.797 | 0.452 | 0.685 | 0.384 | 2.085 | 0.468 | 1.107 | 0.668 |
192 | 0.634 | 0.403 | 0.652 | 0.412 | 0.946 | 0.535 | 0.688 | 0.392 | 0.867 | 0.467 | 1.159 | 0.712 | |
336 | 0.647 | 0.404 | 0.652 | 0.407 | 1.382 | 0.760 | 0.735 | 0.408 | 0.869 | 0.469 | 1.220 | 0.733 | |
720 | 0.677 | 0.428 | 0.684 | 0.423 | 1.152 | 0.649 | 0.717 | 0.397 | 0.881 | 0.473 | 1.482 | 0.807 |
FSDformer | Autoformer | Informer | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | |
72 | 0.157 | 0.274 | 0.262 | 0.367 | 0.165 | 0.285 | 0.279 | 0.376 | 0.285 | 0.385 | 0.370 | 0.439 |
96 | 0.162 | 0.278 | 0.262 | 0.369 | 0.174 | 0.291 | 0.282 | 0.384 | 0.299 | 0.394 | 0.389 | 0.454 |
120 | 0.146 | 0.262 | 0.252 | 0.359 | 0.167 | 0.288 | 0.260 | 0.363 | 0.337 | 0.419 | 0.394 | 0.456 |
144 | 0.144 | 0.258 | 0.250 | 0.360 | 0.168 | 0.287 | 0.264 | 0.366 | 0.391 | 0.445 | 0.400 | 0.463 |
168 | 0.137 | 0.251 | 0.250 | 0.357 | 0.161 | 0.279 | 0.239 | 0.348 | 0.318 | 0.410 | 0.396 | 0.458 |
192 | 0.129 | 0.242 | 0.232 | 0.342 | 0.165 | 0.284 | 0.265 | 0.368 | 0.307 | 0.399 | 0.414 | 0.472 |
336 | 0.126 | 0.241 | 0.216 | 0.326 | 0.170 | 0.291 | 0.247 | 0.351 | 0.354 | 0.440 | 0.424 | 0.477 |
504 | 0.123 | 0.237 | 0.225 | 0.332 | 0.180 | 0.300 | 0.249 | 0.355 | 0.358 | 0.443 | 0.423 | 0.475 |
672 | 0.124 | 0.237 | 0.229 | 0.338 | 0.188 | 0.304 | 0.273 | 0.371 | 0.327 | 0.416 | 0.378 | 0.443 |
720 | 0.121 | 0.234 | 0.220 | 0.327 | 0.186 | 0.302 | 0.254 | 0.356 | 0.355 | 0.432 | 0.401 | 0.456 |
960 | 0.121 | 0.235 | 0.221 | 0.330 | 0.194 | 0.309 | 0.275 | 0.368 | 0.411 | 0.457 | 0.400 | 0.451 |
表3 在Electricity数据集上不同输入长度下多变量时间序列预测结果
Tab. 3 Prediction results of multivariate time-series with different input lengths on Electricity dataset
FSDformer | Autoformer | Informer | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | |
72 | 0.157 | 0.274 | 0.262 | 0.367 | 0.165 | 0.285 | 0.279 | 0.376 | 0.285 | 0.385 | 0.370 | 0.439 |
96 | 0.162 | 0.278 | 0.262 | 0.369 | 0.174 | 0.291 | 0.282 | 0.384 | 0.299 | 0.394 | 0.389 | 0.454 |
120 | 0.146 | 0.262 | 0.252 | 0.359 | 0.167 | 0.288 | 0.260 | 0.363 | 0.337 | 0.419 | 0.394 | 0.456 |
144 | 0.144 | 0.258 | 0.250 | 0.360 | 0.168 | 0.287 | 0.264 | 0.366 | 0.391 | 0.445 | 0.400 | 0.463 |
168 | 0.137 | 0.251 | 0.250 | 0.357 | 0.161 | 0.279 | 0.239 | 0.348 | 0.318 | 0.410 | 0.396 | 0.458 |
192 | 0.129 | 0.242 | 0.232 | 0.342 | 0.165 | 0.284 | 0.265 | 0.368 | 0.307 | 0.399 | 0.414 | 0.472 |
336 | 0.126 | 0.241 | 0.216 | 0.326 | 0.170 | 0.291 | 0.247 | 0.351 | 0.354 | 0.440 | 0.424 | 0.477 |
504 | 0.123 | 0.237 | 0.225 | 0.332 | 0.180 | 0.300 | 0.249 | 0.355 | 0.358 | 0.443 | 0.423 | 0.475 |
672 | 0.124 | 0.237 | 0.229 | 0.338 | 0.188 | 0.304 | 0.273 | 0.371 | 0.327 | 0.416 | 0.378 | 0.443 |
720 | 0.121 | 0.234 | 0.220 | 0.327 | 0.186 | 0.302 | 0.254 | 0.356 | 0.355 | 0.432 | 0.401 | 0.456 |
960 | 0.121 | 0.235 | 0.221 | 0.330 | 0.194 | 0.309 | 0.275 | 0.368 | 0.411 | 0.457 | 0.400 | 0.451 |
Lin | FSDformer | FSDformer-Full | ||
---|---|---|---|---|
MSE | MAE | MSE | MAE | |
72 | 0.157 | 0.274 | 0.176 | 0.293 |
96 | 0.162 | 0.278 | 0.169 | 0.288 |
168 | 0.146 | 0.262 | 0.149 | 0.266 |
192 | 0.144 | 0.258 | 0.140 | 0.258 |
336 | 0.137 | 0.251 | 0.139 | 0.257 |
504 | 0.129 | 0.242 | 0.135 | 0.252 |
672 | 0.126 | 0.241 | 0.132 | 0.247 |
720 | 0.123 | 0.237 | 0.127 | 0.242 |
960 | 0.124 | 0.237 | 0.132 | 0.249 |
表4 在Electricity数据集上不同注意力机制下的模型预测结果
Tab. 4 Model prediction results with different attention mechanisms on Electricity dataset
Lin | FSDformer | FSDformer-Full | ||
---|---|---|---|---|
MSE | MAE | MSE | MAE | |
72 | 0.157 | 0.274 | 0.176 | 0.293 |
96 | 0.162 | 0.278 | 0.169 | 0.288 |
168 | 0.146 | 0.262 | 0.149 | 0.266 |
192 | 0.144 | 0.258 | 0.140 | 0.258 |
336 | 0.137 | 0.251 | 0.139 | 0.257 |
504 | 0.129 | 0.242 | 0.135 | 0.252 |
672 | 0.126 | 0.241 | 0.132 | 0.247 |
720 | 0.123 | 0.237 | 0.127 | 0.242 |
960 | 0.124 | 0.237 | 0.132 | 0.249 |
Lin | FSDformer | Exchange | ||
---|---|---|---|---|
MSE | MAE | MSE | MAE | |
72 | 0.157 | 0.274 | 0.306 | 0.406 |
96 | 0.162 | 0.278 | 0.295 | 0.399 |
168 | 0.146 | 0.262 | 0.265 | 0.376 |
192 | 0.144 | 0.258 | 0.266 | 0.375 |
336 | 0.137 | 0.251 | 0.292 | 0.393 |
504 | 0.129 | 0.242 | 0.286 | 0.389 |
672 | 0.126 | 0.241 | 0.253 | 0.364 |
720 | 0.123 | 0.237 | 0.244 | 0.362 |
960 | 0.124 | 0.237 | 0.267 | 0.372 |
表5 在Electricity数据集上频率敏感双分支结构交换前后的预测结果
Tab. 5 Prediction results before and after frequency sensitive dual-branch structure exchange on Electricity dataset
Lin | FSDformer | Exchange | ||
---|---|---|---|---|
MSE | MAE | MSE | MAE | |
72 | 0.157 | 0.274 | 0.306 | 0.406 |
96 | 0.162 | 0.278 | 0.295 | 0.399 |
168 | 0.146 | 0.262 | 0.265 | 0.376 |
192 | 0.144 | 0.258 | 0.266 | 0.375 |
336 | 0.137 | 0.251 | 0.292 | 0.393 |
504 | 0.129 | 0.242 | 0.286 | 0.389 |
672 | 0.126 | 0.241 | 0.253 | 0.364 |
720 | 0.123 | 0.237 | 0.244 | 0.362 |
960 | 0.124 | 0.237 | 0.267 | 0.372 |
1 | 李大社,孙元威,阮俊虎. 基于GOSSA和HMM的时间序列预测算法[J]. 电子学报, 2023, 51(9):2492-2503. |
LI D S, SUN Y W, RUAN J H. Time series prediction algorithm based on GOSSA and HMM [J]. Acta Electronica Sinica, 2023, 51(9):2492-2503. | |
2 | SHIH S Y, SUN F K, LEE H Y. Temporal pattern attention for multivariate time series forecasting [J]. Machine Learning, 2019, 108(8/9): 1421-1441. |
3 | WU J, XU K, CHEN X, et al. Price graphs: utilizing the structural information of financial time series for stock prediction [J]. Information Sciences, 2022, 588: 405-424. |
4 | PEREIRA D F, LOPES F D C, WATANABE E H. Nonlinear model predictive control for the energy management of fuel cell hybrid electric vehicles in real time [J]. IEEE Transactions on Industrial Electronics, 2021, 68(4): 3213-3223. |
5 | SUN P, ALJERI N, BOUKERCHE A. Machine learning-based models for real-time traffic flow prediction in vehicular networks[J]. IEEE Network, 2020, 34(3): 178-185. |
6 | KAREVAN Z, SUYKENS J A K. Transductive LSTM for time-series prediction: an application to weather forecasting [J]. Neural Networks, 2020, 125: 1-9. |
7 | LIU Y, GONG C, YANG L, et al. DSTP-RNN: a dual-stage two-phase attention-based recurrent neural network for long-term and multivariate time series prediction [J]. Expert Systems with Applications, 2020, 143: No.113082. |
8 | SAGHEER A, KOTB M. Time series forecasting of petroleum production using deep LSTM recurrent networks [J]. Neurocomputing, 2019, 323: 203-213. |
9 | ZHOU H, ZHANG S, PENG J, et al. Informer: beyond efficient Transformer for long sequence time-series forecasting [C]// Proceedings of the 35th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2021: 11106-11115. |
10 | ILHAN F, KARAAHMETOGLU O, BALABAN I, et al. Markovian RNN: an adaptive time series prediction network with HMM-based switching for nonstationary environments [J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(2): 715-728. |
11 | XU X, YONEDA M. Multitask air-quality prediction based on LSTM-autoencoder model [J]. IEEE Transactions on Cybernetics, 2021, 51(5): 2577-2586. |
12 | LI X, MA X, XIAO F, et al. Time-series production forecasting method based on the integration of Bidirectional Gated Recurrent Unit (Bi-GRU) network and Sparrow Search Algorithm (SSA) [J]. Journal of Petroleum Science and Engineering, 2022, 208(Pt A): No.109309. |
13 | LAI G, CHANG W C, YANG Y, et al. Modeling long- and short-term temporal patterns with deep neural networks [C]// Proceedings of the 41st International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2018: 95-104. |
14 | VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2017: 6000-6010. |
15 | LIU S, YU H, LIAO C, et al. Pyraformer: low-complexity pyramidal attention for long-range time series modeling and forecasting [EB/OL]. (2023-02-14) [2023-05-23]. . |
16 | ZHOU T, MA Z, WEN Q, et al. FEDformer: frequency enhanced decomposed Transformer for long-term series forecasting [C]// Proceedings of the 39th International Conference on Machine Learning. New York: JMLR.org, 2022: 27268-27286. |
17 | TAY Y, DEHGHANI M, ABNAR S, et al. Long range arena: a benchmark for efficient Transformers [EB/OL]. (2020-11-08) [2023-03-22]. . |
18 | XIONG T, LI C, BAO Y. Seasonal forecasting of agricultural commodity price using a hybrid STL and ELM method: evidence from the vegetable market in China [J]. Neurocomputing, 2018, 275: 2831-2844. |
19 | WEN Q, ZHANG Z, LI Y, et al. Fast RobustSTL: efficient and robust seasonal-trend decomposition for time series with complex patterns [C]// Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2020: 2203-2213. |
20 | GUO L, FANG W, ZHAO Q, et al. The hybrid PROPHET-SVR approach for forecasting product time series demand with seasonality [J]. Computers and Industrial Engineering, 2021, 161: No.107598. |
21 | SEN R, YU H F, DHILLON I. Think globally, act locally: a deep neural network approach to high-dimensional time series forecasting [C]// Proceedings of the 33rd International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2019: No.435. |
22 | WU H, XU J, WANG J, et al. Autoformer: decomposition Transformers with auto-correlation for long-term series forecasting[EB/OL]. [2023-04-14]. . |
23 | 金苍宏,董腾然,陈天翼,等. 融合序列分解与时空卷积的时序预测算法[J]. 电子学报, 2021, 49(2): 233-238. |
JIN C H, DONG T R, CHEN T Y, et al. Spatio-temporal convolutional forecasting based on time-series decomposition strategy [J]. Acta Electronica Sinica, 2021, 49(2): 233-238. | |
24 | ASADI R, REGAN A C. A spatio-temporal decomposition based deep neural network for time series forecasting [J]. Applied Soft Computing, 2020, 87: No.105963. |
25 | 夏进,王正群,朱世明. 基于时间序列分解的交通流量预测模型[J]. 计算机应用, 2023, 43(4): 1129-1135. |
XIA J, WANG Z Q, ZHU S M. Traffic flow prediction model based on time series decomposition [J]. Journal of Computer Applications, 2023, 43(4): 1129-1135. |
[1] | 贾洁茹, 杨建超, 张硕蕊, 闫涛, 陈斌. 基于自蒸馏视觉Transformer的无监督行人重识别[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2893-2902. |
[2] | 李金金, 桑国明, 张益嘉. APK-CNN和Transformer增强的多域虚假新闻检测模型[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2674-2682. |
[3] | 黄云川, 江永全, 黄骏涛, 杨燕. 基于元图同构网络的分子毒性预测[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2964-2969. |
[4] | 杨鑫, 陈雪妮, 吴春江, 周世杰. 结合变种残差模型和Transformer的城市公路短时交通流预测[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2947-2951. |
[5] | 范黎林, 曹富康, 王琬婷, 杨凯, 宋钊瑜. 基于需求模式自适应匹配的间歇性需求预测方法[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2747-2755. |
[6] | 方介泼, 陶重犇. 应对零日攻击的混合车联网入侵检测系统[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2763-2769. |
[7] | 丁宇伟, 石洪波, 李杰, 梁敏. 基于局部和全局特征解耦的图像去噪网络[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2571-2579. |
[8] | 邓凯丽, 魏伟波, 潘振宽. 改进掩码自编码器的工业缺陷检测方法[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2595-2603. |
[9] | 杨帆, 邹窈, 朱明志, 马振伟, 程大伟, 蒋昌俊. 基于图注意力Transformer神经网络的信用卡欺诈检测模型[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2634-2642. |
[10] | 李大海, 王忠华, 王振东. 结合空间域和频域信息的双分支低光照图像增强网络[J]. 《计算机应用》唯一官方网站, 2024, 44(7): 2175-2182. |
[11] | 黎施彬, 龚俊, 汤圣君. 基于Graph Transformer的半监督异配图表示学习模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1816-1823. |
[12] | 吕锡婷, 赵敬华, 荣海迎, 赵嘉乐. 基于Transformer和关系图卷积网络的信息传播预测模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1760-1766. |
[13] | 徐泽鑫, 杨磊, 李康顺. 较短的长序列时间序列预测模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1824-1831. |
[14] | 黄梦源, 常侃, 凌铭阳, 韦新杰, 覃团发. 基于层间引导的低光照图像渐进增强算法[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1911-1919. |
[15] | 孙子文, 钱立志, 杨传栋, 高一博, 陆庆阳, 袁广林. 基于Transformer的视觉目标跟踪方法综述[J]. 《计算机应用》唯一官方网站, 2024, 44(5): 1644-1654. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||