高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于统计特征搜索的多元时间序列预测方法

潘金伟 王乙乔 钟博 王晓玲

潘金伟, 王乙乔, 钟博, 王晓玲. 基于统计特征搜索的多元时间序列预测方法[J]. 电子与信息学报, 2024, 46(8): 3276-3284. doi: 10.11999/JEIT231264
引用本文: 潘金伟, 王乙乔, 钟博, 王晓玲. 基于统计特征搜索的多元时间序列预测方法[J]. 电子与信息学报, 2024, 46(8): 3276-3284. doi: 10.11999/JEIT231264
PAN Jinwei, WANG Yiqiao, ZHONG Bo, WANG Xiaoling. Statistical Feature-based Search for Multivariate Time Series Forecasting[J]. Journal of Electronics & Information Technology, 2024, 46(8): 3276-3284. doi: 10.11999/JEIT231264
Citation: PAN Jinwei, WANG Yiqiao, ZHONG Bo, WANG Xiaoling. Statistical Feature-based Search for Multivariate Time Series Forecasting[J]. Journal of Electronics & Information Technology, 2024, 46(8): 3276-3284. doi: 10.11999/JEIT231264

基于统计特征搜索的多元时间序列预测方法

doi: 10.11999/JEIT231264
基金项目: 国家自然科学基金(61972155)
详细信息
    作者简介:

    潘金伟:男,硕士,研究方向为多元时间序列分析

    王乙乔:女,硕士生,研究方向为多元时间序列预测与分类

    钟博:男,硕士生,研究方向为多元时间序列自监督学习

    王晓玲:女,教授,博士生导师,研究方向为分布式图数据处理技术、知识图谱、序列推荐与序列数据分析

    通讯作者:

    王晓玲 xlwang@cs.ecnu.edu.cn

  • 中图分类号: TN911.7; TP391

Statistical Feature-based Search for Multivariate Time Series Forecasting

Funds: The National Natural Science Foundation of China (61972155)
  • 摘要: 时间序列中包含一些长期依赖关系,如长期趋势性、季节性和周期性,这些长期依赖信息的跨度可能是以月为单位的,直接应用现有方法无法显式建模时间序列的超长期依赖关系。该文提出基于统计特征搜索的预测方法来显式地建模时间序列中的长期依赖。首先对多元时间序列中的平滑特征、方差特征和区间标准化特征等统计特征进行抽取,提高时间序列搜索对趋势性、周期性、季节性的感知。随后结合统计特征在历史序列搜索相似的序列,并利用注意力机制融合当前序列信息与历史序列信息,生成可靠的预测结果。在5个真实的数据集上的实验表明该文提出的方法优于6种最先进的方法。
  • 图  1  基于统计特征搜索的多元时间序列预测框架

    图  2  消融实验

    图  3  参数敏感度实验

    表  1  数据集基本信息

    变量数目长度时间粒度
    ETTh1&ETTh27174201 h
    ETTm1&ETTm276968015 min
    Exchange-Rate875881 d
    下载: 导出CSV

    表  2  各模型使用的主要技术对比

    卷积神经
    网络
    循环神经
    网络
    自注意力
    机制
    自监督
    学习
    特征分解
    TCN
    LSTNet
    LogTrans
    Informer
    TS2Vec
    Autoformer
    SFSF
    下载: 导出CSV

    表  3  ETTh1数据集结果

    Model2448168336720
    MSEMAEMSEMAEMSEMAEMSEMAEMSEMAE
    TCN0.7630.7420.8480.9481.1281.2271.3831.2941.6451.783
    LSTNet1.1811.1341.1771.3901.5401.5682.1781.9042.5931.985
    LogTrans0.6370.6120.8230.7240.9520.9171.3560.9881.3351.315
    Informer0.5650.5320.6750.6430.8210.7471.0960.8371.1840.873
    TS2Vec0.5760.4620.6980.6540.7660.7471.0680.7911.1530.917
    Autoformer0.4270.4150.4370.4740.4930.5310.5220.5360.5480.563
    SFSF-ED0.3630.3510.3760.4120.4530.5240.5150.5310.5530.541
    下载: 导出CSV

    表  7  Exchange-Rate数据集结果

    Model2448168336720
    MSEMAEMSEMAEMSEMAEMSEMAEMSEMAE
    TCN0.3230.4322.9681.4732.9811.4013.0891.4763.1391.472
    LSTNet0.3840.4391.5751.0651.5211.0031.5131.0852.2501.234
    LogTrans0.2530.3000.9670.8111.0840.8901.6141.1261.9201.128
    Informer0.3870.3750.8800.7231.1730.8721.7271.0772.4751.361
    TS2Vec0.3150.2770.3010.3690.7590.7171.1990.9141.5961.043
    Autoformer0.1580.2730.1950.4640.3330.4560.8690.8901.2980.927
    SFSF-ED0.1890.2040.3530.4900.6180.5920.7800.8701.2350.873
    SFSF-DTW0.1550.2370.2570.3190.3490.3960.7950.7581.0110.824
    下载: 导出CSV

    表  4  ETTh2数据集结果

    Model2448168336720
    MSEMAEMSEMAEMSEMAEMSEMAEMSEMAE
    TCN1.3280.8941.3570.9991.8951.5092.2071.4963.4981.539
    LSTNet1.4031.4611.6101.6442.2601.8132.5922.6283.6103.784
    LogTrans0.8650.7661.8421.0314.1241.6973.9011.7143.8821.594
    Informer0.6360.6281.4751.0023.5091.5282.7451.3723.5171.434
    TS2Vec0.4500.5340.6250.5581.9401.0932.3291.2572.6901.326
    Autoformer0.3380.3660.3740.3730.4810.4630.5250.4720.5360.489
    SFSF-ED0.3180.3220.3470.3320.3780.4130.4740.5720.5700.449
    下载: 导出CSV

    表  5  ETTm1数据集结果

    Model2448168336720
    MSEMAEMSEMAEMSEMAEMSEMAEMSEMAE
    TCN0.3500.4190.5310.3900.6610.6921.3061.3071.4261.424
    LSTNet1.9731.2122.0671.2182.7931.6181.2952.0981.8902.920
    LogTrans0.4460.3850.5540.6840.7010.8201.4221.2631.6791.439
    Informer0.3190.3180.3610.4540.5640.5371.3450.8523.3961.323
    TS2Vec0.4020.4370.5670.4810.5560.5640.7450.6750.7950.633
    Autoformer0.3770.4120.4840.4460.5280.4820.6190.5320.6530.617
    SFSF-ED0.3350.3480.3510.3140.3740.4130.5130.4960.5470.506
    下载: 导出CSV

    表  6  ETTm2数据集结果

    Model2448168336720
    MSEMAEMSEMAEMSEMAEMSEMAEMSEMAE
    TCN1.2713.1103.0341.3233.0691.3913.1201.3143.1061.405
    LSTNet1.2803.0863.1751.3243.1251.3743.1181.4343.2121.344
    LogTrans0.6930.4970.7570.5870.9800.7531.3440.8983.0731.308
    Informer0.2880.3610.3620.4150.6020.5581.3220.8613.3751.364
    TS2Vec0.2600.2250.3710.2790.3750.3870.5690.4250.6480.436
    Autoformer0.2280.2710.2430.3470.2910.3690.3340.3810.4470.441
    SFSF-ED0.2140.2560.2340.2930.2520.3190.3080.3740.3710.468
    下载: 导出CSV

    表  8  基于自注意力机制模型的时间复杂度

    方法 时间复杂度
    LogTrans O(Llog2L)
    Informer O(Llog2L)
    Autoformer O(Llog2L)
    SFSF O(KL2M)
    下载: 导出CSV
  • [1] ORESHKIN B N, CARPOV D, CHAPADOS N, et al. N-Beats: Neural basis expansion analysis for interpretable time series forecasting[C]. International Conference on Learning Representations, Addis Ababa, Ethiopia, 2020: 1–31.
    [2] SALINAS D, FLUNKERT V, GASTHAUS J, et al. DeepAR: Probabilistic forecasting with autoregressive recurrent networks[J]. International Journal of Forecasting, 2020, 36(3): 1181–1191. doi: 10.1016/j.ijforecast.2019.07.001.
    [3] BAI Shaojie, KOLTER J Z, and KOLTUN V. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling[EB/OL]. https://arxiv.org/abs/1803.01271, 2018.
    [4] LAI Guokun, CHANG Weicheng, YANG Yiming, et al. Modeling long-and short-term temporal patterns with deep neural networks[C]. The 41st international ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, USA, 2018: 95–104. doi: 10.1145/3209978.3210006.
    [5] ZHOU Jie, CUI Ganqu, HU Shengding, et al. Graph neural networks: A review of methods and applications[J]. AI Open, 2020, 1: 57–81. doi: 10.1016/j.aiopen.2021.01.001.
    [6] WU Zonghan, PAN Shirui, LONG Guodong, et al. Connecting the dots: Multivariate time series forecasting with graph neural networks[C]. The 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020: 753–763. doi: 10.1145/3394486.3403118.
    [7] SHAO Zezhi, ZHANG Zhao, WANG Fei, et al. Pre-training enhanced spatial-temporal graph neural network for multivariate time series forecasting[C]. The 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, USA, 2022: 1567–1577.
    [8] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]. The 31st International Conference on Neural Information Processing Systems, Long Beach, USA, 2017: 6000–6010.
    [9] YUAN Li, CHEN Yunpeng, WANG Tao, et al. Tokens-to-token ViT: Training vision transformers from scratch on imageNet[C]. The IEEE/CVF International Conference on Computer Vision, Montreal, Canada, 2021: 538–547. doi: 10.1109/ICCV48922.2021.00060.
    [10] HUANG Siteng, WANG Donglin, WU Xuehan, et al. DSANet: Dual self-attention network for multivariate time series forecasting[C]. The 28th ACM International Conference on Information and Knowledge Management, Beijing, China, 2019: 2129–2132. doi: 10.1145/3357384.3358132.
    [11] LI Shiyang, JIN Xiaoyong, XUAN Yao, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[C]. The 33rd International Conference on Neural Information Processing Systems, Vancouver, Canada, 2019, 32: 471.
    [12] ZHOU Haoyi, ZHANG Shanghang, PENG Jieqi, et al. Informer: Beyond efficient transformer for long sequence time-series forecasting[C]. The 35th AAAI Conference on Artificial Intelligence, Palo Alto, USA, 2021: 11106–11115. doi: 10.1609/aaai.v35i12.17325.
    [13] WU Haixu, XU Jiehui, WANG Jianmin, et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting[C]. The 35th International Conference on Neural Information Processing Systems, Red Hook, USA, 2021: 1717.
    [14] ZHOU Tian, MA Ziqing, WEN Qingsong, et al. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting[C]. International Conference on Machine Learning, Baltimore, USA, 2022: 27268–27286.
    [15] LIU Shizhan, YU Hang, LIAO Cong, et al. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting[C]. The Tenth International Conference on Learning Representations, Vienna, Austria, 2022: 1–20.
    [16] YUE Zhihan, WANG Yujing, DUAN Juanyong, et al. TS2Vec: Towards universal representation of time series[C]. The 36th AAAI Conference on Artificial Intelligence, Palo Alto, USA, 2022: 8980–8987. doi: 10.1609/aaai.v36i8.20881.
    [17] ZENG Ailing, CHEN Muxi, ZHANG Lei, et al. Are transformers effective for time series forecasting?[C]. The 37th AAAI Conference on Artificial Intelligence, Washington, USA, 2023: 11121–11128. doi: 10.1609/aaai.v37i9.26317.
  • 加载中
图(3) / 表(8)
计量
  • 文章访问数:  113
  • HTML全文浏览量:  44
  • PDF下载量:  16
  • 被引次数: 0
出版历程
  • 收稿日期:  2023-11-15
  • 修回日期:  2024-07-14
  • 网络出版日期:  2024-07-29
  • 刊出日期:  2024-08-30

目录

    /

    返回文章
    返回