Advanced Search
Volume 46 Issue 8
Aug.  2024
Turn off MathJax
Article Contents
PAN Jinwei, WANG Yiqiao, ZHONG Bo, WANG Xiaoling. Statistical Feature-based Search for Multivariate Time Series Forecasting[J]. Journal of Electronics & Information Technology, 2024, 46(8): 3276-3284. doi: 10.11999/JEIT231264
Citation: PAN Jinwei, WANG Yiqiao, ZHONG Bo, WANG Xiaoling. Statistical Feature-based Search for Multivariate Time Series Forecasting[J]. Journal of Electronics & Information Technology, 2024, 46(8): 3276-3284. doi: 10.11999/JEIT231264

Statistical Feature-based Search for Multivariate Time Series Forecasting

doi: 10.11999/JEIT231264 cstr: 32379.14.JEIT231264
Funds:  The National Natural Science Foundation of China (61972155)
  • Received Date: 2023-11-15
  • Rev Recd Date: 2024-07-14
  • Available Online: 2024-07-29
  • Publish Date: 2024-08-30
  • There are long-term dependencies, such as trends, seasonality, and periodicity in time series, which may span several months. It is insufficient to apply existing methods in modeling the long-term dependencies of the series explicitly. To address this issue, this paper proposes a Statistical Feature-based Search for multivariate time series Forecasting (SFSF). First, statistical features which include smoothing, variance, and interval standardization are extracted from multivariate time series to enhance the perception of the time series’ trends and periodicity. Next, statistical features are used to search for similar series in historical sequences. The current and historical sequence information is then blended using attention mechanisms to produce accurate prediction results. Experimental results show that the SFSF method outperforms six state-of-the-art methods.
  • loading
  • [1]
    ORESHKIN B N, CARPOV D, CHAPADOS N, et al. N-Beats: Neural basis expansion analysis for interpretable time series forecasting[C]. International Conference on Learning Representations, Addis Ababa, Ethiopia, 2020: 1–31.
    [2]
    SALINAS D, FLUNKERT V, GASTHAUS J, et al. DeepAR: Probabilistic forecasting with autoregressive recurrent networks[J]. International Journal of Forecasting, 2020, 36(3): 1181–1191. doi: 10.1016/j.ijforecast.2019.07.001.
    [3]
    BAI Shaojie, KOLTER J Z, and KOLTUN V. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling[EB/OL]. https://arxiv.org/abs/1803.01271, 2018.
    [4]
    LAI Guokun, CHANG Weicheng, YANG Yiming, et al. Modeling long-and short-term temporal patterns with deep neural networks[C]. The 41st international ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, USA, 2018: 95–104. doi: 10.1145/3209978.3210006.
    [5]
    ZHOU Jie, CUI Ganqu, HU Shengding, et al. Graph neural networks: A review of methods and applications[J]. AI Open, 2020, 1: 57–81. doi: 10.1016/j.aiopen.2021.01.001.
    [6]
    WU Zonghan, PAN Shirui, LONG Guodong, et al. Connecting the dots: Multivariate time series forecasting with graph neural networks[C]. The 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020: 753–763. doi: 10.1145/3394486.3403118.
    [7]
    SHAO Zezhi, ZHANG Zhao, WANG Fei, et al. Pre-training enhanced spatial-temporal graph neural network for multivariate time series forecasting[C]. The 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, USA, 2022: 1567–1577.
    [8]
    VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]. The 31st International Conference on Neural Information Processing Systems, Long Beach, USA, 2017: 6000–6010.
    [9]
    YUAN Li, CHEN Yunpeng, WANG Tao, et al. Tokens-to-token ViT: Training vision transformers from scratch on imageNet[C]. The IEEE/CVF International Conference on Computer Vision, Montreal, Canada, 2021: 538–547. doi: 10.1109/ICCV48922.2021.00060.
    [10]
    HUANG Siteng, WANG Donglin, WU Xuehan, et al. DSANet: Dual self-attention network for multivariate time series forecasting[C]. The 28th ACM International Conference on Information and Knowledge Management, Beijing, China, 2019: 2129–2132. doi: 10.1145/3357384.3358132.
    [11]
    LI Shiyang, JIN Xiaoyong, XUAN Yao, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[C]. The 33rd International Conference on Neural Information Processing Systems, Vancouver, Canada, 2019, 32: 471.
    [12]
    ZHOU Haoyi, ZHANG Shanghang, PENG Jieqi, et al. Informer: Beyond efficient transformer for long sequence time-series forecasting[C]. The 35th AAAI Conference on Artificial Intelligence, Palo Alto, USA, 2021: 11106–11115. doi: 10.1609/aaai.v35i12.17325.
    [13]
    WU Haixu, XU Jiehui, WANG Jianmin, et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting[C]. The 35th International Conference on Neural Information Processing Systems, Red Hook, USA, 2021: 1717.
    [14]
    ZHOU Tian, MA Ziqing, WEN Qingsong, et al. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting[C]. International Conference on Machine Learning, Baltimore, USA, 2022: 27268–27286.
    [15]
    LIU Shizhan, YU Hang, LIAO Cong, et al. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting[C]. The Tenth International Conference on Learning Representations, Vienna, Austria, 2022: 1–20.
    [16]
    YUE Zhihan, WANG Yujing, DUAN Juanyong, et al. TS2Vec: Towards universal representation of time series[C]. The 36th AAAI Conference on Artificial Intelligence, Palo Alto, USA, 2022: 8980–8987. doi: 10.1609/aaai.v36i8.20881.
    [17]
    ZENG Ailing, CHEN Muxi, ZHANG Lei, et al. Are transformers effective for time series forecasting?[C]. The 37th AAAI Conference on Artificial Intelligence, Washington, USA, 2023: 11121–11128. doi: 10.1609/aaai.v37i9.26317.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(3)  / Tables(8)

    Article Metrics

    Article views (211) PDF downloads(41) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return