YEH, CHENJ, GONGS, et al. ATFNet: adaptive time - frequency ensembled network for long-term time series forecasting[EB/OL]. 2024 - 04 - 08/2025 - 04 - 30.
[2]
ZHOUH, ZHANGS, PENGJ, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(12): 11106 - 11115.
[3]
OGASAWARAE, MARTINEZL C, OLIVEIRAD, et al. Adaptive normalization: a novel data normalization approach for non - stationary time series[C]//The 2010 International Joint Conference on Neural Networks (IJCNN). IEEE, 2010: 1 - 8.
[4]
PASSALISN, TEFASA, KANNIAINENJ, et al. Deep adaptive input normalization for time series forecasting[J]. IEEE Transactions on Neural Networks and Learning Systems, 2019, 31(9): 3760 - 3765.
[5]
KIMT, KIMJ, TAE Y, et al. Reversible instance normalization for accurate time - series forecasting against distribution shift[C]//ICLR, 2022.
[6]
LIUY, WUH, WANGJ, et al. Non-stationary transformers: exploring the stationarity in time series forecasting[J]. Advances in Neural Information Processing Systems, 2022, 35: 9881 - 9893.
[7]
ZENGA, CHENM, ZHANGL, et al. Are transformers effective for time series forecasting?[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2023, 37(9): 11121 - 11128.
[8]
WUH, XUJ, WANGJ, et al. Autoformer: decomposition transformers with auto-correlation for long-term series forecasting[C]//Advances in Neural Information Processing Systems, 2021, 34: 22419 - 22430.
[9]
WUH, XUJ, WANGJ, et al. TimesNet: temporal 2D-variation modeling for general time series analysis [J]. arXiv preprint arXiv:2022.
[10]
GIBBSJ W. Fourier's series[J]. Nature, 1899, 59(1539): 606.
[11]
VASWANIA, SHAZEERN, PARMARN, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems, 2017: 30.
[12]
DEVLINJ, CHANGM W, LEEK, et al. BERT: pre training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 conference of the North American chapter of the Association for Computational Linguistics: Human Language Technologies, 2019: 4171 - 4186.
[13]
HUJ, SHENL, ALBANIES, et al. Squeeze - and - excitation networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018: 7132 - 7141.
[14]
JIANGM, ZENGP, WANGK, et al. FECAM: frequency enhanced channel attention mechanism for time series forecasting[J]. Advanced Engineering Informatics, 2023, 58: 102158.
[15]
CHAVESS S, LYNFYIELDR, LINDEGRENM L, et al. The US influenza hospitalization surveillance network[J]. Emerging Infectious Diseases, 2015, 21(9): 1543.
[16]
KITAEVN, KAISERL, LEVSKAYAA, et al. Reformer: the efficient transformer[J]. arXiv preprint arXiv:2001.04451, 2020: 1 - 12.
[17]
LAIG, CHANGW, YANGY, et al. Modeling long- and short-term temporal patterns with deep neural networks[C]//The 41st international ACM SIGIR Conference on Research & Development in Information Retrieval. 2018: 95 - 104.
[18]
LIS, JINX, XUANY, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[J]. Advances in Neural Information Processing Systems, 2019, 32: 5243 - 5253.