Document Type : Research Article
Authors
School of Intelligent Systems, University of Tehran, Tehran, Iran.
Abstract
Keywords
Main Subjects
| [1] | Han, Z. and Zhao, J. and Leung, H. and Ma, K. F. and Wang, W.. A Review of Deep Learning Models for Time Series Prediction. IEEE Sensors Journal. 1--8, 2019. [DOI ] |
| [2] | Salinas, D. and Flunkert, V. and Gasthaus, J. and Januschowski, T.. DeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks. International Journal of Forecasting. 2019. [DOI ] |
| [3] | Brown, T. B. and Mann, B. and Ryder, N. and Subbiah, M. and Kaplan, J. and Dhariwal, P. and Neelakantan, A. and Shyam, P. and Sastry, G. and Askell, A. and Agarwal, S. and Herbert-Voss, A. and Krueger, G. and Henighan, T. and Child, R. and others. Language Models are Few-Shot Learners. Advances in Neural Information Processing Systems. 2020. [DOI ] |
| [4] | Zhou, H. and Zhang, S. and Peng, J. and Zhang, S. and Li, J. and Xiong, H. and Zhang, W.. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. AAAI Conference on Artificial Intelligence. 2021. [DOI ] |
| [5] | Nie, Y. and Nguyen, N. H. and Sinthong, P. and Kalagnanam, J.. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. International Conference on Learning Representations (ICLR). 2023. [DOI ] |
| [6] | Zeng, A. and Chen, M. and Zhang, L. and Xu, Q.. Are Transformers Effective for Time Series Forecasting. AAAI Conference on Artificial Intelligence. 2023. [DOI ] |
| [7] | Louvron, H. and Lavril, T. and Izacard, G. and Martinet, X. and Lachaux, M.-A. and Lacroix, T. and Rozière, B. and Goyal, N. and Hambro, E. and Azhar, F. and Rodriguez, A. and Joulin, A. and Grave, E. and Lample, G.. Llama: Open and Efficient Foundation Language Models. arXiv preprint arXiv:2302.13971. 2023. [DOI ] |
| [8] | Lee, J. and Youn, H. L. and Poon, J. and Han, S. C.. StockEmotions: Discover Investor Emotions for Financial Sentiment Analysis and Multivariate Time Series. AAAI Bridge Conference on AI for Financial Services. 2023. [DOI ] |
| [9] | Liu, X.-Y. and Wang, G. and Yang, H. and Zha, D.. FinGPT: Democratizing Internet-scale Data for Financial Large Language Models. NeurIPS Instruction Workshop. 2023. [DOI ] |
| [10] | Das, A. and Kong, W. and Sen, R. and Zhou, Y.. A Decoder-Only Foundation Model for Time-Series Forecasting. International Conference on Machine Learning (ICML). 2024. [DOI ] |
| [11] | Li, X. and Shen, X. and Zeng, Y. and Xing, X. and Xu, J.. FinReport: Explainable Stock Earnings Forecasting via News Factor Analyzing Model. The Web Conference. 2024. [DOI ] |
| [12] | Jia, F. and Wang, K. and Zheng, Y. and Cao, D. and Liu, Y.. GPT4MTS: Prompt-based Large Language Model for Multimodal Time-series Forecasting. AAAI Conference on Artificial Intelligence. 2024. [DOI ] |
| [13] | Rasul, K. and Ashok, A. and Williams, A. R. and Ghonia, H. and Bhagwatkar, R. and Khorasani, A. and Darvishi Bayazi, M. J. and Adamopoulos, G. and Riachi, R. and Hassen, N. and Biloš, M. and Garg, S. and Schneider, A. and Chapados, N.. Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting. arXiv preprint. 2024. [DOI ] |
| [14] | Li, T. and Liu, Z. and Shen, Y. and Wang, X. and Chen, H. and Huang, S.. Master: Market-guided Stock Transformer for Stock Price Forecasting. AAAI Conference on Artificial Intelligence. 2024. [DOI ] |
| [15] | Fan, J. and Shen, Y.. StockMixer: A Simple yet Strong MLP-Based Architecture for Stock Price Forecasting. AAAI Conference on Artificial Intelligence. 2024. [DOI ] |
| [16] | Cao, D. and Jia, F. and Arik, S. Ö. and Pfister, T. and Zheng, Y. and Ye, W. and Liu, Y.. TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting. International Conference on Learning Representations (ICLR). 2024. [DOI ] |
| [17] | Lin, M. and Wang, S. and Ma, L. and Chu, Z. and Zhang, J. Y. and Shi, X. and Chen, P.-Y. and Liang, Y. and Li, Y.-F. and Pan, S. and Wen, Q.. Time-LLM: Time Series Forecasting by Reprogramming Large Language Models. International Conference on Learning Representations (ICLR). 2024. [DOI ] |
| [18] | Ekambaram, V. and Jati, A. and Dayama, P. and Mukherjee, S. and Nguyen, N. H. and Gifford, W. M. and Reddy, C. and Kalagnanam, J.. Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series. arXiv preprint arXiv:2401.03955. 2024. [DOI ] |