Трансформеры и механизмы внимания для прогнозирования финансовых временных рядов: применение в инвестиционном анализе энергетического сектора
Работая с сайтом, я даю свое согласие на использование файлов cookie. Это необходимо для нормального функционирования сайта, показа целевой рекламы и анализа трафика. Статистика использования сайта обрабатывается системой Яндекс.Метрика
SCIENTIFIC JOURNAL BULLETIN OF THE VORONEZH INSTITUTE OF HIGH TECHNOLOGIES
Online media
ISSN 2949-4443

Transformers and Attention Mechanisms for Financial Time Series Forecasting: Application in Investment Analysis of the Energy Sector

Koshelev N.M. ,  Tarlykov A.V. ,  Preobrazhenskiy A.P.  

UDC 004.8+519.72

  • Abstract
  • List of references
  • About authors

This paper investigates attention-based architectures applied to financial time-series forecasting in the context of investment analysis of energy-sector equities. The evolution from autoregressive models and LSTM to transformers is traced; the mathematics of the scaled scalar product of attention and the Temporal Fusion Transformer (TFT) architecture are analyzed. Comparative analysis show that TFT outperforms baseline models on MAE and SMAPE – 36% reduction vs. ARIMA and 17% vs. LSTM – while maintaining forecast interpretability. At the same time, the quadratic computational complexity of full attention, the Informer architecture as its solution for long series, and the fundamental limitations of the transformer approach are considered, including evidence that a properly configured linear model outperforms complex transformer architectures on a number of standard tasks.

1. Are Transformers Effective for Time Series Forecasting? / A. Zeng, M. Chen, L. Zhang, Q. Xu // Thirty-Seventh AAAI Conference on Artificial Intelligence, AAAI 2023, Thirty-Fifth Conference on Innovative Applications of Artificial Intelligence, IAAI 2023, Thirteenth Symposium on Educational Advances in Artificial Intelligence, EAAI 2023, Washington, DC, USA, 07–14 February 2023. – AAAI Press, 2023. – P. 11121–11128.

2. Ozbayoglu A.M. Deep Learning for Financial Applications: A Survey / A.M. Ozbayoglu, M.U. Gudelek, O.B. Sezer // Applied Soft Computing. – 2020. – Vol. 93. – URL: https://doi.org/10.1016/j.asoc.2020.106384 (дата обращения: 14.02.2026).

3. Hochreiter S. Long Short-Term Memory / S. Hochreiter, J. Schmidhuber // Neural Computation. – 1997. – Vol. 9, No. 8. – P. 1735–1780.

4. Attention Is All You Need / A. Vaswani, N. Shazeer, N. Parmar [et al.] // Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 04–09 December 2017, Long Beach, CA, USA. – 2017. – P. 5998–6008.

5. Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting / B. Lim, S.Ö. Arık, N. Loeff, T. Pfister // International Journal of Forecasting. – 2021. – Vol. 37, Iss. 4. – P. 1748–1764.

6. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting / H. Zhou, Sh. Zhang, J. Peng [et al.] // Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Thirty-Third Conference on Innovative Applications of Artificial Intelligence, IAAI 2021, The Eleventh Symposium on Educational Advances in Artificial Intelligence, EAAI 2021, Virtual Event, 02–09 February 2021. – AAAI Press, 2021. – P. 11106–11115.

Koshelev Nikita Mikhailovich

Voronezh Institute of High Technologies

Voronezh, Russia

Tarlykov Alexander Vyacheslavovich

Voronezh Institute of High Technologies

Voronezh, Russia

Preobrazhenskiy Andrey Petrovich
Doctor of Engineering Sciences, Full Professor

Voronezh Institute of High Technologies

Voronezh, Russia

Keywords: transformer, attention mechanism, time series, forecasting, energy sector, interpretability, investment analysis

For citation: Koshelev N.M. , Tarlykov A.V. , Preobrazhenskiy A.P. , Transformers and Attention Mechanisms for Financial Time Series Forecasting: Application in Investment Analysis of the Energy Sector. Bulletin of the Voronezh Institute of High Technologies. 2026;20(1). Available from: https://vestnikvivt.ru/ru/journal/pdf?id=1467 (In Russ).

74

Full text in PDF

Received 11.03.2026

Revised 27.03.2026

Accepted 27.03.2026

Published 31.03.2026