Full Text

Turn on search term navigation

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Financial time-series prediction has been an important topic in deep learning, and the prediction of financial time series is of great importance to investors, commercial banks and regulators. This paper proposes a model based on multiplexed attention mechanisms and linear transformers to predict financial time series. The linear transformer model has a faster model training efficiency and a long-time forecasting capability. Using a linear transformer reduces the original transformer’s complexity and preserves the decoder’s multiplexed attention mechanism. The results show that the proposed method can effectively improve the prediction accuracy of the model, increase the inference speed of the model and reduce the number of operations, which has new implications for the prediction of financial time series.

Details

Title
A Financial Time-Series Prediction Model Based on Multiplex Attention and Linear Transformer Structure
Author
Xu, Caosen 1   VIAFID ORCID Logo  ; Li, Jingyuan 1 ; Feng, Bing 1 ; Lu, Baoli 2 

 School of Management, Wuhan Institute of Technology, Wuhan 430205, China; [email protected] (C.X.); 
 Institute of Semiconductors, Chinese Academy of Sciences, Beijing 100083, China; School of Computing, University of Portsmouth, Portsmouth PO1 3HE, UK 
First page
5175
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
20763417
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2806475141
Copyright
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.