Full Text

Turn on search term navigation

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Traffic-flow prediction plays an important role in the construction of intelligent transportation systems (ITS). So, in order to improve the accuracy of short-term traffic flow prediction, a prediction model (GWO-attention-LSTM) based on the combination of optimized attention mechanism and long short-term memory (LSTM) is proposed. The model is based on LSTM and uses the attention mechanism to assign individual weight to the feature information extracted via LSTM. This can increase the prediction model’s focus on important information. The initial weight parameters of the attention mechanism are also optimized using the grey wolf optimizer (GWO). By simulating the hunting process of grey wolves, the GWO algorithm calculates the hunting position of the grey wolf and maps it to the initial weight parameters of the attention mechanism. In this way, the short-time traffic flow prediction model is constructed. The traffic flow data of the trunk roads in the center of Qingdao (China) are used as the research object. Multiple sets of comparison models are set up for prediction analysis. The results show that the GWO-attention-LSTM model has obvious advantages over other models. The prediction error MAE values of the GWO-attention-LSTM model decreased by 7.32% and 14.35% on average compared with the attention-LSTM model and LSTM model. It is concluded that the GWO-attention-LSTM model has better model performance and can provide effective help for traffic management control and traffic flow theory research.

Details

Title
Short-Term Traffic Flow Prediction Based on the Optimization Study of Initial Weights of the Attention Mechanism
Author
Tianhe Lan 1 ; Zhang, Xiaojing 2 ; Qu, Dayi 1 ; Yang, Yufeng 1 ; Chen, Yicheng 1 

 School of Mechanical and Automotive Engineering, Qingdao University of Technology, Qingdao 266520, China 
 Journal Editorial Department, Qingdao University of Technology, Qingdao 266520, China 
First page
1374
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
20711050
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2767295324
Copyright
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.