Abstract

Due to the structural problem, the traditional neural network models are prone to problems such as gradient explosion and over-fitting, while the deep GRU neural network model has low update efficiency and poor information processing capability among multiple hidden layers. Based on this, this paper proposes an optimized gated recurrent unit(OGRU) neural network.The OGRU neural network model proposed in this paper improves information processing capability and learning efficiency by optimizing the unit structure and learning mechanism of GRU, and avoids the update gate being interfered by the current forgetting information. The experiment uses Tensorflow framework to establish prediction models for LSTM neural network, GRU neural network and OGRU neural network respectively, and compare the prediction accuracy. The results show that the OGRU model has the highest learning efficiency and better prediction accuracy.

Details

Title
OGRU: An Optimized Gated Recurrent Unit Neural Network
Author
Wang, Xin 1 ; Xu, Jiabing 2 ; Shi, Wei 3 ; Liu, Jiarui 4 

 College of Computer Science and Technology, Harbin Engineering University, Harbin, Heilongjiang, 150001, China 
 Yantai Research Institute, China Agricultural University, Yantai, 264670, China 
 Beijing Aerospace Smart Technology Development Co., Ltd, Beijing, 100039, China 
 School of Economics and Management, Harbin Engineering University, Harbin, Heilongjiang, 15001, China 
Publication year
2019
Publication date
Oct 2019
Publisher
IOP Publishing
ISSN
17426588
e-ISSN
17426596
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2568002544
Copyright
© 2019. This work is published under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.