Content area
We assessed whether constructing a mathematical knowledge graph for a knowledge question-answering system or a course recommendation system, Named Entity Recognition (NER), is indispensable. The accuracy of its recognition directly affects the actual performance of these subsequent tasks. In order to improve the accuracy of mathematical knowledge entity recognition and provide effective support for subsequent functionalities, this paper adopts the latest pre-trained language model, LERT, combined with a Bidirectional Gated Recurrent Unit (BiGRU), Iterated Dilated Convolutional Neural Networks (IDCNNs), and Conditional Random Fields (CRFs), to construct the LERT-BiGRU-IDCNN-CRF model. First, LERT provides context-related word vectors, and then the BiGRU captures both long-distance and short-distance information, the IDCNN retrieves local information, and finally the CRF is decoded to output the corresponding labels. Experimental results show that the accuracy of this model when recognizing mathematical concepts and theorem entities is 97.22%, the recall score is 97.47%, and the F1 score is 97.34%. This model can accurately recognize the required entities, and, through comparison, this method outperforms the current state-of-the-art entity recognition models.
Details
; He, Zheng 1 ; Ma, Shuaiqi 1 ; Zhang, Mingze 2 ; Guo, Wei 3 ; Keqing Ning 1 1 School of Information Science and Technology, North China University of Technology, Beijing 100144, China;
2 State Grid Jilin Electric Power Research Institute, Changchun 130015, China;
3 School of Electrical and Control Engineering, North China University of Technology, Beijing 100144, China