It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
The comments in the evolution of network public opinion events not only reflect the attitude of netizens towards the event itself, but also are the key basis for mastering the dynamics of public opinion. According to the comment data in the event evolution process, an event feature vector pre-training model NL2ER-Transformer is constructed to realize the real-time automatic extraction of event features. Firstly, a semi-supervised multi-label curriculum learning model is proposed to generate comment words, event word vectors, event words, and event sentences, so that a public opinion event is mapped into a sequence similar to vectorized natural language. Secondly, based on the Transformer structure, a training method is proposed to simulate the evolution process of events, so that the event vector generation model can learn the evolution law and the characteristics of reversal events. Finally, the event vectors generated by the presented NL2ER-Transformer model are compared with the event vectors generated by the current mainstream models such as XLNet and RoBerta. This paper tests the pre-trained model NL2ER-Transformer and three pre-trained benchmark models on four downstream classification models. The experimental results show that using the vectors generated by NL2ER-Transformer to train downstream models compared to using the vectors generated by other pre-trained benchmark models to train downstream models, the accuracy, recall, and F1 values are 16.66%, 44.44%, and 19% higher than the best downstream model. At the same time, in the evolutionary capability analysis test, only four events show partial errors. In terms of performance of semi-supervised model, the proposed semi-supervised multi-label curriculum learning model outperforms mainstream models in four indicators by 6%, 23%, 8%, and 15%, respectively.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer