Content area
Abstract
Sentiment classification, a crucial task in natural language processing, plays a significant role in deciphering sentiment expressions within textual data. In previous work, the text preprocessing method’s impact on sentiment classification is ignored. Also, local and global context dependencies are not preserved for sentiment classification. To overcome these challenges, we investigate the impact text preprocessing techniques and introduce a novel sentiment classification framework as CNN-BiLSTM Multi-Attention Fusion Mechanism (CBMAFM) to preserve local and global context dependencies. The proposed CBMAFM uses a multi-attention fusion mechanism to leverage the synergistic power of convolutional neural networks (CNN) and bidirectional long short-term memory networks (BiLSTM). The proposed CBMAFM incorporates a multi-attention mechanism that attends to various levels of granularity within the input text. This fine-grained attention enables the model to focus on sentiment-bearing words and phrases, capturing the nuances of sentiment expression while avoiding information loss due to text length variations. By combining CNN and BiLSTM modules, CBMAFM capitalizes on the strengths of both architectures, effectively capturing local patterns and contextual dependencies, respectively. In this work, we have used benchmark datasets such as Electronics reviews, STS-Gold, Twitter reviews, and Movie reviews for experiments and accuracy improvement as 2.54%, 1.65%, 2.26%, and 2.14%, respective datasets. The results demonstrate that proposed CBMAFM performs better than other state-of-the-art methods.






