Content area

Abstract

This paper proposes a hybrid deep learning model for robust and interpretable sentiment classification of Twitter data. The model integrates Bidirectional Encoder Representations from Transformers (BERT)-based contextual embeddings, a Bidirectional Long Short-Term Memory (BiLSTM) network, and a custom attention mechanism to classify tweets into four sentiment categories: Positive, Negative, Neutral, and Irrelevant. Addressing the challenges of noisy and multilingual social media content, the model incorporates a comprehensive preprocessing pipeline and data augmentation strategies including back-translation and synonym replacement. An ablation study demonstrates that combining BERT with BiLSTM improves the model’s sensitivity to sequence dependencies, while the attention mechanism enhances both classification accuracy and interpretability. Empirical results show that the proposed model outperforms BERT-only and BERT+BiLSTM baselines, achieving F1-scores (F1) above 0.94 across all sentiment classes. Attention weight visualizations further reveal the model’s ability to focus on sentiment-bearing tokens, providing transparency in decision-making. The proposed framework is well-suited for deployment in real-time sentiment monitoring systems and offers a scalable solution for multilingual and multi-class sentiment analysis in dynamic social media environments. We also include a focused characterization of the dataset via an Exploratory Data Analysis in the Methods section.

Details

1009240
Business indexing term
Title
A BERT–LSTM–Attention Framework for Robust Multi-Class Sentiment Analysis on Twitter Data
Author
Zhang, Xinyu 1   VIAFID ORCID Logo  ; Liu, Yang 2   VIAFID ORCID Logo  ; Zhang, Tianhui 3   VIAFID ORCID Logo  ; Hou Lingmin 1 ; Liu, Xianchen 4 ; Guo Zhen 5 ; Mulati Aliya 6   VIAFID ORCID Logo 

 Department of Computer Science, Rochester Institute of Technology, Rochester, NY 14623, USA; [email protected] 
 College of Arts & Sciences, University of Miami, Miami, FL 33124, USA; [email protected] 
 College of Engineering, Northeastern University, Boston, MA 02115, USA; [email protected] 
 Department of Computer Engineering, Florida International University, Miami, FL 33199, USA; [email protected] 
 Department of Material Engineering, Florida International University, Miami, FL 33199, USA; [email protected] 
 Department of Politics and International Relations, Florida International University, Miami, FL 33199, USA; [email protected] 
Publication title
Systems; Basel
Volume
13
Issue
11
First page
964
Number of pages
20
Publication year
2025
Publication date
2025
Publisher
MDPI AG
Place of publication
Basel
Country of publication
Switzerland
Publication subject
e-ISSN
20798954
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Online publication date
2025-10-30
Milestone dates
2025-09-23 (Received); 2025-10-24 (Accepted)
Publication history
 
 
   First posting date
30 Oct 2025
ProQuest document ID
3275564978
Document URL
https://www.proquest.com/scholarly-journals/bert-lstm-attention-framework-robust-multi-class/docview/3275564978/se-2?accountid=208611
Copyright
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-11-26
Database
2 databases
  • Coronavirus Research Database
  • ProQuest One Academic