Content area
Abstract
The integration of artificial intelligence in educational environments has the potential to revolutionize teaching and learning by enabling real-time analysis of students’ emotions, which are crucial determinants of engagement, motivation, and learning outcomes. However, accurately detecting and responding to these emotions remains a significant challenge, particularly in online and remote learning settings where direct teacher-student interactions are limited. Traditional educational approaches often fail to account for the emotional states of students, which can lead to disengagement and reduced learning effectiveness. The current study addresses this problem by developing a refined convolutional neural network (CNN) model designed to detect students’ emotions with high accuracy, using the FER2013 facial expression recognition dataset. The methodology involved preprocessing the dataset, including normalization and augmentation techniques, to ensure robustness and generalizability of the model. The CNN architecture was carefully designed with multiple convolutional, batch normalization, and dropout layers to optimize its ability to classify seven basic emotions: anger, disgust, fear, happiness, sadness, surprise, and neutral. The model was trained and validated on an 80-20 split of the dataset, with additional measures such as learning rate reduction and early stopping implemented to enhance performance and prevent overfitting. The results demonstrated that the CNN model achieved a test accuracy of 95%, with consistently high precision and recall across all emotion categories. This high level of accuracy indicates that the model is effective at recognizing subtle differences in facial expressions, making it suitable for real-time application in educational settings.




