Content area
Virtual simulation laboratories have emerged as a critical component in modern STEM education, offering immersive and safe platforms for experiential learning. However, existing systems often lack cohesive frameworks for integrating real-time data processing, multimodal visualization, and adaptive instructional feedback—resulting in limited responsiveness and pedagogical value. To address these limitations, the authors propose the Integration–Real-time–Visualization with Feedback and Learning (IRV-FL) framework, a layered framework that semantically fuses heterogeneous sensor streams, performs real-time rendering, and delivers context-aware feedback to enhance learner engagement and accuracy. The system unifies physical and virtual environments through synchronous signal alignment, adaptive visualization pipelines, and dynamic feedback loops. Extensive experiments were conducted across eight dimensions, comparing IRV-FL against traditional virtual laboratories and six representative baseline models.
Details
Laboratories;
Experiments;
Time;
Responsiveness;
Simulation;
Critical components;
Visualization;
Virtual reality;
Experiential learning;
Data processing;
Science education;
Frame analysis;
Role models;
Real time;
Learning;
STEM education;
Semantics;
Virtual environments;
Pedagogy;
User behavior;
Feedback loops;
Mathematics education;
Technology education;
Artificial intelligence;
Digital twins;
Sensors;
Pipelines;
Integrated approach;
Digital technology
1 Changsha Normal University, China
2 Xiangtan University, China
3 Linkoping University, Sweden
4 Hunan University of Information Technology, China
