Content area
: Emerging technologies and innovative instructional methods have revolutionized education, making blended learning the new standard in the artificial intelligence era. However, poor integration of online and face-to-face learning has led to such challenges like superficial student engagement. This study developed a Community of Inquiry (CoI)-based blended learning model and evaluated its effectiveness with 92 college students. Over 16 weeks, the experimental group (n=48) adopted the blended learning model, while the control group (n=44) used traditional learning conditions. Learning effectiveness and deep learning perceptions were assessed, showing the blended learning group outperformed the traditional group in learning effectiveness (d=0.83) and reported better deep learning perceptions (η2=0.05 - 0.072). These results provide valuable insights for educators aiming to design a CoI-based blended learning model that fosters deep learning and improves overall learning effectiveness.
Introduction
The widespread adoption of digital technologies has transformed education, leading to the rise of diverse instructional methods such as online learning and personalized instruction, which have spurred the growth of blended learning (Perera et al., 2020). Blended learning, which combines the strengths of traditional and online education, addresses the limitations of both methods and plays a crucial role in enhancing teaching quality and talent development (Han, 2023). In such environments, educators guide students in cultivating creativity, problem-solving, and deep learning skills, all essential for higher education reform and the 21st-century workforce (Qi et al., 2020).
The Community of Inquiry (CoI) framework is an effective model for designing blended environments, promoting learning engagement through active participation, meaningful discussion, and timely instructor feedback (Armellini et al., 2021; Liu & Deris, 2022). The three key elements of the CoI framework—teaching presence, social presence, and cognitive presence—are vital for enhancing students’ knowledge acquisition, skill development, and motivation (Zhang, 2020).
Empirical research has suggested that students in blended learning environments tend to achieve superior learning outcomes and report more positive learning experiences compared to those in traditional classroom settings (Vallée et al., 2020). The CoI framework further supports blended course design by fostering collaboration, critical thinking, and meaningful knowledge construction (Zhang, 2020). Despite these promising findings, further research is required to determine whether CoI-based blended learning is more effective than traditional pedagogical methods in enhancing both learning outcomes and deep learning. This study addressed this gap by investigating the effectiveness of a CoI-based blended learning model, facilitated through the Superstar Learning platform (https://www.chaoxing.com/), in promoting high-quality learning experiences. By examining the impact of this model, the study contributes to the growing body of research on evidence-based frameworks for blended learning, offering valuable insights for educators and instructional designers seeking to optimize student engagement and academic achievement.
The Blended Learning Effect
Blended learning is widely recognized for improving learning outcomes, especially when using diverse information and communications technology tools and resources that create immersive, student-centered environments (Bizami et al., 2023; Shamir-Inbal & Blau, 2021). These environments effectively enhance learning by integrating traditional and digital methods. For instance, Li and Cao’s (2020) blended learning model for undergraduate English students combined virtual reality, online content, and interactive classroom sessions, significantly improving test scores. Similarly, Antonelli et al. (2023) found that virtual reality labs reduced student errors and enhanced practical skills. Through a systematic review, Almusaed et al. (2023) confirmed that blended learning environments enabled with artificial intelligence (AI) have the potential to enhance educational quality. Technologies such as chatbots, intelligent tutoring systems, and personalized learning platforms can increase student engagement and support motivation.
Blended learning also fosters deep learning competencies. Innovative models, such as micro-learning and computer-based collaborative learning, have improved students’ self-learning abilities and goal achievement (Astiwardhani Sobandi, 2024). Courses incorporating video lectures and online materials have boosted English learners’ motivation, autonomy, and satisfaction (Wang et al., 2021). Zhao (2022) leveraged the Rain Classroom platform’s network and mobile technologies (https://www.yuketang.cn/en) to create a blended model that increased learning quality and interest. Similarly, AI-powered blended courses have been more effective than traditional teaching in promoting student engagement, deep learning, critical thinking, and independent and cooperative learning (He et al., 2023).
Deep Learning Dimensions
Unlike surface learning, which focuses on memorization driven by external motivation, deep learning fosters meaningful knowledge construction and intrinsic motivation (Darling-Hammond & Oakes, 2021). Deep learning encourages learners to engage deeply with content, develop critical thinking and problem-solving skills, and enhance knowledge transfer (Shen & Chang, 2023). It also promotes skills such as learning how to learn, complex problem-solving, and effective communication and collaboration (Mthethwa-Kunene et al., 2022).
At the core of deep learning is the development of higher-order cognitive abilities, including problem-solving, decision-making, and critical thinking (Kurniawan, 2021). This process empowers students to creatively solve real-world problems (Pan et al., 2023). Mobile technology has been shown to support higher-order thinking by providing timely access to information and facilitating active cooperation and engagement (Hye et al., 2020; Yaniawati et al., 2022). Hu and Hwang (2024) proposed a problem-posing approach that was mobile, self-adapted, and based on concept mapping within a virtual museum context and found that it significantly enhanced learners’ critical thinking and problem-solving abilities. In this study, learning behaviors involving application, analysis, evaluation, and creation were categorized as higher-order cognition.
Deep learning also emphasizes communication and collaboration, essential 21st-century skills (Mthethwa-Kunene et al., 2022). Effective communication involves organizing learning content through presentations, group work, and interactive projects, while collaboration focuses on learner-centered activities such as sharing, interaction, and discussion (Islam et al., 2022; Mthethwa-Kunene et al., 2022). Interactive learning is supported by flipped classroom models that promote active communication through online and in-person discussions and timely feedback (Shen & Chang, 2023). Blended learning, incorporating diverse computer-mediated collaboration strategies, enhances participation and student satisfaction (Belda-Medina, 2021; Vlachopoulos & Makri, 2019). Intelligent computers can simulate interactions between learners and the external environment by performing tasks and providing timely feedback based on individual mistakes. Additionally, collaboration between AI and learners has the potential to optimize learning outcomes (Weber et al., 2025).
Reflective thinking is another key aspect of deep learning, focusing on self-reflection in academic research and practice (Rogers et al., 2019). It plays a crucial role in shaping students’ professional identity and ensuring sustainable learning (Annansingh, 2019). Reflective learning involves both self-reflection and reflection on others’ learning behavior (Priddis & Rogers, 2018). Intelligent learning tools—such as ChatGPT, Apple’s Shortcuts, and LINE—have been shown to facilitate this process by enabling students to monitor their progress and develop reflective thinking skills (Wu et al., 2023). In this study, reflective learning primarily involved learning under supervision and guidance.
Lastly, emotional experience is a hallmark of deep learning (Pellegrino & Hilton, 2012). Self-efficacy, motivation, and positive attitudes toward learning significantly impact engagement and effectiveness (Meyer et al., 2018). Well-designed online and face-to-face activities in blended courses can foster positive attitudes, while clear goals and challenging tasks can enhance self-efficacy and motivation (Zhao & Song, 2022; Zhao et al., 2021). In this study, emotional experiences included learning attitudes, self-efficacy, and motivation.
Research Hypotheses
This study operationally defined four core features of deep learning: higher-order cognition, interactive learning, reflective learning, and emotional experience. It measured students’ deep learning perceptions in these four dimensions to verify the effectiveness of the CoI-based blended learning model. Furthermore, this study, using the Digital Film and Television Directing and Production course as an example, applied a CoI-blended learning model and compared the blended (BL) and traditional (TL) classroom learning outcomes represented by three scores: online quizzes, a final exam, and film and television assignments.
While prior research has generally supported the efficacy of blended learning, some studies have reported no significant differences in learning outcomes between BL and TL students (e.g., Müller & Mildenberger, 2021). These discrepancies may stem from inadequate instructional design, insufficient scaffolding, superficial integration of online and offline learning, and underuse of the potential benefits of blended learning. The CoI framework, which focuses on enhancing students’ learning presence, can effectively guide the design of blended learning activities, fostering a more seamless integration of online and face-to-face learning modes (Armellini et al., 2021). CoI-based blended courses are expected to promote deep learning and improve overall learning effectiveness by balancing online and offline instruction (Tabassum & Mohd Saad, 2024).
Thus, the following hypotheses were proposed:
H1-1b: BL students will achieve higher scores on higher-order online quiz questions.
H1-2a: BL students will achieve higher scores on lower-order final exam questions.
H1-2b: BL students will achieve higher scores on higher-order final exam questions.
H1-3: BL students will score significantly higher on film and television assignments.
H2-2: Interactive learning
H2-3: Reflective learning
H2-4: Emotional experience
Community of Inquiry-Based Blended Learning Model Design
This study examined the implementation of a blended learning model in the Digital Film and Television Directing and Production course, using the Superstar Learning platform developed by Beijing Century Superstar Information. The model integrated three key elements of the community of inquiry (CoI) framework: teaching, social, and cognitive presence, within three learning stages—face-to-face learning, online learning, and face-to-face reporting (Figure 1). The 16-week course was divided into three phases: early creation, mid-stage shooting, and post-editing, with the initial week dedicated to orientation and the remaining time equally distributed across the three teaching modules.
Figure 1
The CoI-Based Blended Learning Model
[Image omitted]
Note. CoI = community of inquiry.
The course design and implementation effectively addressed the three elements of the Community of Inquiry (CoI) framework. In terms of teaching presence, the course featured clearly defined learning objectives, detailed syllabi, and well-organized materials, while instructors actively provided timely feedback and facilitated problem-solving discussions. The intelligent learning platform further enabled tailored instructional strategies by monitoring student progress and adjusting content delivery accordingly.
Regarding social presence, the curriculum incorporated collaborative activities, such as online discussion forums, group projects, and peer reviews, which fostered a sense of community among students. Real-time communication tools and virtual breakout sessions were employed to enhance interaction and support, while integrated social networking features promoted informal exchanges of ideas beyond formal class sessions.
In addressing cognitive presence, the course used inquiry-based tasks, case studies, and problem-solving assignments to encourage critical analysis and reflective thinking. Interactive modules and scenario-based exercises were implemented to deepen understanding and facilitate the application of theoretical concepts, while continuous reflection activities, including reflective journals and self-assessment quizzes, supported the development of critical thinking and knowledge construction.
Face-to-Face Learning Stage
In this stage, in-class learning was paired with the Superstar Learning platform. The first two weeks of each module followed a structured five-step process:
- The teacher introduced new concepts, encouraging active participation.
- Students formed self-selected groups and learning behavior norms for participation.
- Inquiry tasks were assigned, prompting group discussions and collaborative exploration. Results were uploaded to the platform.
- Groups conducted peer evaluations, refining their work based on feedback, with guidance from the teacher.
- Online quizzes assessed learning outcomes, followed by teacher feedback and content summaries.
This stage featured three online discussion questions and one practice task, each spanning 2 weeks and following five steps:
- The teacher provided learning materials (video, PowerPoints, quizzes) on the platform for students to engage with independently.
- Students completed online quizzes with immediate feedback, and the teacher provided support when needed.
- The teacher initiated discussions, with students tackling two discussion questions in the first week and one in the second.
- Group leaders supervised discussions, fostering deeper engagement and collaboration, while the teacher facilitated reflection and knowledge co-construction.
- Groups completed tasks based on discussion outcomes and uploaded their work to the platform.
This stage focused on the online learning tasks and summarizing content. The last week of each learning module followed a five-step process.
- The teachers shared evaluation criteria, and students familiarized themselves with the guidelines.
- Group representatives presented their work, showcasing innovative ideas and application of new concepts. Other groups and the teacher provided feedback.
- Intergroup evaluations were conducted, with students providing feedback based on evaluation criteria and engaging in on-site discussions.
- Groups posted online ratings and comments on other groups’ work, and the teachers stored the evaluation results for post-class review.
- The teachers guided students through content summarization, encouraging critical thinking and integration of new material
Participants
The participants were 92 sophomore educational technology students enrolled in the two compulsory Digital Film and Television Production classes at a university in China in the spring semester of 2023. Both classes lasted for 16 weeks and were taught by the same teacher. One class was randomly assigned as the experimental group (n = 48), while the other class was the control group (n = 44). The male-female ratio for the experimental and control groups were 1:2 (16 males and 32 females) and 1:2.4 (13 males and 31 females), respectively. All students were between 19 and 21 years old, owned smartphones, and were proficient Superstar Learning platform users.
The final exam scores from the prerequisite course Fundamentals of Photography Skills served as the prior knowledge assessment criteria. Independent sample t-test results indicated that there were no statistically significant prior knowledge differences (t = 0.86, p > .05) between the experimental group, M(SD) = 81.38(7.47), and the control group, M(SD) = 80.11(6.59).
Experimental Procedure
This study focused on learning effectiveness and students’ deep learning perceptions in the CoI-based blended learning course (Figure 2). In the pre-intervention stage, the participants took the pretest and were introduced to the basic functions of the Superstar Learning platform. In the intervention stage, while both groups participated in face-to-face learning with some Superstar Learning activities done in class, only the experimental group took part in the online learning stage. The control group students listened to the teacher’s lectures, participated in class discussions, collaborated in groups, completed online quizzes, and completed the same post-class learning tasks. As the Digital Film and Television Directing and Production course focused on both theory and practice, students in both groups were required to take the final exam and submit comprehensive film and television production assignments. At the end of the course, both groups completed the perception posttest.
Figure 2
Experimental Procedure
[Image omitted]
Instruments
This study used various assessments to compare learning effectiveness between students in blended and traditional learning environments: online quizzes, a final exam, and group assignments. Additionally, pre- and posttests were conducted to measure deep learning perceptions in both groups.
Online Quizzes
On the Superstar Learning platform, both groups completed identical online quizzes, worth 280 points in total, consisting of 64 single-choice questions (128 points), 26 multiple-choice questions (104 points), and 6 essay questions (48 points). Based on Bloom’s revised taxonomy (Krathwohl, 2002), the quizzes were divided into lower-order (knowledge and comprehension) and higher-order (application, analysis, and evaluation) cognitive levels. Lower-order questions totaled 186 points, while higher-order questions totaled 94 points.
Final Exam
Both groups took the same final exam, categorized into lower-order and higher-order cognitive questions. For instance, a lower-order question required students to briefly describe the fundamental principles of shot assembly, whereas a higher-order question asked them to provide an example of how the close-up method is applied in effect sound processing. The exam was worth 100 points, with 10 fill-in-the-blank questions (20%), 15 multiple-choice questions (30%), 5 brief questions (35%), and 1 design question (15%). Each cognitive level accounted for 50 points. Content validity was reviewed by three educational technology experts with over 10 years of experience.
Film and Television Production Assignments
Students worked in groups to produce 10-minute films on self-chosen themes. Each class was divided into six groups. To assess individual contributions, students provided task descriptions in their final reports. The total score (100 points) consisted of intragroup scores (30%), intergroup score (30 points), and the teacher’s evaluation (40%).
Deep Learning Perception Scale
The questionnaire comprised two sections. The first section collected students’ basic information, including student number, gender, and experience with the Superstar Learning platform. The second section contained the 20-item Deep Learning Perception Questionnaire (DLPQ), designed to measure students’ perceptions of deep learning. Items related to higher-order cognition, interactive learning, and reflective learning were adapted from Shen and Chang (2022), while emotional experience items were synthesized from Alqurashi (2019) and Zimmerman and Kulikowich (2016). Responses were scored on a 5-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly agree). To ensure content validity, three experts with over 10 years of experience in educational technology reviewed the scale, refining item structure, clarifying meanings, and offering revision suggestions. The finalized questionnaire was pretested with 156 students informed of the study’s purpose, yielding 122 valid responses and a return rate of 78.2%.
Exploratory factor analysis (EFA) was conducted using IBM SPSS Statistics (Version 25.0) to evaluate the DLPQ’s factor structure. Results are shown in Table 1. The Kaiser-Meyer-Olkin value was 0.83, and Bartlett’s chi-square value was 1573.67 (df = 190, p < .001), indicating the scale’s suitability for factor analysis. Principal component analysis with the maximum variance method was employed to analyze the pretest data. Factors were selected based on eigenvalues greater than 1, with item retention requiring factor loadings exceeding 0.5. The analysis revealed four dimensions, explaining a cumulative variance of 66.78% (see Table 1). The interactive learning dimension (5 items, e.g., “I can offer useful suggestions for my peers”) accounted for close to half the variance, with factor loadings ranging from 0.64 to 0.86. The reflective learning dimension (5 items, e.g., “I can analyze the reasons for failure to solve problems”) explained around another one tenth of the variance, with loadings from 0.51 to 0.83. The higher-order cognition dimension (5 items, e.g., “I can find logical relationships between knowledge points”) accounted for an additional small percentage of the variance, with loadings between 0.64 and 0.75. Finally, the emotional experience dimension (5 items, e.g., “I am willing to adjust my learning style to meet course requirements”) explained a further small fraction of the variance, with loadings from 0.60 to 0.83. These findings indicate strong structural validity for the scale. Additionally, Cronbach’s alpha values for the four dimensions demonstrated good reliability.
Table 1
The Reliability and Validity of DLP S cale
| Item | Dimension | |||
| Interactive learning | Reflective learning | Higher-order cognition | Emotional experience | |
| HC1 | 0.69 | |||
| HC2 | 0.75 | |||
| HC3 | 0.75 | |||
| HC4 | 0.73 | |||
| HC5 | 0.64 | |||
| IL6 | 0.64 | |||
| IL7 | 0.74 | |||
| IL8 | 0.86 | |||
| IL9 | 0.69 | |||
| IL10 | 0.77 | |||
| RL11 | 0.62 | |||
| RL12 | 0.83 | |||
| RL13 | 0.72 | |||
| RL14 | 0.72 | |||
| RL15 | 0.51 | |||
| AE16 | 0.67 | |||
| AE17 | 0.67 | |||
| AE18 | 0.68 | |||
| AE19 | 0.60 | |||
| AE20 | 0.83 | |||
| Cronbach’s α | 0.88 | 0.86 | 0.83 | 0.87 |
| Eigenvalues | 8.84 | 1.83 | 1.52 | 1.17 |
| Variance (%) | 44.19 | 9.13 | 7.62 | 5.84 |
| Cumulative variance (%) | 44.19 | 53.32 | 60.94 | 66.78 |
Note. DLP = deep learning perception; HC = higher-order cognition; IL = interactive learning; RL = reflective learning; AE = emotional experience
Data Collection and Analysis
Learning Effectiveness
The learning effectiveness data came from online quizzes (30%), film and television assignments (30%), and a final exam (40%). For online quizzes, the objective questions (single-choice, multiple-choice) were automatically scored by the Superstar Learning platform, while the teacher scored the subjective questions (essay questions). The assignment scores comprised the weighted average of the students’ and teacher’s scores. The final exam papers were graded solely by the teacher. An independent sample t-test was used to explore the differences between the two groups in the lower-order cognitive and higher-order cognitive questions on the online quiz and final exam scores, as well as in the assignment scores.
A significance level of p < .05 was adopted, and Cohen’s d was used to measure the effect size to evaluate further differences between the experimental and the control groups. Cohen’s d values of 0.2∼0.5 represented a small effect size, 0.5∼0.8 denoted a medium effect size, and values greater than 0.8 indicated a large effect size (Cohen, 1988).
Deep Learning Perception
Formal questionnaires were administered to investigate the pretest and posttest deep learning perceptions in both the experimental and the control groups. A total of 92 students completed the questionnaire. Out of 48 students in the experimental group, 46 submitted valid questionnaires, resulting in a high return rate of 95.8%. Out of 44 students in the control group, 41 submitted valid questionnaires, with a robust return rate of 93.2%.
Descriptive statistics were used to analyze the pre- and posttest data probing students’ higher-order cognition, interactive learning, reflective learning, and emotional experience. Mean and standard deviation were employed to assess students’ deeper learning perceptions in four dimensions. Subsequently, covariate analysis was conducted to examine the deep learning perception differences between the experimental and the control groups. η2 was reported to calculate the effect size, with values .01∼.07 indicating a small effect size, values .07∼.14 denoting a moderate effect size, and values greater than .14 signifying a large effect size (Cohen, 1992).
Research Results
Learning Effectiveness
There was a significant difference in the learning effectiveness between the experimental and the control groups (Table 2). The experimental group achieved significantly better learning outcomes than the control group, with a large effect size. These findings support hypothesis H1.
Table 2
Learning Effectiveness Differences
| Score source | Experimental (n = 48) | Control (n = 44) | t | Cohen’s d | ||
| M | SD | M | SD | |||
| Lower-order questions (Online quizzes) | 179.63 | 5.74 | 177.41 | 7.04 | 1.66 | 0.35 |
| Higher-order questions (Online quizzes) | 86.67 | 6.21 | 82.41 | 8.22 | 2.78** | 0.58 |
| Lower-order questions (Final exam) | 36.88 | 6.98 | 34.14 | 6.36 | 1.96 | 0.41 |
| Higher-order questions (Final exam) | 36.54 | 6.82 | 32.39 | 5.09 | 3.29*** | 0.69 |
| Film and television assignments | 88.81 | 3.94 | 86.86 | 4.14 | 2.31* | 0.48 |
| Learning effectiveness | 84.54 | 5.35 | 80.51 | 4.34 | 3.95*** | 0.83 |
Note. *p < .05. **p < .01. ***p < .001.
Also shown in Table 2, there were no significant differences between the experimental and control groups in lower-order question scores for online quizzes and final exam scores, indicating that hypotheses H1-1a and H1-2a were not supported. However, the experimental group scored significantly higher than the control group on higher-order questions in the online quizzes, with a moderate effect size. Similarly, for the final exam, the experimental group outperformed the control group on higher-order question scores, thereby supporting hypotheses H1-1b and H1-2b.
In terms of film and television assignment scores, Table 2 shows a significant difference. The mean assignment score for the experimental group was significantly higher than that of the control group. Although the effect size was small, these findings suggest that students in the experimental group achieved significantly better performance in practical tasks compared to those in the control group, thus supporting hypothesis H1-3.
Deep Learning Perceptions
This study assessed deep learning perceptions across the dimensions of higher-order cognition, interactive learning, reflective learning, and emotional experience both before and after the completion of the blended course. The findings are shown in Table 3. Results indicated that prior to the course, all students’ deep learning competencies in these dimensions were above average, with scores ranging from 3 to 4. After engaging in the blended course, these competencies approached a score of 4, reflecting a significant enhancement in the deep learning abilities of the experimental group. In contrast, the control group, in post-test results, showed less of an improvement, suggesting that the blended learning model played a role in enhancing deep learning competencies.
Table 3
Deep Learning Perceptions in the Experimental and Control Groups
| Dimension | Experimental | Control | |||
| M | SD | M | SD | ||
| Higher-order cognition | Pretest | 3.59 | 0.57 | 3.60 | 0.46 |
| Posttest | 3.96 | 0.55 | 3.76 | 0.46 | |
| Interactive learning | Pretest | 3.52 | 0.50 | 3.59 | 0.56 |
| Posttest | 3.90 | 0.56 | 3.73 | 0.44 | |
| Reflective learning | Pretest | 3.48 | 0.49 | 3.50 | 0.49 |
| Posttest | 3.89 | 0.49 | 3.73 | 0.41 | |
| Emotional experience | Pretest | 3.49 | 0.49 | 3.54 | 0.49 |
| Posttest | 3.95 | 0.53 | 3.77 | 0.38 | |
Covariate analysis was performed to assess the model’s effectiveness further and to elucidate the disparities in higher-order cognition, interactive learning, reflective learning, and emotional experience between the two groups. The results of the intragroup regression coefficient homogeneity test revealed that none of the dimensions reached the significance level: for high-order cognition, F = 3.23, p > .05; for interactive learning, F = 0.04, p > .05; for reflective learning, F = 0.89, p > .05; and for emotional experience, F = 0.26, p > .05. Consequently, the null hypotheses were accepted, aligning with the assumption of homogeneity of regression coefficients in covariate analysis, justifying the continuation of covariate analysis.
Covariate analyses results revealed significant perception differences between the two groups. Specifically, significant differences were found in higher-order cognition, interactive learning, reflective learning, and emotional experience perceptions, revealing small effect sizes in the first three dimensions (tables 4, 5, and 6) and moderate effect size in the last dimension (Table 7). These findings suggest that students enrolled in the CoI-based blended course had significantly more positive higher-order cognition, interactive learning, reflective learning, and emotional experience perceptions than their traditional classroom counterparts, thereby supporting hypotheses H2-1, H2-2, H2-3, and H2-4.
Table 4
Higher-Order Cognition Perception Differences
| Source | SS | df | MS | F | η2 |
| Intervention | 8.99 | 1 | 8.99 | 61.32*** | .42 |
| Experimental effect | 0.62 | 1 | 0.62 | 4.25* | .05 |
| Residuals | 12.32 | 84 | 0.15 | ||
| Interpreted sum | 21.88 | 86 |
Note. R2 = .44 (Adjusted R2 = .42).
*p < .05. ***p < .001.
Table 5
Interactive Learning Perception Differences
| Source | SS | df | MS | F | η2 |
| Intervention | 8.76 | 1 | 8.76 | 59.26*** | .41 |
| Experimental effect | 0.69 | 1 | 0.69 | 4.64* | .05 |
| Residuals | 12.41 | 84 | 0.15 | ||
| Interpreted sum | 21.56 | 86 |
Note. R2 = .42 (Adjusted R2 = .41).
*p < .05. ***p < .001.
Table 6
Reflective Learning Perception Differences
| Source | SS | df | MS | F | η2 |
| Intervention | 7.91 | 1 | 7.91 | 77.57*** | .48 |
| Experimental effect | 0.44 | 1 | 0.44 | 4.31* | .05 |
| Residuals | 8.57 | 84 | 0.10 | ||
| Interpreted sum | 16.84 | 86 |
Note. R2 = .49 (Adjusted R2 = .48).
*p < .05. ***p < .001.
Table 7
Emotional Experience Perception Differences
| Source | Sum of squares | df | Mean of squares | F | η2 |
| Intervention | 6.47 | 1 | 6.47 | 45.43*** | .35 |
| Experimental effect | 0.93 | 1 | 0.93 | 6.54* | .072 |
| Residuals | 11.96 | 84 | 0.14 | ||
| Interpreted sum | 19.15 | 86 |
Note. R2 = .38 (Adjusted R2 = .36).
*p < .05. ***p < .001.
Discussion
Students’ Learning Effectiveness
The findings support hypothesis H1, indicating that students in the blended learning (BL) group demonstrated significantly better learning effectiveness than those in the traditional learning (TL) group. This aligns with previous studies, such as Vallée et al. (2020), which found that students in blended learning environments exhibited superior learning outcomes compared to their traditional counterparts. Similarly, Yin and Yuan (2021, 2022) concluded that CoI-based blended learning models not only increased learning interest but also positively impacted academic performance, suggesting that the structure of blended learning fosters deeper engagement and critical thinking, which can enhance learning quality beyond traditional methods (Vo et al., 2017).
For hypotheses H1-1a and H1-2a, no significant difference was observed between the BL and TL groups on lower-order cognitive questions. This finding may be attributed to the independent learning ability required for mastering lower-order cognitive content, where learning methods appear to have less impact. This result is consistent with Shen and Chang (2023), who found that the instructional approach had limited influence on students’ mastery of lower-order cognitive knowledge. Similarly, Lozano-Lozano et al. (2020) reported no significant differences between BL and TL students in their comprehension of theoretical knowledge.
In contrast, hypotheses H1-1b and H1-2b were supported, with BL students scoring significantly higher on higher-order cognitive questions compared to their TL peers. This confirms that blended learning has a substantial impact on students’ understanding of higher-order cognitive knowledge, consistent with prior research on CoI-based models. Blended learning promotes deeper cognitive engagement, allowing students to move beyond superficial learning and engage in higher-order thinking, leading to improved academic performance and greater understanding of complex concepts (Chen, 2022; Guo et al., 2021). The integration of inquiry-based learning and open communication within the CoI framework enhances social presence, which, in turn, positively affects students’ performance on higher-order cognitive tasks (Tan, 2021). These findings emphasize the importance of designing blended learning activities that foster teaching, cognitive, and social presence to enhance higher-order cognitive skills (Kurniawan, 2021). Given the distinct advantage of BL in fostering higher-order thinking, educators should carefully consider the cognitive complexity of the content when designing such courses.
Hypothesis H1-3 was also supported, as BL students demonstrated significantly better practical skills compared to TL students. The substantial impact of blended learning on students’ application abilities can be attributed to the opportunities for hands-on practice and experimentation that BL environments provide. Virtual simulations, online laboratories, and other digital tools in blended settings offer students interactive experiences that aid in mastering practical skills (Antonelli et al., 2023). For instance, combining offline and virtual experiments have been shown to improve student performance by enabling faster knowledge acquisition and exploration (Kerimbayev et al., 2023). Virtual laboratories offer safe, interactive platforms for students to observe phenomena that may not be possible in physical settings (Seifan et al., 2020). These immersive environments, along with enhanced communication and teamwork facilitated by blended learning, promote collaboration, knowledge sharing, and innovative problem-solving (Li & Cao, 2020). Furthermore, Liu et al. (2021) found that a blended model incorporating computer-assisted self-study, group discussion, and simulation training significantly improved students’ practical application skills.
Overall, the findings suggest that blended learning positively impacts students’ ability to apply knowledge in practical contexts, fostering the development of essential operational skills through enriched learning environments and collaborative opportunities.
Students’ Deep Learning Perception
The research findings robustly support hypotheses H2-1, H2-2, H2-3, and H2-4, indicating that students in the BL group had significantly higher perceptions across the four dimensions of higher-order cognition, interactive learning, reflective learning, and emotional experience compared to their TL counterparts. This is consistent with previous studies showing that CoI-based blended courses facilitate deeper learning (Tabassum & Mohd Saad, 2024; Zhang, 2020).
For hypothesis H2-1, the results indicated that BL students exhibited significantly better perceptions in the higher-order cognitive dimension. This finding highlights the model’s effectiveness in promoting engagement in advanced learning activities, such as application, analysis, evaluation, and creation. Yaniawati et al. (2022) similarly observed significant improvements in students’ creative thinking and problem-solving skills within mobile-based blended learning environments. The intelligent pedagogy framework, which integrates various learning approaches, also showed enhanced higher-order thinking capabilities among students (Meng & Zhang, 2020). Notably, during the COVID-19 pandemic, blended learning became a normative approach that effectively supported higher-order thinking skills. Well-designed blended learning models leverage both online and offline modalities, enhancing cognitive abilities and fostering deeper engagement in learning activities.
In terms of hypothesis H2-2, BL students reported significantly higher perceptions of interactive learning. This can be attributed to the flexible nature of online components within blended courses, which provided opportunities for interaction and allowed students to engage with in-depth discussions at their own pace (Wut et al., 2022). Castro (2019) emphasized the role of innovative technologies in facilitating effective student interaction in higher education blended learning. Enhanced interactive opportunities in blended learning environments improve communication, collaboration, and knowledge sharing, thereby enriching students’ cognitive and interactive capabilities (Islam et al., 2022).
Regarding hypothesis H2-3, the BL group exhibited enhanced reflective learning perceptions, suggesting improved recognition of learning challenges and engagement in reflective practices. This improvement may stem from the facilitated problem presentation and discussion typical of blended environments, which foster self-regulation and reflection (Bizami et al., 2023; Wu et al., 2023). Zhu and Bonk (2019) noted that feedback mechanisms and reflective questions significantly promote self-monitoring and reflection. Additionally, real-time technology-supported interactions likely motivate learners and enhance engagement, thereby improving reflective practices and overall learning quality (Zhu et al., 2021).
For hypothesis H2-4, the BL group demonstrated superior perceptions of emotional experiences, with a moderate effect size, indicating the positive impact of blended learning on the affective domain. This enhancement can be linked to personalized learning interventions that improve students’ attitudes, motivation, and self-efficacy (Zhang et al., 2020). Ballouk et al. (2022) highlighted the role of teaching guidance in fostering active learning and motivation in blended environments. A supportive, user-friendly blended learning context—augmented by accessible digital tools—transforms students’ perceptions of learning, leading to positive emotional experiences (Masadeh, 2021). The availability of diverse learning resources, such as quizzes and case studies, further enhances self-efficacy (Prifti, 2022). Consequently, blended learning environments shift away from traditional teaching paradigms, providing enriched emotional experiences that contribute to improved learning outcomes.
Conclusions
This study examined the effectiveness of a Community of Inquiry (CoI)-based blended learning model implemented through an intelligent learning platform, revealing significant improvements in both students’ learning outcomes and their deep learning perceptions. These findings affirm the model’s efficacy. Educators designing such models should capitalize on the strengths of both online and offline learning to foster robust teaching, social, and cognitive presences. Using digital resources enhances students’ access to information and supports creative learning. Additionally, providing targeted guidance through online platforms can facilitate student reflection on their learning strategies and outcomes. Incorporating both online and offline discussions fosters a more engaging and satisfying interactive learning experience, enhancing emotional engagement.
These insights are valuable for future AI-driven blended course design, underscoring the importance of strategically integrating online and offline components to optimize learning outcomes. The effectiveness of CoI-based blended learning models in promoting deep learning among college students is clearly demonstrated. However, several limitations warrant consideration. First, the sample comprised sophomores from specific educational technology classes at a university in southern China, which limits the generalizability of the findings. Second, the instructional design requirements may differ for courses outside Digital Film and Television Directing and Production, meaning the proposed model’s effectiveness could vary across courses and disciplines. Third, the CoI-based blended learning model was only compared to traditional learning; its efficacy against other blended models remains to be explored.
Future research could address these limitations by expanding the sample to include learners from various majors and courses. Comparative analyses could assess the differential impacts of blended learning across different disciplines. Additionally, investigating the relationship between students’ sense of presence, learning effectiveness, and deep learning perceptions in blended environments would provide deeper insights into the mechanisms that contribute to successful blended learning experiences. Furthermore, future research should examine the implementation differences between the CoI framework in traditional blended learning and AI-driven distance education. Specifically, exploring how AI technologies can collaborate with the CoI framework to optimize instructional design, enhance student engagement, and improve personalized learning experiences will be a key area of future investigation. In conclusion, while this study highlights the benefits of CoI-based blended learning, future research should aim to address its limitations and explore the broader applicability and underlying mechanisms that enhance blended learning effectiveness.
Acknowledgments
This research was financially supported by The Philosophy and Social Science Planning Project of Guangdong Province (GD24XJY39), The Characteristic Innovation Project of Colleges and Universities in Guangdong Province (2024WTSCX041), The Education Science Planning Project of Guangdong Province (2018JKZ022, 2024GXJK331), and The Education and Teaching Achievement Award Cultivation Project of Lingnan Normal University (LSGJ2314).
References
Almusaed, A., Almssad, A., Yitmen, I., & Homod, R. Z. (2023). Enhancing student engagement: Harnessing “AIED”’s power in hybrid education—A review analysis. Education Sciences, 13(7), Article 632. https://doi.org/10.3390/educsci13070632
Alqurashi, E. (2019). Predicting student satisfaction and perceived learning within online learning environments. Distance Education, 40(1), 133-148. https://doi.org/10.1080/01587919.2018.1553562
Annansingh, F. (2019). Mind the gap: Cognitive active learning in virtual learning environment perception of instructors and students. Education and Information Technologies, 24(6), 3669-3688. https://doi.org/10.1007/s10639-019-09949-5
Antonelli, D., Christopoulos, A., Laakso, M.-J., Dagienė, V., Juškevičienė, A., Masiulionytė-Dagienė, V., Mądziel, M., Stadnicka, D., & Stylios, C. (2023). A virtual reality laboratory for blended learning education: Design, implementation and evaluation. Education Sciences, 13(5), Article 528. https://doi.org/10.3390/educsci13050528
Armellini, A., Teixeira Antunes, V., & Howe, R. (2021). Student perspectives on learning experiences in a higher education active blended learning context. TechTrends, 65(4), 433-443. https://doi.org/10.1007/s11528-021-00593-w
Astiwardhani, W., & Sobandi, A. (2024). Transforming Educational Paradigms: How Micro Learning Shapes Student Understanding, Retention, and Motivation? Journal of Education Action Research, 8(2), 300-309. https://doi.org/10.23887/jear.v8i2.77711
Ballouk, R., Mansour, V., Dalziel, B., McDonald, J., & Hegazi, I. (2022). Medical students’ self-regulation of learning in a blended learning environment: A systematic scoping review. Medical Education Online, 27(1), Article 2029336. https://doi.org/10.1080/10872981.2022.2029336
Belda-Medina, J. (2021). Enhancing multimodal interaction and communicative competence through task-based language teaching (TBLT) in synchronous computer-mediated communication (SCMC). Education Sciences, 11(11), Article 723. https://doi.org/10.3390/educsci11110723
Bizami, N. A., Tasir, Z., & Kew, S. N. (2023). Innovative pedagogical principles and technological tools capabilities for immersive blended learning: A systematic literature review. Education and Information Technologies, 28(2), 1373-1425. https://doi.org/10.1007/s10639-022-11243-w
Castro, R. (2019). Blended learning in higher education: Trends and capabilities. Education and Information Technologies, 24(4), 2523-2546. https://doi.org/10.1007/s10639-019-09886-3
Chen, R. H. (2022). Effects of deliberate practice on blended learning sustainability: A community of inquiry perspective. Sustainability, 14(3), Article 1785. https://doi.org/10.3390/su14031785
Cohen, J. (1988). Statistical power analysis for the behavioral sciences. L. Erlbaum Associates.
Cohen, J. (1992). Quantitative methods in psychology: A power primer. Psychological Bulletin, 112(1), 155-159. https://doi.org/10.1037/0033-2909.112.1.155
Darling-Hammond, L., & Oakes, J. (2021). Preparing teachers for deeper learning. Harvard Education Press.
Guo, P., Saab, N., Wu, L., & Admiraal, W. (2021). The Community of Inquiry perspective on students’ social presence, cognitive presence, and academic performance in online project‐based learning. Journal of Computer Assisted Learning, 37(5), 1479-1493. https://doi.org/10.1111/jcal.12586
Han, F. (2023). Relations between students’ study approaches perceptions of the learning environment, and academic achievement in flipped classroom learning: Evidence from self-reported and process data. Journal of Educational Computing Research, 61(6), 1252-1274. https://doi.org/10.1177/07356331231162823
He, J., Ma, T., & Zhang, Y. (2023). Design of blended learning mode and practice community using intelligent cloud teaching. Education and Information Technologies, 28(8), 10593-10615. https://doi.org/10.1007/s10639-023-11606-x
Hu, Y., & Hwang, G.-J. (2024). Promoting students’ higher order thinking in virtual museum contexts: A self-adapted mobile concept mapping-based problem posing approach. Education and Information Technologies, 29(3), 2741-2765. https://doi.org/10.1007/s10639-023-11930-2
Hye, J. K., Yi, P., & Hong, J. I. (2020). Students’ academic use of mobile technology and higher-order thinking skills: The role of active engagement.Education Sciences, 10(3), Article 47. https://doi.org/10.3390/educsci10030047
Islam, M. K., Sarker, M. F. H., & Islam, M.S. (2022). Promoting student-centered blended learning in higher education: A model. E-Learning and Digital Media, 19(1), 36-54. https://doi.org/10.1177/20427530211027721
Kerimbayev, N., Umirzakova, Z., Shadiev, R., & Jotsov, V. (2023). A student-centered approach using modern technologies in distance learning: A systematic review of the literature. Smart Learning Environments, 10(1), Article 61. https://doi.org/10.1186/s40561-023-00280-8
Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory into Practice, 41(4), 212-218. https://www.jstor.org/stable/1477405
Kurniawan, S. (2021). The effect of blended learning of student teams achievement division (STAD) and jigsaw towards higher-order thinking skills. EDUTEC: Journal of Education and Technology, 5(1), 121-133. https://doi.org/10.29062/edu.v5i1.229
Li, X.-D., & Cao, H.-H. (2020). Research on VR-supported flipped classroom based on blended learning. A case study in “Learning English through news.” International Journal of Information and Education Technology, 10(2), 104-109. https://doi.org/10.18178/ijiet.2020.10.2.1347
Liu, W., Wang, J., Zhang, H., Yu, C., Liu, S., Zhang, C., Yu, J., Liu, Q., & Yang, B. (2021). Determining the effects of blended learning using the community of inquiry on nursing students’ learning gains in sudden patient deterioration module. Nursing Open, 8(6), 3635-3644. https://doi.org/10.1002/nop2.914
Liu, X., & Deris, F. D. (2022). CoI-based teaching practices to promote EFL learners’ online discussion in China’s blended learning context. Asian Journal of University Education, 18(2), 477-488. https://files.eric.ed.gov/fulltext/EJ1347554.pdf
Lozano-Lozano, M., Fernández-Lao, C., Cantarero-Villanueva, I., Noguerol, I., Álvarez-Salvago, F., Cruz-Fernández, M., Arroyo-Morales, M., & Galiano-Castillo, N. (2020). A blended learning system to improve motivation, mood state, and satisfaction in undergraduate students: Randomized controlled trial. Journal of Medical Internet Research, 22(5), Article e17101. https://doi.org/10.2196/17101
Masadeh, T. S. Y. (2021). Blended learning: Issues related to successful implementation. International Journal of Scientific Research and Management, 9(10), 1897-1907. https://doi.org/10.18535/ijsrm/v9i10.el02
Meng, Q., Jia, J., & Zhang, Z. (2020). A framework of smart pedagogy based on the facilitating of high order thinking skills. Interactive Technology and Smart Education, 17(3), 251-266. https://doi.org/10.1108/ITSE-11-2019-0076
Meyer, O., Coyle, D., Imhof, M., & Connolly, T. (2018). Beyond CLIL: Fostering student and teacher engagement for personal growth and deeper learning. In Martínez Agudo, J. D. D. (Ed.), Emotions in second language teaching: Theory, research and teacher education (pp. 277-297). Springer. https://doi.org/10.1007/978-3-319-75438-3_16
Mthethwa-Kunene, K., Rugube, T., & Maphosa, C. (2022). Rethinking pedagogy: Interrogating ways of promoting deeper learning in higher education. European Journal of Interactive Multimedia and Education, 3(1), Article e02204. https://doi.org/10.30935/ejimed/11439
Müller, C., & Mildenberger, T. (2021). Facilitating flexible learning by replacing classroom time with an online learning environment: A systematic review of blended learning in higher education. Educational Research Review, 34, Article 100394. https://doi.org/10.1016/j.edurev.2021.100394
Pan, Q., Zhou, J., Yang, D., Shi, D., Wang, D., Chen, X., & Liu, J. (2023). Mapping knowledge domain analysis in deep learning research of global education. Sustainability, 15(4), Article 3097. https://doi.org/10.3390/su15043097
Pellegrino, J. W., & Hilton, M. L. (Eds.). (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. National Academies Press. https://www.fsba.org/wp-content/uploads/2014/04/Education-for-life-and-work.pdf
Perera, C. J., Zainuddin, Z., Piaw, C. Y., Cheah, K. S. L., & Asirvatham, D. (2020). The pedagogical frontiers of urban higher education: Blended learning and co-lecturing. Education and Urban Society, 52(9), 1305-1329. https://doi.org/10.1177/0013124519894966
Priddis, L., & Rogers, S. L. (2018). Development of the reflective practice questionnaire: Preliminary findings. Reflective Practice, 19(1), 89-104. https://doi.org/10.1080/14623943.2017.1379384
Prifti, R. (2022). Self-efficacy and student satisfaction in the context of blended learning courses. Open Learning: The Journal of Open, Distance and e-Learning, 37(2), 111-125. https://doi.org/10.1080/02680513.2020.1755642
Qi, D., Zhang, M., & Zhang, Y. (2020). Resource integration, value co-creation and continuance intention in MOOCs learning process. Interactive Learning Environments, 31(2), 701-713. https://doi.org/10.1080/10494820.2020.1802299
Rogers, S. L., Priddis, L. E., Michels, M., Tieman, M., & Van Winkle, L. J. (2019). Applications of the reflective practice questionnaire in medical education. BMC Medical Education, 19, Article 47. https://doi.org/10.1186/s12909-019-1481-6
Seifan, M., Robertson, N., & Berenjian, A. (2020). Use of virtual learning to increase key laboratory skills and essential non-cognitive characteristics. Education for Chemical Engineers, 33, 66-75. https://doi.org/10.1016/j.ece.2020.07.006
Shamir-Inbal, T., & Blau, I. (2021). Facilitating emergency remote K–12 teaching in computing-enhanced virtual learning environments during COVID-19 pandemic—Blessing or curse? Journal of Educational Computing Research, 59(7), 1243-1271. https://doi.org/10.1177/0735633121992781
Shen, D.-D., & Chang, C.-S. (2022). Exploring college students’ deeper learning perceptions in the blended learning environment: Scale development, validation, and experimental comparison. International Journal of Technology and Human Interaction, 18(1), 1-21. http://doi.org/10.4018/IJTHI.313184
Shen, D., & Chang, C.-S. (2023). Implementation of the flipped classroom approach for promoting college students’ deeper learning. Educational Technology Research and Development, 71, 1323-1347. http://doi.org/10.1007/s11423-023-10186-4
Tabassum, Z., & Mohd Saad, M. R. B. (2024). A decadal examination of community of inquiry and blended learning in EFL/ESL development: A systematic review. Arab World English Journals, 15(1), 401-422. https://dx.doi.org/10.24093/awej/vol15no1.25
Tan, C. (2021). The impact of COVID-19 pandemic on student learning performance from the perspectives of Community of Inquiry. Corporate Governance: The International Journal of Business in Society, 21(6), 1215-1228. https://doi.org/10.1108/CG-09-2020-0419
Vallée, A., Blacher, J., Cariou, A., & Sorbets, E. (2020). Blended learning compared to traditional learning in medical education: Systematic review and meta-analysis. Journal of Medical Internet Research, 22(8), Article e16504. https://doi.org/10.2196/16504
Vlachopoulos, D., & Makri, A. (2019). Online communication and interaction in distance higher education: A framework study of good practice. International Review of Education, 65(4), 605-632. https://doi.org/10.1007/s11159-019-09792-3
Vo, H. M., Zhu, C., & Diep, N. A. (2017). The effect of blended learning on student performance at course-level in higher education: A meta-analysis. Studies in Educational Evaluation, 53, 17-28. https://doi.org/10.1016/j.stueduc.2017.01.002
Wang, N., Chen, J., Tai, M., & Zhang, J. (2021). Blended learning for Chinese university EFL learners: Learning environment and learner perceptions. Computer Assisted Language Learning, 34(3), 297-323. https://doi.org/10.1080/09588221.2019.1607881
Weber, F., Wambsganss, T., & Söllner, M. (2025). Enhancing legal writing skills: The impact of formative feedback in a hybrid intelligence learning environment. British Journal of Educational Technology, 56(2), 650-677. https://doi.org/10.1111/bjet.13529
Wu, T.-T., Lee, H.-Y., Li, P.-H., Huang, C.-N., & Huang, Y.-M. (2023). Promoting self-regulation progress and knowledge construction in blended learning via ChatGPT-based learning aid. Journal of Educational Computing Research, 61(8), 1539-1567. https://doi.org/10.1177/07356331231191125
Wut, T. M., Xu, J. (B.), Lee, S. W., & Lee, D. (2022). University student readiness and its effect on intention to participate in the flipped classroom setting of hybrid learning. Education Sciences, 12(7), Article 442. https://doi.org/10.3390/educsci12070442
Yaniawati, P., Maat, S. M., Supianti, I. I., & Fisher, D. (2022). Mathematics mobile blended learning development: Student-oriented high order thinking skill learning. European Journal of Educational Research, 11(1), 69-81. https://doi.org/10.12973/eu-jer.11.1.69
Yin, B., & Yuan, C.-H. (2021). Precision teaching and learning performance in a blended learning environment. Frontiers in Psychology, 12, Article 631125. https://doi.org/10.3389/fpsyg.2021.631125
Yin, B., & Yuan, C.-H. (2022). Blended learning performance influence mechanism based on community of inquiry. Asia Pacific Journal of Education, 44(4), 977-992. https://doi.org/10.1080/02188791.2022.2061912
Zhang, J.-H., Zou, L.-c., Miao, J.-j., Zhang, Y.-X., Hwang, G.-J., & Zhu, Y. (2020). An individualized intervention approach to improving university students’ learning performance and interactive behaviors in a blended learning environment. Interactive Learning Environments, 28(2), 231-245. https://doi.org/10.1080/10494820.2019.1636078
Zhang, R. (2020). Exploring blended learning experiences through the community of inquiry framework. Language Learning & Technology, 24(1), 38-53. https://doi.org/10125/44707
Zhao, J., Hwang, G.-J., Chang, S.-C., Yang, Q.-f., & Nokkaew, A. (2021). Effects of gamified interactive e-books on students’ flipped learning performance, motivation, and meta-cognition tendency in a mathematics course. Educational Technology Research and Development, 69, 3255-3280. https://doi.org/10.1007/s11423-021-10053-0
Zhao, S., & Song, J. (2022). Unpacking the emotional experiences of learners in a blended learning context. Frontiers in Psychology, 13, Article 879696. https://doi.org/10.3389/fpsyg.2022.879696
Zhao, W. (2022). An empirical study on blended learning in higher education in “Internet+” era. Education and Information Technologies, 27(6), 8705-8722. https://doi.org/10.1007/s10639-022-10944-6
Zhu, M., Berri, S., & Zhang, K. (2021). Effective instructional strategies and technology use in blended learning: A case study. Education and Information Technologies, 26(5), 6143-6161. https://doi.org/10.1007/s10639-021-10544-w
Zhu, M., & Bonk, C. J. (2019). Designing MOOCs to facilitate participant self-monitoring for self-directed learning. Online Learning Journal, 23(4), 106-134. https://doi.org/10.24059/olj.v23i4.2037
Zimmerman, W. A., & Kulikowich, J. M. (2016). Online learning self-efficacy in students with and without online learning experience. The American Journal of Distance Education, 30(3), 180-191. https://doi.org/10.1080/08923647.2016.1193801
Dandan Shen, School of Computer Science and Intelligence Education, Lingnan Normal University, Guangdong, China
Dandan Shen is an Associate Professor of the School of Computer Science and Intelligence Education at Lingnan Normal University, China. Her research interests focus on deeper learning, blended learning design, and technology integrated into teaching
Chiungsui Chang, Tamkang University, Taiwan
Chiung-Sui Chang is a Professor of the Department of Educational Technology at Tamkang University in Taiwan. Her scholarly interests involve game-based learning, online learning, emerging learning technologies, and innovative teaching strategies
Junjie Yang, School of Computer Science and Intelligence Education, Lingnan Normal University, Guangdong, China
Junjie Yang is an Associate Professor of the School of Computer Science and Intelligence Education at Lingnan Normal University, China. His research interests are in the use of computer technology in education.
© 2025. This work is published under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.