Content area
Peer learning is a promising instructional strategy, particularly in higher education, where increasing class sizes limits teachers’ abilities to effectively support students’ learning. However, its use in a traditional way is not always highly effective, due to, for example, students’ lack of familiarity with strategies such as peer feedback. Recent advancements in educational technologies, including learning analytics and artificial intelligence (AI), offer new pathways to support and enhance peer learning. This editorial introduces a special issue that examines how emerging educational technologies, specifically learning analytics, AI, and multimodal tools, can be thoughtfully integrated into peer learning to improve its effectiveness and outcomes. The six studies featured in this issue present key innovations, including the successful application of AI-supported peer assessment systems, multimodal learning analytics for analyzing collaborative gestures and discourse, gamified online platforms, social comparison feedback tools and dashboards, group awareness tools for collaborative learning, and behavioral indicators of peer feedback literacy. Collectively, these studies show how these technologies can scaffold peer learning processes, enrich the quality and uptake of peer feedback, foster engagement through gamification, promote reflective and collaborative learning, and address peer feedback literacy. However, the issue also identifies underexplored gaps, such as the short-term nature of many interventions, insufficient focus on the role of teachers, limited cultural and equity considerations, and a need for deeper theoretical integration. This editorial argues for a more pedagogically grounded, inclusive, and context-sensitive approach to technology-enhanced peer learning—one that foregrounds student agency, long-term impact, and interdisciplinary collaboration. The contributions of this special issue provide insights to guide future research, design, and practice in advancing peer learning through educational technologies.
Introduction
As class sizes in higher education continue to grow each year (Shi, 2019), it becomes increasingly difficult for teachers to provide individualized feedback. For large classes, offering detailed feedback is often impractical due to the heavy workload (Er et al., 2021). In such cases, peer learning in the broad sense (e.g., including peer feedback, peer assessment, peer interaction, and peer dialogue) can serve as an effective alternative instructional strategy. When used as a process-oriented pedagogical activity, peer learning also fosters a collaborative and motivating environment in which students explore learning materials, engage in critical discussions, and learn from one another (Kollar & Fischer, 2010). Peer learning involves students critically reviewing peers’ work, identifying gaps, suggesting improvements, and co-constructing knowledge (Liu & Carless, 2006; Topping, 2009).
While peer learning offers clear benefits, research highlights several challenges across its various instantiations. For instance, in peer feedback, students—especially novices—often provide surface-level rather than in-depth, constructive comments, largely due to a lack of training in giving critical feedback (Cho & Schunn, 2007; Wu & Schunn, 2020a). In peer assessment, concerns persist around its reliability and validity compared to teacher assessment, as students typically lack the expertise to evaluate peers’ work objectively and consistently (Liu & Carless, 2006; Cho & Schunn, 2007). This can lead to distrust in peer assessment, negatively affecting learning outcomes and fostering unfavorable attitudes and emotional responses (Cheng et al., 2014; Noroozi et al., 2016). Peer interaction also brings emotional and psychological challenges: critical dialogues, for example, may trigger anxiety, a fear of losing face, or the perception of critiques as personal attacks rather than constructive input (Rourke & Kanuka, 2007; Noroozi et al., 2016). Furthermore, when peer learning activities are associated with grading, students may experience emotional stress that hinders their ability to provide meaningful feedback (Liu & Carless, 2006). Last but not least, individual differences in cognitive, metacognitive, cultural, and contextual factors can influence students’ ability to interpret and act on peer feedback effectively (Nelson & Schunn, 2009; Wu & Schunn, 2020b). These challenges highlight that successful peer learning does not occur automatically (King, 2002; Kollar & Fischer, 2010); instead, targeted support and guidance are crucial to realizing its full potential in educational settings.
The rapid advancement of Technology-Enhanced Learning (TEL) environments, along with the growth of educational technologies, offers significant opportunities to address not all but some challenges in peer learning. Over the past two decades, various TEL tools and platforms have been developed with functionalities specifically designed to support peer learning (Darvishi et al., 2022; Huang et al., 2019; Keppell et al., 2006; Noroozi et al., 2016; Tsai et al., 2002). For example, computer-supported collaborative learning (CSCL) environments have been shown to enhance students’ social and cognitive awareness and promote deeper engagement in peer assessment (Phielix et al., 2011; Prins et al., 2005). Other studies have explored the use of AI and virtual reality to strengthen trust and improve the quality of peer assessment (Chang et al., 2020; Darvishi et al., 2022). Learning Analytics is another promising innovation in this domain (Banihashem et al., 2022); additionally, multimodal learning analytics are growing in use to capture the quality of students’ interactions using sensing technology (Schneider et al., 2021). Student-centered analytics dashboards, for instance, can visualize peer interaction patterns and provide students with actionable insights using methods such as Social Network Analysis (Lee & Bonk, 2016; Rienties et al., 2013). A shared strength across these technologies is their capacity to embed targeted instructional supports that help overcome peer learning challenges and guide students toward effective collaborative learning outcomes (Noroozi et al., 2021).
Despite such theoretical and practical innovations for facilitating peer learning processes and outcomes, in-depth research on this topic is limited, particularly regarding new technological innovations (e.g., AI, learning analytics, augmented reality, virtual reality, and multimodality). Literature in those areas predominantly emphasizes teacher- and student-centered approaches, with limited focus on peer-centered technology-enhanced learning. Consequently, there is insufficient guidance on how to align educational technologies with pedagogical strategies that meaningfully enhance peer learning outcomes. For example, the capacity of learning analytics has been seen through the lens of large-scale personalized learning practice improvements in higher education, where learning analytics dashboards are mainly either teacher-centered or student-centered, while its potential for enhancing peer learning activities (peer-centered dashboard) has not been sufficiently explored (Banihashem et al., 2022; Ryan et al., 2019). More importantly, the recent advancement of AI—particularly generative AI (GenAI)—has demonstrated significant potential to support teaching and learning (Debets et al., 2025; Gašević et al., 2023; Ogunleye et al., 2024). With its human-like text generation capabilities, GenAI has been explored in recent studies as a tool for providing feedback (Banihashem et al., 2024, 2025; Er et al., 2024). However, its potential to support peer learning—especially in the context of peer feedback—remains largely underexplored.
In light of the current challenges and recent advancements in educational technologies, this special issue invited contributions that investigate how emerging technologies, such as AI, GenAI, learning analytics, augmented and virtual reality, and multimodal environments, can be thoughtfully designed and applied to enhance peer learning and improve educational outcomes.
About this special issue
The special issue titled “Technological Innovations for Facilitation of Peer Learning Processes and Outcomes” focuses on leveraging advanced educational technologies to enhance peer learning in higher education. This special issue includes six studies.
The study by Sung and Nathan (2025), titled “Unraveling Temporally Entangled Multimodal Interactions: Investigating Verbal and Nonverbal Contributions to Collaborative Construction of Embodied Math Knowledge,” investigates the ways in which verbal and nonverbal interactions—such as speech and gestures—contribute to peer learning in online, technology-enhanced environments. Using an embodied math activity, undergraduate students in math education collaborated via Zoom with support from a motion-based tool (THV-CE) while their upper body movements were tracked using PoseNet. Two analytical approaches were compared: the triangulating approach, which looks at cumulative patterns across modalities, and the interleaving approach, which captures the temporal interplay of those interactions. Results showed that students with greater physical movement produced more gestures and verbal contributions, especially co-thought gestures, indicating deeper engagement. The interleaving analysis revealed distinct discourse patterns tied to students’ movement levels—those with more movement were more likely to initiate embodied ideas, while others tended to build on peers’ contributions. The study shows that analyzing the temporal dynamics of multimodal interactions can provide richer insight into collaborative learning, highlighting the value of combining both analytical approaches in technology-enhanced learning research.
Topping et al. (2025), in their study “Enhancing Peer Assessment with Artificial Intelligence,” provide an in-depth examination of the ways in which AI can address the limitations of traditional peer assessment in higher education. It is structured around three main components: a theoretical framework for AI integration, a scoping review of 79 relevant studies, and a case study of the AI-powered RiPPLE platform. The framework highlights six domains where AI can support peer assessment: assigning assessors, enhancing individual reviews, deriving grades and feedback, analyzing student responses, facilitating instructor oversight, and developing assessment systems. The review reveals that most existing research focuses on grading and feedback, with less attention given to areas such as assessor assignment and teachers involvement. The RiPPLE case study illustrates how AI can be applied in practice, offering real-time feedback, reliability-based reviewer assignments, and analytics for teachers. In addition, the authors emphasize that AI should complement, not replace, human judgment in learning environments.
The study by Lu et al. (2024), titled “Social Comparison Feedback in Online Teacher Training and Its Impact on Asynchronous Collaboration,” investigates the ways in which social comparison feedback influences asynchronous online collaboration among in-service teachers during a project-based learning course. Using a randomized controlled trial with 95 participants, the study compared an experimental group that received social comparison feedback—based on behavioral, cognitive, interactional, and emotional indicators—with a control group that received only self-referential feedback. The intervention was delivered via an online platform and analyzed using Epistemic Network Analysis, Lag Sequential Analysis, and Social Network Analysis. The results showed that social comparison feedback significantly improved group-regulated learning, encouraging more focused task discussions, deeper negotiation, and greater monitoring behaviors. It also led to higher posting frequency, stronger interaction patterns, and improved knowledge construction, especially in higher-order phases like co-construction of meaning. Humor and questioning behaviors were more frequent in the experimental group, helping create a more motivating and engaging learning environment. While interaction balance did not differ significantly between groups, the experimental group showed better overall collaboration quality. The study showed that integrating social comparison feedback in asynchronous learning can enhance engagement, reflection, and collaboration. It also highlights the value of multimodal, comparative feedback for online teacher training.
The study by Moon et al. (2024), titled “Using Learning Analytics to Explore Peer Learning Patterns in Asynchronous Gamified Environments,” investigates how students engage in peer learning through online discussions enhanced with gamification. The researchers utilized a mixed-methods approach combining epistemic network analysis, sequence pattern mining, and automated content coding using a large language model tool. The study involved over a thousand business students from undergraduate and graduate programs participating in asynchronous online discussions enhanced with gamification elements such as points, badges, leaderboards, and quests. These elements were integrated into Microsoft Teams using the ClassCred plugin. The researchers collected and analyzed a larger number of discussion posts and responses, focusing on discourse dimensions like epistemic activity, argumentation, and social interaction, based on Weinberger and Fischer’s (2006) framework. The findings revealed that students frequently externalized ideas and built conceptual understanding through structured peer discourse. Patterns involving open-ended questions, debates, and social consensus-building were most effective for knowledge co-construction. The study shows that gamified environments, when paired with learning analytics, can enhance peer interaction, foster critical thinking, and support meaningful collaborative learning.
Zhang et al. (2024), in their study “What does it mean to be good at peer reviewing? A multidimensional scaling and cluster analysis study of behavioral indicators of peer feedback literacy,” explore the behavioral dimensions of peer feedback literacy, shifting the focus from students’ attitudes and knowledge to the actual quality of feedback they produce. The authors propose a comprehensive framework of six dimensions of peer review quality: reviewing process, rating accuracy, feedback amount, perceived comment quality, actual comment quality, and feedback content. Using data from over 800 students in a university-level writing course who engaged in online peer review via the Peerceptiv platform, the study investigates five of these dimensions (excluding feedback content) through 18 specific behavioral indicators. Using correlation analysis, multidimensional scaling, and cluster analysis, the study finds that peer review quality is multidimensional. While rating accuracy, feedback amount, and process measures form clear clusters, perceived and actual comment quality overlap and are better described as “initial impact” (e.g., helpfulness, length) and “ultimate impact” (e.g., usefulness for revision). The findings highlight the importance of both effort and expertise in effective peer feedback. The study concludes with practical recommendations: teachers and platform designers should scaffold four areas—accuracy, volume, initial impact, and ultimate impact—while using accessible indicators like comment length and helpfulness when deeper analysis is not feasible.
Strauss et al. (2025), in their study “Promoting Collaborative Reflection to Foster Interprofessional Collaboration Skills: Putting Laboratory Findings to the Test in the Field,” examine whether a technology-supported collaborative reflection activity can improve students’ knowledge of effective collaboration and enhance the quality of their group interactions. Building on the same pedagogical design principles, the study compares two settings: a laboratory experiment with university students and a field trial involving civil engineering students. Both groups participated in two collaborative problem-solving phases. In between these phases, they engaged in a guided collaborative reflection using a digital group awareness tool, which visualized self-assessed ratings of collaboration quality across dimensions like mutual understanding, information pooling, and consensus-building. Students then discussed and created plans for improving their future collaboration based on these reflections. Results showed that in both settings, the reflection activity improved participants’ explicit knowledge about effective collaboration. However, while knowledge gains were observed, the quality of collaboration in the second task did not significantly improve. Interestingly, students’ self-assessments aligned well with expert evaluations, suggesting that students can accurately reflect on their group processes when properly supported. The study shows that collaborative reflection—especially when supported by technology—can foster metacognitive awareness and promote knowledge about effective collaboration. However, translating that knowledge into improved interaction quality may require more time or additional support.
Meta-review of the collection: key contributions
The six studies in this special issue provide a timely and multifaceted exploration of how educational technologies are being purposefully leveraged to advance peer learning. Moving beyond generic applications of technology, these contributions investigate intelligent and targeted interventions that address persistent challenges in peer assessment, interaction, and feedback.
A meta-review of these studies (summarized in Table 1) captures current trends at the intersection of technology and peer learning. The contributions encompass a wide range of methodological approaches, such as experimental research, quantitative and mixed-methods designs, theoretical models, and scoping reviews. While students are the primary focus, one study (Lu et al., 2024) uniquely centers on teacher training, highlighting the essential role teachers play in facilitating and sustaining peer learning. The technologies examined—ranging from AI and learning analytics to feedback platforms and group awareness tools—reflect a growing reliance on intelligent, data-driven strategies to support peer engagement.
Each study targets a specific aspect of peer learning. Four emphasize peer interaction, while others focus on feedback and assessment, reflecting a broader shift toward collaborative learning, co-regulation, and joint knowledge construction. Interventions include analyses of verbal and nonverbal communication in collaborative tasks (Sung & Nathan, 2025), AI-enhanced peer assessment systems (Topping et al., 2025), and behavioral indicators that shape feedback literacy development (Zhang et al., 2024). Moreover, the integration of gamified platforms (Moon et al., 2024) and comparative feedback tools (Lu et al., 2024) highlights the increasing importance of motivation and engagement in peer learning environments.
The studies collectively extend our understanding of peer learning through the purposeful use of educational technology. It highlights specific innovations, such as the integration of multimodal analytics to uncover embodied aspects of collaboration (e.g., Sung & Nathan, 2025), the theoretical and empirical grounding of AI-supported peer assessment tools (e.g., Topping et al., 2025), and the use of learning analytics and social comparison metrics to foster teacher reflection and interaction (e.g., Lu et al., 2024). These contributions go beyond reporting results; they propose new frameworks, validate novel tools (e.g., RiPPLE, SNA, ENA), and surface critical design implications for enhancing peer feedback, motivation, and collaboration across diverse learning contexts.
Altogether, this collection signals a meaningful shift toward peer-centered, theory-informed, and scalable technological interventions. It highlights a growing commitment to designing peer learning environments that are not only data-driven and pedagogically sound but also inclusive and adaptable to diverse learners and settings in higher education.
Table 1. Meta-review of the collections
Study | Method | Target group | Technology use | Peer learning strategy | Intervention focus | Key Contribution |
|---|---|---|---|---|---|---|
Sung and Nathan (2025) | Mixed-methods | Student | Multimodal learning analytics | Peer interaction | Role of verbal and nonverbal modalities in collaborative knowledge construction | Highlighting the role of gesture and discourse in embodied learning |
Topping et al. (2025) | Theoretical framework, Scoping review, Case study | Student | AI, peer assessment tool | Peer assessment | Grounding AI use for peer learning in a theoretical framework and supporting it with a review and case study | A framework and review for AI in peer assessment |
Lu et al. (2024) | Experimental (randomized controlled trial) | Teacher | Learning analytics | Peer feedback and interaction | Role of the social comparison feedback tool on online collaboration | The role of social comparison feedback in teacher training |
Moon et al. (2024) | Mixed-methods | Student | Learning analytics, AI | Peer interaction | Exploring peer learning patterns in gamified environments | Approach to integrating gamified online peer learning + GenAI |
Strauss et al. (2025) | Experimental design (lab and field study) | Student | CSCL, group awareness tools | Peer interaction | Improving collaborative reflection and interprofessional collaboration skills | Strategy for supporting group reflection to boost collaboration skills |
Zhang et al. (2024) | Quantitative (correlation, cluster analysis, MDS) | Student | Peer feedback tool | Peer feedback | Exploring behavioral indicators of peer feedback literacy | Empirically-derived structure of dimensions for peer feedback literacy |
Implications, critical observations, and future research directions
Based on the insights derived from these studies, the following implications can be identified.
Augmenting Peer Learning with Emerging Technologies: Emerging technologies, such as AI-based assessment systems, multimodal learning analytics, and group awareness tools, can generally enhance peer learning processes and outcomes. For instance, technologies that capture both verbal and nonverbal interactions (e.g., gestures and discourse) or offer targeted feedback (e.g., through AI or social comparison) provide more nuanced insights and personalized support, fostering deeper cognitive engagement and addressing diverse student needs in peer learning settings.
The Necessity of Pedagogical Alignment: While technological capabilities are expanding rapidly, some studies emphasize that technology alone is insufficient. Effective peer learning requires intentional alignment between pedagogical design and technological capabilities. For example, Moon et al. (2024) applied Weinberger and Fischer’s (2006) framework to guide their exploration of peer learning patterns, while Zhang et al. (2024) demonstrated that peer feedback literacy depends on explicit training (e.g., rubrics, exemplars) rather than platform features alone.
Designing for Contextual and Learner Diversity: Targeted, context-sensitive designs are necessary to effectively integrate technologies such as multimodal analytics, AI-driven support, and socially comparative feedback for both students and teachers. This need arises from the clear inadequacy of one-size-fits-all solutions – effective implementations must adapt to disciplinary, cultural, and cognitive differences. For instance, Sung and Nathan (2025) demonstrate the importance of modality-specific approaches, showing how gesture analysis benefits embodied learning domains like mathematics but proves less relevant for text-focused disciplines. Similarly, Lu et al. (2024) emphasize cultural considerations, revealing that while social comparison feedback can motivate students in competitive environments, it may disadvantage those from collectivist cultures unless carefully framed to emphasize collaboration.
The following critical observations emerge from the meta-review of the included studies.
Limited Longitudinal Evidence: A majority of the studies rely on short-term interventions. Only Strauss et al. (2025) attempted a field validation, and even there, no sustained improvements in collaborative performance were observed over time.
Overemphasis on Student Populations: Except for Lu et al., the studies exclusively focus on students. The role of teachers as orchestrators of peer learning and co-designers of feedback environments is underexplored, despite its acknowledged importance.
Technology-Centric Bias: While the studies are rich in technological sophistication, there is a risk of instrumentalizing technology at the expense of deeper pedagogical integration. These efforts could benefit from more critical reflection on the pedagogical assumptions guiding the design and use of these tools.
Limited Diversity and Equity Considerations: Only Lu et al. (2024) and Strauss et al. (2025) hint at contextual and cultural variability, yet a broader investigation into how these tools serve (or marginalize) different learner populations remains lacking.
Fragmented Theoretical Integration: While some studies (e.g., Moon et al., 2024; Topping et al., 2025) draw on established theoretical frameworks, others lack clear alignment with such frameworks. A more coherent theoretical grounding could strengthen the field’s progression.
These studies lay the groundwork for future investigations in the following areas:
Repurposing Technologies as Scaffolds: We call for future research to continue exploring how emerging technologies—particularly GenAI, learning analytics, and multimodal tools—can be used not only as analytical or feedback mechanisms but also as scaffolds that empower students and teachers to take more active roles in peer learning.
Advancing Inclusive and Equitable Design: There is a pressing need to investigate how these tools can improve outcomes across different contexts and the ways in can support diversity, equity, and inclusion within specific collaborative learning environments, ensuring that diverse student populations benefit from such innovations. In particular, research should explore how to design inclusive peer learning environments that reduce bias and support diverse students equitably.
Clarifying Individual Differences: In further support of understanding “for whom” effects, researchers should focus on developing and validating integrated frameworks that bridge cognitive, emotional, and social aspects of peer learning with the technical affordances of learning analytics and AI.
Developing Scalable and Adaptive Systems: Researchers should focus on exploring the design of scalable frameworks for adaptive peer learning technologies, such as discipline-specific AI feedback (e.g., visual analytics for math, discourse analysis for literature), culturally customizable dashboards (e.g., toggling competitive/collaborative comparison modes), and real-time modality switching (text/voice/gesture) based on students’ needs.
Prioritizing Long-Term Impact: Additionally, longitudinal studies are needed to better understand the long-term impacts of these technologies on students’ collaborative competencies and academic outcomes.
Addressing Ethical Considerations: Finally, evaluating the ethical implications of AI-driven interventions in peer learning—including concerns about bias, transparency, and human oversight—will also be essential in shaping responsible future applications.
Conclusion
This special issue offers a timely contribution to the evolving discourse on peer learning in higher education by showcasing how advanced technologies can be harnessed to address its challenges. Moving beyond isolated case studies, the featured research provides a cohesive and empirically grounded narrative on the integration of AI, learning analytics, and multimodal tools into peer-based pedagogies. This special issue thus serves as both a reflection of current progress and a roadmap for future research and practice in the evolving field of technology-enhanced peer learning. The added value of this collection lies in its shift toward peer-centered technological design—highlighting the importance of empowering students as active agents in collaborative learning, rather than as passive recipients. By introducing novel analytical frameworks, validating emerging tools in diverse settings, and critically reflecting on practical implications, this special issue deepens our understanding of how technology can support effective peer learning. It not only fills key gaps in literature but also sets a forward-looking agenda for interdisciplinary research and pedagogical innovation.
Author contributions
All the authors listed contributed to the writing of the paper.
Competing interests
The authors have no competing interests.
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
Banihashem, SK; Noroozi, O; Van Ginkel, S; Macfadyen, LP; Biemans, HJ. A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educational Research Review; 2022; 37, 100489. [DOI: https://dx.doi.org/10.1016/j.edurev.2022.100489]
Banihashem, SK; Kerman, NT; Noroozi, O; Moon, J; Drachsler, H. Feedback sources in essay writing: peer-generated or AI-generated feedback?. International Journal of Educational Technology in Higher Education; 2024; 21,
Banihashem, S. K., Noroozi, O., Khosravi, H., Schunn, C. D., & Drachsler, H. (2025). Pedagogical framework for hybrid intelligent feedback. Innovations in Education and Teaching International, 1–17. https://doi.org/10.1080/14703297.2025.2499174
Chang, SC; Hsu, TC; Jong, MSY. Integration of the peer assessment approach with a virtual reality design system for learning Earth science. Computers & Education; 2020; 146, 103758. [DOI: https://dx.doi.org/10.1016/j.compedu.2019.103758]
Cheng, KH; Hou, HT; Wu, SY. Exploring students’ emotional responses and participation in an online peer assessment activity: A case study. Interactive Learning Environments; 2014; 22,
Cho, K; Schunn, CD. Scaffolded writing and rewriting in the discipline: A web-based reciprocal peer review system. Computers & Education; 2007; 48,
Darvishi, A., Khosravi, H., Sadiq, S., & Gašević, D. (2022). Incorporating AI and learning analytics to build trustworthy peer assessment systems. British Journal of Educational Technology, 844–875. https://doi.org/10.1111/bjet.13233
Debets, T., Banihashem, S. K., Joosten-Ten Brinke, D., Vos, T. E., de Buy Wenniger, G. M., & Camp, G. (2025). Chatbots in education: A systematic review of objectives, underlying technology and theory, evaluation criteria, and impacts. Computers & Education, 105323. https://doi.org/10.1016/j.compedu.2025.105323
Er, E; Dimitriadis, Y; Gašević, D. Collaborative peer feedback and learning analytics: Theory-oriented design for supporting class-wide interventions. Assessment & Evaluation in Higher Education; 2021; 46,
Er, E., Akçapınar, G., Bayazıt, A., Noroozi, O., & Banihashem, S. K. (2024). Assessing student perceptions and use of instructor versus AI-generated feedback. British Journal of Educational Technology, 1–18. https://doi.org/10.1111/bjet.13558
Gašević, D; Siemens, G; Sadiq, S. Empowering learners for the age of artificial intelligence. Computers and Education: Artificial Intelligence; 2023; 4, 100130. [DOI: https://dx.doi.org/10.1016/j.caeai.2023.100130]
Huang, K., Bryant, T., & Schneider, B. (2019). Identifying Collaborative Learning States Using Unsupervised Machine Learning on Eye-Tracking, Physiological and Motion Sensor Data. In proceeding of the 12th International Conference on Educational Data Mining (EDM), Montreal, Canada (p.pp. 318–323).
Keppell, M; Au, E; Ma, A; Chan, C. Peer learning and learning-oriented assessment in technology‐enhanced environments. Assessment & Evaluation in Higher Education; 2006; 31,
King, A. Structuring peer interaction to promote high-level cognitive processing. Theory into Practice; 2002; 41,
Kollar, I; Fischer, F. Peer assessment as collaborative learning: A cognitive perspective. Learning and Instruction; 2010; 20,
Lee, J; Bonk, CJ. Social network analysis of peer relationships and online interactions in a blended class using blogs. The Internet and Higher Education; 2016; 28, pp. 35-44. [DOI: https://dx.doi.org/10.1016/j.iheduc.2015.09.001]
Liu, NF; Carless, D. Peer feedback: The learning element of peer assessment. Teaching in Higher Education; 2006; 11,
Lu, Y; Ma, N; Yan, WY. Social comparison feedback in online teacher training and its impact on asynchronous collaboration. International Journal of Educational Technology in Higher Education; 2024; 21,
Moon, J; McNeill, L; Edmonds, CT; Banihashem, SK; Noroozi, O. Using learning analytics to explore peer learning patterns in asynchronous gamified environments. International Journal of Educational Technology in Higher Education; 2024; 21,
Nelson, MM; Schunn, CD. The nature of feedback: How different types of peer feedback affect writing performance. Instructional Science; 2009; 37,
Noroozi, O; Biemans, H; Mulder, M. Relations between scripted online peer feedback processes and quality of written argumentative essay. The Internet and Higher Education; 2016; 31, pp. 20-31. [DOI: https://dx.doi.org/10.1016/j.iheduc.2016.05.002]
Noroozi, O; Weinberger, A; Kirschner, PA. Technological and pedagogical innovations for facilitation of students’ collaborative argumentation-based learning. Innovations in Education and Teaching International; 2021; 58,
Ogunleye, B; Zakariyyah, KI; Ajao, O; Olayinka, O; Sharma, H. A systematic review of generative AI for teaching and learning practice. Education Sciences; 2024; 14,
Phielix, C; Prins, FJ; Kirschner, PA; Erkens, G; Jaspers, J. Group awareness of social and cognitive performance in a CSCL environment: Effects of a peer feedback and reflection tool. Computers in Human Behavior; 2011; 27,
Prins, FJ; Sluijsmans, DM; Kirschner, PA; Strijbos, JW. Formative peer assessment in A CSCL environment: A case study. Assessment & Evaluation in Higher Education; 2005; 30,
Rienties, B; Héliot, Y; Jindal-Snape, D. Understanding social learning relations of international students in a large classroom using social network analysis. Higher Education; 2013; 66,
Rourke, L; Kanuka, H. Barriers to online critical discourse. International Journal of Computer-Supported Collaborative Learning; 2007; 2,
Ryan, T., Gašević, D., & Henderson, M. (2019). Identifying the impact of feedback over time and at scale: Opportunities for learning analytics. In M. Henderson, R. Ajjawi, D. Boud, & E. Molloy (Eds.), The impact of feedback in higher education (pp. 207–223). Palgrave Macmillan. https://doi.org/10.1007/978-3-030-25112-3_12
Schneider, B; Sung, G; Chng, E; Yang, S. How can high-frequency sensors capture collaboration? A review of the empirical links between multimodal metrics and collaborative constructs. Sensors (Basel, Switzerland); 2021; 21,
Shi, M. The effects of class size and instructional technology on student learning performance. The International Journal of Management Education; 2019; 17,
Sung, H; Nathan, MJ. Unraveling temporally entangled multimodal interactions: Investigating verbal and nonverbal contributions to collaborative construction of embodied math knowledge. International Journal of Educational Technology in Higher Education; 2025; 22,
Strauß, S., Tunnigkeit, I., Eberle, J. et al. Promoting collaborative reflection to foster interprofessional collaboration skills: Putting laboratory findings to the test in the field. Int J Educ Technol High Educ 22, 54 (2025). https://doi.org/10.1186/s41239-025-00553-x
Topping, KJ. Peer assessment. Theory into Practice; 2009; 48,
Topping, KJ; Gehringer, E; Khosravi, H; Gudipati, S; Jadhav, K; Susarla, S. Enhancing peer assessment with artificial intelligence. International Journal of Educational Technology in Higher Education; 2025; 22,
Tsai, CC; Lin, SS; Yuan, SM. Developing science activities through a networked peer assessment system. Computers & Education; 2002; 38,
Weinberger, A; Fischer, F. A framework to analyze argumentative knowledge construction in computer-supported collaborative learning. Computers & Education; 2006; 46,
Wu, Y; Schunn, CD. When peers agree, do students listen? The central role of feedback quality and feedback frequency in determining uptake of feedback. Contemporary Educational Psychology; 2020; 62, 101897. [DOI: https://dx.doi.org/10.1016/j.cedpsych.2020.101897]
Wu, Y; Schunn, CD. From feedback to revisions: Effects of feedback features and perceptions. Contemporary Educational Psychology; 2020; 60, 101826. [DOI: https://dx.doi.org/10.1016/j.cedpsych.2019.101826]
Zhang, Y; Schunn, CD; Wu, Y. What does it mean to be good at peer reviewing? A multidimensional scaling and cluster analysis study of behavioral indicators of peer feedback literacy. International Journal of Educational Technology in Higher Education; 2024; 21,
© The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.