1. Introduction
The distinctive context of higher education, characterised by advanced qualifications like undergraduate degrees and postgraduate programmes (postgraduate diplomas, master’s degrees, and doctoral studies), necessitates a tailored approach to data analytics within Learning Management System (LMS) platforms to effectively improve the user experience (UX) and support academic activities [1,2]. LMS are digital platforms designed to facilitate online learning and educational administration [3]. Their features, which include discussion boards, tools for submitting assignments, gradebooks, and multimedia integration, serve as a hub for the delivery of course content, communication, assessment, and student engagement [1,3]. LMS platforms in higher education must be able to meet the complex requirements of students who frequently collaborate on group projects, perform independent research, and complete specialised coursework [4]. LMS platforms must, therefore, be reliable, adaptable, and user-friendly to effectively support the academic endeavours and enhance the UX of students [4,5,6].
UX, in this context, refers to the overall interaction and perception of learners when using an LMS [2,5]. It encompasses factors such as ease of navigation, clarity of information, efficiency of task completion, and the overall satisfaction derived from the platform. A positive UX is crucial for learners, as it can significantly impact their engagement, motivation, and eventually, their academic performance [2,5,6]. For instance, an LMS with a cluttered interface or a convoluted assignment submission process can lead to frustration and wasted time, detracting from the learner’s focus on their research or coursework. Conversely, an LMS with a streamlined, intuitive design can enhance productivity, facilitate seamless communication, and foster a more positive learning environment. In this regard, higher education institutions (HEIs) can better understand how learners use their LMS, pinpoint areas for development, and tailor the learning process to each student’s unique requirements and preferences by utilising learning analytics (LA) [2,3,4,5]. In LA, data about learners and their contexts is measured, gathered, analysed, and reported in order to better understand and optimise learning as well as the settings in which it takes place [7,8]. In essence, it turns the unprocessed data produced by learner interactions inside the LMS into useful insights. This involves monitoring a number of student behaviour indicators, including how often students log in [4], how much time they spend on particular learning resources, how often they participate in discussion boards, how often they turn in assignments, and how well they do on tests and quizzes [9,10,11]. HEIs can obtain a detailed picture of how students interact with their LMS by examining this data and recognising both successful and challenging trends [9,10]. In particular, the rapid evolution of digital learning environments calls for a better comprehension of how LA can be adapted to the particular difficulties encountered by students, who frequently balance rigorous schedules, a variety of learning preferences, and intricate research goals [11,12]. The complex requirements of this group are frequently not met by the conventional one-size-fits-all approach to the LMS design, which results in less-than-ideal engagement and may even impede academic achievement [2,12,13]. Additionally, there is an opportunity to optimise and customise the learning process due to the growing amount of data available within LMS platforms [1,14,15].
Even though data analytics has the potential to improve academic performance, HEIs confront numerous obstacles and constraints when attempting to effectively adopt it [16,17]. In particular, there is a significant gap in the way LA are standardised and integrated into LMS platforms to maximise learners’ UX [3,18,19]. In the context of LMS, this lack of best practices causes HEIs to be unclear about data gathering tactics, efficient analysis methods, and the application of data-driven insights for programme enhancement [16,20]. The resultant inconsistency in LA implementation across programmes and institutions can lead to variable impacts on student learning outcomes and UX [10,11]. Furthermore, the extant literature on LA often focuses on isolated analytical techniques, such as descriptive analytics or predictive modelling, rather than exploring the integrated application of diverse approaches within the LMS environment [18,19]. This integrated approach, which could offer nuanced insights into student learning behaviour and performance within the digital learning platform [20], remains under-explored, potentially limiting the development of targeted interventions and improved academic achievement through enhanced LMS user experience.
In addition, despite the widespread adoption of LMS in education [3], a significant knowledge gap persists regarding the effective integration of LA to enhance UX. This gap impedes the development of LMS platforms optimised to specific requirements and learning styles [16]. Consequently, this systematic review aimed to synthesise empirical literature to identify and evaluate LA approaches that have demonstrated efficacy in improving the UX of LMS platforms for students. Furthermore, this study assessed the impact of these approaches on academic outcomes, student engagement, and satisfaction, thereby producing an evidence-based synthesis to inform future research and practice. Thus, without a systematic evaluation of effective analytical methods, institutions risk misinterpreting or underutilising the potential of LA [7]. Therefore, this review will address this critical gap by synthesising the methodological rigour of existing research, establishing best practices in the application of LA for UX enhancement, and providing practical insights for educators, instructional designers, and LMS developers. These insights will facilitate the creation of more user-friendly, engaging, and productive learning environments for students. In this regard, this review is guided by the following research objectives: Identify and evaluate learning analytics approaches utilised within Learning Management Systems to enhance the user experience of students. Examine the challenges hindering the successful integration of learning analytics for user experience improvement in Learning Management Systems environments.
2. Materials and Methods
In performing this systematic review and meta-analysis, the researchers followed the guidelines provided in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA 2020 (Supplementary Materials) statement to guarantee integrity and methodological rigour [21]. To bolster the methodological robustness and clarity of our review process, we proactively registered the study protocol on the Open Science Framework (OSF) [22]. This preregistration created a publicly available, time-stamped account of our intended aims, eligibility criteria, search approach, data extraction techniques, and planned synthesis, thereby mitigating the risk of biases emerging after data collection and enhancing trust in the reliability of this review’s conclusions [21].
2.1. Literature Retrieval, Screening, and Eligibility Criteria
The literature retrieval was conducted across five prevalent academic databases: Web of Science, ERIC, PsycINFO, Scopus, and the ACM Digital Library. Peer-reviewed journal articles, conference papers, and book chapters were among the many scholarly outputs from the original search that dealt with the nexus between data analytics and higher education. In particular, the search approach was carefully crafted to find research that concentrated on applying LA in LMS to improve students’ UXs. This approach combined keywords and subject titles [23] pertaining to LA, higher education, LMS platforms, and UX. The researchers used the subsequent syntax for each database: ACM Digital Library: ((“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) AND (“learning analytics” OR “educational data mining” OR “data analytics”) AND (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”)). ERIC: (TI = (“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) OR AB = (“learning analytics” OR “educational data mining” OR “data analytics”)) AND (KW = (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”))). PsycINFO: ((TX = “undergraduate” OR “postgraduate education” OR TX = “masters” OR TX = “doctoral” OR TX = “graduate education”) AND (TX = “learning analytics” OR TX = “educational data mining” OR TX = “data analytics”) AND (TX = “learning management system” OR TX = “LMS” OR TX = “user experience” OR TX = “UX” OR TX = “student engagement” OR TX = “academic performance”)). Scopus: TITLE-ABS-KEY((“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) AND (“learning analytics” OR “educational data mining” OR “data analytics”) AND (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”)). Web of Science: TS = ((“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) AND (“learning analytics” OR “educational data mining” OR “data analytics”) AND (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”)). Therefore, to enhance search recall and precision, Boolean operators (AND, OR) were employed to effectively combine the aforementioned search terms [23].
Subsequently, a targeted and pertinent literature search was ensured through the development of a structured search strategy and the establishment of eligibility criteria, guided by the Population, Intervention, Comparison, Outcome (PICO) framework [24]. Specifically, the PICO framework served as an instrumental tool in streamlining the initial stages of the study selection process, facilitating the identification of eligible articles [24]. Accordingly, the following eligibility criteria were formulated based on the PICO framework: Population (P): Higher education learners (undergraduate, postgraduate programmes) utilising LMS. Inclusion: Studies explicitly identifying undergraduate and postgraduate learners within an LMS context. Exclusion: Studies focusing on pregraduate education or general populations without LMS-specific data. Intervention (I): Enhance UX via LA in LMS. Inclusion: Studies applying LA to improve learner experience within LMS. Exclusion: Purely technical LA studies without LMS UX focus. Comparison (C): LA intervention comparison within LMS. Preference: Studies comparing LA effectiveness within LMS. Screening: Prioritised comparative studies within LMS; non-comparative studies not prioritised. Outcome (O): UX and learner success in LMS. Inclusion: Studies measuring UX and learner outcomes (academic, engagement, satisfaction) within LMS. Exclusion: Studies without UX or learner outcome data within LMS [24].
Building on the fundamental PICO framework, the following further requirements for eligibility were created: Language: Studies published in English to ensure accessibility and consistency in analysis. Publication Period: Studies published from January 2015 to February 2025, reflecting the rapid evolution of LA technologies and their application within LMS environments. This timeframe ensures the inclusion of recent advancements relevant to UX enhancement. Methodological Rigour and Conceptual Clarity: Research articles demonstrating a robust understanding of LA concepts and their application within LMS environments. Studies should employ rigorous research methodologies, clearly define the learner sample within the LMS context, and present findings that contribute to the understanding of how LA enhances UX and student success in this specific digital learning environment. The PRISMA flow diagram (Figure 1) illustrates the systematic search and study selection process.
As shown in Figure 1, an initial retrieval of 3107 records were conducted across the designated databases. To manage this volume of data and ensure a rigorous selection process, a two-stage screening methodology was implemented, utilising Rayyan software version 1.5.0 (a web-based deduplication and screening platform) and ASReview software version 2.0 (an open-source machine, learning-assisted screening tool).
2.1.1. Stage 1: Automated Deduplication and Machine Learning-Assisted Prioritisation
Rayyan’s deduplication functionality was employed to identify and remove redundant and non-English records [26]. Utilising algorithms that compared attributes such as title, authors, abstract, and publication details, 1547 records were excluded. This deduplication process reduced the dataset to 1560 records for subsequent evaluation.
Subsequently, ASReview was utilised to prioritise the remaining 1560 records for full-text review [27]. The reviewers were able to concentrate on the most promising records, since ASReview’s machine learning algorithms projected each record’s relevance by comparing abstracts and titles to the predetermined inclusion and exclusion criteria. This automated prioritisation made the screening process much faster [26,27].
2.1.2. Stage 2: Independent Full-Text Review and Selection
Following the ASReview prioritisation, the two independent reviewers conducted a full-text review using Rayyan. Each reviewer independently applied the predetermined eligibility criteria to the prioritised records.
Through this rigorous application of the eligibility criteria, 1490 records were excluded due to lack of peer review and non-compliance with the research objectives, resulting in the identification of 70 records directly relevant to the application of LA within LMS environments to enhance UX. These 70 records were retained for methodological rigour and validity assessment. Therefore, the final step in the selection process involved applying the quality criteria.
2.2. Methodological Rigour and Validity Assessment
A thorough assessment of study quality was conducted to assure the epistemic integrity and robustness of this systematic review. This study made use of the Critical Appraisal Skills Programme (CASP) checklist, which is a proven tool that provides an organised and systematic framework for evaluating research articles [28]. To reduce subjective bias and promote transparency in the selection process, the CASP checklist enabled a uniform approach across all included research [28]. The CASP checklist directed attention to four cardinal dimensions of study quality and reliability: Conceptual Clarity and Theoretical Foundation: The reviewers carefully considered how LA fundamentals were articulated and understood, as well as how they were applied in higher education. This required a thorough examination of the introduction and literature review sections to determine the authors’ familiarity with pertinent theoretical frameworks, current research, and the unique educational possibilities and challenges present in this field. This component dealt with the examined literature’s construct validity. Methodological Soundness: Each article’s methodology sections were carefully examined. The reviewers assessed the suitability of data analysis methods, the validity and reliability of data collection tools (such as surveys and interviews), the researchers’ attempts to address potential sources of bias, and the appropriateness of the research design with respect to the stated research questions. Prioritising studies that demonstrated well-reasoned and methodologically sound procedures highlighted the research’s internal validity. Sampling Adequacy and Generalisability: Clearly defined sampling frames are important, as the CASP checklist emphasised. The reviewers evaluated how well the authors defined the target group of students, the sampling procedure they used, and the rationale behind the sample size and representativeness. In order to address the external validity and transferability of the results, studies with clearly defined and representative sampling frames were given greater credibility. Importance and Interpretive Complexity of Results: To assess the offered findings’ coherence, clarity, and applicability to the study objectives and methodology, a thorough analysis of the results and discussion sections was conducted. The reviewers evaluated how well the authors articulated the implications of their findings and how deeply they interpreted them. Particularly noteworthy were articles that offered fresh perspectives and useful suggestions about the use of LA in HEIs. This element had to do with the research’s ecological validity and usefulness.
The CASP checklist’s methodical implementation made it easier to thoroughly assess each article’s benefits and drawbacks in relation to these four important quality metrics [28]. This procedure led to the identification of a cohort of eligible research publications, which served as the systematic review’s evidentiary foundation [28]. Following the application of the pre-established quality criteria, 29 studies were deemed ineligible, resulting in the selection of 41 studies that formed the evidentiary base of this review.
2.3. Inter-Rater Reliability and Consistency
After the first independent screening of articles using the Rayyan and ASReview software, the reviewers worked together to settle any disagreements that surfaced throughout the selection process. This cooperative strategy guaranteed a fair and impartial assessment of each study’s applicability. Inter-rater reliability (IRR) tests were used to determine the screening process’s consistency and reduce inter-reviewer variability [29]. As a major metric, the reviewers’ percentage of agreement produced a high concordance rate of 80%. Additionally, a more robust metric that takes chance agreement into consideration, Cohen’s kappa (κ), was developed [29,30]. The dependability of the screening procedure was further supported by the ensuing kappa coefficient of 0.75, which showed a high degree of agreement among reviewers [30]. These quantitative measurements of IRR increased the research’s credibility by demonstrating the rigour used to guarantee a thorough and objective selection of studies for this review [29,30].
2.4. Extraction and Synthesis of Data
To ensure adherence to the best practices in systematic review methodology, data extraction was conducted in compliance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist. The researchers made use of CADIMA, an open-access web application, to establish a consistent data extraction procedure [31]. This promoted consistency and efficiency in the data extraction process by making it easier to systematically collect relevant information from the chosen research [31]. Key components of the data extraction form included the following: Bibliographic details, such as the author, the year of publication, and the location. Methodological design. LA tools and methods used. Significant findings about the benefits and difficulties of LA in HEIs.
Manual data coding involved extracting and classifying crucial information, including the authors, the research design, the place of publication, and the main conclusions. Appendix A [32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72] offers a thorough audit trail of the data extraction and coding procedure.
3. Results
The subsequent section delineates the analysed results, with a specific focus on addressing the core objectives of this study. These objectives entail, firstly, the identification and evaluation of diverse LA approaches employed within LMS to enhance student UX; and secondly, the examination of challenges that hinder the successful integration of LA for the explicit purpose of UX improvement within LMS environments. As such, this analysis sought to provide a comprehensive understanding of how LA are currently being utilised to optimise the student learning experience, while also critically assessing the obstacles that impede their effective implementation. Table 1 provides a structured overview of the LA tools and techniques identified across the reviewed studies.
Table 1 demonstrates a diverse application of LA tools in LMS research. Studies commonly employ general log data analysis, statistical methods (regression, correlation), and machine learning for predictive modelling. The table also shows that visualisation dashboards aid data interpretation, while mixed methods approach integrates behavioural and survey data. Furthermore, trends include the use of Moodle-specific tools, advanced artificial intelligence (AI) techniques, and cross-platform data integration for holistic learner analysis.
In addition to analysing the LA techniques and tools, the reviewed studies also demonstrate a focus on specific LMS platforms. Table 2 presents a summary of these LMS platforms.
Table 2 highlights a dual focus in LMS research: a platform-centric approach, analysing specific systems like Moodle and Canvas with data-driven methods, and a broader analysis of general LMS usage and educational technology impacts. This dichotomy reveals a field concerned with both granular system analysis and the overarching theoretical and practical implications of technology integration in education, including innovative frameworks like RiPPLE.
To address the second research objective, an examination of the challenges impeding the efficacious integration of LA for UX enhancement within LMS environments was conducted. Table 3 presents the findings of this analysis.
Table 3 presents six key challenges in leveraging LA for UX improvement: ethical data handling, implementation complexity, limited generalisability, difficulty in deriving actionable insights, personalisation conflicts, and system quality issues. These challenges, collectively, hinder the effective translation of LA data into meaningful UX enhancements.
4. Discussion
Table 1 highlights the diverse methodological approaches prevalent within LA research [16], illustrating a broad adoption of various tools and techniques aimed at comprehending and improving learning environments [2,3]. A fundamental component of this methodological landscape is the widespread utilisation of general LMS log data, which offers substantial behavioural insights into student interactions within these platforms [4,6]. Although this quantitative approach provides valuable data, it is crucial to acknowledge its inherent limitations, as it may not fully encapsulate the nuanced and intricate nature of learning processes. Consequently, statistical analyses, such as regression and correlation, are frequently applied to discern relationships and generate predictions, signifying a clear inclination towards quantifying the influence of diverse factors on student learning outcomes [9,10,12].
The growing adoption of machine learning (ML) marks a significant transition towards more data-centric and automated methodologies within LA, facilitating sophisticated predictive modelling and pattern identification (Table 1). While ML presents considerable potential for the development of personalised interventions, it also introduces critical ethical considerations pertaining to data privacy and algorithmic bias, thereby underscoring the necessity for transparency and explainability in its application [3,73,74]. Visualisation dashboards are also playing an increasingly vital role in rendering complex data comprehensible and actionable, emphasising the importance of effective data representation for facilitating informed decision making [16]. Complementing these quantitative approaches, the integration of survey data analysis and qualitative methods, such as interviews and questionnaires, underscores the recognised value of incorporating subjective perspectives and contextual insights into LA research [33], thus moving beyond purely quantitative analyses [10,12,16]. Furthermore, as evidenced in Table 1, the specific utilisation of Moodle-centric tools highlights the significance of context-sensitive analyses and the recognition of the unique affordances offered by different LMS platforms [75]. The increasing application of advanced AI techniques, including deep learning and Bayesian networks, further demonstrates the escalating sophistication of LA methodologies [18], enabling the modelling of intricate temporal dependencies and probabilistic inferences [7,9]. Moreover, the emphasis on cross-platform data integration indicates a progressive movement towards achieving a holistic understanding of learner behaviour across disparate learning systems [18]. Taken together, these observations underscore the inherently multifaceted nature of LA research [16], advocating for the synergistic combination of quantitative and qualitative methods alongside advanced analytical techniques to attain a comprehensive understanding of student learning, while concurrently emphasising critical ethical considerations and the imperative for transparent and responsible data utilisation [38,73,74].
Table 2 delineates the diverse landscape of LMS research, revealing a dual focus encompassing both platform-specific analyses and broader examinations of LMS usage [18]. Studies centred around Moodle demonstrate a significant interest in capitalising on its platform-specific features and plugins, thereby underscoring the importance of comprehending the unique functionalities inherent in individual LMS environments [3,35]. Similarly, research investigating Canvas explores its influence on course design and delivery, highlighting a focus on the practical ramifications of implementing particular LMS systems [34,43]. The analysis of StarC, employing clickstream data, further exemplifies the increasing application of quantitative methodologies to elucidate user interaction patterns within specific LMS platforms [55]. Conversely, a substantial body of research transcends these platform-specific boundaries, examining general LMS usage [9], educational data more broadly, and underlying theoretical concepts. These studies offer a wider perspective on the impact of technology on education without restricting their analyses to a singular LMS [6,7]. The inclusion of RiPPLE, a personalised peer-learning environment, signals a growing interest in innovative pedagogical frameworks that operate within or in conjunction with traditional LMS structures [65]. Collectively, these findings suggest a multifaceted research landscape, encompassing both granular analyses of specific LMS platforms and macro-level investigations into the broader implications of technology integration within educational contexts [15,18]. This dual emphasis underscores the necessity for both specialised expertise in individual LMS platforms and a comprehensive understanding of the theoretical and practical implications of LMS usage across diverse learning environments [17].
Table 3 articulates several critical challenges that impede the effective utilisation of LA for UX enhancement. Initially, concerns surrounding data privacy and ethical considerations [3,73], including a lack of clarity regarding data usage protocols and anxieties about data sharing, significantly erode stakeholder trust and restrict data accessibility. Consequently, this hinders the development of personalised and impactful UX improvements that could otherwise be informed by comprehensive data insights [72]. Secondly, the inherent complexity and practical implementation issues [40,48], such as the steep learning curve faced by faculty and difficulties encountered in LMS integration, present substantial obstacles to user adoption and the creation of seamless, intuitive UX solutions [5,13]. Furthermore, the limited transferability and generalisability of LA models across diverse educational contexts restricts the scalability and widespread applicability of UX enhancements, potentially confining effective solutions to specific environments [41,63]. The challenges associated with data interpretation and the derivation of actionable insights further compound these issues [46,74]. These challenges are often exacerbated by inadequate visualisations and the essential yet time-consuming necessity for qualitative assessments, leading to inefficient LA utilisation and impeding the translation of raw data into meaningful UX improvements that demonstrably enhance learning outcomes [2,4,5]. Navigating the inherent complexities of user preferences and personalisation [48], particularly in striking a balance between individualised customisation and existing data constraints alongside diverse user desires, poses a significant design challenge for effective UX implementation [5,13]. Moreover, technology and system quality issues, encompassing aspects such as system stability, data accuracy, and interoperability across different platforms, can erode user confidence and compromise the reliability of the information intended to inform UX improvements [49,52]. Collectively, these multifaceted challenges underscore the imperative for the adoption of rigorous, ethically sound, and contextually aware approaches to LA implementation. Such approaches are crucial to effectively realise the inherent potential of LA in meaningfully enhancing learning experiences and ensuring responsible innovation in educational technology [73,74].
This study, while providing a comprehensive overview of LA tools and challenges within LMS environments, is inherently limited by its reliance on the existing literature, which may introduce biases based on study selection and database limitations. The generalisability of findings could be constrained by the diverse contexts and populations represented in the reviewed studies, and a potential overemphasis on technical aspects might overlook crucial pedagogical considerations. The subjective nature of data interpretation and the absence of original empirical data further contribute to potential limitations.
5. Conclusions
This study examined the application of LA within LMS to enhance UX, revealing a diverse methodological landscape encompassing both quantitative and qualitative approaches, including log data analysis, statistical modelling, machine learning, and visualisation. It also highlighted a dual research focus on platform-specific analyses and broader LMS usage, demonstrating the field’s concern for both granular systems understanding and the wider impact of educational technology. However, significant challenges, including ethical concerns, implementation complexities, limited generalisability, data interpretation difficulties, personalisation conflicts, and system quality issues, hinder the effective integration of LA for UX improvement, necessitating rigorous, ethical, and contextually aware implementation strategies to realise its full potential. Given the limitations inherent in the study, future research should prioritise empirical investigations to validate findings in real-world settings, address ethical concerns related to data privacy and algorithmic bias, develop practical implementation strategies, and explore personalised UX solutions. Further studies should also focus on enhancing the generalisability of LA models, integrating qualitative insights, conducting longitudinal studies, and investigating the integration of emerging technologies and the development of interoperability standards.
P.N. and M.M.N. jointly conceived the study and developed its methodological framework. P.N. and M.M.N. were responsible for software development, undertaking formal analysis, investigation, and meticulous data curation, and also crafted the initial manuscript and created visual representations to illustrate key findings. Also, both P.N. and M.M.N. collaborated to validate the results, ensuring the study’s conclusions were rigorously tested and substantiated. P.N. successfully secured essential resources and funding, providing overarching supervision for the project’s duration, and reviewed and edited the manuscript to refine its content and clarity. The final manuscript received the approval of all authors, confirming their collective endorsement of the research outcomes and conclusions. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
No new data were created or analysed in this study.
The authors acknowledge the support of two postdoctoral fellows from UNISA for their support in checking the data coding.
The authors declare no conflicts of interest.
The following abbreviations are used in this manuscript:
AI | Artificial Intelligence |
CASP | Critical Appraisal Skills Programme |
DA | Data Analytics |
HEIs | Higher Education Institutions |
IRR | Inter-Rater Reliability |
LA | Learning Analytics |
LMS | Learning Management Systems |
ML | Machine Learning |
OSF | Open Science Framework |
PICO | Population, Intervention, Comparison, Outcome |
PRISMA | Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews |
UX | User Experience |
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 1 Flowchart and description of the literature search and study selection protocol (Adapted from PRISMA Statement, Page et al., 2021 [
Learning analytics tools and techniques used.
Tool/Technique Category | Key Observations | Studies |
---|---|---|
LMS Log Data Analysis (General) | Core of many studies; fundamental data for LA. | [ |
Statistical Analysis (Regression, Correlation, etc.) | Used to find refs. [ | [ |
Survey Data Analysis | Combines behavioural data with self-reported experiences. | [ |
Machine Learning (ML) | Increasingly used for predictive modelling and pattern recognition. | [ |
Deep Learning (Long Short-Term Memory) networks, etc.) | Demonstrates the application of advanced AI techniques. | [ |
Qualitative Analysis (Interviews, Questionnaires) | Provides context and deeper insights into user experiences. | [ |
Visualisation Dashboards | Aids in data interpretation and decision making. | [ |
Moodle Specific Tools | Highlights the use of specific LMS features for analytics. | [ |
Cross-Platform Data Integration | Focuses on combining data from different learning systems. | [ |
Bayesian Networks | Utilises probabilistic models for learning status inference. | [ |
Focus on Learning Management Systems specificity.
LMS Platform/Category | Key Research Themes/Observations | Studies |
---|---|---|
General LMS Usage/Data | Studies examining broad LMS usage patterns, general educational data, or theoretical concepts without specifying a particular LMS platform. | [ |
Moodle | Investigations into the utilisation of specific Moodle features and plugins, emphasising platform-specific functionality. | [ |
Canvas | Analyses of Canvas functionalities and the impact of Canvas system implementations on course design and delivery. | [ |
StarC | Focused analysis of clickstream data generated within the StarC LMS, highlighting user interaction patterns. | [ |
RiPPLE | Application-focused study of RiPPLE, a personalised peer-learning environment, emphasising its unique pedagogical implementation. | [ |
Challenges hindering successful LA integration for LMS UX improvement.
Challenge Category | Specific Challenges Identified | Impact on UX Improvement | Studies |
---|---|---|---|
Complexity and Implementation Issues | Learning curve for faculty; LMS integration difficulties; complexities of multimodal learning analytics (MMLA) systems and in implementing complex adaptive systems. | Creates barriers to user adoption and hinders the development of intuitive and seamless UX solutions. | [ |
Data Interpretation and Actionable Insights | Difficulty in deriving actionable insights from data; lack of effective visualisations; need for qualitative assessments and translating data to applicable improvements | Leads to inefficient use of LA; reduces the ability to transform data into meaningful UX enhancements that positively impact learning outcomes. | [ |
Data Privacy and Ethical Concerns | Lack of clarity on data usage; concerns about data sharing, ensuring compliance with ethical guidelines, as well as maintaining transparency, data accuracy, and privacy. | Limits stakeholder trust in LA systems; inhibits data sharing, thus impacting the ability to create personalised and effective UX improvements. | [ |
Lack of Transferability and Generalisability | Models not readily transferable between courses, variability in course design affecting learning patterns and context-specific results. | Restricts scalability and limits the creation of generalised UX improvements that benefit diverse user populations. | [ |
User Preferences and Personalisation | Balancing user desires for personalised dashboards with concerns about data sharing, defining desired engagement and system feature requests, and content curation vs. monitoring. | Challenges in designing personalised UX solutions that resonate with diverse user preferences and address the complexities around individual customisation within data constraints. | [ |
Technology and System Quality | System output and quality issues regarding integration and interoperability between LMSs, data accuracy, and system stability. | Impacts user confidence in the systems, and reduces reliable information used to provide UX improvements. | [ |
Supplementary Materials
The following supporting information can be downloaded at:
Appendix A
Author(s) | Research Title | Research Design | Place of Publication | Learning Analytics Tools Used | Major Findings |
[ | Using log variables in a learning management system to evaluate learning activity using the lens of activity theory | Quantitative analysis of LMS log data | Assessment and Evaluation in Higher Education | Moodle log data analysis, statistical analysis | Low overall LMS usage; significant variation in activity patterns across courses and colleges. Contradictions within the activity system hinder effective LMS use. |
[ | Towards actionable learning analytics using dispositions | Quantitative analysis, incorporating self-reported data | IEEE Transactions on Learning Technologies | Demographic, trace (LMS), and self-reported data analysis | Incorporation of dispositional data (e.g., procrastination, boredom) into LA models enhances understanding of student behaviour and enables more actionable interventions. |
[ | Predicting time-management skills from learning analytics | Quantitative (linear and multilevel regression) | Journal of Computer-Assisted Learning | Canvas LMS trace data, questionnaire data | LMS trace data can predict self-reported time-management skills, but models are not readily transferable between courses. Further research is needed to improve portability. |
[ | Evaluation of usability in Moodle learning management system through analytics graphs: University of Applied Sciences teacher’s perspective in Finland | Quantitative analysis of LMS log data | International Journal of Education and Development using Information and Communication Technology | Moodle log data, analytics graphs plugin | Analytics graphs in Moodle provide insights into student activity and enable identification of student profiles. This aids teachers and management in tracking and improving student performance. |
[ | Analytics-informed design: Exploring visualisation of learning management systems recorded data for learning design | Educational design research, qualitative interviews, dashboard pilot evaluation | SAGE Open | Visualisation dashboard development, LMS data visualisation | Educational design research can effectively develop user-friendly LA visualisation dashboards to support data-informed learning design. Preliminary design principles were identified. |
[ | Using analytics to predict students’ interactions with learning management systems in online courses | Quantitative, Multiple Linear Regression (MLR) and Decision Tree (DT) | Education and Information Technologies | LMS analytics and log data analysis | MLR and DT models can effectively predict learner–LMS interactions. Key predictors include submission, content access, and assessment access metrics. |
[ | Learning analytics and data ethics in performance data management: A benchlearning exercise involving six European universities | Qualitative, benchlearning exercise | Quality in Higher Education | Analysis of institutional data management models, ethical review | Learning analytics are present in European universities but are primarily based on traditional data. Ethical risks are generally covered by regulations. Learning analytics offers opportunities for improved data and quality management. |
[ | Pre-class learning analytics in flipped classroom: Focusing on resource management strategy, procrastination, and repetitive learning | Quantitative, log data, survey, and exam data analysis. | Journal of Computer-Assisted Learning | Learning analytics of pre-class video viewing data, statistical analysis | Resource management strategies (time, study environment) significantly influence pre-class video engagement and learning achievement in flipped classrooms. Procrastination significantly decreases video engagement. |
[ | From data to action: Faculty experiences with a university-designed learning analytics system | Qualitative case study, surveys, and focus groups | International Journal on E-Learning | University-designed LA system, cloud-based data collection, dashboards, alert emails | Faculty will use LA to make data-driven changes to teaching, including feedback and communication. Implementation challenges include learning curves and LMS integration issues. Ongoing training and clear policies are needed. |
[ | The mediating role of learning analytics: Insights into student approaches to learning and academic achievement in Latin America | Quantitative, analysis of LMS trace data | Journal of Learning Analytics | LMS trace data analysis, statistical analysis | Most LA indicators do not mediate the effect between learning approaches and performance, but fine-grained indicators can. Organised learning approaches are effective in Latin American higher education. |
[ | Using learning analytics and student perceptions to explore student interactions in an online construction management course | Case study, learning analytics, surveys | Journal of Civil Engineering Education | Canvas LMS analytics, survey data | Student interactions with course materials decreased after the midterm. Students found lecture videos and slides most helpful. LA can inform course design. |
[ | Student device usage, learning management system tool usage, and self-regulated learning | Non-intervention descriptive research design, LMS data logs, surveys | ProQuest Dissertations and Theses Global (University of Nevada, Las Vegas) | LMS data logs, Online Self-Regulated Learning Questionnaire (OSLQ) | Device usage varies; low-performing students report similar or higher SRL but use LMS tools less. SRL instruction and tool/device effectiveness are crucial. |
[ | Leveraging complexity science to promote learning analytics adoption in higher education: An embedded case study | Embedded case study | ProQuest Dissertations and Theses Global (University of Maryland, College Park) | Analysis of learning analytics practices, application of CAS framework | Learning analytics implementation requires consideration of higher education institutions as complex adaptive systems. Emergent, ground-up approaches are more effective than top-down. |
[ | Beyond learning management systems: Teaching digital fluency | Pedagogical reflections, qualitative analysis | Journal of Political Science Education | Analysis of pedagogical approaches, platforms beyond LMS | Teaching digital fluency requires platforms beyond LMS. Innovative assignments improve digital skills and content retention for Generation Z learners. |
[ | Increasing student engagement with course content in graduate public health education: A pilot randomised trial of behavioral nudges | Pilot randomised controlled trial | Education and Information Technologies | LMS data analysis, behavioural nudges | Behavioural nudges based on LA did not significantly change student engagement. Future work should focus on qualitative assessment of motivations and richer analysis of learning behaviours. |
[ | Learning management systems for higher education: A brief comparison | Comparative analysis | Discover Education | Evaluation criteria based on SQTL (Software Quality and Teaching–Learning tools) | Paradiso and Moodle are the top-rated LMS based on SQTL criteria, with high scores in interoperability, accessibility, and learning tools. |
[ | Learners’ needs in online learning environments and third-generation learning management systems (LMS 3.0) | Qualitative, open-ended questionnaire, semi-structured interviews | Technology, Knowledge and Learning | Content analysis of questionnaire and interview data | Learners desire entertaining, self-monitoring LMS environments with gamification. Needs align with LMS 3.0, which can be developed using data mining and LA. |
[ | Examining learning management system success: A multiperspective framework | Quantitative, survey, structural equation modelling (SEM) | Education and Information Technologies | TAM3 and ISS framework, SEM analysis | LMS success depends on content, system, and output quality, leading to student satisfaction and perceived usefulness. User satisfaction negatively impacts system and output quality. |
[ | LearnSphere: A learning data and analytics cyberinfrastructure | Use-driven design, case studies | Journal of Educational Data Mining | LearnSphere, Tigris | LearnSphere facilitates discoveries about learning (active learning vs. passive, discussion board quality), supports research reproducibility, and enables workflow combinations for analytics. |
[ | Pragmatic monitoring of learning recession using log data of learning management systems and machine learning techniques | Analysis of system log data, machine learning application | International Journal of Education and Development using Information and Communication Technology | Machine learning techniques (unspecified) applied to LMS log data. | System log data can be used for machine learning-based monitoring of students’ learning recession. Proposed indicators and visualisations for proactive intervention. |
[ | Evidence-based multimodal learning analytics for feedback and reflection in collaborative learning | Two-year longitudinal study, survey, Evaluation Framework for Learning Analytics | British Journal of Educational Technology | Multimodal Learning Analytics (MMLA) system | MMLA solution enhances feedback and reflection in collaborative learning. Positive perceptions from teachers and students, but complexity and qualitative measures need improvement. Importance of data accuracy, transparency, and privacy. |
[ | Students’ use of learning management systems and desired e-learning experiences: Are they ready for next generation digital learning environments? | Survey | Higher Education Research and Development | Analysis of LMS usage data | Students’ LMS usage for content and discussion correlates with desired engagement in student-centred e-learning. Students desire systems supporting content curation, group management, and mobile interoperability. |
[ | Architecting analytics across multiple e-learning systems to enhance learning design | Cross-platform architecture development, regression and classification techniques | IEEE Transactions on Learning Technologies | Cross-platform architecture for integrating data from multiple e-learning systems. | Combining data across multiple e-learning systems improves classification accuracy. Cross-platform analytics provides broader insights into learner behaviour. |
[ | Utilizing clickstream data to reveal the time management of self-regulated learning in a higher education online learning environment | Analysis of clickstream data, learning analytics | Interactive Learning Environments | StarC system log analysis | Clickstream data reveals time management aspects of self-regulated learning (SRL). Differences in time management among students with varying academic performance. |
[ | Effectiveness of machine learning algorithms on predicting course level outcomes from learning management system data | Quantitative comparative study, machine learning algorithm comparison | ProQuest Dissertations and Theses Global (Doctoral Dissertation) | Naive Bayes, decision tree, neural network, support vector machine | Decision tree effectively predicts students with poor course outcomes using LMS data. Decision trees outperformed other algorithms. |
[ | Detecting learning strategies with analytics: Links with self-reported measures and academic performance | Analysis of trace data, correlation with self-reported measures | Journal of Learning Analytics | Analysis of trace data from a flipped classroom environment. | Learning strategies extracted from trace data correlate with deep and surface approaches to learning. Deep approach to learning correlates with higher academic performance. |
[ | Individual differences related to college students’ course performance in Calculus II | Dominance analysis, correlation analysis | Journal of Learning Analytics | Analysis of LMS data, discussion forum data, quiz attempts. | Math importance, approximate number system (ANS) ability, discussion forum posting, and workshop submission time are significant predictors of final grades in Calculus II. |
[ | How flexible is your data? A comparative analysis of scoring methodologies across learning platforms in the context of group differentiation | Comparative analysis of scoring methodologies, resampling approach | Journal of Learning Analytics | Analysis of ASSISTments and Cognitive Tutor data. | Partial credit scoring offers more efficient group differentiation than binary accuracy measures in learning platforms. Partial credit increases analytic power. |
[ | Designing Analytics for Collaboration Literacy and Student Empowerment | Survey | Journal of Learning Analytics | BLINC (collaborative analytics tool) | Student collaboration concerns fall into seven dimensions: Climate, Compatibility, Communication, Conflict, Context, Contribution, and Constructive. These dimensions should inform collaboration analytics design. |
[ | A Novel Deep Learning Model for Student Performance Prediction Using Engagement Data | Deep learning model development and evaluation | Journal of Learning Analytics | ASIST (Attention-aware convolutional Stacked BiLSTM network) | ASIST, a deep learning model, predicts student performance using engagement data from VLEs. It outperforms baseline models. |
[ | Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance | Deep learning approach, comparison with machine learning classifiers | Journal of Learning Analytics | LSTM (Long Short-Term Memory) networks | LSTM networks effectively predict course performance using LMS time series data, outperforming traditional machine learning classifiers. |
[ | The Positive Impact of Deliberate Writing Course Design on Student Learning Experience and Performance | Analysis of LMS data, correlation with course design decisions | Journal of Learning Analytics | Analysis of LMS usage data | Course design influences learner interaction patterns. Discussion entry length predicts final grades, highlighting the impact of writing practice. |
[ | Privacy-driven Design of Learning Analytics Applications—Exploring the Design Space of Solutions for Data Sharing and Interoperability | Conceptual model development | Journal of Learning Analytics | Learning Analytics Design Space model | Privacy-driven design is crucial for learning analytics systems. The Learning Analytics Design Space model aids in designing privacy-conscious solutions. |
[ | RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of Learning Activities | Platform development and pilot study | Journal of Learning Analytics | RiPPLE (Recommendation in Personalised Peer-Learning Environments) | RiPPLE, a crowdsourced adaptive platform, recommends personalised learning activities and shows measurable learning gains. |
[ | Leveraging learning analytics for student reflection and course evaluation | Faculty utilisation of learning analytics, curriculum evaluation | Journal of Applied Research in Higher Education | Learning analytics tools within LMS | Learning analytics enables student reflection, remediation, and curriculum evaluation, providing detailed data for stakeholders. |
[ | Applying learning analytics for the early prediction of students’ academic performance in blended learning | Predictive modelling, principal component regression | Educational Technology and Society | Analysis of LMS data, video-viewing, practice behaviours, homework, quizzes | Learning analytics predicts student performance in blended learning. Online and traditional factors contribute to prediction accuracy. |
[ | Learning management system and course influences on student actions and learning experiences | Comparative study of LMS and course influences | Educational Technology Research and Development | Analysis of LMS usage data | Course type and LMS design influence student actions and experiences. Discussion-focused systems increase perceived learning support. |
[ | Toward Precision Education: Educational Data Mining and Learning Analytics for Identifying Students’ Learning Patterns with Ebook Systems | Clustering approach, analysis of ebook system data | Educational Technology and Society | Analysis of ebook system data | Clustering identifies subgroups of students with different learning patterns. Learning patterns correlate with learning outcomes. |
[ | A Bayesian Classification Network-based Learning Status Management System in an Intelligent Classroom | System development and experiment | Educational Technology and Society | Bayesian classification network, sensor technology, image recognition | Learning status management system using sensors and image recognition. Bayesian network infers student learning status, with feedback to teachers and students. |
[ | Student perceptions of privacy principles for learning analytics | Exploratory study, survey | Educational Technology Research and Development | Analysis of student perceptions | Students desire adaptive, personalised dashboards in learning analytics systems but are conservative about data sharing. Stakeholder involvement is crucial for successful implementation. |
[ | Fostering evidence-based education with learning analytics: Capturing teaching-learning cases from log data | System development, case studies | Educational Technology and Society | Learning analytics framework, statistical modelling of learning logs | Automated capture of teaching–learning cases (TLCs) using learning analytics. Statistical modelling identifies intervention effectiveness. |
1. Marks, A.; Al-Ali, M. Analytics within UAE higher education context. Proceedings of the 2016 3rd MEC International Conference on Big Data and Smart City (ICBDSC); Muscat, Oman, 15–16 March 2016; pp. 1-6. [DOI: https://dx.doi.org/10.1109/ICBDSC.2016.7460396]
2. Saleh, A.M.; Abuaddous, H.Y.; Alansari, I.S.; Enaizan, O. The Evaluation of User Experience on Learning Management Systems Using UEQ. Int. J. Emerg. Technol. Learn.; 2022; 17, pp. 145-162. [DOI: https://dx.doi.org/10.3991/ijet.v17i07.29525]
3. Mohd Kasim, N.N.; Khalid, F. Choosing the Right Learning Management System (LMS) for the Higher Education Institution Context: A Systematic Review. Int. J. Emerg. Technol. Learn.; 2016; 11, pp. 55-61. [DOI: https://dx.doi.org/10.3991/ijet.v11i06.5644]
4. de Kock, E.; van Biljon, J.; Botha, A. User Experience of Academic Staff in the Use of a Learning Management System Tool. Proceedings of the SAICSIT ‘16: Proceedings of the Annual Conference of the South African Institute of Computer Scientists and Information Technologists; Johannesburg, South Africa, 26–28 September 2016; pp. 1-10. [DOI: https://dx.doi.org/10.1145/2987491.2987514]
5. Maslov, I.; Nikou, S.; Hansen, P. Exploring user experience of learning management system. Int. J. Inf. Learn. Technol.; 2021; 38, pp. 344-363. [DOI: https://dx.doi.org/10.1108/IJILT-03-2021-0046]
6. Arqoub, M.A.; El-Khalili, N.; Hasan, M.A.-S.; Banna, A.A. Extending Learning Management System for Learning Analytics. Proceedings of the 2022 International Conference on Business Analytics for Technology and Security (ICBATS); Dubai, United Arab Emirates, 16–17 February 2022; pp. 1-6. [DOI: https://dx.doi.org/10.1109/ICBATS54253.2022.9759070]
7. Hernández-Leo, D.; Martinez-Maldonado, R.; Pardo, A.; Muñoz-Cristóbal, J.A.; Rodríguez-Triana, M.J. Analytics for learning design: A layered framework and tools. Br. J. Educ. Technol.; 2019; 50, pp. 139-152. [DOI: https://dx.doi.org/10.1111/bjet.12645]
8. Society for Learning Analytics Research. What Is Learning Analytics?. 2024; Available online: https://www.solaresearch.org/about/what-is-learning-analytics/ (accessed on 19 November 2024).
9. Ismail, S.N.; Hamid, S.; Ahmad, M.; Alaboudi, A.; Jhanjhi, N. Exploring Students Engagement Towards the Learning Management System (LMS) Using Learning Analytics. Comput. Syst. Sci. Eng.; 2021; 37, pp. 73-87. [DOI: https://dx.doi.org/10.32604/csse.2021.015261]
10. Viberg, O.; Hatakka, M.; Bälter, O.; Mavroudi, A. The current landscape of learning analytics in higher education. Comput. Hum. Behav.; 2018; 89, pp. 98-110. [DOI: https://dx.doi.org/10.1016/j.chb.2018.07.027]
11. Goode, C.; Terry, A.; Harlow, H.; Cash, R. Mining for Gold: Learning Analytics and Design for Learning: A Review. Scope Teach. Learn.; 2021; 7474, 10. [DOI: https://dx.doi.org/10.34074/scop.4010004]
12. Tzimas, D.; Demetriadis, S.N. The Impact of Learning Analytics on Student Performance and Satisfaction in a Higher Education Course. Proceedings of the Educational Data Mining; Paris, France, 29 June–2 July 2021; Available online: https://api.semanticscholar.org/CorpusID:247321827 (accessed on 18 November 2024).
13. Mkpojiogu, E.O.C.; Okeke-Uzodike, O.E.; Emmanuel, E.I. Quality Attributes for an LMS Cognitive Model for User Experience Design and Evaluation of Learning Management Systems. Proceedings of the 3rd International Conference on Integrated Intelligent Computing Communication & Security (ICIIC 2021); Bangalore, India, 6–7 August 2021; Atlantis Highlights in Computer Sciences Atlantis Press: Dordrecht, The Netherlands, 2021; Volume 4.
14. Almusharraf, A.I. An Investigation of University Students’ Perceptions of Learning Management Systems: Insights for Enhancing Usability and Engagement. Sustainability; 2024; 16, 10037. [DOI: https://dx.doi.org/10.3390/su162210037]
15. Maluleke, A.F. Enhancing Learning Analytics through Learning Management Systems Engagement in African Higher Education. J. Educ. Learn. Technol.; 2024; 5, pp. 130-149. [DOI: https://dx.doi.org/10.38159/jelt.2024565]
16. Ncube, M.M.; Ngulube, P. Optimising Data Analytics to Enhance Postgraduate Student Academic Achievement: A Systematic Review. Educ. Sci.; 2024; 14, 1263. [DOI: https://dx.doi.org/10.3390/educsci14111263]
17. El Alfy, S.; Marx Gómez, J.; Dani, A. Exploring the benefits and challenges of learning analytics in higher education institutions: A systematic literature review. Inf. Discov. Deliv.; 2019; 47, pp. 25-34. [DOI: https://dx.doi.org/10.1108/IDD-06-2018-0018]
18. Samuelsen, J.; Chen, W.; Wasson, B. Integrating multiple data sources for learning analytics—Review of literature. Res. Pract. Technol. Enhanc. Learn.; 2019; 14, 11. [DOI: https://dx.doi.org/10.1186/s41039-019-0105-4]
19. Pan, Z.; Biegley, L.; Taylor, A.; Zheng, H. A Systematic Review of Learning Analytics: Incorporated Instructional Interventions on Learning Management Systems. J. Learn. Anal.; 2024; 11, pp. 52-72. [DOI: https://dx.doi.org/10.18608/jla.2023.8093]
20. Adeniran, I.A.; Efunniyi, C.P.; Osundare, O.S.; Abhulimen, A.O. Integrating data analytics in academic institutions: Enhancing research productivity and institutional efficiency. Int. J. Sch. Res. Multidiscip. Stud.; 2024; 5, pp. 77-87. [DOI: https://dx.doi.org/10.56781/ijsrms.2024.5.1.0041]
21. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.
22. Pieper, D.; Rombey, T. Where to prospectively register a systematic review. Syst. Rev.; 2022; 11, 8. [DOI: https://dx.doi.org/10.1186/s13643-021-01877-1]
23. MacFarlane, A.; Russell-Rose, T.; Shokraneh, F. Search Strategy Formulation for Systematic Reviews: Issues, Challenges and Opportunities. Intell. Syst. Appl.; 2022; 15, 200091. [DOI: https://dx.doi.org/10.1016/j.iswa.2022.200091]
24. Methley, A.M.; Campbell, S.; Chew-Graham, C.; McNally, R.; Cheraghi-Sohi, S. PICO, PICOS and SPIDER: A Comparison Study of Specificity and Sensitivity in Three Search Tools for Qualitative Systematic Reviews. BMC Health Serv. Res.; 2014; 14, 579. [DOI: https://dx.doi.org/10.1186/s12913-014-0579-0]
25. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.
26. Rayyan. Faster Systematic Reviews. 2024; Available online: https://www.rayyan.ai/ (accessed on 7 December 2024).
27. ASReview. Join the Movement Towards Fast, Open, and Transparent Systematic Reviews. 2024; Available online: https://asreview.nl/ (accessed on 9 December 2024).
28. Critical Appraisal Skills Programme (CASP). CASP Checklists. 2024; Available online: https://casp-uk.net/casp-tools-checklists/ (accessed on 13 December 2024).
29. Li, M.; Gao, Q.; Yu, T. Kappa Statistic Considerations in Evaluating Inter-Rater Reliability between Two Raters: Which, When and Context Matters. BMC Cancer; 2023; 23, 799. [DOI: https://dx.doi.org/10.1186/s12885-023-11325-z]
30. Mandrekar, J.N. Measures of Interrater Agreement. Biostat. Clin.; 2011; 6, pp. 6-7. [DOI: https://dx.doi.org/10.1097/JTO.0b013e318200f983] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/21178713]
31. CADIMA. Evidence Synthesis Tool and Database. 2025; Available online: https://www.cadima.info/ (accessed on 17 January 2025).
32. Park, Y.; Jo, I.-H. Using log variables in a learning management system to evaluate learning activity using the lens of activity theory. Assess. Eval. High. Educ.; 2017; 42, pp. 531-547. [DOI: https://dx.doi.org/10.1080/02602938.2016.1158236]
33. Tempelaar, D.T.; Rienties, B.; Nguyen, Q. Towards actionable learning analytics using dispositions. IEEE Trans. Learn. Technol.; 2017; 10, pp. 6-17. [DOI: https://dx.doi.org/10.1109/TLT.2017.2662679]
34. Sluijs, M.; Matzat, U. Predicting time-management skills from learning analytics. J. Comput. Assist. Learn.; 2024; 40, pp. 525-537. [DOI: https://dx.doi.org/10.1111/jcal.12893]
35. Olaleye, S.; Agjei, R.; Jimoh, B.; Adoma, P. Evaluation of usability in Moodle learning management system through analytics graphs: University of Applied Sciences teacher’s perspective in Finland. Int. J. Educ. Dev. Using Inf. Commun. Technol.; 2023; 19, pp. 85-107. Available online: http://files.eric.ed.gov/fulltext/EJ1413526.pdf (accessed on 11 October 2024).
36. Liu, Q.; Gladman, T.; Muir, J.; Wang, C.; Grainger, R. Analytics-informed design: Exploring visualization of learning management systems recorded data for learning design. SAGE Open; 2023; 13, pp. 1-10. [DOI: https://dx.doi.org/10.1177/21582440231193590]
37. Alshammari, A. Using analytics to predict students’ interactions with learning management systems in online courses. Educ. Inf. Technol.; 2024; 29, pp. 20587-20612. [DOI: https://dx.doi.org/10.1007/s10639-024-12709-9]
38. Rosa, M.J.; Williams, J.; Claeys, J.; Kane, D.; Bruckmann, S.; Costa, D.; Rafael, J.A. Learning analytics and data ethics in performance data management: A bench learning exercise involving six European universities. Qual. High. Educ.; 2022; 28, pp. 65-81. [DOI: https://dx.doi.org/10.1080/13538322.2021.1951455]
39. Doo, M.Y.; Park, Y. Pre-class learning analytics in flipped classroom: Focusing on resource management strategy, procrastination and repetitive learning. J. Comput. Assist. Learn.; 2024; advance online publication [DOI: https://dx.doi.org/10.1111/jcal.12946]
40. Fuller, J.; Lokey-Vega, A. From data to action: Faculty experiences with a university-designed learning analytics system. Int. J. E-Learn.; 2024; 23, pp. 471-487. Available online: https://www.learntechlib.org/primary/p/225169/ (accessed on 11 November 2024). [DOI: https://dx.doi.org/10.70725/922708xxcbru]
41. Villalobos, E.; Hilliger, I.; Gonzalez, C.; Celis, S.; Pérez-Sanagustín, M.; Broisin, J. The mediating role of learning analytics: Insights into student approaches to learning and academic achievement in Latin America. J. Learn. Anal.; 2024; 11, pp. 6-20. Available online: http://files.eric.ed.gov/fulltext/EJ1423426.pdf (accessed on 11 November 2024). [DOI: https://dx.doi.org/10.18608/jla.2024.8149]
42. West, P.; Paige, F.; Lee, W.; Watts, N.; Scales, G. Using learning analytics and student perceptions to explore student interactions in an online construction management course. J. Civ. Eng. Educ.; 2022; 148, 05022001. [DOI: https://dx.doi.org/10.1061/(ASCE)EI.2643-9115.0000066]
43. Webb, N.L. Student Device Usage, Learning Management System Tool Usage, and Self-Regulated Learning (Publication No. 30310232). Doctoral Dissertation; University of Nevada: Las Vegas, NV, USA, 2023; ProQuest Dissertations & Theses Global Available online: http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:29996357 (accessed on 11 November 2024).
44. Moses, P.S. Leveraging Complexity Science to Promote Learning Analytics Adoption in Higher Education: An Embedded Case Study (Publication No. 30814981). Doctoral Dissertation; University of Maryland: College Park, MD, USA, 2023; ProQuest Dissertations & Theses Global Available online: https://www.proquest.com/docview/3113526225 (accessed on 10 November 2024).
45. Le, D.; Pole, A. Beyond learning management systems: Teaching digital fluency. J. Polit. Sci. Educ.; 2023; 19, pp. 134-153. [DOI: https://dx.doi.org/10.1080/15512169.2022.2139268]
46. Garbers, S.; Crinklaw, A.D.; Brown, A.S.; Russell, R. Increasing student engagement with course content in graduate public health education: A pilot randomized trial of behavioral nudges. Educ. Inf. Technol.; 2023; 28, pp. 13405-13421. [DOI: https://dx.doi.org/10.1007/s10639-023-11709-5]
47. Sanchez, L.; Penarreta, J.; Poma, X.S. Learning management systems for higher education: A brief comparison. Discov. Educ.; 2024; 3, 58. [DOI: https://dx.doi.org/10.1007/s44217-024-00143-5]
48. Sahin, M.; Yurdugül, H. Learners’ needs in online learning environments and third generation learning management systems (LMS 3.0). Technol. Knowl. Learn.; 2022; 27, pp. 33-48. [DOI: https://dx.doi.org/10.1007/s10758-020-09479-x]
49. Becirovic, S. Examining learning management system success: A multiperspective framework. Educ. Inf. Technol.; 2024; 29, pp. 11675-11699. [DOI: https://dx.doi.org/10.1007/s10639-023-12308-0]
50. Stamper, J.; Moore, S.; Rosé, C.P.; Pavlik, P.I., Jr.; Koedinger, K. LearnSphere: A learning data and analytics cyberinfrastructure. J. Educ. Data Min.; 2024; 16, pp. 141-163.
51. Kalegele, K. Pragmatic monitoring of learning recession using log data of learning management systems and machine learning techniques. Int. J. Educ. Dev. Using Inf. Commun. Technol.; 2023; 19, pp. 177-190.
52. Yan, L.; Echeverria, V.; Jin, Y.; Fernandez-Nieto, G.; Zhao, L.; Li, X.; Alfredo, R.; Swiecki, Z.; Gašević, D.; Martinez-Maldonado, R. Evidence-based multimodal learning analytics for feedback and reflection in collaborative learning. Br. J. Educ. Technol.; 2024; 55, pp. 1900-1925. [DOI: https://dx.doi.org/10.1111/bjet.13498]
53. Koh, J.H.L.; Kan, R.Y.P. Students’ use of learning management systems and desired e-learning experiences: Are they ready for next generation digital learning environments?. High. Educ. Res. Dev.; 2021; 40, pp. 995-1010. [DOI: https://dx.doi.org/10.1080/07294360.2020.1799949]
54. Mangaroska, K.; Vesin, B.; Kostakos, V.; Brusilovsky, P.; Giannakos, M.N. Architecting analytics across multiple e-learning systems to enhance learning design. IEEE Trans. Learn. Technol.; 2021; 14, pp. 173-188. [DOI: https://dx.doi.org/10.1109/TLT.2021.3072159]
55. Cao, T.; Zhang, Z.; Chen, W.; Shu, J. Utilizing clickstream data to reveal the time management of self-regulated learning in a higher education online learning environment. Interact. Learn. Environ.; 2023; 31, pp. 6555-6572. [DOI: https://dx.doi.org/10.1080/10494820.2022.2042031]
56. Ashby, M.W. Effectiveness of Machine Learning Algorithms on Predicting Course Level Outcomes from Learning Management SYSTEM Data (Publication No. 30182602). Doctoral Dissertation; National University: San Diego, CA, USA, 2022; ProQuest Dissertations & Theses Global Available online: https://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:31241309 (accessed on 18 November 2024).
57. Gašević, D.; Jovanović, J.; Pardo, A.; Dawson, S. Detecting learning strategies with analytics: Links with self-reported measures and academic performance. J. Learn. Anal.; 2017; 4, pp. 10-27. [DOI: https://dx.doi.org/10.18608/jla.2017.42.10]
58. Hart, S.; Daucourt, M.; Ganley, C. Individual differences related to college students’ course performance in Calculus II. J. Learn. Anal.; 2017; 4, pp. 28-44. [DOI: https://dx.doi.org/10.18608/jla.2017.42.11]
59. Ostrow, K.S.; Wang, Y.; Heffernan, N.T. How flexible is your data? A comparative analysis of scoring methodologies across learning platforms in the context of group differentiation. J. Learn. Anal.; 2017; 4, pp. 1-9. [DOI: https://dx.doi.org/10.18608/jla.2017.42.9]
60. Worsley, M.; Anderson, K.; Melo, N.; Jang, J.Y. Designing Analytics for Collaboration Literacy and Student Empowerment. J. Learn. Anal.; 2021; 8, pp. 30-48. [DOI: https://dx.doi.org/10.18608/jla.2021.7242]
61. Fazil, M.; Rísquez, A.; Halpin, C. A Novel Deep Learning Model for Student Performance Prediction Using Engagement Data. J. Learn. Anal.; 2024; 11, pp. 23-41. [DOI: https://dx.doi.org/10.18608/jla.2024.7985]
62. Chen, F.; Cui, Y. Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance. J. Learn. Anal.; 2020; 7, pp. 1-17. [DOI: https://dx.doi.org/10.18608/jla.2020.72.1]
63. Lancaster, A.; Moses, P.S.; Clark, M.; Masters, M.C. The Positive Impact of Deliberate Writing Course Design on Student Learning Experience and Performance. J. Learn. Anal.; 2020; 7, pp. 48-63. [DOI: https://dx.doi.org/10.18608/jla.2020.73.5]
64. Hoel, T.; Chen, W. Privacy-driven Design of Learning Analytics Applications—Exploring the Design Space of Solutions for Data Sharing and Interoperability. J. Learn. Anal.; 2016; 3, pp. 139-158. [DOI: https://dx.doi.org/10.18608/jla.2016.31.9]
65. Khosravi, H.; Kitto, K.; Williams, J.J. RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of Learning Activities. J. Learn. Anal.; 2019; 6, pp. 91-105. [DOI: https://dx.doi.org/10.18608/jla.2019.63.12]
66. Ozdemir, D.; Opseth, H.M.; Taylor, H. Leveraging learning analytics for student reflection and course evaluation. J. Appl. Res. High. Educ.; 2020; 12, pp. 27-37. [DOI: https://dx.doi.org/10.1108/JARHE-11-2018-0253]
67. Lu, O.H.T.; Huang, A.Y.Q.; Lin, A.J.Q.; Ogata, H.; Yang, S.J.H. Applying learning analytics for the early prediction of students’ academic performance in blended learning. Educ. Technol. Soc.; 2018; 21, pp. 220-232. Available online: https://www.jstor.org/stable/26388400 (accessed on 22 November 2024).
68. Epp, C.D.; Phirangee, K.; Hewitt, J.; Perfetti, C.A. Learning management system and course influences on student actions and learning experiences. Educ. Technol. Res. Dev.; 2020; 68, pp. 3263-3297. [DOI: https://dx.doi.org/10.1007/s11423-020-09821-1]
69. Yang, C.C.Y.; Chen, I.Y.L.; Ogata, H. Toward Precision Education: Educational Data Mining and Learning Analytics for Identifying Students’ Learning Patterns with Ebook Systems. Educ. Technol. Soc.; 2021; 24, pp. 152-163.
70. Chiu, C.-K.; Tseng, J.C.R. A Bayesian Classification Network-based Learning Status Management System in an Intelligent Classroom. Educ. Technol. Soc.; 2021; 24, pp. 256-267.
71. Ifenthaler, D.; Schumacher, C. Student perceptions of privacy principles for learning analytics. Educ. Technol. Res. Dev.; 2016; 64, pp. 923-938. [DOI: https://dx.doi.org/10.1007/s11423-016-9477-y]
72. Kuromiya, H.; Majumdar, R.; Ogata, H. Fostering evidence-based education with learning analytics: Capturing teaching-learning cases from log data. Educ. Technol. Soc.; 2020; 23, pp. 14-29.
73. Jin, Y.; Echeverria, V.; Yan, L.; Zhao, L.; Alfredo, R.; Tsai, Y.-S.; Gašević, D.; Martinez-Maldonado, R. FATE in MMLA: A Student-Centred Exploration of Fairness, Accountability, Transparency, and Ethics in Multimodal Learning Analytics. arXiv; 2024; arXiv: 2402.19071[DOI: https://dx.doi.org/10.18608/jla.2024.8351]
74. Kasun, M.; Ryan, K.; Paik, J.; Lane-McKinley, K.; Bodin Dunn, L.; Weiss Roberts, L.; Paik Kim, J. Academic machine learning researchers’ ethical perspectives on algorithm development for health care: A qualitative study. J. Am. Med. Inform. Assoc.; 2024; 31, pp. 563-573. [DOI: https://dx.doi.org/10.1093/jamia/ocad238] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/38069455]
75. Tan, F.Z.; Lim, J.Y.; Chan, W.H.; Idris, M.I.T. Computational intelligence in learning analytics: A mini review. ASEAN Eng. J.; 2024; 14, pp. 121-129. [DOI: https://dx.doi.org/10.11113/aej.v14.21375]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
This systematic review examines the application of learning analytics to enhance user experience within Learning Management Systems in higher education institutions. Addressing a salient knowledge gap regarding the optimal integration of learning analytics for diverse learner populations, this study identifies analytical approaches and delineates implementation challenges that contribute to data misinterpretation and underutilisation. Consequently, the absence of a systematic evaluation of analytical methodologies impedes the capacity of higher education institutes to tailor learning processes to individual student needs. Adhering to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, a search was conducted across five academic databases. Studies employing learning analytics within Learning Management Systems environments to improve user experience in higher education institutions were included, while purely theoretical or non-higher education institution studies were excluded, resulting in a final corpus of 41 studies. Methodological rigour was assessed using the Critical Appraisal Skills Programme Checklist. This study revealed diverse learning analytics methodologies and a dual research focus on specific platforms and broader impacts on Learning Management Systems. However, ethical, implementation, generalisability, interpretation, personalisation, and system quality challenges impede effective learning analytics integration for user experience improvement, demanding rigorous and contextually aware strategies. This study’s reliance on existing literature introduces potential selection and database biases. As such, future research should prioritise empirical validation and cross-institutional studies to address these limitations.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer