Content area
Critical thinking, essential in contemporary education, has been the subject of research, which has focused on the creation of pedagogical strategies to promote and strengthen this skill in students. This study focused on designing an Intelligent Tutor System to develop this type of higher order thinking in university students. It was divided into two phases: design of the expert module, which devised learning paths and explored an algorithm, and the planning module, which covered automated planning, plan customization, outcome evaluation and testing. The findings highlight creating a didactic model of the learning path and validating a planning algorithm, called the Learning Progress Algorithm, as a formal, automated approach to improving critical thinking skills. The conclusion highlights the ability of the planning module, supported by the algorithm, to adjust to individual learner progress, which enables educational personalization and has practical implications for optimizing critical thinking in challenging contexts. This integrative approach reflects the synergy between educational technology and critical skills development within educational research.
Introduction
The COVID-19 pandemic caused a change in the dynamics of interaction in educational environments and accelerated the transition towards the use of Information and Communication Technologies (ICT) as indispensable tools in the teaching–learning processes, this underlines the importance of designing and implementing active methodologies that foster the development of high-level competences in students. From this perspective, strengthening skills resulting from critical thinking (CT), as well as individual characteristics, thinking habits and self-regulation of knowledge, contributes to establishing a positive connection between academic habits and research skills. This approach, then, is crucial in an educational paradigm centred on the needs of the learner.
In this context, CT is considered one of the essential skills of the twenty-first century [26], and plays a fundamental role in the holistic education of children and young people, as it helps to shape their interaction with their environment and their way of interpreting the world. Furthermore [29], CT has a multidisciplinary dimension and is therefore the subject of study in fields such as social sciences, philosophy and pedagogy.
Critical Thinking
CT is the way in which any content, issue or problem is approached, subjecting it to intellectual standards (clarity, accuracy, relevance, depth, breadth, logic, meaning and fairness) that facilitate understanding of the structures inherent in the thinking process [28]. Furthermore, it involves responsible, reasonable and self-reflective thinking that leads to informed judgement by considering context,and helps determine what to believe and how to act. It also encourages reflection on one’s own thinking.
Similarly, CT is seen as a system that allows the individual to monitor his or her own performance, to avoid inaccurate, biased, superficial or illogical judgements, solutions or conclusions (Grajeda 2009). It also seeks to ensure that the individual uses knowledge appropriately to achieve a goal, giving them the ability to make more informed decisions and solve problems more effectively [31]. This approach involves analysing and evaluating thoughts in everyday situations, as well as substantiating claims through arguments [29].
To adequately develop CT skills, the subject must have self-direction, self-discipline, self-regulation and the capacity for self-correction [28, 33]. In this sense, CT represents the possibility of reflecting on one's own thinking, constantly revising it to make it more accurate and precise [23].
By circumscribing CT to the educational setting, a variety of authors recognise the importance of cultivating both basic and advanced cognitive skills in children, adolescents and teachers [13], to encourage students to cultivate creativity and improve analytical and decision-making skills in their daily lives. Therefore, the school, which is the environment where teachers and students interact through a bidirectional teaching–learning process, plays a crucial role in promoting critical thinking from early childhood, addressing argumentation and the development of basic thinking skills as a foundation for individuals to reach levels of metacognition, since, as has been pointed out since childhood and youth, people must learn to argue, to serve—as a member of a group of democratic coexistence—to decide the course of human praxis in interpersonal and political relations. From this need arises both the pedagogy of argumentation and the didactics of critical thinking [19]. Are means through which CT is fostered in the school environment. This enables students to reflect on their environment, identify and understand its characteristics, and seek continuous improvements in learning processes. Enquiry also helps students explore concepts, analyse feedback from their teachers, examine how they learn, correct their study techniques, make connections between ideas, be curious and challenge themselves [25].
Dimensions of Critical Thinking
CT delves into the process that a person must develop his or her critical capacity when faced with diverse contextual elements [22]. Therefore, this metacognitive process encompasses the following dimensions: observation, comparison, classification, description, comprehension, explanation, problem solving and evaluation, which when used appropriately increase the possibilities of producing logical conclusions and arguments with a critical sense [12], as shown in Table 1.
Table 1. Dimensions of high and low critical thinking performance figure
Level of critical thinking | Dimension | Conceptualisation |
|---|---|---|
Critical thinking high | Evaluating | Value concepts aimed at determining the appropriation of knowledge through strategies that identify, analyse, interpret and decode to establish strategies aimed at continuous improvement [38] |
Problem solving | Recognition and analysis of a context situation to establish a pathway that allows for objective choices and alternatives towards a relevant solution [21, 35] | |
Explain | It is directed towards argumentative and reflective reasoning, in search of concrete and objective realities [13] | |
Understanding | Clarify the concepts towards the understanding of knowledge to appropriate the conceptual contents proposed in the theories to be deciphered [32], [2] | |
Description | It accounts for the characteristics of the object in a precise and ordered way through deductive and inductive thinking [9, 39] | |
Classification | The mental process by which an individual can identify differences and similarities between observed objects to understand and apply concepts through mental representations [9, 39] | |
Comparison | It is the extension of observation. It consists of establishing identifications and relationships of objects through mental representations that allow differences to be abstracted [9, 39] | |
Observation | Mental process that allows identifying and making known the characteristics of observed objects to distinguish conceptual representations at a given moment [9, 39] |
Intelligent Tutoring Systems (ITS)
An important advance in the implementation of artificial intelligence in education has been the development of Intelligent Tutoring Systems [27, 10], which have been used by researchers in education, psychology and artificial intelligence. ITSs are systems that provide learners with the benefits of individualised instruction (personalised education), i.e. they adapt to their needs and make recommendations on the next activities to be performed to guide their learning process and help them achieve learning outcomes. With this type of system, it is possible to implement active-participatory methodologies in the learning process [34], it allows learners to practice their skills by performing tasks in highly interactive learning environments.
The ITS incorporate built-in expert systems to monitor learner performance and personalise instruction based on adaptation to learners' learning style, current knowledge level and appropriate teaching strategies in e-Learning systems [34]. However, they are still limited in enabling learning through experimentation.
The main functionalities include systems that propose exercises extracted from a repository to the students for them to solve them. These systems then check how good their solution is and suggest new ones. These exercises are chosen according to different criteria, such as their progress in the learning process or the results obtained in previous exercises [1], as seen in Fig. 1.
Fig. 1 [Images not available. See PDF.]
ITS architecture, adapted from [6]
Systems that introduce students to an exercise and how to solve it step by step (solved examples). These systems then create similar exercises, usually by changing data, which can be solved by following the steps learned. In subsequent stages, the number of steps in the worked solution can also be gradually faded out (faded solved examples) so that learners can explain how they arrived at the final solution [20].
Research within both approaches focuses on adapting rules that recommend learning actions rather than giving the learner the freedom to experiment [8]. Thus, an ITS evaluates the actions of each learner in an interactive environment and develops a model of their knowledge, skills and experience. Based on the learner's model, the ITS can adapt instructional strategies, both in terms of content and style, and provides explanations, suggestions, examples, demonstrations and practice problems relevant to each learner. To provide this instruction, these systems have an architecture that is composed of three types of knowledge, organised into four separate software modules [6], as shown in Fig. 1.
The domain module (in the case of this study it is represented by the name Planner) is a computer representation of the declarative and procedural knowledge of an expert in each domain. This knowledge allows the ITS to compare the learner's actions and choices with those of an expert to assess their level of understanding.
The student module reflects the student's level of knowledge while interacting with the tutoring system. This module evaluates each learner’s performance by analysing their behaviour during the interaction. The purpose is to determine their knowledge, perceptual skills and reasoning skills. It generates evidence and uses inference to provide relevant instructions for each user.
The tutor module (for this the Expert study) contains the objective of specifying the knowledge necessary to make decisions about instructional tactics. It relies on the diagnostic processes of the learner module to determine what, when, and how to present information. For example, if a learner is assessed as a beginner in a specific procedure, this module will show step-by-step demonstrations before asking the user to perform the procedure themselves. As the learner gains experience, the system may decide to present more complex scenarios. In addition, it can select topics, simulations and examples relevant to the learner's level of knowledge.
The personalization of teaching processes fosters more effective learning experiences for students by tailoring content to their individual needs [3]. Additionally, the design of intelligent tutoring systems (ITS) should focus on managing cognitive load and creating contextualized and relevant content for students [30]. In this context, the use of educational technology to develop higher-order cognitive skills has revealed multifactorial findings [4], highlighting the importance of the present research.
To this end, a didactic model has been designed and validated that allows customizing the activities and exercises necessary to guide students in their learning by means of personalized thinking routines [37]. Given an initial diagnosis of the student's level of thinking, a critical thinking test is integrated into the ITS that each user must take when logging in for the first time [17].
Finally, the Graphical User Interface module, which is the bridge that allows interaction between the end-user and the ITS, is designed in such a way that it is reasonably tailored to the needs of the individual learner.
Methodology
Research Objective
To develop an Expert and Planneer module of activities of an intelligent tutor system for the development of critical thinking in university students.
Design of the Algorithms for the Expert and Planning modules
This study used a non-experimental approach in its design, framed within the paradigms characteristic of applied research. Furthermore, it was carried out through a sequential process composed of two clearly defined phases.
Phase 1: Pedagogical Design of the Expert Module
This phase focused on the design of pedagogical strategies to foster the development of critical thinking in university students from socially deprived contexts. To this end, learning routes were devised that were derived from an original pedagogical model specifically conceived to achieve this purpose [37]. An algorithm that allowed the implementation and adaptation of these learning paths in this module was also explored.
The design was carried out using tools for the construction of flowcharts and Unified Modelling Language (UML), which enabled a visual and structured representation of the ideas and processes involved in the development of the pedagogical strategies.
Phase 2: Design of the Planner module
The development of this module was subdivided into five stages that were executed consecutively, as shown below:
Automated Planning
Automated planning algorithms were implemented that considered the domain representation, student data and previously defined rules. These algorithms took responsibility for generating personalised action plans for each learner in an efficient manner.
Personalisation of Plans
It was ensured that the plans generated by the Planner were highly personalised. This involved carefully tailoring them to the individual needs of each learner, considering their level of progress and the specific areas requiring improvement.
Evaluation of Expected Results
Criteria and metrics were established to evaluate the effectiveness of the plans generated, which allowed us to accurately measure whether the plans contributed to the students' progress in their critical thinking skills.
Integration with the Main Algorithm
The Planner module was effectively integrated with the main Learning Progress Algorithm. This ensured that the plans generated were applied coherently and consistently within the overall process of monitoring and optimising student progress.
Testing and Validation
Finally, testing of the Planner module was carried out using student data and simulated scenarios. These tests allowed to evaluate the ability of this module to generate effective and personalised plans, which will help students to improve their critical thinking skills. Its successful integration into the main algorithm contributed significantly to the overall effectiveness of the system by creating synthetic student profiles representing a wide range of cognitive abilities and levels of socio-cultural engagement. The profiles included demographic information, historical academic performance data and initial learning objectives.
The evaluation focused on comparing the performance of personalised learning paths generated by the Intelligent Tutoring System with traditional non-personalised approaches. Metrics such as learning value, cognitive achievement and socio-cultural knowledge were used to assess the effectiveness of the system. In addition, qualitative feedback was collected from students and educators to measure user satisfaction and system viability.
Results
Phase 1: Pedagogical Design of the Expert Module
The design of the expert module included, in turn, the design of a didactic model, which was an integral part of a study that was conducted prior to this design. This model focuses on the development of thinking routines as essential elements of didactic design [37, 15], identifying two didactic sequences that define the routes that students must follow to acquire basic skills in their initial phase. Then, higher-order skills are integrated into the cognitive framework of critical thinking, especially in problematic and challenging situations [37].
To carry out the implementation of this model, an algorithm was designed that encompasses various dimensions of critical thinking and instructional planning. These were adapted from a variety of taxonomies from the most representative tests available on the market that measure this type of thinking [17].
As the learner progresses through the developmental levels, they are assigned a reward with the aim of establishing a gamified stimulus-reward methodology. This approach aims to foster student engagement and motivation as they progress through the process of acquiring critical thinking skills, in a socio-cultural or environmental context; as seen in Fig. 2. This model illustrates the lower levels of thinking. As for the higher levels, the functionality and programming logic related to effort and rewards are equivalent to those of a higher cognitive level.
Fig. 2 [Images not available. See PDF.]
Flowchart of the didactic model for the student learning pathway
Next, the design of the algorithms corresponding to the calculation of critical thinking levels and the conditions that will determine the placement in a thinking routine according to the calculated level are presented in formal language.
Calculation of the Level of Critical Thinking, you have Point Ranks CT Level
Let P be the number of points obtained in a satisfaction level.
P ≤ 24 P ≤ 24 represents the unsatisfactory level, where the points are less than or equal to 24.
25 ≤ P ≤ 4825 ≤ P ≤ 48 represents the satisfactory level, where points are greater than or equal to 25 and less than or equal to 48.
49 ≤ P ≤ 7249 ≤ P ≤ 72 represents the upper level, where points are greater than or equal to 49 and less than or equal to 72.
In summary, the expression establishes ranges of points that determine the levels of satisfaction: unsatisfactory, satisfactory, and superior.
Conditions Based on the Level at Which the Student is Placed and Determines the Routine to be Performed Based on that Level
Let S be a variable representing the level the student is at, and R a variable representing the routine to be performed.
Then, it is expressed as:
If S ∈ {Unsatisfactory, Satisfactory}; then, R = Routine of observed skills.
If S = Superior, then, R = Superior skill routine.
Phase 2: Planner Module Design
Planning algorithm
Formal Description of Planning Algorithm
Domain description: ("Critical-Thinking-Reasoning")
Domain: Critical-Thinking-Reasoning.
Prerequisites: STRIPS, fluents, typing, negative preconditions, and ADL.
Types:
Student: Set of students.
Step: Set of steps or stages in critical reasoning.
Thinkingskill: Set of thinking skills.
Dimension: Set of dimensions or aspects of critical reasoning.
Predicate:
step-link(?from, ?to): Represents a relationship between two steps ?from and ?to, where ?from ∈ ?to.
student-evidences-step(?x, ?y): Indicates that student ?x provides evidence for step ?y.
Functions:
learning-path-time(): Returns the total time spent on the learning path.
learning-path-reward(): Returns the cumulative reward on the learning path.
time-required-step-link(?from, ?to): Calculates the time required to complete the link between the ?from and ?to steps.
reward-step-link(?from, ?to): Calculates the reward associated with the connection between the ?from and ?to steps.
Action: "Learning-Progress".
Parameters:
?x: Student involved in the progress.
?from: Source step.
?to: Destination step.
Precondition The action is applicable if learner _x provides evidence at step ?from but not at step ?to.
Effect The action states that learner ?x provides evidence in the ?to step and increases the reward in the learning path as a function of the connection between the ?from and ?to steps.
The algorithm is called the Learning Progress Algorithm, which aims to monitor and optimise students' progress in developing critical thinking skills. It considers a domain that includes students, learning steps, thinking skills and dimensions of critical thinking.
Key Components
Planning Domain
The domain is formally defined, and sets out the relationships between learners, learning steps and thinking skills. It uses predicates to represent the connections between the elements of the domain, such as the relationships between steps and the evidence provided by learners.
Given a domain D, it is defined as a set of elements:
D = {e_1, e_2, …, e_n}, where each element represents a learner, a learning step or a thinking skill.
Relationships R_1, R_2, …, R_m are established connecting these elements in the domain.
For example, R_1(e_i, e_j) could represent a relationship between learner e_i and learning step e_j, and R_2(e_j, e_k) is represented as a relationship between learning step e_j and thinking skill e_k.
To formally represent these connections, predicates are used:
P_1, P_2, …, P_m, where each predicate P_i is a function that takes one or more elements of the domain as arguments and returns a logical value indicating whether the relation specified by the predicate is true or false.
Functions and Rewards
Functions are used to perform calculations related to the time and reward accumulated in the learning process: the learning-path-time function tracks the total time spent on the learning path, the learning-path-reward function tracks the reward accumulated in the learning path, and other functions calculate the time required to complete connections between steps and the reward associated with those connections.
A set of mathematical functions are also defined to perform calculations related to the time and reward accumulated in the learning process. These functions are expressed as follows:
Learning-path-time is a function that assigns a learning path (denoted by Path) a time value (represented by Time). Mathematically, this is defined as follows: learning-path-time: Path → Time.
Learning-path-reward is a function that assigns a learning path (Path) a reward value (Reward). Mathematically, it can be expressed as learning-path-reward: Path → Reward. Additionally, there are other functions that calculate the time required to complete connections between steps and the reward associated with those connections: time-for-connection is a function that takes two learning steps (Step_1 and Step_2) as arguments and returns the time required (Time) to complete the connection between those steps. This is defined mathematically as follows: time-for-connection: Step x Step → Time.
Reward-for-connection is another function that takes two learning steps (Step_1 and Step_2) as arguments and returns the reward (Reward) associated with the connection between those steps. This is expressed mathematically as reward-for-connection: Step x Step → Reward.
Learning Progress Action
The main action of the algorithm is called learning-progress, which is executed when a learner provides evidence at a specific step, but not at the next step. The effect of this action is that the learner provides evidence in the next step, and the cumulative reward is increased based on the connection between the steps.
The above is described as follows:
If “learning-progress” is executed at step i, → it is guaranteed to be executed at step (i + 1): learning-progress (i) → learning-progress(i + 1).
The cumulative reward increases as a function of the connection between the steps: reward(i, i + 1) increases.
Algorithm Operation
Initial state denoted as S_0, it represents the state in which learners have progressed to certain learning steps.
Objective to maximise the cumulative reward. This can be expressed as a maximisation function: maximise learning-path-reward().
Progress monitoring The algorithm constantly monitors learners' progress and makes decisions based on predefined conditions and rewards. This involves a decision-making process that is based on predefined conditions and evaluation of the rewards associated with the connections between the steps.
ITS functional architecture refers to the structure and operation of the Intelligent Tutoring System, which consists of a Planner and an Expert module, which collaborate in managing the learning process and making decisions to maximise reward. The objective of maximising cumulative reward and constant monitoring of student progress and the functional architecture of the ITS is to personalise learning. The functional architecture is shown in Fig. 3, and the source code can be viewed at the link: https://github.com/APIP-CECAR/planner.
Fig. 3 [Images not available. See PDF.]
Personalised learning pathway for each student
Evaluation Results
Configuration of the Assessment Test
Synthetic student profiles representing cognitive abilities, including critical thinking, and levels of environmental engagement were created for the assessment. The profiles included demographic information, historical performance data, and initial learning goals. Cognitive and ecological goal refinement models were trained using historical data to predict refined goals. The recommender system used collaborative filtering and content-based approaches to suggest initial learning paths and a reinforcement learning agent employed the learning planner module to refine the paths.
Synthetic Learner Profiles
Synthetic learner profiles were generated with various attributes to simulate real situations. Table 2 presents an example of the synthetic learner profiles used in the evaluation, along with the corresponding cognitive and environmental targets.
Table 2. Synthetic student profiles and goals
Student ID | Cognitive goal | Environmental goal |
|---|---|---|
1 | Problem solving | Climate awareness |
2 | Critical thinking | Biodiversity conservation |
3 | Problem solving | Waste reduction |
4 | Critical thinking | Energy efficiency |
5 | Critical thinking | Climate awareness |
6 | Problem solving | Biodiversity conservation |
7 | Critical thinking | Waste reduction |
8 | Problem solving | Energy efficiency |
9 | Problem solving | Climate awareness |
10 | Critical thinking | Biodiversity conservation |
The Intelligent Tutoring System successfully personalized students' learning paths, which improved their cognitive growth and environmental awareness, whereas non-personalized approaches did not have this effect. Furthermore, the evaluation test conducted with synthetic student profiles also demonstrated the system's ability to optimize personalized learning paths and enhance the educational experience. Finally, the combination of quantitative metrics and qualitative feedback provided a comprehensive assessment of system effectiveness and user satisfaction [7].
Environmental objectives are associated with various types of cognitive objectives because addressing environmental challenges requires a set of cognitive skills ranging from basic understanding to critical analysis and complex problem solving. This responds to the multifaceted and systemic nature of environmental problems, as well as the need for students to make informed and responsible decisions to effectively address them.
The learning path configuration involved selecting the number of recommended activities, the balance between cognitive complexity and environmental impact (controlled by λ), and the exploration factor (ϵ) for the Q-learning agent. Table 3 illustrates the learning path configuration settings used in the evaluation.
Table 3. Learning path configuration
ID | Recommended activities | λ (Balance) | ϵ (Exploration) |
|---|---|---|---|
1 | 8 | 0.7 | 0.2 |
2 | 10 | 0.5 | 0.1 |
3 | 6 | 0.8 | 0.3 |
4 | 9 | 0.6 | 0.2 |
5 | 7 | 0.6 | 0.1 |
6 | 10 | 0.7 | 0.3 |
7 | 8 | 0.5 | 0.2 |
8 | 9 | 0.8 | 0.2 |
9 | 7 | 0.7 | 0.1 |
10 | 6 | 0.5 | 0.3 |
The evaluation focused on comparing the performance of the personalized learning paths generated by the intelligent tutoring system with traditional non-personalized approaches.
Metrics such as learning value, cognitive achievement, and environmental awareness were used to assess the effectiveness of the system. Additionally, qualitative feedback from both students and educators was gathered to gauge user satisfaction and the system’s practicality.
The results of the evaluation demonstrated that the intelligent tutoring system successfully personalized learning paths for students, leading to improved cognitive growth and environmental awareness compared to non-personalized approaches.
The evaluation test using synthetic student profiles showcased the system’s capability to optimize personalized learning paths, thereby enhancing the overall educational experience in secondary education. The combination of quantitative metrics and qualitative feedback provided a comprehensive assessment of the system’s effectiveness and user satisfaction.
Potential Applications
This algorithm has significant applications in the field of education and the development of higher-order thinking skills, as it can be used in automated learning systems to optimize individual student progress, provide personalized feedback, and ensure that critical thinking goals are achieved.
Discussion and Conclusions
The Learning Progress Algorithm is a formal, automated approach to monitor and enhance the learning of CT skills. By modeling domains, relationships, and rewards in detail, this algorithm has the potential to facilitate informed decision-making in educational and learning support contexts. In a broader context, in terms of fostering autonomy, ITSs emerge as key players, because these systems can provide specific pathways that are conducive to the development of learners' cognitive abilities [14, 18].
This holistic approach highlights the synergy between educational technologies and the promotion of critical skills and underlines the importance of implementing pedagogical strategies that maximize the development of these skills in students. This approach aligns with studies that highlight the continuous need to explore and develop technological solutions to improve education and learning, create enriching didactic experiences that contribute to students' CT development [24], and design personalized didactic models that use intelligent tutoring as a pedagogical mediation to provide learners with individualized learning strategies that help them develop and refine their cognitive processes [16].
Likewise, [5] proposes the development of learning methodologies based on the use of neuroscience to foster the development of higher skills in students. These methodologies take as their premise four stages or dimensions: information gathering, reflection, creation and application. This contrasts with the algorithm designed in the present study (see Fig. 2), which defines dimensions, didactic routines, and contexts that allow progress in the development of activities and student progress from one TC level to another. However, when designing ITS, researchers have taken teaching strategies and methodologies for granted, since many of these studies focus the expert module on the application of traditional teaching methodologies or do not use pedagogical didactic designs as mediators for the development of students’ skills [36] [40].
In addition, ITSs incorporate pedagogical agents to support students’ self-regulated learning through prompts and feedback that promote the monitoring and regulation of cognitive, affective, metacognitive, and motivational processes to achieve the proposed objectives [11]. In that sense, the functional architecture of the ITS of this study, designed to integrate the Expert and Planner modules, offers a comprehensive approach to the management of the learning process, through the Learning Progress Algorithm that has the potential for automated and adaptive decision-making. In that sense, the planning domain, by establishing essential relationships, provides a solid basis for data-driven decision-making. This finding enriches the theoretical knowledge on CT development and presents a practical and adaptable solution in the university educational environment. Thus, the ability of the Planner module to adjust to individual student progress, supported by an algorithm, enables personalization in education. This research contributes to educational theory and offers practical implications for CT optimization in a variety of challenging contexts, cementing its relevance in the academic landscape and its potential to drive future research.
Funding
This study was funded by Corporación Universitaría del Caribe –CECAR (grant number: ACTA07/2).
Data Availability
Source Code of Intelligent Algorithms: This repository contains the source code for each of the algorithms described in the results of the study: https://github.com/APIP-CECAR/planner/blob/main/domain.pddl, https://github.com/APIP-CECAR/planner/blob/main/problem.pddl. Algorithm Validation: To assess the effectiveness of the proposed intelligent tutoring system in personalized learning path generation, an evaluation was conducted in a secondary education setting: https://doi.org/https://doi.org/10.1109/C358072.2023.10436195. Didactic Model Designed to Promote the Development of Critical Thinking: The following study was the first phase of the research, which served as the conceptual basis for the implementation of the algorithms: https://doi.org/https://doi.org/10.1016/j.ijedro.2022.100207.
Declarations
Conflict of Interest
The authors declare that they have no other conficts of interest.
Informed Consent
Informed consent was obtained from all individual participants included in the study.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
1. Aleven, V; McLaren, BM; Sewall, J; van Velsen, M; Popescu, O; Demi, S; Ringenberg, M; Koedinger, KR. Example-tracing tutors: intelligent tutor development for non-programmers. Int J Artif Intell Educ; 2016; 26,
2. Anderson, & Krathwohl. (2016). Understanding the New Version of Bloom’s Taxonomy. 1–7. https://quincycollege.edu/wp-content/uploads/Anderson-and-Krathwohl_Revised-Blooms-Taxonomy.pdf
3. Behrooz, M; Tiffany, B. Evolution of an intelligent deductive logic tutor using data-driven elements. Int J Artif Intell Educ; 2017; 27,
4. Beverley P, Woolf B, Aïmeur E, Nkambou R, Lajoie S. Intelligent tutoring systems: 9th International Conference, ITS 2008 Montreal, Canada, June 2007;23–27, 2008: proceedings.
5. CaicedoLópez, H. Neuroeducación Una propuesta educativaen el aula de clase; 2016; 1 Ediciones de la U:
6. Carbonell, J. (1970). AI in CAI: An Artificial-Intelligence Approach to Computer-Assisted Instruction. IEEE Transactions on Man-Machine Systems, 11(4), 190–202. https://stacks.stanford.edu/file/druid:xr633ts6369/xr633ts6369.pdf
7. Caro, M. F., Quitian, L., Giraldo, J. C., & Lengua-Cantero, C. (2023). A Formal Model for Personalized Learning Path using Artificial Intelligence for Instructional Planning with a Focus on 21st-Century Skills and Environmental Awareness. 2023 IEEE Colombian Caribbean Conference (C3), 1–6. https://doi.org/10.1109/C358072.2023.10436195
8. D’Aniello, G; Gaeta, A; Gaeta, M; Tomasiello, S. Self-regulated learning with approximate reasoning and situation awareness. J Ambient Intell Humaniz Comput; 2018; 9,
9. De Sánchez, M. A. (1995). Desarrollo de habilidades de pensamiento: procesos básicos del pensamiento (Trillas, Ed.).
10. Dermeval, D; Paiva, R; Bittencourt, II; Vassileva, J; Borges, D. Authoring tools for designing intelligent tutoring systems: a systematic review of the literature. Int J Artif Intell Educ; 2018; 28,
11. Dever, DA; Wiedbusch, MD; Romero, SM; Azevedo, R. Investigating pedagogical agents’ scaffolding of self-regulated learning in relation to learners’ subgoals. Br J Edu Technol; 2024; [DOI: https://dx.doi.org/10.1111/bjet.13432]
12. Dwyer, CP; Hogan, MJ; Stewart, I. An integrated critical thinking framework for the 21st century. Think Skills Creat; 2014; 12, pp. 43-52. [DOI: https://dx.doi.org/10.1016/j.tsc.2013.12.004]
13. Facione, P. (1990). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction: Vol. ED 315 423 (The California Academic Press, Ed.). https://www.qcc.cuny.edu/socialSciences/ppecorino/CT-Expert-Report.pdf
14. Ferro, LS; Sapio, F; Mecella, M; Temperini, M; Terracina, A. Intelligent pedagogic agents (IPAs) in GEA2, an educational game to teach STEM topics; 2021; Springer: pp. 226-236.
15. Feuerstein, R; Miller, R; Hoffman, MB; Rand, Y; Mintzker, Y; Jensen, MR. Cognitive modifiability in adolescence: cognitive structure and the effects of intervention. J Spec Educ; 1981; 15,
16. Gao, H; Zeng, Y; Ma, B; Pan, Y. Improving knowledge learning through modelling students’ practice-based cognitive processes. Cogn Comput; 2024; 16,
17. García Medina, M., Acosta Meza, D., Lengua-Cantero Claudia, Anaya Herrera, J. (2023). PCB test to assess critical thinking in students from disadvantaged contexts. Migration Letters, 20(S9), 1666–1692. https://migrationletters.com/index.php/ml/article/view/6248/4229
18 Gómez, A; Márquez, L; Zapa, H; Florez, M. GDA-based tutor module of an intelligent tutoring system for the personalization of pedagogic strategies; 2021; Springer: pp. 742-750.
19. González Cástulo, Y; Vargas Garduño, MDL; del Campo, Gómez; del Paso, MI; Méndez Puga, AM. Estrategias que favorecen el aprendizaje autónomo en estudiantes universitarios. Caleidoscopio—Revista Semestral de Ciencias Sociales y Humanidades; 2018; 21,
20 Green, N; Di Eugenio, B; Harsley, R; Fossati, D; AlZoubi, O. Behavior and learning of students using worked-out examples in a tutoring system; 2016; Springer: pp. 389-395.
21. Halpern, D. (2014). Thought and knowledge: an introduction to critical thinking (Psychology Press, Ed.; 5th ed). https://ia801301.us.archive.org/9/items/Thought_and_Knowledge_An_Introduction_to_Critical_Thinking_by_Diane_F._Halpern/Thought_and_Knowledge_An_Introduction_to_Critical_Thinking_by_Diane_F._Halpern.pdf
22. Herrero, J. (2018). Elementos de Pensamiento Crítico (Marcial Pons, Ed.; 2nd ed.). https://www.marcialpons.es/media/pdf/9788491234951.pdf
23. Joanne, K. (1987). The reasoning-centered classroom: approaches that work. ERIC RIE, 39(8), 1–79. https://n9.cl/8d0tp
24. Lengua-Cantero, C; Oviedo, GB; Barboza, WF; Feria, MV. Emerging technologies in the teaching-learning process: towards the critical thinking development. Revista Electronica Interuniversitaria de Formacion Del Profesorado; 2020; 23,
25. López, G. (2012). Pensamiento crítico en el aula. Docencia e Investigación, 22, 41–60. https://www.educacion.to.uclm.es/pdf/revistaDI/3_22_2012.pdf
26. Maggio, M. (2018). XIII Foro Latinoamericano de Educación Habilidades del siglo xxi. Cuando el futuro es hoy (Fundación Santillana, Ed.; 1st ed.). http://www.codajic.org/sites/default/files/sites/www.codajic.org/files/XIII-Foro-Documento-Basico-WEB.pdf
27. Murray, T. (2003). Authoring Tools for Advanced Technology Learning Environments (T. Murray, S. B. Blessing, & S. Ainsworth, Eds.). Springer Netherlands. https://doi.org/10.1007/978-94-017-0819-7
28. Naessens, H. (2015a). Comparación entre dos autores del pensamiento crítico: Jacques Boisvert y Richard Paul-Linda Elde. Temas de Historia y Discontinuidad Sociocultural En México, 1, 207–225. http://ri.uaemex.mx/handle/20.500.11799/57993
29. Naessens, H. (2015b). Comparación entre dos autores del pensamiento crítico: Jacques Boisvert y Richard Paul-Linda Elder. Temas de Historia y Discontinuidad Sociocultural En México, 207–225. http://ri.uaemex.mx/bitstream/handle/20.500.11799/57993/CAP 10 COMPARACION.pdf?sequence=1&isAllowed=y
30. Neelu, Jyothi, Ahuja., Roohi, Sille. (2012). 6. A Critical Review of Development of Intelligent Tutoring Systems: Retrospect, Present and Prospect
31. Nosich, G. (2011). Learning to think things through: a guide to critical thinking across the curriculum (Pearson Prentice Hall, Ed.; 4th ed.).
32. Orejudo, S. (2006). Reseña de “CALIDAD DEL APRENDIZAJE UNIVERSITARIO” de J. Biggs. Revista Interuniversitaria de Formación Del Profesorado, 20, 327–331. https://www.redalyc.org/pdf/274/27411311022.pdf
33. Paul, R., & Elder, L. (2005). La mini-guía para el Pensamiento crítico Conceptos y herramientas (Foundation for Critical Thinking, Ed.; Vol. 1). https://www.criticalthinking.org/resources/PDF/SP-Comp_Standards.pdf
34. Phobun, P; Vicheanpanya, J. Adaptive intelligent tutoring systems for e-learning systems. Procedia Soc Behav Sci; 2010; 2,
35. Roca Llobet, J. (2013). El desarrollo del Pensamiento Crítico a través de diferentes metodologías docentes en el Grado en Enfermería [Universitat Autònoma de Barcelona ]. https://dialnet.unirioja.es/servlet/tesis?codigo=84936
36 Rodríguez Chávez, MH. Sistemas de tutoría inteligente y su aplicación en la educación superior. RIDE Revista Iberoamericana Para La Investigación y El Desarrollo Educativo; 2021; [DOI: https://dx.doi.org/10.23913/ride.v11i22.848]
37. Rodriguez Sandoval, MT; Bernal Oviedo, GM; Rodriguez-Torres, MI. From preconceptions to concept: the basis of a didactic model designed to promote the development of critical thinking. Int J Educ Res Open; 2022; 3, [DOI: https://dx.doi.org/10.1016/j.ijedro.2022.100207]
38. Saiz, C. (2017). Pensamiento crítico y cambio (Pirámide (Grupo Anaya S.A.), Ed.; Vol. 1).
39. Velásquez Burgos, BM; Remolina de Cleves, N; CalleMárquez, MG. Habilidades de pensamiento como estrategia de aprendizaje para los estudiantes universitarios. Revista de Investigaciones UNAD; 2013; 12,
40. Zhu, M; Qiu, L; Zhou, J. Meta-path structured graph pre-training for improving knowledge tracing in intelligent tutoring. Expert Syst Appl; 2024; 254, [DOI: https://dx.doi.org/10.1016/j.eswa.2024.124451]
© The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd. 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.