1. Introduction
With the arrival of ChatGPT 4.0, people have felt the power and convenience of natural language processing technology. The popularity and advancement of artificial intelligence technology have been significantly aided by the open source and promotion of ChatGPT [1]. Because ChatGPT is a substantial language model based on deep learning technology, it is regarded as a significant advancement in artificial intelligence techniques. Moreover, since robots themselves belong to the broad realm of AI technology applications, they will rightfully benefit from the technological progress [2]. To develop students’ analytical, creative, and practical skills, robots suitable for structured and practical disciplines such as Science, Technology, Engineering, and Mathematics (STEM) fields are called educational robots. These robots have the qualities of teaching applicability, openness, scalability, and friendly human–computer interaction [3], and their introduction into the educational field, which focuses on structured learning and the development of practical skills, can contribute to the development and transformation of education to a greater extent [4], and also help to promote the achievement of sustainable learning [5]. For example, in English classes, robots are used to help students practice speaking, listening, and grammar, and to provide personalized learning advice [6]; Cozmo robots, designed by Anki, provide students with an open programming environment [7]. Google Classroom has begun to use educational robotics to provide students with online learning tools and resources [8]. These robots are able to provide a more personalized learning experience by automatically adapting the learning material and level of difficulty based on students’ performance and need as they interact with them. This approach not only makes it easier for students to acquire knowledge but also enhances their ability to learn independently and continuously. Social education has long promoted a commitment to developing people’s willingness and ability to learn, so that they can continually adapt to new demands for knowledge and skills and improve themselves. The integration of technology through artificial intelligence can bring new opportunities and challenges to the education field, which focuses on structured learning and developing practical skills [9], offering new possibilities for the development of sustainable learning and helping society to support the growth of the younger generation.
At the same time, it is important to note that while educational robots can enhance students’ interest and ability to learn, they cannot completely replace the human interaction and emotional communication provided by human teachers [10]. Educational robots excel at providing personalized learning support, helping students understand concepts, answering questions, and automating assessment and feedback. However, the scenarios in which robots can be used are limited when it comes to complex emotional support, ethics and moral education, and other domains [10]. These domains typically require deep human interaction and emotional communication from human teachers, as well as understanding and respect for individual differences.
Since they make up the majority of robots used in educational robotics applications, assistive teaching robots will receive most of the users’ and makers’ attention [11]. Assistive teaching robots are also known as robots that assist teachers with classroom support or repetitive tasks [12]. Humanoid robots such as NAO, SAYA, Bioloid, Aisa (ASIMO), Eddie, RoboTutor, and so on, play a significant role as teaching assistants in academic environments [13,14]. At present, robots come in many different varieties, with a wide range of products on the market and very different areas of application. From the point of view of robot manufacturing, although there is a flourishing trend in the market, there is an obvious homogenization in terms of the underlying technology, modeling, functions, and so on [15,16], which leads users to wonder which robot to use in the teaching practice process or whether they should just heedlessly follow the trends. As a result, it is not possible to effectively meet the market’s diverse needs. The inability to meet different demands has a negative impact on the development and proper use of educational robots. Additionally, it is not conducive to enhancing the sustainable learning ability of the educated. More options are available due to the growing use of assistive teaching robots in education today [17]. While the use of educational robots in relatively structured and explicit educational environments, such as STEM fields and basic subjects, can have a positive impact, we still need to be aware of their limited nature. When evaluating the use of educational robots, we need to focus on both their positive and potential negative impacts. So an evaluation framework that can support efficient development and assessment is needed to help with the selection. However, the lack of a framework with extensive coverage and a high degree of application and the fragmented functional evaluation criteria in the current literature have prevented this issue from being fully addressed [12,18,19].
This study proposed a functional evaluation paradigm with broad applicability to close this research gap. The Decision-Making Trial and Evaluation Laboratory (DEMATEL) and the Analysis Network Process (ANP) method, also known as the DANP method, were combined in this study’s hybrid, multi-criteria decision-making (MCDM) model to create interdependencies between evaluation dimensions and criteria [20,21,22]. Although the use of robots in the classroom has been studied in the past, such as the use of the NAO robot to help children learn English [23], Zhexenova et al. (2020) [24] found that students’ writing skills and confidence improved after interacting with the CoWriter robot. There are also studies that have introduced the SAYA robot into the classroom as a support for a teleclassroom system [25]. In general, these studies have focused on specific applications and there is still a lack of methods to support their development and evaluation. At the same time, with continuous learning as the key to personal growth and development, and its close relationship to the sustainable development of society and the long-term educational goals of social education, the ability of educational robots to promote continuous learning has become a major concern. It is necessary to organize and develop a set of functional evaluation metrics to direct the development of assistive teaching machine goods and to guide the selection of robots in schools.
The following sections make up the remaining portions of this study. Section 2 describes the creation of the evaluation metrics, and the pertinent literature is reviewed. Section 3 explains how the influential network relationship map (INRM) impacts relationships, and how criterion weights were generated using the DANP method. The data are analyzed in Section 4. The relevant discussion is found in Section 5, and some concluding remarks are presented in Section 6.
2. Literature Review
The advent of the digital era and developments in information technology and engineering science has led to the widespread use of robots and related products in various industries. Robots can perform tedious, complicated, and even dangerous tasks more accurately than people [26]. In recent years, robots have also begun to be progressively used in education [27]. While there is an intersection between robotics and its related components and products and the sustainable development of education [28], robots positively impact the ability of students to continue learning in a shorter period of time by providing personalized learning support [8], assessment of learning outcomes and massive learning resources [29,30], robots improve students’ learning effectiveness in a shorter period of time [31], thus actively promoting the sustainable development of education in the whole society. Correspondingly, the improvement of educational effectiveness also gives impetus to the continuous iteration and rapid development of robotics [32]. These two elements mutually stimulate and reinforce each other, creating a symbiotic relationship that drives progress.
2.1. Current Status of Research
Educational robots are intelligent systems or bodies of intelligence that integrate information technology and innovative technologies like computing, sensing, networking, and artificial intelligence to achieve digital modeling and computation of knowledge systems, educational participants, educational scenarios, and educational processes [33]. With the ability to help with teaching, manage to teach, handle teaching, and even host teaching [34], educational robots can create an engaging and interactive learning environment by offering students intriguing activities and unique engagement opportunities for class participation [35]. Interacting with educational robots can meet the different learning needs of students as the robots can provide personalized educational content and support [8]. As a result, educational robots are seen as a valuable tool for raising students’ motivation, interest, and academic performance [36].
With the rapid development of artificial intelligence and natural language processing technologies, educational robots in teaching can be broadly classified into three categories: assistive teaching robots, companion robots, and teaching tools [37].
Robots can perform on a par with or even better than humans in perception, memory, arithmetic, continuous work, and concentration [38]. By storing student learning data in the system, robots can pace teaching based on a student’s proficiency, helping human teachers with labor-intensive repetitive tasks. Robots are thus most frequently used as teaching assistants in educational settings. Using assistive robots in the teaching and learning process helps to alter some of the work teachers typically do. When we look at students’ access to knowledge, it is easy to see that the opportunities for one-to-one interaction between students and teachers are quite limited in the traditional general classroom environment. This limitation restricts students’ individuality and their ability to develop a deeper understanding of knowledge. In a class with many students, the teacher has to cover a large amount of content in a limited amount of time, making it difficult to provide enough individualized instruction and interaction for each student. Assistive teaching robots can solve teacher shortages [38]. According to Robaczewski and other scholars, NAO robots have been widely employed in education and can aid novices in learning through several built-in programs [39]. They also observed that NAO robots contain motor, functional, and emotional qualities enabling them to perform well as assistive robots. On the one hand, some researchers used the Android Robot SAYA and the Madatech robot to compare the effectiveness of robot-assisted teachers within the same course [40]. They ultimately discovered that most students could actively participate in learning activities and interact with the robot assistants throughout the system with the help of the robot. By employing the ASIMO robot to teach storytelling to students, Costa, and other scholars have discovered that the humanoid robot’s gaze behavior significantly aided in developing the students’ narrative abilities [41]. Some scholars use robots as a tool to assist classroom teaching, using the robot’s language recognition and facial expression recognition functions to help students and teachers complete classroom interactions.
In education, the nature of teaching and learning is much more than the transfer of knowledge. One of the aims of education is to develop students’ generic skills, which include critical thinking, creative thinking, problem-solving [42], and a sense of self-directed learning. Although robots excel in perception, memory, persistence, and concentration, they rarely cover all dimensions of education, such as emotions, morality, and social interaction [43]. This is precisely why robots can only be used as tools to assist teachers in the teaching and learning process, where teachers are able to impart human values, inspire students to think creatively, and guide them to become socially responsible citizens. Assistive teaching robots support teachers in terms of knowledge transfer and practice correction, but their greater significance lies in freeing up teachers’ time and energy to focus on those aspects of education that require the involvement of human emotion, intelligence, and flexibility. Robots do not exist to replace teachers [44], but rather to play to their strengths in education and provide more support to teachers, enabling them to guide the development of their students in a more profound way.
Meanwhile, in modern society, knowledge is updated at a rapid pace and students need to have the ability to learn continuously and adapt to changes [45]. Assistive teaching robots help to develop students’ sense of independent learning and continuous learning ability through personalized learning plans, continuous learning tracking, and resource support [46]. This is a gradual process toward lifelong learning, enabling students to adapt to future changes and challenges. Therefore, the use of assistive teaching robots in education is not only to improve efficiency, but also to better meet the diversity of education and the comprehensive needs of students. As part of educational technology, they work with teachers to create a more dynamic and interactive learning environment, helping to cultivate future talents with comprehensive skills.
While robotics facilitates sustainable learning [47], and the number of situations where assistive teaching robots are used in the classroom is growing, necessitating suitable methods to create and assess these robots [19]. There needs to be more information regarding the options for assistive teaching robots in the research on developing systems for evaluating the functions of robots. Most of the existing content primarily concentrates on the application of industrial robots [48]. Therefore, it is hoped that by drawing on the development of an evaluation system for industrial robots, an evaluation system that is practical and appropriate for evaluating educational robots can be explored.
It is not easy to choose the best industrial robot because of the increasing complexity of robotic systems and the growing diversity of robots with varied roles, features, and specifications on the market [49]. Most currently used selection techniques for creating functional evaluation metrics in the literature on industrial robots use the Multi-Criteria Decision-Making (MCDM) model. An MCDM refers to a situation with a limited or infinite collection of decisions that cannot be shared and have conflicting solutions. The fundamental goal of this strategy is to arrive at a value judgment that is based on consensus [50,51]. The Analytic Hierarchy Process (AHP) is a multi-criteria technique that is among the most well-known and significant. Numerous factors, including positioning accuracy, cost, flexibility, load capacity, human–machine interface, and vendor service quality [48], must be taken into account during the industrial robot selection process [52,53]. Therefore, the selection of industrial robots, in general, can be considered a multi-criteria decision problem, and manufacturers and business organizations can use the MCDM approach to resolve various problems with robot selection evaluation [54]. The most well-known MCDM methodology, AHP, was first put forth by Saaty (1980) [55] for assessing, rating, ranking, and evaluating decision selection of one of the most widely used MCDM techniques. The fundamental steps of the AHP method include laying out a complex decision problem as a hierarchy, using pairwise comparison techniques to estimate the relative importance of each element at each level, and finally integrating these significant issues to create an overall evaluation system for choosing a decision solution [56].
By determining the weights of various levels of criteria through AHP to determine the performance of various robot designs, Goh (1997) [57] and Geng (2013) [58] evaluated the value of various industrial robot alternatives. An integrated model combining Analytic Hierarchy Processing (AHP) and Quality Function Deployment (QFD) was presented to determine whether the performance of the functional deployment of robots in the industry improves from the standpoint of needs [56]. Wang et al. (2022) [59] used a linguistic assessment scale based on a spherical fuzzy set, SF-AHP, to allow decision makers to freely express their significance in decision making. Kapoor and Tak (2005) [60] proposed a method of using “fuzzy linguistic variables” instead of numbers to solve the industrial robot selection problem by combining the Analytic Hierarchy Process (AHP) and Multi-Attribute Utility Theory (MAUT), two conventional, multi-criteria decision-making techniques. They used a fuzzification process by tying linguistic variables to the values of membership functions and developing suitable decision rules. Defuzzification, the final step in the process, converts fuzzy outputs into exact values and produces results as fuzzy scores.
AHP is frequently utilized as an MCDM technique for evaluating industrial robot functions since it is predicated on the idea that the dimensions and criteria of a system are independent [61]. The dimensions or criteria of the evaluation system rarely function independently in practice; instead, they frequently interact. The Analytical Network Process (ANP), a development of the AHP methodology, has also successfully resolved several real-world decision problem-solving processes, including project selection, product planning, supply chain management, and optimal scheduling. A better-organized functional evaluation and selection metrics system must choose the best robot for a particular domain and analyze fuzzy selection choices. DEMATEL can be used by producers, decision makers, and users as a field decision-making guide when combined with ANP approaches.
2.2. The Functional Evaluation Index System of Assistive Teaching Robots
In order to select suitable robots for teaching and to evaluate the effectiveness of robots in facilitating sustainable learning, it is necessary to consider robotics in different educational scenarios [30] and the corresponding teaching standards, teaching objectives, content and teaching environments. Even though earlier studies have emphasized educational level and learning subjects as two crucial factors in selecting educational robots, there is still a lack of an implementable evaluation framework to guide future research [21,22,23,24,25,26,27]. Most studies have concentrated on theory, design, development, practice, and reflection, but there has yet to be much academic research on the functional analysis and evaluation of educational robots. The significance of the educational purpose is the aspect that needs to be focused on in the case of constrained educational costs for assistive teaching robots, according to some related scholars who have made this point in their research [38]; Chang et al. (2010) [62] argue for focusing on the appearance and body construction of teaching assistants robots, with movable robotic arms or suitable display positions designed to enable a variety of interactions between teachers and students; Tsiakas et al. (2018) [63] divided the four functional evaluation metrics for assistive teaching robots into perceptual functions, behavioral control, classroom assistance, and personalization to assess the effects of robot appearance, verbal and nonverbal behavior, service, and communication style on educational instruction. To further improve the autonomous capabilities of the assistive teaching robot, Cooney and Leister (2019) [38] built a more comprehensive functional evaluation system through an exploratory experimental study of the robot’s capabilities. Other researchers used a structural equation modeling approach to design a quantitative model of the instructional design model and proposed a functional design framework for robot-assisted instruction based on constructivism [64]. The network relationships of the assessment system and the effect relationships between evaluation indicators, however, have not been taken into consideration by related research, which proposes a few evaluation dimensions on the surface.
The quality evaluation index system for the assistive teaching robot was developed in this study based on a substantial amount of domestic and international literature on the test evaluation and related design of the assistive teaching robot in four aspects: system structure, appearance interface, teaching function, and auxiliary support. The 17 secondary indicators under the dimension level are as follows:
2.2.1. System Structure (X1)
A robotic system comprises hardware, operating system software, and a physical body. The robot is a typical system, and although it should be highly reliable in theory, Merlet (2009) claimed that variations in how the mechanical components of the robot were made inherently cause uncertainty in the robot. As a result, some analytical methods are required to guarantee that the robot is reliable in a pedagogical use environment [65]. Functional diversity and ease of use can improve teaching effectiveness and expand the range of services teaching assistants offer, making the robot more manageable and structurally compact. According to Yoshino and Zhang (2018) [66], the reliability and security of the system should be the primary considerations when evaluating the indicators, because they can improve the user experience when the operation of the robot system is stable, the authentication service system is comprehensive, and the firewall has security protection. The user experience is impacted by how quickly system functions respond, according to Yu et al. (2020) [67] and a related article published by the China Education Equipment Industry Association.
Furthermore, easy-to-use systems and programs can maintain their appeal to students and their interest and motivation to learn. When utilizing the ROBOSEM robot as an example to emphasize the relevance of switching between remote operation and autonomous control features given by assistive teaching robots, Park et al. (2011) [68] underlined that system flexibility may lead to a superior qualitative experience. As a result, according to the literature mentioned above, the system structure dimension’s metrics include system reliability, security, flexibility, functional diversity, and operational convenience.
2.2.2. Appearance Interface (X2)
Robots have been divided by Fong and other scholars into four categories based on their appearance, namely, anthropomorphic, exaggerated, caricatured, and functional [12]. It turns out that children under the age of 9 pay close attention to the appearance of robots. Children’s attitudes toward robots that resemble humans are better than their attitudes toward robots that are simply machines [69], but support for those robots that closely resemble humans has declined dramatically. Ryu et al. (2007) [69] explored the design function of assistive teaching robots and found that the robot’s physical characteristics and role models significantly impacted educated people. As a result, the esthetically pleasing, comprehensive, and compact design of the outer structure is a critical assessment criterion. According to Yang and Wang’s (2020) [70] hypothesis, the interactive interface is the element that people find most intuitive in determining the level of service quality and customer happiness, and the ease with which an interface’s features may be used directly impacts the user experience. According to a related article by Tsiakas et al. (2018) [63], custom extension plugins can increase the diversity of the educational process. According to the Technical Specification for Teaching Robots in Primary and Secondary Schools, teaching assistants should have a modular structure and flexibly mountable and expandable external device interfaces. To better assess the appearance of the user interface, the degree of expansion of the equipment is also taken into account in this study.
2.2.3. Teaching Function (X3)
Huijnen et al. (2017) [71] suggest that assistive robots need to provide students with content that is compatible with their learning methods and study habits, improve student learning through personalized instruction, and get them more actively involved in learning. Appropriate use of robotics in the teaching and learning process is expected to improve the effectiveness of teaching and learning, enhance students’ motivation and interest in learning [72], and ultimately contribute to the promotion of lifelong sustainable learning. Therefore, the teaching function of assistive teaching robots has become one of the most important dimensions to measure its functionality. Many schools have introduced educational robots as teaching aids, such as the Pepper robot, which is able to interact with students, introduce course content and answer questions [73]. Teaching functionality, UI appearance, and assistive support are the three main factors that determine the quality of an assistive teaching robot. As a result, the instructional design content that the assistive teaching robot holds must follow the rules of effective teaching and learning, such as normality, applicability, and completeness. According to Wu et al. (2015) [74], because educational robots must have higher levels of intelligence and a greater variety of knowledge, assistive teaching robots must be able to create various lesson plans to suit curriculum variations. To be an effective teaching assistant, robots should assist teachers in the classroom by adjusting the pace of teaching and learning to create a more efficient and effective teaching environment [75]. They should also be able to meet the needs and preferences of different students according to their diverse characteristics. According to the literature listed above, the teaching function dimensions include teaching content standardization, teaching content applicability, teaching content integrity, and teaching process control.
2.2.4. Auxiliary Support (X4)
According to Hong and Huang (2016) [76], the RALL system framework, which offers teachers aids, scripted materials for writing, scripted materials that can be easily imported into multimedia resources, and scripted presentations for anthropomorphic robots, can improve the resource management capabilities of the assistive teaching robot. According to Cross et al.’s (2009) [77] theory, a robot’s expression, language, actions, emotions, and ability to communicate pain are distinctive and dynamic in human–robot interaction. This ability to express verbal engagement, in particular, impacts the entire classroom. According to Louie et al. (2017) [75], assistive teaching robots are created to address the need for teacher support in the school, particularly in circumstances where teachers struggle to handle things on their own and where teaching assistants are required to be able to perform repetitive tasks and manage sizable amounts of redundant data resources instead of teachers. Park et al. (2011) [68] use ROBOSEM assistive teaching robots to pronounce and assess pronunciation, identify students with RFID tags or markers, or record student credits while retrieving portfolios and voices using microphone radios. It also provides interaction and real-time feedback with students to enhance the learning experience. As a result, regulatory capability, voice interaction capability, resource management capability, and learning support capability are all part of auxiliary support.
The specific reference Table 1 for this article’s dimensions and index sources is below.
3. Methods
To provide a system of robotic solutions and selection indicators that meet different educational needs and are oriented towards sustainable learning. This study refers to the relevant research results of Hsu (2012) [80] and Govindan (2015) [81] in this section. In this study, the DEMATEL method is extended by incorporating the network analysis method (ANP) and utilizing the DANP method. This approach allows for a more detailed quantification of factor weights using a comprehensive influence matrix. Additionally, the study investigates the relationships among the elements within the system, thereby providing a deeper understanding of their interconnections. The network relationships between different assessment dimensions are made clear, successfully extending the analytical reach of the DEMATEL methodology and simplifying pairwise comparisons using the ANP method. The steps in its computation process are described as follows.
3.1. The DEMATEL Method
This study uses the DEMATEL method to determine the interdependence of variables/criteria and to reduce the dependency on features representing the underlying system and development patterns. The interrelationships between the elements that went into creating the INRM can also be determined through it [82,83]. The steps of the method are described as follows:
Step 1: Calculate matrix of the direct relationship average. The average matrix of direct links was calculated using values from 0–4 for “no impact”, “low impact”, “average impact”, “strong impact”, and “very strong impact”. For the expert questionnaire, the average matrix of direct links was utilized. Respondents were asked to rate the degree of direct influence between the two paired criteria using the following scale, and expert ratings were used to assess the relationships between the elements. Afterwards, the same criteria from the matrix of respondents were averaged to produce the direct relationship averaging matrix. Equation (1) is the matrix , where is the number of criteria. Thus
(1)
Step 2: The product of and yields the initial direct influence matrix , where represents the total number of influencing factors in the system and represents the strength of influencing a factor’s impact over .
(2)
(3)
Step 3: Equation (4) is used to generate the total impact matrix , and the member represents the indirect influence of criterion on criterion . Therefore
(4)
where, and is the unit matrix.Step 4: The matrix ’s column sums () and row sums () are calculated as follows:
(5)
(6)
The overall influence effect that criterion receives from other criteria is represented by element in vector . Similar to this, indicates how factor impacts other criteria, both directly and indirectly.
Step 5: Both matrix and matrix are derived using the criteria and dimensionality, respectively. Based on the criterion and dimensionality, matrix can be distinguished as and , respectively. Matrix is produced by averaging the criterion’s level of influence in each dimension.
(7)
Step 6: Get INRM. As a result, while − displays the net influence of factor on the other components, + depicts the intensity of giving and receiving the power of factor . It is obvious that factor is the causative component if − is positive, and that factor is the impacted component if − is negative. Therefore, the data set ( + , − ) can be mapped to create the INRM.
3.2. Getting ANP Weights Using the DEMATEL Method
This study uses the DEMATEL approach to establish the interrelationships between the elements required to produce the INRM. The steps are briefly listed below. ANP is the procedure for creating unweighted supermatrices and for determining factor weights. ANP, as opposed to AHP, considers the connections and interdependencies between numerous aspects or criteria [84]. The original ANP strategy, however, suffers from three major problems. Before utilizing the network analysis method, one must assume a relational structure of the evaluation system. Second, the complexity of the questionnaire will make it difficult to compare two indications simultaneously [85], which may make the results challenging to grasp. Thirdly, each cluster having the exact same weight is implausible, given that the effect level varies between dimensions or clusters [86].
It should be noted that the DEMATEL-based ANP approach can be used to address these three problems. The unweighted supermatrix is then normalized using the ANP method after the DEMATEL method determines the extent of influence between dimensions. These are the precise steps:
Step 7: Obtain the supermatrix that is not weighted. A matrix is obtained by normalizing .
(8)
A submatrix of called , for instance, can be normalized to as follows:
(9)
where = 1, 2…(10)
The matrix transposes the unweighted matrix supermatrix as follows:
(11)
(12)
As shown in Equation (11), denotes the th dimension; denotes the th criterion of the th dimension.
Step 8: A weighted supermatrix is derived. As seen in step 5, the matrix is generated by averaging the degree of influence of the criteria on each dimension.
(13)
where is the number of criteria in dimension and is the number in dimension .(14)
The weighted supermatrix is created by multiplying the unweighted supermatrix as by as follows:
(15)
(16)
Step 9: Determine the weights of DANP. Continue to impose restrictions on the weighted supermatrix until it converges and stabilizes. The DANP weights can be multiplied by the matrix to reach a steady state.
(17)
4. Data Analysis
4.1. Relationship between Metric Dimensions and Indicators
According to Table A1 (see Appendix A), a questionnaire was developed to gauge the initial direct influence matrix by comparing the level of influence between any two indicators. A total of 10 experts, including university professors, specialists in robotics-related fields, and primary and secondary school teachers with some familiarity with assisted robotics, were invited to form an advisory group to score the indicators of the influencing factors at each level. By using the maximization of affiliation, this study totaled the results to determine the value of the level of influence between each indicator. The direct impact matrix was produced by applying Equation (2) to convert the fuzzy triangular numbers of the impact factors of the functional assessment indicators into precise values after compiling the questionnaire responses from 10 experts, as shown in Table A1. Even though the 10 experts did not represent all relevant beneficiaries, the results show good consistency and can, in some cases, reflect reality.
The normalized directed relationship matrix is obtained from Equations (2) and (3), and is calculated using Equation (4) (refer to Table 2). The influence matrix within each dimension is then averaged to yield the total influence matrix for each dimension, as shown in Table 3. Equations (5) and (6) of Step 4 were completed to produce the sum of provided and received impacts between criteria and dimensions, as shown in Table 4.
4.2. Identification and Statistics of Functional Evaluation Indicators for Assistive Teaching Robots
The entire influence matrix of functional assessment impact indicators can be created using the calculation in Step 5. The impact, influence, centrality, and causation degrees of each factor at all levels are calculated, as seen in Table 5.
The main influencing factors for the functional evaluation of the assistive teaching robot can be initially identified based on the centrality ranking in Table 5 (continued), including functional diversity (X14), ease of use (X15), device expansion (X24), learning assistance capability (X44), teaching process control (X34), and resource management capability (X43). To more intuitively demonstrate the mutual influence and influenced relationship of each influencing factor in the functional evaluation of the assistive teaching robot, this study combines the data in Table 5, projects the statistical results of the cause degree and centrality of each key influencing factor into a two-dimensional coordinate system, and obtains the cause–effect diagram of the influencing factor of the functional evaluation of the assistive teaching robots as shown in Figure 1. Meanwhile, the dimensional and systematic influence network relationship map (INRM) can be drawn from Table 5, as shown in Figure 2.
According to the four-quadrant diagram of causality, the factor in the first quadrant is the Driving Factor of the entire evaluation model, having the most significant impact on the functional evaluation index system and playing an important role in the whole system [87]. The factors in the second quadrant are known as Supporting factors (Voluntariness), as they play a supporting role in the model; the factors in the third quadrant are known as Independent factors, as they have a more significant influence on the model; none of the factors in this study are located in the third quadrant, demonstrating the interdependence of all the factors in the functional index system; and the factor in the fourth quadrant is known as the Threshold factor. Table 6 presents the association between the variables in the four quadrants.
The unweighted supermatrix of the 17 secondary influencing factors was constructed in this study and is displayed in Table 7 to more precisely quantify the weights of the influencing factors for the functional evaluation of the assistive teaching robot.
Based on this, Equation (6) was used to calculate and weigh the comprehensive influence matrix of four first-level influence factors. The weights of every part of the system were then derived by performing limit operations on the weighted supermatrix of the influence factors to assess the assistive teaching robot’s function, as shown in Table 8.
Although most other research constructed robot indicators using the ANP approach [88,89], this study opted to employ the DANP method to determine the significance of indicators. The DANP technique utilizes the DEMATEL and ANP results, in addition to examining the interactions between internal elements of the system, to calculate the criteria weights. The degree of influence () between the indicator and the dimension is obtained from DEMATEL, and is normalized by considering the intensity of the effect of each indicator within its dimension. As a result, the weighted supermatrix (Step 8 of Section 3) takes each criterion’s percentage within its dimension and the extent to which it influences the other dimensions into account. Therefore, the weights of the indicators can be obtained first, then we determine the consequences of each dimension by adding the corresponding importance of the indicators within the same dimension. Consistent results can be obtained while avoiding the time-consuming, pairwise comparisons in the original ANP. Table 9 and Table 10 show the component weight values and the total of each component’s influences. This study has now quantified the weights of the factors impacting the functional evaluation of the assistive teaching robot.
5. Discussion
5.1. Cause Factor Analysis
The appearance interface (X2), which has the most considerable cause degree ( − ) value (0.160) and is consequently the most influential dimension, is shown in Table 5 and Figure 1 through a study of how the influencing elements interact with one another and their positions within the system. This indicates the importance of the appearance interface in the operation of the educational robot, aligning with Ryu et al.’s (2007) [69] assertion that a robot’s external features significantly influence educated individuals. At the same time, the appearance interface will dramatically impact the other dimensions and could have the most significant impact on the overall functional evaluation system [12]. The main causal factors (causal degree values greater than 0) of the operational evaluation influencing factors of the assistive teaching robot are system reliability (X11), clarity of interface interaction (X22), degree of equipment expansion (X24), system security (X12), system flexibility (X13), simplicity of interface presentation (X23), functional diversity (X14), the esthetic of the exterior structure (X21), and voice interaction capability (X42). These findings indicate that the above factors are more likely to influence the other factors in the system during the application of the assistive teaching robot, which plays a fundamental and guaranteed role.
Additionally, the system structure (X1) has the highest centrality ( + ) (1.941), which indicates that it has the most outstanding overall degree of impact inside the dimension, and has the most ability to influence the other dimensional indicators. Furthermore, Yoshino and other researchers (2018) noted in their publication that a system’s general stability enhances the user’s experience when using it [66,67], which is consistent with the findings of this study. We are aware that when selecting a robot that is not yet fully understood, the attractiveness of the appearance can affect people’s initial willingness to use it, which is referred to as the first-cause effect. The stable use experience that the robot itself can bring plays a crucial role in the user’s long-term use as the length of use increases and the use deepens [66]. As a result, the system structure will influence the user experience brought by other dimensions in the process. System security and flexibility are necessary for delivering high-quality services; device expansion, interface presentation clarity, and functional diversity offer crucial hardware and software guarantees for the assistive teaching robot; and whether or not the user will accept the assistive teaching robot depends on the esthetic appeal of the exterior structure and its voice interaction capability. User acceptance of the assistive teaching robot is directly correlated with system dependability and the clarity of interface interaction. The decision makers’ and users’ choice of the robot will ultimately be influenced most by the robot’s appearance interface, and its system structure.
5.2. Outcome Factor Analysis
Similarly, it can be deduced that the capability of resource management (X43), regulatory capability (X41), operational convenience (X15), teaching content applicability (X32), teaching content standardization (X31), simplicity of interface interaction (X23), learning support capability (X44), teaching content integrity (X33), and control of the teaching process (X34) are the outcome factors (reason degree value less than 0) of the factors influencing the functional evaluation of the assistive teaching robot. The components primarily dealing with teaching function and auxiliary support directly impact the evaluation system’s system structure and appearance interface components. The size of the interface or the system design will influence the richness of teaching-related functions that can be provided on the operating interface. In a related study, Yang and Wang (2020) [70] also suggested that the neatness of the interface features affects the range of functions and assistive support that assistive teaching robots can offer users. Teaching function and auxiliary support are two crucial aspects of the functional evaluation system of assistive robots and are key elements in promoting sustainable learning.
5.3. Importance Analysis of Influencing Factors
According to the ranking results of the weight values of the factors influencing the functional evaluation of the assistive teaching robot in Table 10 in the order of highest to lowest, the factors are teaching function (X3), auxiliary support (X4), system structure (X1), and appearance interface (X2). From the criteria, five indicators, namely control of the teaching process (X34), learning support capability (X44), regulatory capability (X41), teaching content applicability (X32), and resource management capability (X43), have a significant influence on the application evaluation of assistive teaching robots. This finding is in line with the need for assistive teaching robots to have a higher level of intelligence and a wide range of knowledge for the teaching field [74], while the esthetic of the exterior structure (X21), clarity of interface presentation (X22), and system security (X12) are the minor essential indicators in the system.
These results support the findings of the DEMATEL method study, which discovered that the teaching function and auxiliary support are two crucial parts of an assistive teaching robot’s functional evaluation system. The teaching function and auxiliary support are vital to the functional evaluation of an assistive teaching robot. Still, simultaneously, these two dimensions are easily influenced by both the system structure and the appearance interface. Although this result contrasts with the results of the centrality ranking of the factors presented in Table 5, it indicates this fact. The finding that robots with diverse teaching resources, standardized content settings, personalized learning for teachers and students, and convenient support are readily preferred by users is also consistent with the expectations of customized teaching that teaching assistants can provide in related studies [71,75]; the four dimensions intertwine to influence the selection of an assistive teaching robot as a whole. Given that teaching is the main application scenario for teaching robots, the design of teaching functions and the reserve of teaching resources require special consideration during the design and development process, as shown by the ranked factors in Table 5 and the computation of the index weights in Table 9. For the assistive teaching robot to be a valuable tool for teachers and a helpful learning partner for students, it is also important to consider how the system is built and how the interface’s appearance is designed during development.
5.4. Theoretical and Practical Implications
In this study, based on integrating the existing literature and verifying the validity of the evaluation indicators, we successfully constructed a set of functional evaluation indicator systems for assistive teaching robots to promote sustainable learning. By applying the DANP method of hybrid, multi-criteria decision making, we deeply explored the associations among the indicator dimensions and their mutual influence relationships in the complex evaluation system. This not only provides an implementable and widely adaptable evaluation framework for related research, but also provides a useful reference for the analysis and evaluation of educational robot functions in educational environments that emphasize structure and operability, presenting the network associations of the evaluation system and the complex influences among evaluation indicators from a new perspective.
In practical applications, this study provides a practical evaluation tool within a specific educational field, especially for relatively structured and explicit educational environments such as STEM fields and basic subjects. In these educational environments, the functionality of assistive teaching robots is compatible with the nature of the subject, and their features such as personalized support, content delivery, and learning pace adjustment can significantly enhance teaching and learning.
According to the data analysis findings, it is evident that auxiliary support and teaching functions are the main criteria for judging the functionality of the assistive teaching robot. The design and development of assistive teaching robots can be guided by the core issues of evaluation systems to meet the needs of teachers and students in educational fields that emphasize structured learning and the development of practical skills. The teaching function includes the teaching methods, techniques, and strategies that the assistive robot can provide to the teacher, as well as the ability to personalize the teaching to the student’s needs. Auxiliary support focuses on the support and assistance that the robot can provide in the teaching process beyond what is necessary for the course to proceed, such as answering questions and solving problems, and recommending learning resources. The robot assistant should be able to obtain the latest teaching resources and knowledge in a timely manner and integrate them into the teaching content to meet the evolving learning needs and knowledge updating requirements. Therefore, the evaluation system can provide guidance for the design and development of assistive robots in terms of focusing attention, as well as a basis for selecting robots that promote sustainable competence enhancement to meet the different learning needs of teachers and students, and to support the whole society in developing sustainable learning competence of the next generation of young people.
The final results of the study may, in certain cases, provide targeted recommendations for the development of appropriate robotic products to support the use of assistive teaching robots in instruction and education. The relevant results from the data analysis can address the homogenization problem with educational robot production, and provide a more robust logical basis for the robots selected by manufacturers, decision makers, and users. When developing and designing assistive teaching robots, the emphasis should be on providing practical teaching functions and tailored assistive support to meet the needs of teachers and students for sustainable learning in education. Even though an assistive teaching robot’s operator interface, appearance, and practicality impact the user experience, a greater focus should be placed on its individualized assistance support and practical teaching functions. Suppose assistive teaching robots are to be successful in the classroom. In that case, they must have access to a wide range of teaching resources, standardized teaching topic environments, and the ability to support individualized learning for teachers and students. Robot manufacturers should concentrate on improving teaching resources, monitoring the quality of teaching content, and improving personalized learning support capabilities during the development process. At the same time, decision makers and users can also use these factors as the primary basis for selecting robots.
6. Conclusions
This study has built an assistive teaching robot functional evaluation indicator system aimed at sustainable learning, that can direct product development and serve as a guide for schools in selecting robots. Relevant functional evaluation is increasingly crucial as artificial intelligence technology continuously upgrades and educational robot applications in real-world settings increase in number. This study built a functional evaluation indicator system for assistant teaching robots using 17 indicators, including resource management skills, system dependability, and degree of equipment expansion, and four dimensions: system structure, appearance interface, teaching function, and auxiliary support. The weights of each dimension indicator that influence the functional evaluation are then determined using the DANP method. Finally, the pertinent influence, result, and causal factors were examined. According to the findings, the teaching function is the key evaluation factor that significantly impacts deciding which assistive teaching robot functions to use.
The research results can provide the following contributions: (1) identify suitable functional evaluation indicators for assistive teaching robots; (2) determine the weights of each dimension and indicator in the evaluation system by using advanced models; and (3) provide targeted suggestions for the development and selection of assistive teaching robots based on expert judgments.
6.1. Suggestions for Future Research and Practice
By analyzing cause–effect relationships and influencing factors, and considering the ranking of influencing factors, this study makes the following recommendations for the design and selection of future assistive teaching robots:
6.1.1. Enrich the Teaching Resource Library and Enhance the Ability to Assist in Teaching
Robots with an extensive library of teaching resources can aid in improving teaching in educational settings. The teaching function is the primary determining factor when choosing assistive teaching robots. Teaching robots should follow national or regional education and teaching standards, ensure that the lesson content satisfies the requirements, and provide a library of teaching resources that may include various educational applications, interactive multimedia teaching materials, online courses, and so on, to meet the needs of multiple learners and promote their participation. The integrated teaching resource library should also cover the fundamental and advanced knowledge of the pertinent subjects in-depth and methodically. Teachers can fully utilize the courseware and teaching resources in the teaching resource library to prepare lessons, saving them time and allowing them to focus more on assisting students with their learning. Assistive teaching robots must also offer teaching materials that can be individually adjusted and optimized according to the needs of various learners to promote students’ learning progress and improve their learning effectiveness. Assistive teaching robots should also have the function of effectively managing learning resources, including collecting, classifying, storing and sharing learning resources, ensuring that students can conveniently access the required learning materials, and providing a diversified choice of resources to meet the learning needs of different students. With the rapid development of information technology and the rapid updating and innovation of educational resources, educational robots, as tools capable of providing rich educational resources, have the potential to create a richer educational experience for learners. By providing students with diverse learning materials, real-time feedback, and personalized support, educational robots are expected to effectively enhance their motivation and interest in learning, thus helping to strengthen their ability to learn in a sustainable way.
6.1.2. Combine Core Technologies and Facilitate Supportive Support
Teaching assistants’ supportive assistance in the classroom can impact the entire teaching process. Robots must have sensors that can provide a variety of application scenarios, natural language processing capabilities, and potent perceptual information processing technologies since the diversity of robot functions must rely on integrating numerous essential technologies. For instance, the Alpha Egg from China’s iFlytek combines speech recognition and sound source localization technology to enable face-to-face communication with users. Due to the variety of data encountered in the practical application of assistive teaching robots in education, there is a need for collaborative efforts among more sophisticated technologies to enhance the pedagogical suitability of educational robots. At the same time, in case of potential misbehavior, the assistive robot needs to have a monitoring and warning function to ensure the order of the educational process and the safety of the learning environment. In practice, assistive teaching robots can use advanced recommendation algorithms and machine learning technologies to recommend learning content and resources tailored to students’ learning needs and interests. Through in-depth analysis of students’ learning history and behavior patterns, it understands and grasps students’ learning tendencies and knowledge needs, and provides them with personalized learning suggestions, thus stimulating students’ active learning interests and sustained motivation. Currently, technologies that can emulate human perception are still in the development stage, and one important technology is multimodal fusion perception. This technology enables robots to obtain more comprehensive and accurate information to support more efficient scientific decision making by assistive teaching robots.
6.1.3. Pay Attention to the Design of Robot Appearance and Interaction Interface
The appearance of the robot has a long-term direct impact on the user’s in-use behavior regarding the robot. To lower the barriers to human–computer interaction and to enhance the learners’ sense of affinity, it is important to essentialize the robot’s appearance, expressions, and movements specifically for the user’s age group. The interactive interface is the first barrier to human–robot interaction, and slick interface animations and clear instructions greatly impact the user’s experience. However, the design of robot products is frequently uniform, both domestically and internationally, in terms of appearance and user interface, and robots from large manufacturers have a similar appearance and the same operation methods, quickly making users feel fatigued. The emphasis on appearance and interactive interface design will help to resolve this issue and enable learners to interact with the robot more thoroughly and productively. Additionally, robots must be more appealing and targeted to learners’ unique requirements and cognitive variations at various levels. In addition, as related artificial intelligence technologies, like bionic technology and intelligent manufacturing technology mature, the anthropomorphic appearance design of assistive teaching robots will become a significant development trend in the future.
6.2. Limitations
A hybrid of DEMATEL and ANP methods was used to build and analyze the indicators in this study. The DEMATEL method, based on the expert analysis and selection of indicator relationships, was used to confirm the interdependence of variables/criteria, so the data obtained have some subjectivity. This limitation issue can be resolved if future research includes fuzzy set comparison analysis.
Conceptualization, P.-Y.S.; methodology, P.-Y.S. and Q.-G.S.; investigation, P.-Y.S. and Z.-Y.Z.; data curation, Z.-Y.Z., Q.-G.S. and P.-Y.L.; writing—original draft preparation, P.-Y.S. and Z.-Y.Z.; writing—review and editing, P.-Y.S., Z.-Y.Z. and Z.L.; supervision, Q.-G.S. and Z.L.; funding acquisition, P.-Y.S. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
Not applicable.
The authors are thankful for the help and technical support of other authors and appreciate the financial support provided by the foundations.
The authors declare no conflict of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 2. Network of systematic impact and the four aspects’ relationship diagram.
Literature sources for the assistive teaching robots’ functional evaluation index system.
Dimensions | Indicators | Literature Sources |
---|---|---|
System Structure (X1) | System Reliability (X11) | [ |
System Security (X12) | [ |
|
System Flexibility (X13) | [ |
|
Functional Diversity (X14) | [ |
|
Operational Convenience (X15) | [ |
|
Appearance interface (X2) | Esthetics of the Exterior Structure (X21) | [ |
Clarity of Interface Presentation (X22) | [ |
|
Simplicity of Interface Interaction (X23) | [ |
|
Degree of Equipment Expansion (X24) | [ |
|
Teaching function (X3) | Teaching Content Standardization (X31) | [ |
Teaching Content Applicability (X32) | [ |
|
Teaching Content Integrity (X33) | [ |
|
Control of the Teaching Process (X34) | [ |
|
Auxiliary Support (X4) | Regulatory Capability (X41) | [ |
Voice Interaction Capability (X42) | [ |
|
Resource Management Capability (X43) | [ |
|
Learning Support Capability (X44) | [ |
Total influence matrix.
Z | X 11 | X 12 | X 13 | X 14 | X 15 | X 21 | X 22 | X 23 | X 24 | X 31 | X 32 | X 33 | X 34 | X 41 | X 42 | X 43 | X 44 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
X 11 | 0.20 | 0.28 | 0.31 | 0.31 | 0.35 | 0.17 | 0.18 | 0.24 | 0.31 | 0.30 | 0.30 | 0.30 | 0.35 | 0.34 | 0.28 | 0.31 | 0.34 |
X 12 | 0.24 | 0.16 | 0.22 | 0.26 | 0.28 | 0.13 | 0.14 | 0.20 | 0.26 | 0.25 | 0.26 | 0.26 | 0.30 | 0.30 | 0.23 | 0.26 | 0.27 |
X 13 | 0.22 | 0.21 | 0.18 | 0.30 | 0.32 | 0.15 | 0.15 | 0.23 | 0.27 | 0.26 | 0.26 | 0.25 | 0.30 | 0.29 | 0.23 | 0.26 | 0.29 |
X 14 | 0.26 | 0.24 | 0.29 | 0.26 | 0.35 | 0.17 | 0.19 | 0.27 | 0.31 | 0.29 | 0.31 | 0.30 | 0.35 | 0.33 | 0.27 | 0.31 | 0.34 |
X 15 | 0.22 | 0.21 | 0.25 | 0.28 | 0.24 | 0.15 | 0.16 | 0.24 | 0.25 | 0.24 | 0.26 | 0.25 | 0.31 | 0.28 | 0.23 | 0.27 | 0.29 |
X 21 | 0.12 | 0.11 | 0.14 | 0.17 | 0.22 | 0.08 | 0.16 | 0.19 | 0.16 | 0.17 | 0.17 | 0.17 | 0.19 | 0.17 | 0.13 | 0.16 | 0.17 |
X 22 | 0.16 | 0.15 | 0.17 | 0.22 | 0.26 | 0.19 | 0.11 | 0.23 | 0.20 | 0.21 | 0.23 | 0.22 | 0.24 | 0.22 | 0.18 | 0.21 | 0.23 |
X 23 | 0.21 | 0.19 | 0.25 | 0.28 | 0.32 | 0.19 | 0.19 | 0.18 | 0.23 | 0.25 | 0.27 | 0.25 | 0.31 | 0.26 | 0.22 | 0.25 | 0.28 |
X 24 | 0.27 | 0.27 | 0.29 | 0.35 | 0.36 | 0.17 | 0.17 | 0.26 | 0.24 | 0.27 | 0.30 | 0.29 | 0.33 | 0.32 | 0.28 | 0.30 | 0.34 |
X 31 | 0.15 | 0.16 | 0.16 | 0.21 | 0.23 | 0.12 | 0.13 | 0.18 | 0.18 | 0.17 | 0.25 | 0.24 | 0.26 | 0.23 | 0.18 | 0.24 | 0.25 |
X 32 | 0.14 | 0.14 | 0.16 | 0.20 | 0.23 | 0.10 | 0.11 | 0.18 | 0.18 | 0.23 | 0.18 | 0.24 | 0.26 | 0.24 | 0.18 | 0.24 | 0.27 |
X 33 | 0.16 | 0.15 | 0.18 | 0.22 | 0.25 | 0.11 | 0.13 | 0.19 | 0.21 | 0.25 | 0.26 | 0.19 | 0.27 | 0.25 | 0.19 | 0.26 | 0.28 |
X 34 | 0.16 | 0.16 | 0.19 | 0.22 | 0.26 | 0.11 | 0.13 | 0.19 | 0.20 | 0.25 | 0.26 | 0.26 | 0.22 | 0.28 | 0.21 | 0.26 | 0.29 |
X 41 | 0.19 | 0.21 | 0.20 | 0.26 | 0.27 | 0.12 | 0.13 | 0.20 | 0.22 | 0.26 | 0.27 | 0.26 | 0.30 | 0.22 | 0.23 | 0.26 | 0.29 |
X 42 | 0.20 | 0.19 | 0.22 | 0.28 | 0.31 | 0.12 | 0.14 | 0.22 | 0.25 | 0.24 | 0.26 | 0.25 | 0.29 | 0.28 | 0.18 | 0.25 | 0.28 |
X 43 | 0.22 | 0.22 | 0.24 | 0.29 | 0.30 | 0.13 | 0.15 | 0.22 | 0.25 | 0.26 | 0.27 | 0.27 | 0.28 | 0.28 | 0.22 | 0.22 | 0.30 |
X 44 | 0.20 | 0.20 | 0.22 | 0.27 | 0.28 | 0.13 | 0.14 | 0.20 | 0.24 | 0.24 | 0.26 | 0.25 | 0.27 | 0.27 | 0.22 | 0.26 | 0.22 |
Influence matrix by dimension.
|
X 1 | X 2 | X 3 | X 4 |
---|---|---|---|---|
X 1 | 0.26 | 0.21 | 0.28 | 0.29 |
X 2 | 0.23 | 0.18 | 0.24 | 0.23 |
X 3 | 0.19 | 0.15 | 0.24 | 0.24 |
X 4 | 0.24 | 0.18 | 0.26 | 0.25 |
Comprehensive influence matrix.
INRM | X 11 | X 12 | X 13 | X 14 | X 15 | X 21 | X 22 | X 23 | X 24 | X 31 | X 32 | X 33 | X 34 | X 41 | X 42 | X 43 | X 44 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
r | 4.83 | 4.04 | 4.15 | 4.81 | 4.14 | 2.68 | 3.42 | 4.14 | 4.81 | 3.31 | 3.27 | 3.54 | 3.67 | 3.88 | 3.96 | 4.14 | 3.86 |
c | 3.35 | 3.27 | 3.61 | 4.36 | 4.82 | 2.33 | 2.49 | 3.62 | 3.97 | 4.12 | 4.37 | 4.24 | 4.82 | 4.54 | 3.67 | 4.31 | 4.74 |
r + c | 8.17 | 7.31 | 7.77 | 9.17 | 8.96 | 5.01 | 5.91 | 7.76 | 8.77 | 7.44 | 7.63 | 7.78 | 8.49 | 8.42 | 7.63 | 8.46 | 8.59 |
r − c | 1.48 | 0.77 | 0.54 | 0.45 | −0.68 | 0.34 | 0.92 | 0.52 | 0.84 | −0.81 | −1.10 | −0.70 | −1.16 | −0.66 | 0.29 | −0.17 | −0.88 |
Influence index and ranking of each dimension.
Dimensions | Influence |
Influenced |
Centrality |
Cause Degree |
---|---|---|---|---|
System Structure (X1) | 1.034(1) | 0.907(3) | 1.941(1) | 0.128(2) |
Appearance interface (X2) | 0.884(2) | 0.724(4) | 1.607(4) | 0.160(1) |
Teaching function (X3) | 0.815(3) | 1.026(1) | 1.841(3) | −0.211(3) |
Auxiliary Support (X4) | 0.930(4) | 1.007(2) | 1.938(2) | −0.077(4) |
System Reliability(X11) | 4.826(1) | 3.345(14) | 8.171(8) | 1.481(1) |
System Security(X12) | 4.042(8) | 3.27(15) | 7.312(15) | 0.772(4) |
System Flexibility(X13) | 4.152(4) | 3.615(13) | 7.767(10) | 0.537(5) |
Functional Diversity(X14) | 4.811(2) | 4.361(6) | 9.172(1) | 0.449(7) |
Operational Convenience(X15) | 4.138(7) | 4.821(2) | 8.959(2) | −0.683(12) |
Esthetics of the Exterior Structure(X21) | 2.675(17) | 2.331(17) | 5.006(17) | 0.345(8) |
Clarity of Interface Presentation(X22) | 3.418(14) | 2.493(16) | 5.91(16) | 0.925(2) |
Simplicity of Interface Interaction(X23) | 4.142(6) | 3.621(12) | 7.764(11) | 0.521(6) |
Degree of Equipment Expansion(X24) | 4.807(3) | 3.966(10) | 8.773(3) | 0.841(3) |
Teaching Content Standardization(X31) | 3.314(15) | 4.123(9) | 7.437(14) | −0.809(14) |
Teaching Content Applicability(X32) | 3.266(16) | 4.367(5) | 7.632(12) | −1.101(16) |
Teaching Content Integrity(X33) | 3.541(13) | 4.243(8) | 7.785(9) | −0.702(13) |
Control of the Teaching Process(X34) | 3.667(12) | 4.825(1) | 8.492(5) | −1.158(17) |
Regulatory Capability(X41) | 3.88(10) | 4.544(4) | 8.424(7) | −0.665(11) |
Voice Interaction Capability(X42) | 3.96(9) | 3.666(11) | 7.625(13) | 0.294(9) |
Resource Management Capability(X43) | 4.143(5) | 4.313(7) | 8.456(6) | −0.17(10) |
Learning Support Capability(X44) | 3.857(11) | 4.735(3) | 8.592(4) | −0.879(15) |
Four-quadrant index factor relationship chart.
Quadrants | Title | Factors |
Features |
---|---|---|---|
1 | Driving Factor | X11 X12 X13 X14X23 X24 X42 | The reason degree is positive; the centroid is above 7 |
2 | Voluntariness | X21 X22 | The reason degree is positive; the centroid is below 7 |
3 | Independent | / | The reason degree is negative; the centroid is below 7 |
4 | Core Problem | X15 X31 X32 X33X34 X41 X43 X44 | The reason degree is negative; the centroid is above 7 |
Unweighted supermatrix.
Z | X 11 | X 12 | X 13 | X 14 | X 15 | X 21 | X 22 | X 23 | X 24 | X 31 | X 32 | X 33 | X 34 | X 41 | X 42 | X 43 | X 44 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
X 11 | 0.14 | 0.20 | 0.19 | 0.22 | 0.25 | 0.19 | 0.20 | 0.27 | 0.34 | 0.24 | 0.24 | 0.24 | 0.28 | 0.27 | 0.22 | 0.24 | 0.27 |
X 12 | 0.21 | 0.14 | 0.19 | 0.22 | 0.24 | 0.18 | 0.19 | 0.27 | 0.36 | 0.23 | 0.25 | 0.24 | 0.28 | 0.28 | 0.22 | 0.25 | 0.26 |
X 13 | 0.18 | 0.17 | 0.15 | 0.24 | 0.26 | 0.19 | 0.19 | 0.29 | 0.33 | 0.24 | 0.24 | 0.23 | 0.28 | 0.27 | 0.22 | 0.24 | 0.27 |
X 14 | 0.19 | 0.17 | 0.20 | 0.18 | 0.25 | 0.18 | 0.20 | 0.29 | 0.33 | 0.23 | 0.25 | 0.24 | 0.28 | 0.26 | 0.21 | 0.25 | 0.27 |
X 15 | 0.19 | 0.17 | 0.21 | 0.23 | 0.20 | 0.19 | 0.20 | 0.30 | 0.31 | 0.23 | 0.25 | 0.24 | 0.29 | 0.26 | 0.22 | 0.25 | 0.27 |
X 21 | 0.16 | 0.15 | 0.18 | 0.22 | 0.28 | 0.14 | 0.27 | 0.32 | 0.27 | 0.24 | 0.25 | 0.24 | 0.27 | 0.27 | 0.21 | 0.25 | 0.27 |
X 22 | 0.17 | 0.16 | 0.17 | 0.23 | 0.27 | 0.26 | 0.15 | 0.32 | 0.28 | 0.24 | 0.25 | 0.24 | 0.27 | 0.26 | 0.21 | 0.25 | 0.28 |
X 23 | 0.17 | 0.15 | 0.20 | 0.23 | 0.26 | 0.24 | 0.24 | 0.23 | 0.29 | 0.23 | 0.25 | 0.23 | 0.29 | 0.26 | 0.21 | 0.25 | 0.28 |
X 24 | 0.17 | 0.17 | 0.19 | 0.23 | 0.23 | 0.20 | 0.21 | 0.31 | 0.28 | 0.22 | 0.25 | 0.25 | 0.28 | 0.26 | 0.23 | 0.24 | 0.27 |
X 31 | 0.17 | 0.18 | 0.17 | 0.23 | 0.25 | 0.19 | 0.21 | 0.30 | 0.30 | 0.18 | 0.27 | 0.27 | 0.28 | 0.25 | 0.20 | 0.26 | 0.28 |
X 32 | 0.17 | 0.16 | 0.18 | 0.23 | 0.26 | 0.18 | 0.19 | 0.31 | 0.32 | 0.25 | 0.20 | 0.27 | 0.29 | 0.26 | 0.20 | 0.26 | 0.29 |
X 33 | 0.17 | 0.16 | 0.18 | 0.23 | 0.26 | 0.18 | 0.20 | 0.30 | 0.32 | 0.25 | 0.27 | 0.19 | 0.28 | 0.25 | 0.20 | 0.26 | 0.28 |
X 34 | 0.16 | 0.16 | 0.19 | 0.22 | 0.26 | 0.17 | 0.20 | 0.30 | 0.32 | 0.25 | 0.27 | 0.26 | 0.22 | 0.27 | 0.20 | 0.25 | 0.28 |
X 41 | 0.17 | 0.19 | 0.18 | 0.23 | 0.24 | 0.17 | 0.19 | 0.30 | 0.34 | 0.24 | 0.25 | 0.24 | 0.27 | 0.22 | 0.23 | 0.26 | 0.29 |
X 42 | 0.17 | 0.16 | 0.19 | 0.23 | 0.26 | 0.17 | 0.19 | 0.30 | 0.34 | 0.24 | 0.25 | 0.24 | 0.28 | 0.28 | 0.18 | 0.25 | 0.29 |
X 43 | 0.17 | 0.17 | 0.19 | 0.23 | 0.23 | 0.17 | 0.20 | 0.29 | 0.34 | 0.24 | 0.25 | 0.25 | 0.26 | 0.28 | 0.22 | 0.21 | 0.30 |
X 44 | 0.17 | 0.17 | 0.19 | 0.23 | 0.24 | 0.18 | 0.20 | 0.28 | 0.34 | 0.24 | 0.25 | 0.25 | 0.27 | 0.28 | 0.23 | 0.27 | 0.23 |
Polarized supermatrix.
Z | X 11 | X 12 | X 13 | X 14 | X 15 | X 21 | X 22 | X 23 | X 24 | X 31 | X 32 | X 33 | X 34 | X 41 | X 42 | X 43 | X 44 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
X 11 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 |
X 12 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 |
X 13 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 | 0.05 |
X 14 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 |
X 15 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 |
X 21 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 |
X 22 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 |
X 23 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 |
X 24 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 |
X 31 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 |
X 32 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 |
X 33 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 |
X 34 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 |
X 41 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 |
X 42 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 |
X 43 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 |
X 44 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 | 0.08 |
Functional evaluation impact factor weights.
Factor | X 34 | X 44 | X 41 | X 32 | X 43 | X 33 | X 31 | X 24 | X 15 | X 42 | X 23 | X 14 | X 13 | X 11 | X 12 | X 22 | X 21 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Weight | 0.077 | 0.076 | 0.072 | 0.070 | 0.069 | 0.068 | 0.066 | 0.063 | 0.061 | 0.058 | 0.058 | 0.056 | 0.046 | 0.042 | 0.041 | 0.039 | 0.036 |
Rank | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |
Overall weighting table for functional evaluation impact factors.
T | Local Weight | Ranking | T | Local Weight | Ranking | Global Weight |
---|---|---|---|---|---|---|
X 1 | 0.246 | 3 | X 11 | 0.172 | 14 | 0.042 |
X 12 | 0.168 | 15 | 0.041 | |||
X 13 | 0.186 | 13 | 0.046 | |||
X 14 | 0.226 | 12 | 0.056 | |||
X 15 | 0.248 | 9 | 0.061 | |||
X 2 | 0.196 | 4 | X 21 | 0.186 | 17 | 0.036 |
X 22 | 0.201 | 16 | 0.039 | |||
X 23 | 0.294 | 11 | 0.058 | |||
X 24 | 0.320 | 8 | 0.063 | |||
X 3 | 0.281 | 1 | X 31 | 0.235 | 7 | 0.066 |
X 32 | 0.249 | 4 | 0.070 | |||
X 33 | 0.243 | 6 | 0.068 | |||
X 34 | 0.273 | 1 | 0.077 | |||
X 4 | 0.276 | 2 | X 41 | 0.262 | 3 | 0.072 |
X 42 | 0.211 | 10 | 0.058 | |||
X 43 | 0.251 | 5 | 0.069 | |||
X 11 | 0.172 | 14 | 0.042 |
Appendix A
Questionnaire on the impact of each indicator.
Level of |
X 11 | X 12 | X 13 | X 14 | X 15 | X 21 | X 22 | X 23 | X 24 | X 31 | X 32 | X 33 | X 34 | X 41 | X 42 | X 43 | X 44 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
X 11 | |||||||||||||||||
X 12 | |||||||||||||||||
X 13 | |||||||||||||||||
X 14 | |||||||||||||||||
X 15 | |||||||||||||||||
X 21 | |||||||||||||||||
X 22 | |||||||||||||||||
X 23 | |||||||||||||||||
X 24 | |||||||||||||||||
X 31 | |||||||||||||||||
X 32 | |||||||||||||||||
X 33 | |||||||||||||||||
X 34 | |||||||||||||||||
X 41 | |||||||||||||||||
X 42 | |||||||||||||||||
X 43 | |||||||||||||||||
X 44 |
Note: X11: System Reliability, X12: System Security, X13: System Flexibility, X14: Functional Diversity, X15: Operational Convenience, X21: Esthetics of the Exterior Structure, X22: Clarity of Interface Presentation, X23: Simplicity of Interface Interaction, X24: Degree of Equipment Expansion, X31: Teaching Content Standardization, X32: Teaching Content Applicability, X33: Teaching Content Integrity, X34: Control of the Teaching Process, X41: Regulatory Capability, X42: Voice Interaction Capability, X43: Resource Management Capability, X44: Learning Support Capability. Black cells indicate that there is no need for the questionnaire to ask about the impact of the indicator on itself.
References
1. Dwivedi, Y.K.; Kshetri, N.; Hughes, L.; Slade, E.L.; Jeyaraj, A.; Kar, A.K.; Baabdullah, A.M.; Koohang, A.; Raghavan, V.; Ahuja, M. et al. “So what if ChatGPT wrote it?” Multidisciplinary perspectives on opportunities, challenges, and implications of generative conversational AI for research, practice, and policy. Int. J. Inf. Manag.; 2023; 71, 102642. [DOI: https://dx.doi.org/10.1016/j.ijinfomgt.2023.102642]
2. Ghatrehsamani, S.; Jha, G.; Dutta, W.; Molaei, F.; Nazrul, F.; Fortin, M.; Bansal, S.; Debangshi, U.; Neupane, J. Artificial Intelligence Tools and Techniques to Combat Herbicide Resistant Weeds—A Review. Sustainability; 2023; 15, 1843. [DOI: https://dx.doi.org/10.3390/su15031843]
3. Lu, V.N.; Wirtz, J.; Kunz, W.H.; Paluch, S.; Gruber, T.; Martins, A.; Patterson, P.G. Service robots, customers, and service employees: What can we learn from the academic literature and where are the gaps?. J. Serv. Theory Pract.; 2020; 30, pp. 361-391. [DOI: https://dx.doi.org/10.1108/JSTP-04-2019-0088]
4. Hwang, G.J.; Xie, H.; Wah, B.W.; Gašević, D. Vision, challenges, roles and research issues of Artificial Intelligence in Education. Comput. Educ. Artif. Intell.; 2020; 1, 100001. [DOI: https://dx.doi.org/10.1016/j.caeai.2020.100001]
5. Hsieh, Y.Z.; Lin, S.S.; Luo, Y.C.; Jeng, Y.L.; Tan, S.W.; Chen, C.R.; Chiang, P.Y. ARCS-assisted teaching robots based on anticipatory computing and emotional big data for improving sustainable learning efficiency and motivation. Sustainability; 2020; 12, 5605. [DOI: https://dx.doi.org/10.3390/su12145605]
6. Chen, Y.L.; Hsu, C.C.; Lin, C.Y.; Hsu, H.H. Robot-assisted language learning: Integrating artificial intelligence and virtual reality into English tour guide practice. Educ. Sci.; 2022; 12, 437. [DOI: https://dx.doi.org/10.3390/educsci12070437]
7. Touretzky, D.S.; Gardner-McCune, C. Calypso for Cozmo: Robotic AI for everyone. Proceedings of the 49th ACM Technical Symposium on Computer Science Education; Baltimore, MD, USA, 21–24 February 2018; 1110.
8. Chassignol, M.; Khoroshavin, A.; Klimova, A.; Bilyatdinova, A. Artificial Intelligence trends in education: A narrative overview. Procedia Comput. Sci.; 2018; 136, pp. 16-24. [DOI: https://dx.doi.org/10.1016/j.procs.2018.08.233]
9. Khan, M.A.; Khojah, M.; Vivek.,. Artificial intelligence and big data: The advent of new pedagogy in the adaptive e-learning system in the higher educational institutions of Saudi Arabia. Educ. Res. Int.; 2022; 2022, 1263555. [DOI: https://dx.doi.org/10.1155/2022/1263555]
10. Woo, H.; LeTendre, G.K.; Pham-Shouse, T.; Xiong, Y. The use of social robots in classrooms: A review of field-based studies. Educ. Res. Rev.; 2021; 33, 100388. [DOI: https://dx.doi.org/10.1016/j.edurev.2021.100388]
11. Papadopoulos, I.; Lazzarino, R.; Miah, S.; Weaver, T.; Thomas, B.; Koulouglioti, C. A systematic review of the literature regarding socially assistive robots in pre-tertiary education. Comput. Educ.; 2020; 155, 103924. [DOI: https://dx.doi.org/10.1016/j.compedu.2020.103924]
12. Huang, S. Design and Development of Educational Robot Teaching Resources Using Artificial Intelligence Technology. Int. J. Emerg. Technol. Learn.; 2021; 15, pp. 116-129. [DOI: https://dx.doi.org/10.3991/ijet.v16i05.20311]
13. Kühnlenz, K.; Sosnowski, S.; Buss, M. Impact of animal-like features on emotion expression of robot head eddie. Adv. Robot.; 2010; 24, pp. 1239-1255. [DOI: https://dx.doi.org/10.1163/016918610X501309]
14. Shaner, D.L.; Beckie, H.J. The future for weed control and technology. Pest Manag. Sci.; 2014; 70, pp. 1329-1339. [DOI: https://dx.doi.org/10.1002/ps.3706]
15. Levitt, T. The globalization of markets. McKinsey Q.; 1983; 2, pp. 69-81.
16. Hiller, J.; Lipson, H. Automatic design and manufacture of soft robots. IEEE Trans. Robot.; 2011; 28, pp. 457-466. [DOI: https://dx.doi.org/10.1109/TRO.2011.2172702]
17. Kessler, G. Technology and the future of language teaching. Foreign Lang. Ann.; 2018; 51, pp. 205-218. [DOI: https://dx.doi.org/10.1111/flan.12318]
18. Tzafestas, C.S.; Palaiologou, N.; Alifragis, M. Virtual and remote robotic laboratory: Comparative experimental evaluation. IEEE Trans. Educ.; 2006; 49, pp. 360-369. [DOI: https://dx.doi.org/10.1109/TE.2006.879255]
19. Lin, Y.; Qu, Q.; Lin, Y.; He, J.; Zhang, Q.; Wang, C.; Jiang, Z.; Guo, F.; Jia, J. Customizing robot-assisted passive neurorehabilitation exercise based on teaching training mechanism. BioMed Res. Int.; 2021; 2021, 9972560. [DOI: https://dx.doi.org/10.1155/2021/9972560] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34195289]
20. Hsu, C.C.; Liou, J.J.; Chuang, Y.C. Integrating DANP and modified grey relation theory for the selection of an outsourcing provider. Expert Syst. Appl.; 2013; 40, pp. 2297-2304. [DOI: https://dx.doi.org/10.1016/j.eswa.2012.10.040]
21. Huang, C.N.; Liou, J.J.; Chuang, Y.C. A method for exploring the interdependencies and importance of critical infrastructures. Knowl.-Based Syst.; 2014; 55, pp. 66-74. [DOI: https://dx.doi.org/10.1016/j.knosys.2013.10.010]
22. Hung, Y.H.; Huang, T.L.; Hsieh, J.C.; Tsuei, H.J.; Cheng, C.C.; Tzeng, G.H. Online reputation management for improving marketing by using a hybrid MCDM model. Knowl.-Based Syst.; 2012; 35, pp. 87-93. [DOI: https://dx.doi.org/10.1016/j.knosys.2012.03.004]
23. Crompton, H.; Gregory, K.; Burke, D. Humanoid robots supporting children’s learning in an early childhood setting. Br. J. Educ. Technol.; 2018; 49, pp. 911-927. [DOI: https://dx.doi.org/10.1111/bjet.12654]
24. Zhexenova, Z.; Amirova, A.; Abdikarimova, M.; Kudaibergenov, K.; Baimakhan, N.; Tleubayev, B.; Asselborn, T.; Johal, W.; Dillenbourg, P.; CohenMiller, A. et al. A comparison of social robot to tablet and teacher in a new script learning context. Front. Robot. AI; 2020; 7, 99. [DOI: https://dx.doi.org/10.3389/frobt.2020.00099] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33501266]
25. Hashimoto, T.; Kato, N.; Kobayashi, H. Development of educational system with the android robot SAYA and evaluation. Int. J. Adv. Robot. Syst.; 2011; 8, 28. [DOI: https://dx.doi.org/10.5772/10667]
26. Billard, A.; Kragic, D. Trends and challenges in robot manipulation. Science; 2019; 364, eaat8414. [DOI: https://dx.doi.org/10.1126/science.aat8414] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31221831]
27. Cheng, Y.W.; Sun, P.C.; Chen, N.S. The essential applications of educational robot: Requirement analysis from the perspectives of experts, researchers, and instructors. Comput. Educ.; 2018; 126, pp. 399-416. [DOI: https://dx.doi.org/10.1016/j.compedu.2018.07.020]
28. Dias, M.B.; Mills-Tettey, G.A.; Nanayakkara, T. Robotics, education, and sustainable development. Proceedings of the 2005 IEEE International Conference on Robotics and Automation; Barcelona, Spain, 18–22 April 2005; pp. 4248-4253.
29. Alam, A. Employing adaptive learning and intelligent tutoring robots for virtual classrooms and smart campuses: Reforming education in the age of artificial intelligence. Advanced Computing and Intelligent Technologies: Proceedings of ICACIT 2022; Springer: Berlin/Heidelberg, Germany, 2022; pp. 395-406.
30. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot.; 2018; 3, eaat5954. [DOI: https://dx.doi.org/10.1126/scirobotics.aat5954]
31. Nugent, G.; Barker, B.; Grandgenett, N.; Adamchuk, V.I. Impact of robotics and geospatial technology interventions on youth STEM learning and attitudes. J. Res. Technol. Educ.; 2010; 42, pp. 391-408. [DOI: https://dx.doi.org/10.1080/15391523.2010.10782557]
32. Chen, L.; Chen, P.; Lin, Z. Artificial intelligence in education: A review. IEEE Access.; 2020; 8, pp. 75264-75278. [DOI: https://dx.doi.org/10.1109/ACCESS.2020.2988510]
33. Pan, Y. Heading toward artificial intelligence 2.0. Engineering; 2016; 2, pp. 409-413. [DOI: https://dx.doi.org/10.1016/J.ENG.2016.04.018]
34. Peng, S.D. On Robotics Education (above). E-Educ. Res.; 2002; 6, pp. 3-7. [DOI: https://dx.doi.org/10.3102/0013189X006011003]
35. Alimisis, D. Educational robotics: Open questions and new challenges. Themes Sci. Technol. Educ.; 2013; 6, pp. 63-71.
36. Mitnik, R.; Recabarren, M.; Nussbaum, M.; Soto, A. Collaborative robotic instruction: A graph teaching experience. Comput. Educ.; 2009; 53, pp. 330-342. [DOI: https://dx.doi.org/10.1016/j.compedu.2009.02.010]
37. Scaradozzi, D.; Screpanti, L.; Cesaretti, L. Towards a definition of educational robotics: A classification of tools, experiences and assessments. Smart Learning with Educational Robotics: Using Robots to Scaffold Learning Outcomes; Springer: Berlin/Heidelberg, Germany, 2019; pp. 63-92.
38. Cooney, M.; Leister, W. Using the engagement profile to design an engaging robotic teaching assistant for students. Robotics; 2019; 8, 21. [DOI: https://dx.doi.org/10.3390/robotics8010021]
39. Robaczewski, A.; Bouchard, J.; Bouchard, K.; Gaboury, S. Socially assistive robots: The specific case of the NAO. Int. J. Soc. Robot.; 2021; 13, pp. 795-831. [DOI: https://dx.doi.org/10.1007/s12369-020-00664-7]
40. Verner, I.M.; Polishuk, A.; Krayner, N. Science class with RoboThespian: Using a robot teacher to make science fun and engage students. IEEE Robot. Autom. Mag.; 2016; 23, pp. 74-80. [DOI: https://dx.doi.org/10.1109/MRA.2016.2515018]
41. Costa, S.; Brunete, A.; Bae, B.C.; Mavridis, N. Emotional storytelling using virtual and robotic agents. Int. J. Hum. Robot.; 2018; 15, 1850006. [DOI: https://dx.doi.org/10.1142/S0219843618500068]
42. Garrison, D.R. Critical thinking and adult education: A conceptual model for developing critical thinking in adult learners. Int. J. Lifelong Educ.; 1991; 10, pp. 287-303. [DOI: https://dx.doi.org/10.1080/0260137910100403]
43. Sharkey, A.J. Should we welcome robot teachers?. Ethics Inf. Technol.; 2016; 18, pp. 283-297. [DOI: https://dx.doi.org/10.1007/s10676-016-9387-z]
44. Smakman, M.; Berket, J.; Konijn, E.A. The impact of social robots in education: Moral considerations of dutch educational policymakers. Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication; Naples, Italy, 31 August–4 September 2020; pp. 647-652.
45. Murdoch-Eaton, D.; Whittle, S. Generic skills in medical education: Developing the tools for successful lifelong learning. Med. Educ.; 2012; 46, pp. 120-128. [DOI: https://dx.doi.org/10.1111/j.1365-2923.2011.04065.x]
46. Drigas, A.; Papanastasiou, G.; Skianis, C. The School of the Future: The Role of Digital Technologies, Metacognition and Emotional Intelligence. Int. J. Emerg. Technol. Learn.; 2023; 18, 65. [DOI: https://dx.doi.org/10.3991/ijet.v18i09.38133]
47. Ahmed, H.; La, H.M. Education-robotics symbiosis: An evaluation of challenges and proposed recommendations. Proceedings of the 2019 IEEE Integrated STEM Education Conference (ISEC); Princeton, NJ, USA, 16 March 2019; pp. 222-229.
48. Koulouriotis, D.E.; Ketipi, M.K. Robot evaluation and selection Part A: An integrated review and annotated taxonomy. Int. J. Adv. Manuf. Technol.; 2014; 71, pp. 1371-1394. [DOI: https://dx.doi.org/10.1007/s00170-013-5525-5]
49. Matheson, E.; Minto, R.; Zampieri, E.G.; Faccio, M.; Rosati, G. Human–robot collaboration in manufacturing applications: A review. Robotics; 2019; 8, 100. [DOI: https://dx.doi.org/10.3390/robotics8040100]
50. Büyüközkan, G.; Çifçi, G. A novel hybrid MCDM approach based on fuzzy DEMATEL, fuzzy ANP, and fuzzy TOPSIS to evaluate green suppliers. Expert Syst. Appl.; 2012; 39, pp. 3000-3011. [DOI: https://dx.doi.org/10.1016/j.eswa.2011.08.162]
51. Sakthivel, G.; Ilangkumaran, M.; Gaikwad, A. A hybrid multi-criteria decision modeling approach for the best biodiesel blend selection based on ANP-TOPSIS analysis. Ain Shams Eng. J.; 2015; 6, pp. 239-256. [DOI: https://dx.doi.org/10.1016/j.asej.2014.08.003]
52. Liu, H.C.; Ren, M.L.; Wu, J.; Lin, Q.L. An interval 2-tuple linguistic MCDM method for robot evaluation and selection. Int. J. Prod. Res.; 2014; 52, pp. 2867-2880. [DOI: https://dx.doi.org/10.1080/00207543.2013.854939]
53. Parameshwaran, R.; Kumar, S.P.; Saravanakumar, K. An integrated fuzzy MCDM based approach for robot selection considering objective and subjective criteria. Appl. Soft Comput.; 2015; 26, pp. 31-41. [DOI: https://dx.doi.org/10.1016/j.asoc.2014.09.025]
54. Sen, D.K.; Datta, S.; Mahapatra, S.S. Extension of PROMETHEE for robot selection decision making: Simultaneous exploration of objective data and subjective (fuzzy) data. Benchmarking; 2016; 23, pp. 983-1014. [DOI: https://dx.doi.org/10.1108/BIJ-08-2015-0081]
55. Saaty, T. The analytic hierarchy process (AHP) for decision making. Kobe Jpn.; 1980; 1, 69.
56. Bhattacharya, A.; Sarkar, B.; Mukherjee, S.K. Integrating AHP with QFD for robot selection under requirement perspective. Int. J. Prod. Res.; 2005; 43, pp. 3671-3685. [DOI: https://dx.doi.org/10.1080/00207540500137217]
57. Goh, C.H. Analytic hierarchy process for robot selection. J. Manuf. Syst.; 1997; 16, pp. 381-386. [DOI: https://dx.doi.org/10.1016/S0278-6125(97)88467-1]
58. Geng, W.L.; Hu, Y.S. Performance evaluation of robot design based on AHP. Int. J. Database Theory Appl.; 2013; 6, pp. 79-88.
59. Wang, C.N.; Nguyen, N.A.T.; Dang, T.T. Offshore wind power station (OWPS) site selection using a two-stage MCDM-based spherical fuzzy set approach. Sci. Rep.; 2022; 12, 4260. [DOI: https://dx.doi.org/10.1038/s41598-022-08257-2]
60. Kapoor, V.; Tak, S.S. Fuzzy application to the analytic hierarchy process for robot selection. Fuzzy Optim. Decis. Mak.; 2005; 4, pp. 209-234. [DOI: https://dx.doi.org/10.1007/s10700-005-1890-3]
61. Wu, H.Y.; Chen, J.K.; Chen, I.S.; Zhuo, H.H. Ranking universities based on performance evaluation by a hybrid MCDM model. Measurement; 2012; 45, pp. 856-880. [DOI: https://dx.doi.org/10.1016/j.measurement.2012.02.009]
62. Chang, C.W.; Lee, J.H.; Chao, P.Y.; Wang, C.Y.; Chen, G.D. Exploring the possibility of using humanoid robots as instructional tools for teaching a second language in primary school. J. Educ. Technol. Soc.; 2010; 13, pp. 13-24.
63. Tsiakas, K.; Karkaletsis, V.; Makedon, F. A taxonomy in robot-assisted training: Current trends, needs, and challenges. Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference; Corfu, Greece, 26–29 June 2018; pp. 208-213.
64. Khaksar, S.M.S.; Khosla, R.; Chu, M.T.; Shahmehr, F.S. Service innovation using social robot to reduce social vulnerability among older people in residential care facilities. Technol. Forecast. Soc. Change; 2016; 113, pp. 438-453. [DOI: https://dx.doi.org/10.1016/j.techfore.2016.07.009]
65. Merlet, J.P. Interval analysis and reliability in robotics. Int. J. Reliab. Saf.; 2009; 3, pp. 104-130. [DOI: https://dx.doi.org/10.1504/IJRS.2009.026837]
66. Yoshino, K.; Zhang, S. Construction of assistive teaching robot in programming class. Proceedings of the 2018 7th International Congress on Advanced Applied Informatics; Yonago, Japan, 8–13 July 2018; pp. 215-220.
67. Yu, M.; Zhou, R.; Cai, Z.; Tan, C.W.; Wang, H. Unravelling the relationship between response time and user experience in mobile applications. Internet Res.; 2020; 30, pp. 1353-1382. [DOI: https://dx.doi.org/10.1108/INTR-05-2019-0223]
68. Park, S.J.; Han, J.H.; Kang, B.H.; Shin, K.C. Assistive teaching robot, ROBOSEM, in English class and practical issues for its diffusion. Proceedings of the Advanced Robotics and Its Social Impacts; Menlo Park, CA, USA, 2–4 October 2011; pp. 8-11.
69. Ryu, H.J.; Song, M.J.; Choi, J.G.; Kim, M.S. Visualization of assistive teaching robot’s image based on child’s mental model. Arch. Des. Res.; 2007; 20, pp. 177-188.
70. Yang, D.; Oh, E.S.; Wang, Y. Hybrid physical education teaching and curriculum design based on a voice interactive artificial intelligence educational robot. Sustainability; 2020; 12, 8000. [DOI: https://dx.doi.org/10.3390/su12198000]
71. Huijnen, C.A.; Lexis, M.A.; Jansens, R.; de Witte, L.P. How to implement robots in interventions for children with autism? A co-creation study involving people with autism, parents, and professionals. J. Autism Dev. Disord.; 2017; 47, pp. 3079-3096. [DOI: https://dx.doi.org/10.1007/s10803-017-3235-9] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/28685411]
72. Hsia, C.H.; Lai, C.F.; Su, Y.S. Impact of using ARCS model and problem-based learning on human interaction with robot and motivation. Libr. Hi Tech; 2022; 40, pp. 963-975. [DOI: https://dx.doi.org/10.1108/LHT-07-2020-0182]
73. Sonderegger, S. Enhancing learning processes by integrating social robots with learning management systems. Proceedings of the 2022 31st IEEE International Conference on Robot and Human Interactive Communication; Napoli, Italy, 29 August–2 September 2022; pp. 365-370.
74. Wu, W.C.V.; Wang, R.J.; Chen, N.S. Instructional design using an in-house built teaching assistant robot to enhance elementary school English-as-a-foreign-language learning. Interact. Learn. Environ.; 2015; 23, pp. 696-714. [DOI: https://dx.doi.org/10.1080/10494820.2013.792844]
75. Louie, W.Y.G.; Nejat, G. A social robot learning to facilitate an assistive group-based activity from non-expert caregivers. Int. J. Soc. Robot.; 2020; 12, pp. 1159-1176. [DOI: https://dx.doi.org/10.1007/s12369-020-00621-4]
76. Hong, Z.W.; Huang, Y.M.; Hsu, M.; Shen, W.W. Authoring robot-assisted instructional materials for improving learning performance and motivation in EFL classrooms. J. Educ. Technol. Soc.; 2016; 19, pp. 337-349.
77. Cross, E.S.; Hortensius, R.; Wykowska, A. From social brains to social robots: Applying neurocognitive insights to human-robot interaction. Philos. Trans. R. Soc. B; 2019; 374, 20180024. [DOI: https://dx.doi.org/10.1098/rstb.2018.0024]
78. Lauridsen, K.; Christensen, P.; Kongsø, H.E. Assessment of the reliability of robotic systems for use in radiation environments. Reliab. Eng. Syst. Saf.; 1996; 53, pp. 265-276. [DOI: https://dx.doi.org/10.1016/S0951-8320(96)00056-7]
79. Jung, J.G.; Choi, J.H.; Han, J.H. Analysis on children’s response depending on teaching assistant robots’ styles. J. Korean Assoc. Inf. Educ.; 2007; 11, pp. 195-203.
80. Hsu, C.H.; Wang, F.K.; Tzeng, G.H. The best vendor selection for conducting the recycled material based on a hybrid MCDM model combining DANP with VIKOR. Resour. Conserv. Recycl.; 2012; 66, pp. 95-111. [DOI: https://dx.doi.org/10.1016/j.resconrec.2012.02.009]
81. Govindan, K.; Kannan, D.; Shankar, M. Evaluation of green manufacturing practices using a hybrid MCDM model combining DANP with PROMETHEE. Int. J. Prod. Res.; 2015; 53, pp. 6344-6371. [DOI: https://dx.doi.org/10.1080/00207543.2014.898865]
82. Hung, Y.H.; Chou, S.C.T.; Tzeng, G.H. Knowledge management adoption and assessment for SMEs by a novel MCDM approach. Decis. Support Syst.; 2011; 51, pp. 270-291. [DOI: https://dx.doi.org/10.1016/j.dss.2010.11.021]
83. Tzeng, G.H.; Chiang, C.H.; Li, C.W. Evaluating intertwined effects in e-learning programs: A novel hybrid MCDM model based on factor analysis and DEMATEL. Expert Syst. Appl.; 2007; 32, pp. 1028-1044. [DOI: https://dx.doi.org/10.1016/j.eswa.2006.02.004]
84. Ordoobadi, S.M. Application of ANP methodology in evaluation of advanced technologies. J. Manuf. Technol. Manag.; 2012; 23, pp. 229-252. [DOI: https://dx.doi.org/10.1108/17410381211202214]
85. Shao, Q.G.; Liou, J.J.; Weng, S.S.; Chuang, Y.C. Improving the green building evaluation system in China based on the DANP method. Sustainability; 2018; 10, 1173. [DOI: https://dx.doi.org/10.3390/su10041173]
86. James, G.M.; Sugar, C.A. Clustering for sparsely sampled functional data. J. Am. Stat. Assoc.; 2003; 98, pp. 397-408. [DOI: https://dx.doi.org/10.1198/016214503000189]
87. DiPasquale, D.; Wheaton, W.C. The markets for real estate assets and space: A conceptual framework. Real Estate Econ.; 1992; 20, pp. 181-198. [DOI: https://dx.doi.org/10.1111/1540-6229.00579]
88. Yang, C.L.; Chuang, S.P.; Huang, R.H. Manufacturing evaluation system based on AHP/ANP approach for wafer fabricating industry. Expert Syst. Appl.; 2009; 36, pp. 11369-11377. [DOI: https://dx.doi.org/10.1016/j.eswa.2009.03.023]
89. Chung, S.H.; Lee, A.H.; Pearn, W.L. Product mix optimization for semiconductor manufacturing based on AHP and ANP analysis. J. Adv. Manuf. Technol.; 2005; 25, pp. 1144-1156. [DOI: https://dx.doi.org/10.1007/s00170-003-1956-8]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
A typical example of a robot used in education is the assistive teaching robot, which has much potential to enhance teaching and learning as well as to promote sustainable learning. However, there needs to be formalized selection and evaluation procedures for robotic teaching assistants. To address this need, this paper presents a function evaluation system framework for assistive teaching robots, which includes four dimensions: system structure, appearance interface, teaching function, and auxiliary support. This study employed the framework of the DANP method to examine the extent of influence of various indicators. The analysis focused on determining the critical components of the function evaluation system for assistive teaching robots. Ultimately, the study concluded that two crucial factors in this evaluation system emerged: teaching function and auxiliary support. These two aspects are also key elements in promoting sustainable learning. Moreover, recommendations are made for designing and selecting suitable assistive teaching robot products, aiming to serve as an exemplary framework for future product development and implementing educational activities within school settings, while further contributing to the realization of sustainable learning.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details

1 Faculty of Education, Fujian Normal University, Fuzhou 350117, China;
2 School of Economics and Management, Xiamen University of Technology, Xiamen 361024, China;