1. Introduction
Advances in intelligent and mobile technologies have given rise to smart classrooms (SCs)—technology-driven learning environments that enhance teaching quality, student engagement, and personalized learning (Gambo & Shakir, 2023; Zhan et al., 2021). SCs integrate information technology, the internet, and digital tools to create intelligent, personalized, and digitized educational spaces (Lycurgus et al., 2024; Ma et al., 2024; Mugruza-Vassallo, 2023; Shaw & Patra, 2022). However, fully harnessing digital technologies requires a shift toward a student-centered, personalized learning culture (C.-M. Chen et al., 2005; S. Y. Chen et al., 2016; Walkington & Bernacki, 2020). Here, defined as a technology-assisted closed environment, SCs support personalized learning and innovative pedagogy (Major et al., 2021; Zhang et al., 2020a, 2020b). In this study, we define personalized learning in smart classrooms as an approach that tailors learning experiences based on individual differences—including prior knowledge, cognitive and non-cognitive traits, interests, needs, and goals—through intelligent technologies and systematic instructional design (Hu et al., 2022; Z. Liu et al., 2022; Wang et al., 2023). Our aim is to develop a personalized learning measurement scale for smart classrooms, enabling the assessment of students’ experiences, optimization of teaching strategies, and advancement of personalized learning research.
1.1. Personalized Learning in Smart Classrooms
Personalized learning (PL), rooted in a learner-centered philosophy, respects individual differences by offering flexible methods that help students master knowledge and develop key skills at their own pace (C.-M. Chen et al., 2005; S. Y. Chen et al., 2016; Walkington & Bernacki, 2020). Historically, PL’s roots trace back to John Dewey’s early 20th-century learner-centered philosophy (Keefe & Jenkins, 2008; Redding, 2016). Characteristically, PL challenges the standardized, large-scale education model of the Industrial Revolution, driving modern educational reform (Basham et al., 2016). It acknowledges differences in students’ cognitive and non-cognitive characteristics (Sampson & Karagiannidis, 2002), where cognitive skills involve problem-solving and critical thinking, while non-cognitive traits influence personal growth, social engagement, and achievement (Farkas, 2003; Kell, 2018). S. Y. Chen and Macredie (2010) further identified factors like prior knowledge, cognitive styles, gender, and learning approaches that shape PL. To put it briefly, given its emphasis on individual differences, PL is central to student-centered instructional design.
Like many key concepts in the humanities and social sciences, PL lacks a unified definition. Principally, researchers viewed it through the lens of learner individuality, emphasizing a learner-centered philosophy that promotes active participation and choice in what, how, when, and where students learn (C.-M. Chen et al., 2005; S. Y. Chen et al., 2016; Walkington & Bernacki, 2020). Some define PL from a teaching perspective, considering individual differences (Patrick et al., 2013; U.S. Department of Education, 2010). Others emphasize the role of technology in PL, defining technology-enhanced personalized learning (TEL) as adaptive learning experiences based on real-time assessments (e.g., Bulger, 2016; Major et al., 2021; Zhang et al., 2022). All in all, PL is designed based on respecting students’ individual differences, focusing on their cognitive, thinking, emotional aspects, and maximizing the autonomy and subjectivity of their learning (Bulger, 2016; Major et al., 2021; Patrick et al., 2013; U.S. Department of Education, 2010; Zhang et al., 2022).
While teachers can address personalized needs through direct guidance (Bulger, 2016; Holmes et al., 2018), TEL provides scalable solutions for overburdened educators and overcrowded classrooms (Fake & Dabbagh, 2020). As a result, PL approaches and techniques have gained widespread attention in education (Major et al., 2021; Pane et al., 2015; Van Schoors et al., 2021; Xie et al., 2019). Researchers increasingly integrate intelligent technologies into SCs to develop PL systems that deliver tailored recommendations and individualized experiences (Niknam & Thulasiraman, 2020; Ouf et al., 2017; Sungkur & Maharaj, 2021; Troussas et al., 2020). Nevertheless, while technology has shown promise in enhancing PL, its impact in SC environments remains underexplored.
A key challenge is the lack of effective tools to measure PL support, limiting both theoretical research and practical implementation. Although some researchers have developed personalized learning support instruments (PLSIs) based on the universal design for learning (UDL) framework (Zhang et al., 2022), these tools are designed for traditional classrooms and are unsuitable for smart learning environments (Chatti et al., 2010; Cheung et al., 2021; Kloos et al., 2020; Price, 2015; Sharma et al., 2018; Toivonen et al., 2018). Additionally, from our comprehensive literature review, we found that many researchers have explored technology-driven solutions to enhance PL. Wang et al. (2023) describe SCs as an advanced form of online and multimedia-based learning, emphasizing that technology is the key differentiator between smart and traditional classrooms. Hu et al. (2022) introduced a learning analytics dashboard (LAD) in an AI-supported smart learning environment (SLE), allowing students to track and optimize their learning behaviors, leading to significant academic improvement. Kaur and Bhatia (2022) further highlight the role of information and communication technology (ICT) components such as the Internet of Things (IoT), cloud computing, and edge computing in realizing this vision. Niknam and Thulasiraman (2020) created a learning path recommendation system using an ant colony optimization algorithm, significantly improving knowledge retention. Yu et al. (2024) developed an intelligent Q&A system for entrepreneurship education, utilizing natural language processing (NLP) to enhance knowledge retrieval and comprehension. Overall, most current research results often center on the application of specific systems or algorithms, and their effectiveness and universality in different environments and broader populations remain unclear (Hu et al., 2022; Kaur & Bhatia, 2022; Niknam & Thulasiraman, 2020; Wang et al., 2023; Yu et al., 2024). Thus, understanding students’ experiences with these innovative solutions becomes essential. Therefore, we adopted a constructivist theoretical perspective for understanding student experiences (Radcliffe, 2009).
1.2. Constructivism
Constructivism suggests that individuals actively construct their own understanding of the world through their experiences and interactions with the environment, and that this process of construction is ongoing throughout the lifespan (Charmaz, 2021; Glaser & Strauss, 1964). Social constructionism/constructivism is, “a theoretical perspective that assumes that people create social reality(ies) through individual and collective actions” (Charmaz, 2006, p. 189). This perspective focuses on how people in a particular time and place create their own perceptions of reality. Instead of assuming that reality exists independently of us, social constructionists “view that social properties are constructed through interactions between people, rather than having a separate existence” (Robson & McCartan, 2016, p. 24). Hence, Radcliffe (2009) proposed the pedagogy–space–technology theoretical framework to foster an understanding of student-centered constructivist pedagogies in education.
1.3. Pedagogy–Space–Technology Framework
David Radcliffe, after launching the “next generation learning spaces” project, proposed a framework for designing and evaluating learning spaces (Radcliffe, 2009). This framework suggests that the composition of a learning space includes three elements: pedagogy, learning space, and technology. It classifies stimuli based on their constituent elements, considering natural objective objects (technology), artificial objective objects (learning space), subjective objects (pedagogy), and other factors. According to the three-dimensional model for learning environment design and evaluation, students engaging in PL connect with teaching methods, the effective integration of technology, and the design of learning spaces (Reushle, 2012). We will examine the rationale for developing this scale in terms of PST dimensions.
1.3.1. Pedagogy
What kind of teaching and learning do we aim to promote, and for what reasons? (Radcliffe, 2009). This question forms the basis of the “pedagogy” dimension. The constantly evolving nature of education and the (so-called) new generation of students places considerable emphasis on the adaptive teaching and characteristics of teachers (Göncz, 2017). According to Huber and Seidel (2018), our understanding of the interplay between teachers and students has recently begun to improve. Compared to teacher-centered pedagogies, which overly emphasize the teacher’s dominant role and neglect students’ individual differences and active learning needs, student-centered pedagogies foster creativity and critical thinking by promoting student autonomy (Bulger, 2016; Major et al., 2021; Patrick et al., 2013; U.S. Department of Education, 2010; Zhang et al., 2022). Furthermore, Basham et al. (2016) found that designing PL experiences requires a restructured educational model that emphasizes individual growth in various areas. Currently, Cheng and Carolyn Yang (2023) summarize three types of student-centered pedagogies—team-based learning, problem-based learning, and project-based learning—and further clarify the differences among them. In addition, precision teaching (Lindsley, 1992) and flipped classrooms are also considered effective student-centered teaching methods (Fan, 2024; C. Liu & Wang, 2018). Since SCs and student-centered pedagogies are interdependent (Koh et al., 2020; Malekigorji & Hatahet, 2020), we recognized the pedagogy dimension as the first rationale.
1.3.2. Learning Spaces
The key question for this dimension is “Which elements of space design and the arrangement of furniture and fixtures influence pedagogical approaches?” (Radcliffe, 2009). A learning space naturally influences how people interact within it, shaped by interactions between students and their environments (Blumer, 1966; Bryant & Charmaz, 2007; Charmaz, 2021; Dai & Chen, 2013; Dai et al., 2011; Lemert, 1974; Mead, 1934/1972; Morse et al., 2021; Robson & McCartan, 2016; Sternberg, 2022; Wirthwein et al., 2019; Ziegler & Bicakci, 2023; Ziegler & Stoeger, 2012; Ziegler et al., 2012). A smart learning space is essentially an intelligent physical space equipped with interactive whiteboards, projectors, automatic assessment/feedback tools, cameras for recording and storing lectures, and sensors that control temperature, humidity, air quality, and acoustics (Saini & Goel, 2019). This is distinct from the commonly discussed “learning environment”, which solely refers to the social, psychological, or conceptual environment (Reushle, 2012; Zhan et al., 2021).
The intelligent systems of smart classrooms are considered as another element in the learning environment with the following interaction types (a) teacher–student, (b) student–student, (c) teacher–media, and (d) student–media (Zhan et al., 2021). Hence, compared to traditional multimedia classroom (TMC) environments, the infrastructural resources of SCs have the potential to significantly influence individuals’ educational pathways and experiences (c.f., Ziegler, 2005; Ziegler & Stoeger, 2017; Ziegler et al., 2012). Bringing it all together, students’ experiences are shaped by their individual perceptions of the learning space (Ketscher et al., 2025; Merton, 1995; Thomas & Thomas, 1928). Eventually, we noted learning spaces as the second rationale.
1.3.3. Technology
In what ways can technology be incorporated into space design to reinforce and improve the desired teaching and learning approaches? (Radcliffe, 2009). Technology supports learning, particularly in promoting PL for students, with a positive and beneficial impact. Some researchers have found that artificial intelligence (AI) technology can not only provide personalized feedback and expert guidance to learners (Yu et al., 2024), but also enhance engagement, offer PL experiences, and ultimately increase motivation and foster autonomous learning (Santhosh et al., 2023). That is to say, the technological infrastructure of SCs can lead to distinct pedagogical improvements compared to traditional and/or technology-integrated classrooms (Chatti et al., 2010; Cheung et al., 2021; Kloos et al., 2020; Price, 2015; Sharma et al., 2018; Toivonen et al., 2018). How learners use smart devices (such as smart tablets, smartphones, etc.) flexibly and effectively in a smart classroom is a key factor in promoting their PL. Thus, we have concluded the third rationale.
1.4. Summary of Research Rationale
The (so-called) new generation of students perceives technology as an integral part of learning and may struggle in technology-limited environments (Tapscott, 2009). Nevertheless, while Basham et al. (2016) find that PL environments demand more than technology, smart classrooms offer this “more” (Yang & Huang, 2015). SCs enhance adaptability and interactivity through advanced technology and AI, offering a more dynamic learning environment than TMCs, which rely on fixed resources and conventional software (Zhan et al., 2021). “Smart classroom is one direction in smart learning environment research”(Yang & Huang, 2015, p. 157). Hence, in this research body, our aim is to develop a scale to measure PL experiences in SC environments.
2. Method
2.1. Development Process of Smart Classroom Personalized Learning Scale (SCE-PL)
This study adopted a deductive approach to scale development, beginning with a comprehensive literature review to define the target construct of PL for middle-school students in SC settings (Hinkin, 1995). Existing scales and questionnaires were evaluated to identify gaps and areas for refinement, focusing particularly on five key instruments: Fraser (1998)’s What is happening in this class? (WIHIC), Zhang et al. (2022)’s personalized learning experience support instrument (PLSI), EDUCASE Learning Initiative (2021)’s learning space rating system, Timothy et al. (2010)’s self-directed learning with technology scale (SDLTS), and Yang and Huang (2015)’s classroom environment evaluation scale (CEES). From these sources, an initial pool of 48 items was generated, covering topics such as teaching methods and learning spaces. Subsequently, two Chinese information technology experts conducted three rounds of discussions to ensure clarity, exclusivity, and alignment with the scale’s objectives (DeVellis, 1991; Hambleton, 2005; Hinkin, 1995). Through this process, four items were removed for failing to meet relevance or clarity requirements (e.g., The view outside the classroom window relaxes me and enhances my learning atmosphere). This phase concluded with 44 initial items.
Content validity—how effectively an instrument’s items capture the intended construct (Polit & Beck, 2006)—was assessed by 11 experts (8 university professors and 3 secondary school teachers) in information technology (IT). After two rounds of expert feedback, 5 additional items were eliminated due to issues of redundancy or ambiguity (DeVellis, 1991; Hambleton, 2005), leaving 39 items in the initial scale (Appendix A, Table A1). The questionnaire comprised two sections: (1) demographic information about the respondents, and (2) the 39-item SCE-PL, rated on a 5-point Likert scale (1 = “strongly disagree”, 5 = “strongly agree”).
2.2. Sampling Frame and Participants
A combination of stratified and convenience sampling was employed to capture diverse educational contexts. Three provinces in China (Guangdong, Shaanxi, and Inner Mongolia) were selected to reflect variations in economic development and educational practices. Within each province, one representative school was purposively chosen, and students from multiple grade levels (7th, 8th, 9th) were included to enhance generalizability. All participating schools had adopted technology-enhanced PL strategies, providing a suitable environment for investigating the proposed scale.
The research targeted Chinese middle-school students who had experience in SC settings. A SC was defined as a learning space equipped with interactive whiteboards, recording systems, VR/AR devices, and student terminal devices, e.g., tablets and laptops (Lycurgus et al., 2024; Ma et al., 2024; Mugruza-Vassallo, 2023; Shaw & Patra, 2022). Prior to data collection, data collection approval was obtained from local school administrators, superintendents or principals, and class teachers. Furthermore, in order to avoid an ethical dilemma, we prepared our research under the guidance of people like Kropotkin (1922/1924) or Resnik (1998), and ethical codes such as the American Educational Research Association (AERA, 2011), American Psychological Association (APA, 2017), “The Nuremberg Code (1947)” (The Nuremberg Code (1947), 1996), and Multidisciplinary Digital Publishing Institute’s research and publication ethics (MDPI, n.d.).
2.3. Data Collection
The questionnaire was distributed online through Wenjuanxing (
2.4. Data Preparation
Two researchers jointly managed the statistical analyses to ensure consistency and accuracy. SPSS 26.0 was employed to conduct item analysis using the critical ratio (CR) method and reliability tests (Hair et al., 2010). The decision rules were (a) retain items with significant CR values (p < 0.05) and t > 3; and (b) ensure Cronbach’s α remains above 0.70, item-total correlation (CITC) exceeds 0.35, and that deleting an item does not elevate the dimension’s α (Yuan et al., 2021; Zhao et al., 2023). Based on these criteria, five items (CG3, FMM4, SLV3, LD2, and DA3) were eliminated, leaving 34 items. Further reliability checks confirmed that all remaining items had α coefficients above 0.70, that removing them would lower the dimension’s reliability, and that item-total correlations exceeded 0.40 (Tabachnick & Fidell, 2007).
2.5. Data Analysis
EFA is routinely employed to assess the construct validity of an instrument. Prior to factor extraction, the data’s suitability was evaluated using the Kaiser–Meyer–Olkin (KMO) test and Bartlett’s test of sphericity (Huck et al., 1974; Zhao et al., 2023). The KMO value was 0.889 (>0.70), indicating strong inter-item correlations, and Bartlett’s test was highly significant (χ2 = 10,258.843, df = 561, p < 0.001), confirming that common factors were present (Table 2). Principal component analysis (PCA) with varimax rotation then extracted nine factors with eigenvalues greater than 1 (DeVellis, 1991).
CFA was conducted using AMOS 23.0 on the second dataset (n = 584) to verify whether the model derived from EFA aligned with theoretical expectations. Model fit was assessed via multiple indices: χ2, RMSEA, CFI, TLI, RMR, GFI, AGFI, NFI, and IFI. Common acceptance criteria include χ2/df < 5, RMR ≤ 0.05, RMSEA < 0.08, GFI and AGFI > 0.80, and CFI, NFI, IFI, and TLI > 0.90. After confirming model fit, reliability and validity tests (including convergent and discriminant validity) were performed to finalize the scale structure.
3. Results
EFA was conducted on the first dataset (n = 424) to identify the latent structure of the initial scale. The KMO value (0.889) and Bartlett’s test of sphericity (χ2 = 10,258.843, df = 561, p < 0.001) confirmed the adequacy of the data for factor analysis. PCA with varimax rotation extracted nine factors with eigenvalues exceeding 1. Together, these factors accounted for 78.12% of the total variance, surpassing typical benchmarks in educational and psychological research (Gruijters, 2019; Härdle & Simar, 2015; Tabachnick & Fidell, 2007). The scree plot (Figure 1) shows a sharp drop in eigenvalues over the first nine factors, followed by a distinct “elbow”, after which the remaining factors level out with eigenvalues below 1. Factor loadings exceeded 0.60 for all retained items, and no additional deletions were necessary, thus yielding a robust nine-factor structure for subsequent validation.
The second dataset (n = 584), with a ratio of approximately 17:1 respondents per item, was used to verify the nine-factor structure. AMOS 24.0 estimated a model comprising nine first-order factors—Timely Progress Monitoring (TPM), Intelligent Diagnosis and Services (IDS), Layout and Display (LD), Flexible Instructional Methods and Materials (FMM), Clear and Relevant Goals (CRG), Environmental Quality (EQ), Access to Resources (AR), Supporting Learner Variability (SLV), and Device Access (DA)—organized under three broader dimensions (pedagogy, space, and technology). Model fit was excellent (χ2/df = 1.515, RMSEA = 0.030, GFI = 0.929, AGFI = 0.918, NFI = 0.946, TLI = 0.979, and CFI = 0.981), and all factor loadings were significant (p < 0.001). No item showed cross-loadings, verifying that the 34 retained items accurately measured their intended constructs (Huck et al., 1974). Figure 2 illustrates the final model, while further analyses confirmed that variables were correctly assigned to the nine factors, each aligning with the theoretical dimensions established through expert review.
The rotated component matrix (Table 3) shows how each survey item clusters onto nine distinct factors, with each column (Components 1–9) corresponding to one factor. For example, items TPM1–TPM4 load strongly on Component 1 (all above 0.80), indicating they measure a single underlying construct (Timely Progress Monitoring). Similarly, items IDS1–IDS4 load on Component 2 (Intelligent Diagnosis and Services), LD1–LD5 on Component 3 (Layout and Display), etc. Consequently, this table supports the conclusion that the final 34-item questionnaire reliably measures nine distinct constructs, each capturing a unique aspect of PL in a SC environment.
Cronbach’s alpha was used to assess internal consistency. The overall alpha coefficient for the scale was 0.938, while the subscales ranged from 0.887 to 0.924, all well as being above the commonly accepted threshold of 0.70. Table 3 presents the means, standard deviations, and correlation coefficients among the nine latent constructs (CRG, FMM, TPM, SLV, EQ, LD, DA, AR, and IDS). The bold-faced diagonal entries (e.g., 0.820, 0.839) are the square roots of the AVE for each factor.
The Fornell–Larcker criterion supported discriminant validity: the square roots of the AVE for each factor surpassed the correlations with other factors, indicating that the nine constructs are empirically distinct. As shown in Table 4, the EFA and CFA findings, alongside favorable reliability and validity metrics, affirm that the nine-factor, 34-item SCE-PL is both psychometrically sound and theoretically coherent.
Table 5 offers a detailed overview of the convergent validity indices for the SCE-PL. Convergent validity was established by factor loadings (>0.50), composite reliability (>0.70), and average variance extracted (>0.50) for each factor, confirming that items within each subscale converge on a single underlying dimension.
4. Discussion
This study developed and validated the smart classroom environment–personalized learning scale (SCE-PL), a measurement instrument designed to capture middle-school students’ PL experiences in SC environments. Drawing on literature reviews, expert consultations, and iterative item refinement, we generated a final instrument comprising 34 items and nine factors aligned with three overarching dimensions: pedagogy, learning space, and technology (Radcliffe, 2009). EFA indicated that these nine factors accounted for a substantial portion of the total variance, while confirmatory factor analysis confirmed strong model fit and factor loadings. Reliability tests (Cronbach’s alpha and composite reliability) met or exceeded standard criteria, and convergent and discriminant validity were supported through factor loadings, AVE, and the Fornell–Larcker criterion. These findings underscore that the nine-factor structure and strong psychometric properties of the SCE-PL scale corroborate the claim that PL in SCs emerges from a dynamic interplay of student-centered pedagogy, thoughtfully designed learning spaces, and well-integrated technology (Basham et al., 2016; Bulger, 2016; Chatti et al., 2010; S. Y. Chen et al., 2016; Fraser, 1998; Gambo & Shakir, 2023; Z. Liu et al., 2022; Mead, 1934/1972; Radcliffe, 2009; Yang & Huang, 2015; Zhan et al., 2021; Ziegler, 2005; Ziegler & Stoeger, 2017).
The results enrich the theoretical dialogue surrounding PL and SC research by systematically bridging pedagogy, space, and technology factors under a single measurement framework (Ouf et al., 2017; Pane et al., 2015; Price, 2015; Radcliffe, 2009; Reushle, 2012; Saini & Goel, 2019; Van Schoors et al., 2021; Walkington & Bernacki, 2020; Xie et al., 2019; Yang & Huang, 2015; Zhan et al., 2021; Zhang et al., 2020a). Earlier research often centered on specific technologies (e.g., learning analytics dashboards and algorithm-based recommendation systems) or conceptual models lacking robust validation in varied contexts (e.g., Hu et al., 2022; Kaur & Bhatia, 2022; Niknam & Thulasiraman, 2020; Wang et al., 2023; Yu et al., 2024). By integrating Radcliffe’s (2009) PST framework with the conceptual underpinnings of PL, this study contributes a nuanced perspective that acknowledges both the instructional design (teacher strategies, student autonomy, and flexible methods) and environmental design (physical layout and device access) critical for personalizing students’ learning processes. Consequently, the SCE-PL may serve as a foundational tool for future empirical inquiries into how these dimensions interact to shape adaptive learning experiences.
From a constructivist perspective, learning is a process in which students actively construct meaning based on their experiences, interactions, and reflections. This study emphasizes the situated nature of PL by confirming that each of the nine subfactors is distinct yet correlated. Students are not merely passive recipients of knowledge; they engage with instructional materials, classroom technologies, peers, and teachers in ways profoundly shaped by individual differences and contextual factors. Such a view aligns with social constructionism, wherein reality—and, by extension, learning experiences (Charmaz, 2021; Glaser & Strauss, 1964; Robson & McCartan, 2016)—are co-constructed in specific social and technological contexts. The robust loadings of subfactors related to pedagogy further highlight that PL is not only about customizing content or pacing; it also depends on the interactive, learner-centered strategies that spark motivation and autonomy (Fraser, 1998; Tapscott, 2009; Yang & Huang, 2015; Zhan et al., 2021). The consistent internal consistency of these constructs suggests that educators who adopt learner-oriented pedagogies guided by constructivist principles significantly influence students’ PL experiences.
The findings validate prior arguments that student-centered instructional methods (e.g., team-based learning and flipped classrooms) are essential for personalizing the learning experience (Basham et al., 2016; Bulger, 2016; Cheng & Carolyn Yang, 2023). That Timely Progress Monitoring (TPM) loaded strongly under pedagogy underscores the importance of continuous, data-driven feedback loops, aligning with constructivist theories that posit real-time interaction as a core facilitator of deeper learning. A smart learning space includes elements like interactive whiteboards, flexible seating, and environmental controls (Saini & Goel, 2019). Hence, the significant loadings for Environmental Quality (EQ) and Layout and Display (LD) confirm the critical role of physical and infrastructural design in creating learning spaces (Blumer, 1966; Reushle, 2012). The data suggests that these factors are not merely aesthetic or comfort-related; they tangibly shape students’ perceptions of autonomy, engagement, and collaboration.
Strong factor loadings for Access to Resources (AR), Device Access (DA), and Intelligent Diagnosis and Services (IDS) reinforce the claim that smart technology is integral to PL, notably when it offers adaptive feedback, varied resources, and real-time assessments (Hu et al., 2022; Niknam & Thulasiraman, 2020; Yu et al., 2024). Such findings echo emerging stances on how technology mediates learning by providing interactive experiences that allow students to co-construct knowledge rather than passively receive it. Moreover, the evidence that these technological subfactors are empirically distinct yet interrelated with pedagogy and space underscores Radcliffe’s (2009) assertion that technology must be purposefully embedded within pedagogical strategies and physical infrastructure to maximize its educational impact. Furthermore, rather than seeing technology as a standalone enabler, the data illustrate that technology’s efficacy in personalizing education is contingent upon compatible pedagogical methods (such as immediate feedback and flexible pacing) and well-designed spaces (supportive seating, adjustable lighting, and interactive displays). Consequently, this study provides a more granular, empirically grounded articulation of how the synergy among pedagogy, space, and technology contributes to student-centered, adaptive learning cultures.
Finally, the validated structure of the SCE-PL enriches ongoing scholarly discourse on PL by merging classic learner-centered theories (John Dewey, Piaget, Vygotsky) with modern, technology-integrated approaches (Bulger, 2016; Major et al., 2021; Zhang et al., 2022). The high internal consistency of subscales like Flexible Instructional Methods and Materials (FMM) or Supporting Learner Variability (SLV) highlights that personalizing education is not a single-pronged strategy but a holistic endeavor. Each dimension—be it real-time progress monitoring, physical layout, or AI-driven diagnostics—contributes uniquely to the learner’s constructivist process of active knowledge building (S. Y. Chen & Macredie, 2010). Moreover, mapping these subscales onto the pedagogy–space–technology matrix illustrates that personalization extends beyond simply providing digital resources or customizing tasks; it requires cohesive pedagogical design, flexible spatial arrangements, and sophisticated technological infrastructure. This integrated view resonates with the broader notion of “smart learning environments”, wherein learning transcends mere digitization to become an adaptive, interactive, and profoundly learner-centered experience (Ma et al., 2024; Mugruza-Vassallo, 2023; Yuan et al., 2021; Zhan et al., 2021).
Practitioners and policymakers can leverage the SCE-PL to diagnose and optimize PL initiatives within SCs. By pinpointing strengths and weaknesses across nine distinct factors—such as Timely Progress Monitoring (TPM), Device Access (DA), or Layout and Display (LD)—teachers and administrators gain actionable insights for targeted improvements. For instance, if students rate “Layout and Display” lower, schools might invest in rearranging seating or upgrading interactive displays to boost engagement. Likewise, higher scores in “Intelligent Diagnosis and Services” (IDS) but moderate scores in “Flexible Instructional Methods and Materials” (FMM) could imply that while real-time data analytics are robust, the range of pedagogical materials needs expansion. Overall, these diagnostic applications can foster continuous quality enhancement in SCs, aligning with broader global shifts toward data-informed educational practice (Kaur & Bhatia, 2022; C. Liu & Wang, 2018; Radcliffe, 2009; Reushle, 2012; Sharma et al., 2018; Zhan et al., 2021).
Despite its rigorous methodology, this study has several limitations. First, generalizability may be constrained by the sampling strategy, which, although stratified across three Chinese provinces, remains situated in specific cultural and educational contexts. Future validations in diverse regions—especially outside East Asia—are encouraged to ascertain the scale’s broader applicability. Second, self-reported data can be susceptible to social desirability bias, as students may overstate or understate their experiences (Diefes-Dux, 2019; Zhang et al., 2022). Incorporating observational or performance-based measures could yield a more holistic evaluation of PL (Sampson & Karagiannidis, 2002). Third, while the SCE-PL captures key dimensions aligned with the PST framework, other latent variables—such as motivation, self-efficacy, or teacher autonomy support—may also influence PL experiences (Fraser, 1998; Yang & Huang, 2015). Integrating these constructs could offer richer insights.
Building on the SCE-PL, researchers might explore longitudinal designs to assess how students’ PL experiences evolve over time, particularly as SC infrastructure and pedagogical innovations mature. Cross-cultural replication could illuminate whether the nine-factor structure holds under varying educational policies, socio-economic conditions, or technological advancements. Additionally, experimental or quasi-experimental designs could investigate causal relationships between improved scores on specific subscales and measurable outcomes such as academic performance, engagement, or self-regulated learning skills. Finally, further integration with learning analytics—for instance, correlating SCE-PL scores with real-time data from learning management systems—could refine our understanding of how different aspects of the scale map onto students’ actual learning behaviors.
To conclude, by uniting pedagogy, space, and technology elements into a validated nine-factor instrument, this study addresses a critical gap in measuring PL experiences within SCs. The SCE-PL offers a theoretically grounded and empirically substantiated tool for both researchers and practitioners, enabling more precise identification of strengths and challenges in technology-enriched instructional settings. Doing so lays the groundwork for data-driven refinements to PL strategies, ultimately advancing the broader goal of placing learner individuality at the heart of educational innovation.
Conceptualization, A.Z., B.Z., P.T. and M.B.; methodology, P.T., A.Z. and M.B.; software, P.T.; validation, P.T.; formal analysis, P.T.; investigation, P.T. and M.B.; resources, P.T. and M.B.; data curation, P.T.; writing—original draft preparation, P.T. and M.B.; writing—review and editing, M.B., P.T. and A.Z.; visualization, P.T. and M.B.; supervision, A.Z. and B.Z. All authors have read and agreed to the published version of the manuscript.
The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Faculty of Education, Shaanxi Normal University (date of 20 February 2025).
Informed consent was obtained from all subjects involved in the study.
The raw data supporting the conclusions of this article will be made available by the authors up-on request.
The authors declare no conflicts of interest.
The following abbreviations are used in this manuscript:
AERA | American Educational Research Association |
APA | American Psychological Association |
AR | Access to Resources |
AVE | Average Variance Extracted |
CFA | Confirmatory Factor Analysis |
CR | Composite Reliability |
CRG | Clear and Relevant Goals |
DA | Device Access |
EFA | Exploratory Factor Analysis |
EQ | Environmental Quality |
FMM | Flexible Instructional Methods and Materials |
IDS | Intelligent Diagnosis and Services |
KMO | Kaiser–Meyer–Olkin |
LD | Layout and Display |
MDPI | Multidisciplinary Digital Publishing Institute |
PST | Pedagogy–Space–Technology |
SC | Smart Classroom |
SCE_PL | Smart Classroom Environment–Personalized Learning |
SLV | Supporting Learner Variability |
TEL | Technology-Enhanced Personalized Learning |
TMC | Traditional Multimedia Classrooms |
TPM | Timely Progress Monitoring |
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 1 Scree plot.
Figure 2 Model of confirmatory factor analysis.
The demographic composition of the samples.
Participant Demographics | Sample 1 | Sample 2 |
---|---|---|
Grade level | n (%) | n (%) |
Seventh grade | 159 (36.14%) | 225 (38.53%) |
Eighth grade | 135 (30.68%) | 173 (29.62%) |
Ninth grade | 146 (33.18%) | 186 (31.85%) |
Gender | ||
Male | 253 (57.5%) | 281 (48.12%) |
Female | 187 (42.5%) | 303 (51.88%) |
* Before data purification.
KMO test and Bartlett’s test of sphericity.
KMO | Bartlett’s Test of Sphericity | ||
---|---|---|---|
χ2 | df | Sig. | |
0.889 | 10,258.843 | 561 | 0.000 |
Rotated component matrix.
Items | Component | ||||||||
---|---|---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | |
TPM3 | 0.845 | 0.013 | 0.079 | 0.145 | 0.173 | 0.048 | 0.065 | 0.159 | 0.054 |
TPM2 | 0.839 | 0.034 | 0.103 | 0.236 | 0.215 | 0.007 | 0.019 | 0.172 | 0.027 |
TPM4 | 0.819 | 0.064 | 0.028 | 0.221 | 0.124 | −0.008 | 0.008 | 0.122 | 0.025 |
TPM1 | 0.808 | 0.005 | 0.042 | 0.206 | 0.218 | 0.039 | 0.002 | 0.149 | −0.010 |
IDS3 | 0.028 | 0.892 | 0.054 | 0.033 | 0.075 | 0.057 | 0.161 | 0.060 | 0.167 |
IDS2 | −0.009 | 0.861 | 0.099 | 0.079 | 0.049 | 0.066 | 0.194 | 0.045 | 0.162 |
IDS1 | 0.066 | 0.853 | 0.109 | 0.058 | 0.061 | 0.064 | 0.174 | 0.014 | 0.167 |
IDS4 | 0.034 | 0.814 | 0.088 | 0.087 | 0.067 | 0.070 | 0.212 | 0.005 | 0.160 |
LD1 | 0.052 | 0.064 | 0.868 | 0.055 | 0.060 | 0.209 | 0.075 | 0.032 | 0.036 |
LD5 | 0.074 | 0.097 | 0.855 | 0.059 | 0.134 | 0.264 | 0.100 | 0.095 | 0.067 |
LD3 | 0.066 | 0.095 | 0.835 | 0.139 | 0.054 | 0.202 | 0.049 | 0.059 | 0.116 |
LD4 | 0.070 | 0.116 | 0.831 | 0.136 | 0.081 | 0.281 | 0.136 | 0.098 | 0.078 |
FMM5 | 0.257 | 0.041 | 0.100 | 0.848 | 0.166 | 0.058 | 0.007 | 0.153 | 0.036 |
FMM2 | 0.182 | 0.092 | 0.102 | 0.824 | 0.154 | 0.072 | 0.034 | 0.176 | 0.072 |
FMM1 | 0.204 | 0.150 | 0.122 | 0.794 | 0.246 | 0.117 | 0.025 | 0.180 | 0.066 |
FMM3 | 0.300 | 0.010 | 0.096 | 0.696 | 0.217 | 0.122 | 0.109 | 0.184 | −0.011 |
CRG5 | 0.123 | 0.022 | 0.093 | 0.111 | 0.839 | 0.027 | 0.016 | 0.107 | 0.123 |
CRG4 | 0.130 | 0.097 | 0.068 | 0.198 | 0.790 | 0.053 | 0.106 | 0.141 | 0.015 |
CRG2 | 0.268 | 0.105 | 0.058 | 0.192 | 0.790 | 0.126 | 0.019 | 0.161 | −0.021 |
CRG1 | 0.304 | 0.052 | 0.108 | 0.235 | 0.774 | 0.057 | 0.062 | 0.207 | −0.004 |
EQ4 | −0.018 | 0.137 | 0.210 | 0.025 | 0.059 | 0.825 | 0.067 | 0.036 | 0.048 |
EQ2 | 0.017 | 0.017 | 0.153 | 0.116 | 0.083 | 0.822 | 0.037 | −0.008 | 0.040 |
EQ3 | 0.044 | 0.045 | 0.226 | 0.034 | 0.014 | 0.820 | 0.083 | 0.096 | 0.062 |
EQ1 | 0.047 | 0.047 | 0.263 | 0.117 | 0.063 | 0.806 | 0.097 | 0.025 | 0.016 |
AR2 | 0.029 | 0.195 | 0.071 | 0.034 | 0.001 | 0.097 | 0.830 | 0.043 | 0.133 |
AR1 | 0.066 | 0.160 | 0.083 | 0.067 | 0.116 | 0.017 | 0.819 | −0.019 | 0.140 |
AR4 | −0.003 | 0.174 | 0.108 | 0.047 | 0.087 | 0.033 | 0.814 | −0.091 | 0.165 |
AR3 | −0.002 | 0.175 | 0.058 | −0.013 | −0.024 | 0.138 | 0.763 | 0.092 | 0.153 |
SLV2 | 0.281 | 0.065 | 0.053 | 0.224 | 0.150 | 0.060 | −0.024 | 0.816 | 0.051 |
SLV4 | 0.190 | −0.006 | 0.131 | 0.203 | 0.230 | 0.009 | 0.029 | 0.807 | 0.032 |
SLV1 | 0.153 | 0.059 | 0.077 | 0.192 | 0.190 | 0.081 | 0.016 | 0.807 | −0.010 |
DA1 | 0.047 | 0.231 | 0.066 | 0.056 | 0.032 | 0.076 | 0.198 | −0.002 | 0.811 |
DA4 | 0.024 | 0.203 | 0.064 | 0.055 | 0.056 | 0.058 | 0.179 | 0.029 | 0.808 |
DA2 | 0.011 | 0.184 | 0.122 | 0.019 | 0.022 | 0.025 | 0.203 | 0.034 | 0.794 |
Eigenvalue | 3.344 | 3.310 | 3.261 | 3.058 | 3.048 | 3.042 | 2.950 | 2.342 | 2.205 |
% of Variance | 9.835 | 9.734 | 9.591 | 8.994 | 8.965 | 8.947 | 8.677 | 6.888 | 6.484 |
Cumulative % | 9.835 | 19.569 | 29.160 | 38.155 | 47.120 | 56.066 | 64.743 | 71.631 | 78.116 |
Discriminant validity.
Mean | SD | CRG | FMM | TPM | SLV | EQ | LD | DA | AR | IDS | |
---|---|---|---|---|---|---|---|---|---|---|---|
CRG | 3.59 | 1.06 | 0.820 | ||||||||
FMM | 3.73 | 0.96 | 0.505 ** | 0.839 | |||||||
TPM | 3.55 | 1.02 | 0.491 ** | 0.524 ** | 0.839 | ||||||
SLV | 3.63 | 1.05 | 0.489 ** | 0.477 ** | 0.497 ** | 0.830 | |||||
EQ | 3.60 | 0.98 | 0.241 ** | 0.258 ** | 0.163 ** | 0.234 ** | 0.816 | ||||
LD | 3.67 | 1.07 | 0.374 ** | 0.335 ** | 0.274 ** | 0.321 ** | 0.540 ** | 0.886 | |||
DA | 3.69 | 1.01 | 0.275 ** | 0.225 ** | 0.247 ** | 0.226 ** | 0.235 ** | 0.326 ** | 0.802 | ||
AR | 3.79 | 0.88 | 0.249 ** | 0.238 ** | 0.268 ** | 0.219 ** | 0.136 ** | 0.271 ** | 0.454 ** | 0.806 | |
IDS | 3.72 | 0.93 | 0.221 ** | 0.258 ** | 0.235 ** | 0.224 ** | 0.207 ** | 0.287 ** | 0.504 ** | 0.431 ** | 0.873 |
** Statistically significant at the p < 0.01 level.
Convergent validity.
Variable | Item | Factor Loading | Composite | Average Variance |
---|---|---|---|---|
Smart Classroom Personalized Learning Scale (SC-PLS) | Pedagogy | 0.719 | 0.752 | 0.503 |
Space | 0.720 | |||
Technology | 0.688 | |||
Pedagogy | CRG | 0.766 | 0.836 | 0.560 |
FMM | 0.741 | |||
TPM | 0.750 | |||
SLV | 0.735 | |||
Space | EQ | 0.800 | 0.747 | 0.597 |
LD | 0.744 | |||
Technology | DA | 0.798 | 0.767 | 0.525 |
AR | 0.662 | |||
IDS | 0.707 | |||
Clear and Relevant Goals (CRG) | CRG1 | 0.883 | 0.891 | 0.672 |
CRG2 | 0.864 | |||
CRG4 | 0.765 | |||
CRG5 | 0.758 | |||
Flexible Instructional Methods and Materials (FMM) | FMM1 | 0.871 | 0.905 | 0.704 |
FMM2 | 0.832 | |||
FMM3 | 0.750 | |||
FMM5 | 0.896 | |||
Timely Progress Monitoring (TPM) | TPM1 | 0.811 | 0.905 | 0.704 |
TPM2 | 0.895 | |||
TPM3 | 0.841 | |||
TPM4 | 0.805 | |||
Supporting Learner Variability (SLV) | SLV1 | 0.772 | 0.868 | 0.688 |
SLV2 | 0.883 | |||
SLV4 | 0.830 | |||
Environmental Quality (EQ) | EQ1 | 0.857 | 0.888 | 0.665 |
EQ2 | 0.766 | |||
EQ3 | 0.837 | |||
EQ4 | 0.799 | |||
Layout and Display (LD) | LD1 | 0.833 | 0.936 | 0.785 |
LD3 | 0.855 | |||
LD4 | 0.923 | |||
LD5 | 0.929 | |||
Device Access (DA) | DA1 | 0.842 | 0.844 | 0.644 |
DA2 | 0.778 | |||
DA4 | 0.785 | |||
Access to Resources (AR) | AR1 | 0.810 | 0.881 | 0.650 |
AR2 | 0.835 | |||
AR3 | 0.754 | |||
AR4 | 0.824 | |||
Intelligent Diagnosis and Services (IDS) | IDS1 | 0.864 | 0.928 | 0.762 |
IDS2 | 0.879 | |||
IDS3 | 0.910 | |||
IDS4 | 0.837 |
Appendix A
The 39 items on SCE-PLE developed in the first stage.
Items | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|
1. The teacher sets personalized learning goals for me based on my learning data. | |||||
2. I clearly understand the learning objectives of each smart classroom lesson. | |||||
3. The smart platform or smart devices (such as tablets, laptops, etc.) recommend suitable learning tasks to help me achieve the learning goals. | |||||
4. The teacher provides feedback on my progress toward achieving learning goals through the smart platform. | |||||
5. The smart classroom offers learning content and resources that I am interested in. | |||||
6. The teacher provides multiple learning content options through the smart platform, and I can choose the most suitable content based on my interests and needs. | |||||
7. The teacher uses smart tools (such as online resource libraries and learning path recommendation systems) to recommend diverse learning materials (such as videos, articles, exercises, etc.) for me. | |||||
8. I can participate in the course in different ways (such as online discussions, virtual experiments, group collaboration, etc.) through the smart platform or smart devices. | |||||
9. The teacher offers various ways to showcase learning outcomes (such as text, charts, videos, etc.) via the smart platform, allowing me to choose the most appropriate method. | |||||
10. The smart platform or smart devices provide abundant learning resources and materials, allowing me to flexibly choose the most suitable learning approach. | |||||
11. The teacher monitors my learning progress in real time through the smart platform and adjusts the teaching content and methods accordingly. | |||||
12. When I make mistakes in class, the teacher provides more accurate guidance based on the diagnostic feedback from the smart platform. | |||||
13. The teacher checks my learning progress in real time through the smart platform and provides timely feedback. | |||||
14. The teacher regularly checks my progress on completing learning tasks through the smart platform. | |||||
15. The teacher designs personalized learning tasks based on my learning data to help me better understand the course topics. | |||||
16. The teacher provides me with challenging tasks suited to my learning level through the smart platform. | |||||
17. The teacher uses smart tools (such as group functions) to help me collaborate with classmates to meet my personalized learning needs. | |||||
18. The teacher adjusts teaching strategies based on learning analysis reports from the smart platform to ensure that each student learns in the most suitable way. | |||||
19. The classroom has sufficient natural light, allowing me to see the blackboard and textbooks more clearly, which enhances my learning experience. | |||||
20. I can adjust the temperature of the smart classroom according to my needs, ensuring comfort without distractions. | |||||
21. I can adjust the brightness of the classroom lighting as needed, making the learning environment more comfortable. | |||||
22. The classroom’s sound system ensures that I can clearly hear the teacher’s explanations. | |||||
23. I find the seating in the classroom very comfortable, which helps me focus on learning. | |||||
24. The seating arrangement in the classroom is flexible, allowing me to move in and out easily and interact with others. | |||||
25. I can easily adjust the position of smart devices to participate in different types of learning activities. | |||||
26. The desk size is just right, providing enough space for textbooks, smart devices, and other materials. | |||||
27. The layout of the classroom’s central control system, interactive whiteboard, and smart projectors is well-organized, suitable for both teaching and learning. | |||||
28. I can easily use the smart devices provided by the school for learning. | |||||
29. The smart devices provided by the school are high-quality and capable of supporting a variety of learning tasks. | |||||
30. I can continue learning at home using the smart devices or smart platform provided by the school. | |||||
31. The school regularly updates and maintains the smart devices to ensure they are functioning properly. | |||||
32. I can easily access the learning resources recommended by the teacher (such as videos, articles, exercises, etc.) through the smart platform or smart devices. | |||||
33. I can access supplementary learning resources related to the course content through the smart platform or smart devices. | |||||
34. I can share learning resources with other classmates through the smart platform or smart devices. | |||||
35. The smart platform generates a personalized list of learning resources to help me complete learning tasks more effectively. | |||||
36. The smart platform or smart devices automatically analyze my learning weaknesses based on my learning data (such as quiz results, study time, etc.). | |||||
37. When I encounter learning difficulties, the smart platform or smart devices recommend suitable learning resources or solutions to assist me. | |||||
38. The smart platform or smart devices generate a personalized learning report to help me track my learning progress. | |||||
39. After completing learning tasks, the smart platform or smart devices provide detailed error analysis and improvement suggestions. |
AERA. AERA code of ethics: American educational research association approved by the AERA council February 2011. Educational Researcher; 2011; 40,
APA. Ethical principles of psychologists and code of conduct; American Psychological Association: 2017; Available online: https://www.apa.org/ethics/code/ethics-code-2017.pdf (accessed on 20 February 2025).
Basham, J. D.; Hall, T. E.; Carter, R. A.; Stahl, W. M. An operationalized understanding of personalized learning. Journal of Special Education Technology; 2016; 31,
Blumer, H. Sociological implications of the thought of George Herbert mead. American Journal of Sociology; 1966; 71, pp. 535-544. [DOI: https://dx.doi.org/10.1086/224171]
Bryant, A.; Charmaz, K. Grounded theory research: Methods and practices. The SAGE handbook of grouded theory; Bryant, A.; Charmaz, K. Sage: 2007; pp. 1-28.
Bulger, M. Personalized learning: The conversations we’re not having. Data and Society; 2016; 22,
Charmaz, K. Constructing grounded theory: A practical guide through qualitative analysis; Sage: 2006.
Charmaz, K. Morse, J. M.; Bowers, B. J.; Charmaz, K.; Clarke, A. E.; Corbin, J.; Porr, C. J.; Stern, P. N. The genesis, grounds, and growth of constructivist grounded theory. Developing grounded theory: The second generation revisited; 2nd ed. Routledge: 2021; pp. 153-187. [DOI: https://dx.doi.org/10.4324/9781315169170]
Chatti, M. A.; Agustiawan, M. R.; Jarke, M.; Specht, M. Toward a personal learning environment framework. International Journal of Virtual and Personal Learning Environments (IJVPLE); 2010; 1,
Chen, C.-M.; Lee, H.-M.; Chen, Y.-H. Personalized e-learning system using item response theory. Computers & Education; 2005; 44,
Chen, S. Y.; Huang, P.-R.; Shih, Y.-C.; Chang, L.-P. Investigation of multiple human factors in personalized learning. Interactive Learning Environments; 2016; 24,
Chen, S. Y.; Macredie, R. Web-based interaction: A review of three important human factors. International Journal of Information Management; 2010; 30,
Cheng, C. C.; Carolyn Yang, Y. T. Impact of smart classrooms combined with student-centered pedagogies on rural students’ learning outcomes: Pedagogy and duration as moderator variables. Computers and Education; 2023; 207, 104911. [DOI: https://dx.doi.org/10.1016/j.compedu.2023.104911]
Cheung, S. K. S.; Kwok, L. F.; Phusavat, K.; Yang, H. H. Shaping the future learning environments with smart elements: Challenges and opportunities. International Journal of Educational Technology in Higher Education; 2021; 18,
Dai, D. Y.; Chen, F. Three paradigms of gifted education: In search of conceptual clarity in research and practice. Gifted Child Quarterly; 2013; 57,
Dai, D. Y.; Swanson, J. A.; Cheng, H. State of research on giftedness and gifted education: A survey of empirical studies published during 1998–2010 (April). Gifted Child Quarterly; 2011; 55,
DeVellis, R. F. Scale development: Theory and applications; Sage Publications, Inc.: 1991.
Diefes-Dux, H. A. Student self-reported use of standards-based grading resources and feedback. European Journal of Engineering Education; 2019; 44,
EDUCASE Learning Initiative. Learning space rating system; 2021; Available online: https://www.educause.edu/focus-areas-and-initiatives/teaching-and-learning-program/initiatives/learning-space-rating-system (accessed on 21 February 2025).
Fake, H.; Dabbagh, N. Personalized learning within online workforce learning environments: Exploring implementations, obstacles, opportunities, and perspectives of workforce leaders. Technology, Knowledge and Learning; 2020; 25,
Fan, J. Construction and application of English smart classroom teaching model integrating MOOC and flipped classroom. Applied Mathematics and Nonlinear Sciences; 2024; 9,
Farkas, G. Cognitive skills and noncognitive traits and behaviors in stratification processes. Annual Review of Sociology; 2003; 29, pp. 541-562. [DOI: https://dx.doi.org/10.1146/annurev.soc.29.010202.100023]
Fraser, B. J. Classroom environment instruments: Development, validity and applications. Learning Environments Research; 1998; 1, pp. 7-34. [DOI: https://dx.doi.org/10.1023/A:1009932514731]
Gambo, Y.; Shakir, M. Z. Evaluating students’ experiences in self-regulated smart learning environment. Education and Information Technologies; 2023; 28,
Glaser, B. G.; Strauss, A. L. Awareness contexts and social interaction. American Sociological Review; 1964; 29,
Göncz, L. Teacher personality: A review of psychological research and guidelines for a more comprehensive theory in educational psychology. Open Review of Educational Research; 2017; 4,
Gruijters, S. L. K. Using principal component analysis to validate psychological scales: Bad statistical habits we should have broken yesterday II. The European Health Psychologist; 2019; 20,
Hair, J. F.; Black, W. C.; Babin, B. J.; Anderson, R. E.; Tatham, R. L. Multivariate data analysis; Prentice-Hall: 2010; Vol. 5.
Hambleton, R. K. Issues, designs, and techinical guidelenes for adapting tests into multiple languages and cultures. Adapting educational and psychological tests for cross-cultural assessment; Hambleton, R. K.; Merand, P. F.; Spielberger, C. D. Lawrence Erlbaum Associates Inc. Publishers: 2005.
Härdle, W. K.; Simar, L. Applied multivariate statistical anaylsis; 4th ed. Springer: 2015; [DOI: https://dx.doi.org/10.1007/978-3-662-45171-7]
Hinkin, T. R. A review of scale development practices in the study of organizations. Journal of Management; 1995; 21,
Holmes, W.; Anastopoulou, S.; Schaumburg, H.; Mavrikis, E. Technology-enhanced personalised learning: Untangling the evidence; Robert Bosch Stiftung: 2018.
Hu, Y.; Huang, J.; Kong, F. College students’ learning perceptions and outcomes in different classroom environments: A community of inquiry perspective. Frontiers in Psychology; 2022; 13, 1047027. [DOI: https://dx.doi.org/10.3389/fpsyg.2022.1047027]
Huber, S. A.; Seidel, T. Comparing teacher and student perspectives on the interplay of cognitive and motivational-affective student characteristics. PLoS ONE; 2018; 13,
Huck, S. W.; Cormier, W. H.; Bounds, W. G. Reading statistics and research; Harper & Row: 1974.
Kaur, A.; Bhatia, M. Smart classroom: A review and research agenda. IEEE Transactions on Engineering Management; 2022; 71, pp. 2430-2446. [DOI: https://dx.doi.org/10.1109/TEM.2022.3176477]
Keefe, J. W.; Jenkins, J. M. Personalized instruction: The key to student achievement; 2nd ed. Rowman & Littlefield Education: 2008.
Kell, H. J. Noncognitive proponents’ conflation of “cognitive skills” and “cognition” and its implications. Personality and Individual Differences; 2018; 134,
Ketscher, L.; Stoeger, H.; Vialle, W.; Ziegler, A. Same classroom, different reality: Secondary school students’ perceptions of STEM lessons—A pioneering study. Education Sciences; 2025; 15,
Kloos, C. D.; Alario-Hoyos, C.; Muñoz-Merino, P. J.; Ibáñez, M. B.; Estévez-Ayres, I.; Fernández-Panadero, C. Educational technology in the age of natural interfaces and deep learning. IEEE Revista Iberoamericana de Tecnologias del Aprendizaje; 2020; 15,
Koh, Y. Y. J.; Schmidt, H. G.; Low-Beer, N.; Rotgans, J. I. Team-based learning analytics: An empirical case study. Academic Medicine; 2020; 95,
Kropotkin, P. Ethics: Origins and development; Friedland, L. S.; Piroshnikoff, J. R. George G. Harrap and Co., Ltd.: 1924; (Original work published 1922) Available online: https://theanarchistlibrary.org/library/petr-kropotkin-ethics-origin-and-development (accessed on 20 February 2025).
Lemert, E. M. Beyond Mead: The societal reaction to deviance. Social Problems; 1974; 21,
Lindsley, O. R. Precision teaching: Discoveries and effects. Journal of Applied Behavior Analysis; 1992; 25,
Liu, C.; Wang, X. Application of smart education in college English flipped classroom. 8th International Conference on Social Network, Communication and Education (SNCE 2018); Shenyang, China, March 2–4; 2018.
Liu, Z.; Zhang, N.; Peng, X.; Liu, S.; Yang, Z.; Peng, J.; Su, Z.; Chen, J. Exploring the relationship between social interaction, cognitive processing and learning achievements in a MOOC discussion forum. Journal of Educational Computing Research; 2022; 60,
Lycurgus, T.; Kilbourne, A.; Almirall, D. Approaches to statistical efficiency when comparing the embedded adaptive interventions in a SMART. Journal of Educational and Behavioral Statistics; 2024; 10769986241251419. [DOI: https://dx.doi.org/10.3102/10769986241251419]
Ma, X.; Xie, Y.; Yang, X.; Wang, H.; Li, Z.; Lu, J. Teacher-student interaction modes in smart classroom based on lag sequential analysis. Education and Information Technologies; 2024; 29,
Major, L.; Francis, G. A.; Tsapali, M. The effectiveness of technology-supported personalised learning in low- and middle-income countries: A meta-analysis. British Journal of Educational Technology; 2021; 52,
Malekigorji, M.; Hatahet, T. Classroom response system in a super-blended learning and teaching model: Individual or team-based learning?. Pharmacy; 2020; 8,
MDPI. Research and publication ethics; MDPI: n.d.; Available online: https://www.mdpi.com/ethics (accessed on 22 February 2025).
Mead, G. H. Mind, self, and society: From the standpoint of a social behaviorist; University of Chicago Press: (Original work published 1934) 1972.
Merton, R. K. The Thomas theorem and the Matthew effect. Social Forces; 1995; 74,
Morse, J. M.; Bowers, B. J.; Clarke, A. E.; Charmaz, K.; Corbin, J.; Porr, C. J. Morse, J. M.; Bowers, B. J.; Charmaz, K.; Clarke, A. E.; Corbin, J.; Porr, C. J.; Stern, P. N. The challenges to and future(s) of grounded theory. Developing grounded theory: The second generation revisited; 2nd ed. Routledge: 2021; pp. 289-319. [DOI: https://dx.doi.org/10.4324/9781315169170]
Mugruza-Vassallo, C. A. A “fractal” expander-compressor-supplier formative research method on array processing. Education and Information Technologies; 2023; 28,
Niknam, M.; Thulasiraman, P. LPR: A bio-inspired intelligent learning path recommendation system based on meaningful learning theory. Education and Information Technologies; 2020; 25,
Ouf, S.; Abd Ellatif, M.; Salama, S. E.; Helmy, Y. A proposed paradigm for smart learning environment based on semantic web. Computers in Human Behavior; 2017; 72, pp. 796-818. [DOI: https://dx.doi.org/10.1016/j.chb.2016.08.030]
Pane, J. F.; Steiner, E. D.; Baird, M. D.; Hamilton, L. S. Promising evidence on personalized learning; RAND Corporation: 2015; Available online: https://eric.ed.gov/ED571009 (accessed on 22 February 2025).
Patrick, S.; Kennedy, K.; Powell, A. K. Mean what you say: Defining and integrating personalized, blended and competency education; International Association for K-12 Online Learning: 2013; Available online: http://files.eric.ed.gov/fulltext/ED561301.pdf (accessed on 22 February 2025).
Polit, D. F.; Beck, C. T. The content validity index: Are you sure you know what’s being reported? Critique and recommendations. Research in Nursing & Health; 2006; 29,
Price, J. K. Transforming learning for the smart learning environment: Lessons learned from the Intel education initiatives. Smart Learning Environments; 2015; 2,
Radcliffe, D. A pedagogy-space-technology (PST) framework for designing and evaluating learning spaces. Learning spaces in higher education—Positive outcomes by design. Proceedings of the Next Generation Learning Spaces 2008 Colloquium; Radcliffe, D.; Wilson, H.; Powell, D.; Tibbets, B. University of Queensland: 2009.
Redding, S. Competencies and personalized learning. Handbook on personalized learning for states, districts, and schools; Murphy, M.; Redding, S.; Twyman, J. Centeril: 2016; pp. 3-18.
Resnik, D. B. The ethics of science: An introduction; Routledge: 1998.
Reushle, S. Keppell, M.; Souter, K.; Riddle, M. Designing and evaluating learning spaces: PaSsPorT and design-based research. Physical and virtual learning spaces in higher education: Concepts for the modern learning environment; IGI Global: 2012; pp. 87-101. [DOI: https://dx.doi.org/10.4018/978-1-60960-114-0.ch006]
Robson, C.; McCartan, K. Real world research: A resource for users of social research methods in applied setting; 4th ed. Wiley: 2016.
Saini, M. K.; Goel, N. How smart are smart classrooms? A review of smart classroom technologies. ACM Computing Surveys; 2019; 52,
Sampson, D. G.; Karagiannidis, C. Personalised learning: Educational, technological and standardisation perspective. Digital Education Review; 2002; pp. 24-39.
Santhosh, J.; Dzsotjan, D.; Ishimaru, S. Multimodal assessment of interest levels in reading: Integrating eye-tracking and physiological sensing. IEEE Access; 2023; 11, pp. 93994-94008. [DOI: https://dx.doi.org/10.1109/ACCESS.2023.3311268]
Sharma, B. N.; Nand, R.; Naseem, M.; Reddy, E.; Narayan, S. S.; Reddy, K. Smart learning in the pacific: Design of new pedagogical tools. 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE); Wollongong, NSW, Australia, December 4–7; 2018.
Shaw, R.; Patra, B. K. Classifying students based on cognitive state in flipped learning pedagogy. Future Generation Computer Systems; 2022; 126,
Sternberg, R. J. Giftedness as trait vs. state. Roeper Review; 2022; 44,
Sungkur, R. K.; Maharaj, M. S. Design and implementation of a SMART learning environment for the upskilling of cybersecurity professionals in Mauritius. Education and Information Technologies; 2021; 26,
Tabachnick, B. G.; Fidell, L. S. Using multivariate statistics; 5th ed. Allyn & Bacon/Pearson Education: 2007.
Tapscott, D. Grown up digital: How the net generation is changing the world; McGraw Hill: 2009.
The Nuremberg Code (1947). BMJ; 1996; 313,
Thomas, W. I.; Thomas, D. S. The child in America: Behavior problems and programs; Knopf: 1928.
Timothy, T.; Seng Chee, T.; Chwee Beng, L.; Ching Sing, C.; Joyce Hwee Ling, K.; Wen Li, C.; Horn Mun, C. The self-directed learning with technology scale (SDLTS) for young students: An initial development and validation. Computers & Education; 2010; 55,
Toivonen, T.; Jormanainen, I.; Montero, C. S.; Alessandrini, A. Chang, M.; Popescu Kinshuk, E.; Chen, N.-S.; Jemni, M.; Huang, R.; Spector, J. M. Innovative maker movement platform for K-12 education as a smart learning environment. Challenges and solutions in smart learning; Singapore: 2018.
Troussas, C.; Krouska, A.; Sgouropoulou, C. Collaboration and fuzzy-modeled personalization for mobile game-based learning in higher education. Computers & Education; 2020; 144, 103698. [DOI: https://dx.doi.org/10.1016/j.compedu.2019.103698]
U.S. Department of Education. Transforming American education: Learning powered by technology; Office of Educational Technology: 2010; Available online: https://files.eric.ed.gov/fulltext/ED512681.pdf (accessed on 23 December 2024).
Van Schoors, R.; Elen, J.; Raes, A.; Depaepe, F. An overview of 25 years of research on digital personalised learning in primary and secondary education: A systematic review of conceptual and methodological trends. British Journal of Educational Technology; 2021; 52,
Walkington, C.; Bernacki, M. L. Appraising research on personalized learning: Definitions, theoretical alignment, advancements, and future directions. Journal of Research on Technology in Education; 2020; 52,
Wang, J.; Xie, K.; Liu, Q.; Long, T.; Lu, G. Examining the effect of seat location on students’ real-time social interactions in a smart classroom using experience sampling method. Journal of Computers in Education; 2023; 10,
Wirthwein, L.; Bergold, S.; Preckel, F.; Steinmayr, R. Personality and school functioning of intellectually gifted and nongifted adolescents: Self-perceptions and parents’ assessments. Learning and Individual Differences; 2019; 73, pp. 16-29. [DOI: https://dx.doi.org/10.1016/j.lindif.2019.04.003]
Xie, H.; Chu, H.-C.; Hwang, G.-J.; Wang, C.-C. Trends and development in technology-enhanced adaptive/personalized learning: A systematic review of journal publications from 2007 to 2017. Computers & Education; 2019; 140, 103599. [DOI: https://dx.doi.org/10.1016/j.compedu.2019.103599]
Yang, J.; Huang, R. Development and validation of a scale for evaluating technology-rich classroom environment. Journal of Computers in Education; 2015; 2,
Yu, H.; Wang, E.; Lang, Q.; Wang, J. Intelligent retrieval and comprehension of entrepreneurship education resources based on semantic summarization of knowledge graphs. IEEE Transactions on Learning Technologies; 2024; 17, pp. 1210-1221. [DOI: https://dx.doi.org/10.1109/TLT.2024.3364155]
Yuan, X.; Yu, L.; Wu, H. Awareness of sustainable development goals among students from a Chinese senior high school. Education Sciences; 2021; 11,
Zhan, Z.; Wu, Q.; Lin, Z.; Cai, J. Smart classroom environments affect teacher-student interaction: Evidence from a behavioural sequence analysis. Australasian Journal of Educational Technology; 2021; 37,
Zhang, L.; Basham, J. D.; Carter, R. A. Measuring personalized learning through the Lens of UDL: Development and content validation of a student self-report instrument. Studies in Educational Evaluation; 2022; 72, 101121. [DOI: https://dx.doi.org/10.1016/j.stueduc.2021.101121]
Zhang, L.; Basham, J. D.; Yang, S. Understanding the implementation of personalized learning: A research synthesis. Educational Research Review; 2020a; 31, 100339. [DOI: https://dx.doi.org/10.1016/j.edurev.2020.100339]
Zhang, L.; Yang, S.; Carter, R. A. Personalized learning and ESSA: What we know and where we go. Journal of Research on Technology in Education; 2020b; 52,
Zhao, J.; Zhang, L.; Yao, X. Developing and validating a scale for university teacher’s caring behavior in online teaching. Education Sciences; 2023; 13,
Ziegler, A. The actiotope model of giftedness. Conceptions of giftedness; 2nd ed. Cambridge University Press: 2005; pp. 411-436. [DOI: https://dx.doi.org/10.1017/CBO9780511610455.024]
Ziegler, A.; Bicakci, M. Labeling the gifted: An overview of four core challenges [Keynote address]. Gifted Students: Nomen est Omen [Nadaný žák: Nomen est Omen], Zlin, Czechia 2023.
Ziegler, A.; Stoeger, H. Shortcomings of the IQ-based construct of underachievement. Roeper Review; 2012; 34,
Ziegler, A.; Stoeger, H. Systemic gifted education: A theoretical introduction. Gifted Child Quarterly; 2017; 61,
Ziegler, A.; Stoeger, H.; Vialle, W. Giftedness and gifted education: The need for a paradigm change. Gifted Child Quarterly; 2012; 56,
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Smart classrooms leverage intelligent and mobile technologies to create highly interactive, student-centered environments conducive to personalized learning. However, measuring students’ personalized learning experiences in these technologically advanced spaces remains a challenge. This study addresses the gap by developing and validating a Smart Classroom Environment–Personalized Learning Scale (SCE-PL). Drawing on a comprehensive literature review, content-expert feedback, and iterative item refinement, an initial pool of 48 items was reduced to 39 and subsequently to 34 following item-level analyses. Two datasets were collected from Chinese middle-school students across three provinces, capturing diverse socio-economic contexts and grade levels (7th, 8th, and 9th). EFA on the first dataset (n = 424) revealed a nine-factor structure collectively explaining 78.12% of the total variance. Confirmatory factor analysis (CFA) on the second dataset (n = 584) verified an excellent model fit. Internal consistency indices (Cronbach’s α > 0.87, composite reliability > 0.75) and strong convergent and discriminant validity evidence (based on AVE and inter-factor correlations) further support the scale’s psychometric soundness. The SCE-PL thus offers researchers, policymakers, and practitioners a robust, theory-driven instrument for assessing personalized learning experiences in smart classroom environments, paving the way for data-informed pedagogy, optimized learning spaces, and enhanced technological integration.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details


1 Faculty of Education, Shaanxi Normal University, Xi’an 710062, China; [email protected]
2 Department of Psychology, Educational Psychology and Research on Excellence, Faculty of Humanities, Social Sciences, and Theology, Friedrich-Alexander-University of Erlangen-Nürnberg, 90478 Nürnberg, Germany; [email protected] (M.B.);