Abstract
The current study tested the effectiveness of a computer-supported collaborative learning tool-HyLighter-on undergraduate students' learning, affect for learning, and complex cognitive skills. The HyLighter group (N = 23) (1) digitally highlighted and annotated a reading article, and (2) reviewed and expanded on peers' and instructor's highlights and annotations. The control group (N=27) read the article in hard copy without using HyLighter and practicing its learning activities. The dependent variables included: (a) performance on a reading quiz, (b) a number of affective variables related to the reading assignment, and (3) students' cognitive modeling of the article's content. Although students reported high rates of satisfaction with the HyLighter tool, performance on the reading comprehension quiz did not differ significantly between the two groups. Students using HyLighter tended to score higher on both the positive-valence and negative-valence emotions. However, these students also showed significant and substantial superiority in mental model similarity indices. Thus, HyLigher use in the learning process was apparent in the learners' content-conceptualization more so than in quiz performances. These findings have significant implications for both instructional and research purposes.
Keywords
social annotation, distance education, computer-supported collaborative learning, complex thinking skills, HIMATT, HyLigther, text summaries, reference models
Introduction
Collaborative learning is characterized by groups of learners actively communicating and interacting with each other to set up a shared focus aimed at reaching a common goal (Akkerman et al., 2007; Beers, Boshuizen, Kirschner, & Gijselaers, 2006). In fact, "the widespread and increasing use of collaborative learning has been a success story" (Johnson & Johnson, 2009, p. 365). Amongst most important characteristics of collaborative learning are the following: (1) positive mutual dependence where the learners depend on each other to complete a shared task, (2) individual accountability where each learner in the group shares responsibility for the end result, and (3) social skills where the members develop social skills to work together successfully (see van der Meijden, 2007). The success of collaborative learning can be attributed to its allowing increased active engagement in the learning process and longer retention of the learned material (Morgan, Whorton, & Gunsalus, 2000), improving students' complex thinking skills (Sloffer, Dueber, & Duffy, 1999), and overall enabling more self- regulation in students (Van Boxtel, Van der Linden, & Kanselaar, 2000). Several concepts inherent in different disciplines can help at least partially explain why students, given the right circumstances, learn more optimally from actively communicating with one another.
According to classical cognitive developmental perspective, collaborative learning works efficiently because it promotes learning within a zone of proximal development (see Vygotsky, 1978). Hence, the scaffolding of the learning material by the more capable learner transfers to the less capable learner. Moreover, collaborative learning works because it supports the development of new cognitive schémas (Fawcett & Garton, 2005); hence, more advanced representations are formed and implemented by the learner. Finally, collaborative learning works because it creates social cohesion resulting in group members' desire to help each other in the learning task (see Janssen, Kirschner, Erkens, Kirschner, & Paas, 2010). This said, collaborative learning environments may differ in size, group characteristics, learning focus, learning goals, and supportive technologies (Kirschner, Paas, & Kirschner, 2009).
Within technologically supported learning environments, computer-supported collaborative learning draws ample research (Gress, Fior, Hadwin, & Winne, 2008). As for its origins, computer-supported collaborative learning takes root in the expansion of the computersupported cooperative work (Wang, 2009). Within the computer-supported collaborative learning platforms, Web-based collaborative learning systems draw learners to participate in collaborative learning activities (Jones, Blackey, Fitzgibbon, & Chew, 2010). Consequently, the overarching realm of the research into the computer-supported collaborative learning includes designing software tools and creating collaborative learning environments to promote social construction of the knowledge (Hadwin, Gress, & Page, 2006). Computer-supported collaborative learning combines collaborative learning and the use of information and communication technologies resulting in a number of educational, social, and motivational gains (see Jansen, Erkens, & Kanselaar, 2007). Comparative accounts of computer-supported collaborative learning and face to face learning environments reveal for instance that students within computer-supported collaborative learning environments (1) report larger learning gains (Hertz-Lazarowitz & Bar-Natan, 2002); (2) deliver more complete assignments, make higher quality decisions, and perform more optimally on tasks that require brainstorming (Fjermestad, 2004); (3) engage in more complex and challenging discussions (Benbunan-Fich, Hiltz, &Turoff, 2003), with more equal participation of group members (Fjermestad, 2004); and finally, (4) report higher levels of satisfaction (Fjermestad, 2004).
This said, not all computer-supported collaborative learning settings are effective (Dewiyanti, Brand-Gruwel, &Jochems, 2005). Participants in computer-supported collaborative learning settings often encounter interaction, or communication problems, or both (Janssen, Erkens, Jaspers, & Broeken, 2005). Specifically, within computer-supported collaborative learning platforms, participants may perceive group discussions as perplexing (Thompson & Coovert, 2003), need more time to arrive at a consensus (Fjermestad, 2004), and take longer to solve problems (Baltes, Dickson, Sherman, Bauer, & LaGanke, 2002). The effectiveness of these platforms could also be hindered because group members in computer-supported collaborative learning settings seldom recognize the task as a group task (i.e., perform individually and not collaboratively) (Carroll, Neale, Isenhour, Rosson, & McCrickard, 2003). A probable explanation for these difficulties can also be attributed to the fact that most computer-supported collaborative learning environments focus primarily on the support of cognitive processes, whereas social processes including positive affective relationships, group cohesiveness, trust, and a sense of community are most needed for a group to reach their full potential (Kreijns & Kirschner, 2004). In fact, given that these processes do not automatically develop by merely gathering learners together (Fischer, Bruhn, Gräsel, & Mandi, 2002) , researchers suggest that for obtaining optimal effects, computer-supported collaborative learning settings should provide an environment that develops and promotes social processes as well as the cognitive ones (Kreijns, Kirschner, &Jochems, 2003).
To promote the social processes within a computer-supported collaborative learning environment, a social annotation model-learning system may prove ideal. The social annotation model-learning system (SAM-LS) is a technology-enhanced shared annotation tool that improves social collaboration on electronic documents where multiple writers can proofread, and provide corrective feedback and reviews (Lebow, Lick, & Hartman, 2009). Social annotation model-learning system further supports social processes by merging touches of Web 2.0 social networking tools (i.e., Facebook, MySpace, Ning) with more basic annotation practices including the reviewing features inherent in Microsoft Word, or Adobe Acrobat, or both (Mendenhall & Johnson, 2010).
Preliminary findings associated with the use of the social annotation model-learning system in a community college identified the following: (1) improved participation, engagement, accountability, and completion rates for assignments; (2) enhanced active reading skills, and learning from text; and (3) better writing skills for the students who used the program (Lebow, Lick, & Hartman, 2004). Additional pilot studies also identified that in undergraduate settings, the social annotation model-learning system helped increased learning gains, increased optimal academic achievement, and improved positive affective responses to learning (Razon, Turner, Arsal, Johnson, & Tenenbaum, 2012). Further empirical data also indicates that college students using the system improved reading comprehension and meta-cognitive skills in general (Johnson, Archibald, & Tenenbaum, 2010).
The social annotation model-learning system is thought to improve learning and performance through use of relevant activities including the following: (1) offering examples, (2) attempting practice, and (3) engaging in reflections and collaboration (Mendenhall & Johnson, 2010). Despite the promising initial findings associated with the social annotation model-learning system, there is only scant literature into its use (see Johnson et al., 2010; Mendenhall et al., 2010; Razón et al., 2012). The present study is aimed at testing the effectiveness of a social annotation model-learning tool on undergraduate students' learning, affect toward learning, and development of complex thinking skills, given that in the current system students largely lack reading comprehension and learning skills (Hammond, Linton, Smink, & Drew, 2007), positive affects for learning (Willingham, 2009) , and complex thinking abilities for critical thinking, and reflective learning (Joseph, 2010).
This research effort is significant in that it can help address the social processes-related shortcomings to the computer-supported collaborative learning. Within the current study, undergraduate students were instructed to use the social annotation model-learning tool HyLighter toward the completion of a reading assignment. HyLighter is designed to allow the user to highlight and annotate digital texts, and communicate with one another to further elaborate on the text. Specifically, students in the current study (1) highlighted and annotated a reading article, (2) reviewed peer and instructors' highlights and annotations, and (3) digitally commented on the reviewed highlights and annotations. The control group in the current study read the article without using HyLighter or performing the learning activities inherent in its use. Of interest was the students' (1) performance on a reading quiz, (2) affective responses to the reading assignment, and the (3) structure of their text summaries.
One way of assessing the effectiveness of a social annotation modeling-tool on reading comprehension and learning is by comparing mental model measures of student work with an expert or reference model. Mental models are internal symbols or representation of how one understands and views a phenomenon and the world (Jonassen & Cho, 2008). Johnson, PirneyDummer, Ifenthaler, Mendenhall, Karaman, and Tenenbaum (2011) posit that student mental models representations, a text summary or concept map representing their understanding of an article or text, is an operable way of evaluating students' comprehension of the learning materials (i.e., article, textbook chapter). Mental models are very complex, and while there are multiple ways of assessing them, there is not a single method that is agreed upon (Jonassen & Cho, 2008) .
Pirnay-Dummer, Ifenthaler, and Spector (2010) describe a reliable Web-based tool, HIMATT (Highly Integrated Model Assessment Technology and Tools) that is used to assess mental models in academic or workplace settings. HIMATT analyzes external representations (i.e., text summaries) for structural (i.e., surface, graphical, gamma, and structural matching) and semantic similarities (i.e., concept, propositional, and balanced propositional matching) between the students' representations and a reference model (i.e., teacher or expert summary) (Johnson, et al., 2011; Pirnay-Dummer & Ifenthaler, 2010; Pirnay-Dummer, Ifenthaler, & Spector, 2010). Johnson, et al. (2011) conducted a study that validated the use of text summaries as a way to capture students' mental models, and, therefore, an appropriate way of assessing student comprehension and understanding of learning materials.
The purpose of this study was to test the effectiveness of a computer-supported collaborative learning tool (HyLighter) on students' learning, affect for learning, and complex cognitive skills within an undergraduate setting. Thus, it was expected that (1) HyLighter use would improve reading comprehension, i.e., students using HyLighter would score higher on a reading comprehension quiz; (2) HyLighter use would help generate more positive affect toward learning, i.e., students using HyLighter would report increased rates of positive-valence emotions; and finally, (3) HyLighter use would enhance overall thinking skills in learners, i.e., students using HyLighter would display higher order thinking skills in short summaries of text reading.
Method
Sampling
Participants included 50 undergraduate students majoring in Education at a large southeastern university in the United States. Of these participants, 65% were female, and 35% were male; 70% were Caucasian, 13% Hispanic (Latino), 13% African-American, and 4% Asian/Pacific Islander. Prior to this research, students registered in a college-wide human subject pool and selected this study to participate for course credit. Participants were randomly assigned into two groups: (1) HyLighter group (i.e., Experimental) and (2) No-HyLighter group (i.e., Control). Students in the HyLighter group (N =23) used the social annotation tool. Students in the No-HyLighter group (N =27) did not use the social annotation tool. Participation lasted one session of approximately two hours for each student, and one researcher provided the instructions and administered the scales for the study.
Instrumentation
Reading Comprehensions Quiz. A team of three experts in the field prepared the quiz items. The quiz was designed in a multiple choice assessment format, and included three items aimed at assessing reading comprehension on the assigned article. Responses to each item were rated as either correct or incorrect. Total quiz scores ranged between 0 and 3, and corresponded to the total number of correctly answered items on the quiz.
Text Summary; Content-Conceptualization (HiMATT; Pirnay-Dummer, Ifenthaler, & Spector, 2010). Students were asked to briefly summarize (350 words) and provide views on the reading content. The open-ended question was designed to gauge their complex thinking skills. Students' complex thinking skills were evaluated via mental model measures (i.e., similarity scores between student text summaries and expert/reference model) using an automated webbased technology, HIMATT (Highly Integrated Model Assessment Technology and Tools). HIMATT was used to analyze structural and conceptual components of students' content-related representations (i.e., text summaries), and then compared students' mental conceptualization with the reference content-conceptualization (i.e., teacher/expert model) for similarities (Mendenhall, Kim, & Johnson, 2011). HIMATT analyzes the text, "tracks" the relationships between concepts within the texts, and creates a graphical representation of similarities (Johnson et al., 2011). Pirnay-Dummer (2011), Ifenthaler (2010), and Mendenhall et al. (2011) describe the similarity measures that include four structural induces and three semantic indices that are used for quantitative comparison of textual representations. These are the following:
Structural Indices
(1) Surface Structure - compares the number of vertices within the text. This represents the complexity of the model (i.e., how large the model is).
(2) Graphical Matching - compares the diameters of the spanning trees of the mental models. This is an indicator for the breadth of conceptual knowledge.
(3) Structural Matching - compares the complete structure of two concepts without regard to their content.
(4) Gamma Matching - also called 'density of vertices' - describes the quotient in terms of per vertex within a concept. Weak density mental models connect only pairs of concepts while medium density is "expected for most good working models."
Semantic Indices
(1) Concept Matching - counts of similar concepts within a mental model. This measures the differences in language between comparison models.
(2) Propositional Matching - compares identical propositions between two mental models.
(3) Balanced Semantic Matching - combines both propositional matching and concept matching to control for the dependency from propositional matching on concept matching: Only if concepts match, then propositions may match.
Learning-related Affect Questionnaire. The Learning Affect Questionnaire included seven self-report adjectives aimed at assessing students' affect toward the reading assignment. Each selfreport item was rated on a 6 -point Likert-type scale with anchors ranging from 0 not at all) to 6 (extremely). The seven self-report items included three positive -valence emotions (e.g., excited, optimistic, and happy; Cronbach a = .80) and four negative -valence emotions (e.g., worried, distress, uncertain, and pessimistic; Cronbach a = .60). Learning affect questionnaire was similar to the Profiles of Mood States (POMS; McNair, Lorr, & Droppleman, 1971), except for a higher balance with regard to hedonic tone. The Learning Affect Questionnaire included two additional items aimed at assessing students' level of motivation for reading the article, and desire to read further articles. The motivational items were rated on an identical format Likert-type scale with anchors ranging from 0 (not at all) to 6 (extremely). The 2 -item for the scale shared a Cronbach a of .56. All three scales (i.e., positive emotions, negative emotions, and motivations) share high ecological and face validities.
HyLighter Questionnaire. The HyLighter Questionnaire included 18 items aimed at assessing students' comfort using the HyLighter program. Amongst the items were the following: 'I will use HyLighter only when told to do so,' and 'HyLighter makes it possible to learn more productively.' Each item was rated on a 5 -point Likert-type Scale ranging from 1 (strongly disagree) to 5 (strongly agree). The internal consistency (Cronbach a) of the HyLighter Questionnaire approximated .86, and its items shared high face and ecological validities.
Instructional Tool
This study used HyLighter as the instructional tool. HyLighter is a web -based social annotation tool that allows readers to digitally highlight and annotate a text. HyLighter facilitates spacious display of text-related commentary, and allows unlimited number of contributors to highlight and add comments. To insert comments, readers first block the fragment of text to comment upon and then insert their comment in the comment box that avails. Once a fragment is annotated, HyLighter displays color codes on top of a web page linked to comments in the margins. This feature aids readers as they interact with selected fragments without burdening the margins. Figure 1 graphically illustrates the color map for the entries from a group of readers. An area highlighted by 'you' (the logged-in user) is displayed in yellow. Areas not highlighted by 'you,' but highlighted by one or more other readers, are displayed in shades of blue (the darker the shade, the more readers have highlighted the fragment). Finally, areas highlighted by both 'you' and others are displayed in shades of green. Of importance also, textrelevant comments are displayed in the margin or comment field. Finally, the system provides different 'views' that help the reader reviewing the group input, and/or assessing highlighted fragments and relevant comments via use of a range of searching and sorting options (e.g., by username, recommended changes, or date modified) .
Instructional Method
The instructional method included the completion of an instructional activity using or not using the HyLighter tool. For the purposes of the instructional activity, students in the HyLighter group first read an article using HyLighter, i.e., digitally highlighting, annotating, and interacting with other readers, and the researcher. Prior to students' reading of the article, the researcher highlighted and annotated upon selected fragments of the text. These inputs reflected the researcher's thought about selecting fragments, and at times included text-relevant web links. Most of these came in forms of open-ended statements to prompt additional follow up/comments from the students (i.e., 'check this [website] out'; 'Is this really surprising'?). Students in the NoHyLighter group completed the reading assignment in conventional format using a paper copy of the article without use of the HyLighter tool.
Procedures
Prior to starting the study, three researchers selected a reading article, and developed the instruments. Prior to participation, participants signed an institutional review board-approved consent form, and completed a demographic questionnaire. Students in the HyLighter group also received HyLighter training. The HyLighter training lasted approximately 30 minutes, and focused on how to use the system for best reading practices on the assigned article. Students in both groups first read the article. Approximately 45 minutes was allocated for completing the reading including the reading activities. At the completion of the reading, students submitted the article to the researcher and completed the reading comprehension quiz. Following, students wrote their text summaries of the article. Then, students answered the learning affect questionnaire, and lastly, the HyLighter group responded to the HyLighter specific questionnaire.
Results
Manipuhtion Check
HyLighter Specific Questionnaire. Students using the HyLighter strongly disagreed (i.e., ratings < 2) with statements including, 'Given the opportunity to use HyLighter in the future, I am afraid that I might have trouble in navigating through it,' ? would avoid learning a topic if it involves HyLighter,' and 'I would hesitate to use HyLighter in case if I make mistakes,' ? feel uneasy using HyLighter,' 'I need an experienced person nearby when I'm using HyLighter.' Conversely, students tend to agree (i.e., ratings >3.8) with statements including, 'HyLighter does not get me anxious at all,' 'HyLighter provides more interesting and innovative ways for learning,' Using HyLighter is comfortable,' and 'HyLighter can make it possible to learn more productively.' In view of these ratings, students regarded HyLighter as useful, and valuable, for the learning experience.
Main Findings
Reading Comprehension. A one-way ANOVA was performed to reveal the effect of HyLighter use on reading comprehension (e.g., reading comprehension quiz; RCQ). The results indicated non-significant mean differences between the two learning conditions, F (1, 48) = .68, p = .41, n^sub p^^sup 2^ =. 014. Figure 2 illustrates reading comprehension levels for the two conditions (i.e., HyLighter vs. No-HyLighter). The No-HyLighter group, relative to the HyLighter group, scored slightly higher on the RCQ. (M^sub No.Lightter^= 2.11, SD = .801; M^sub HyLighter^ = 1.91, SD = .90, Cohen's d = .24).
Learning Affect
Three separate multivariate analysis of variance (MANOVAs) were conducted for learning-related affect measures (i.e., positive-valence emotion, negative -valence emotion, and motivation to learn). The two conditions did not differ significantly on positive -valence emotions, Wilk's λ = .97, F (3, 45) =.39, p = .75, η^sub p^^sup 2^ =.026, negative-valence emotion, Wilk's λ = .86, F (4, 44) = 1.84, p = .14, η^sub p^^sup 2^ .14, and motivation to learn, Wilk's λ = .93, F (2, 47) = 1.66, p = .20, η^sub p^^sup 2^=. 07. Though non-significant learning method effects were noted for positive and negative emotions' valence, students using HyLighter reported descriptively more positive emotions (Cohen's d range: .32 to .19), and negative emotions (Cohen's d range: .60 to .33) than students not exposed to HyLighter. Mean differences between the two conditions are presented in Table 1.
Complex Thinking Skills: Content - Conceptualization
MANOVA analysis pertaining to the HIMATT coefficients (e.g., seven mental model similarity indices) revealed significant differences between the two instructional methods, Wilk's λ = .73, F (7, 42) = 2.25, p = .04, η^sub p^^sup 2^ =.05, Follow-up test of between-subjects effect revealed that the students using HyLighter obtained significantly higher mental model similarity (e.g., contentconceptualization) scores than students not using HyLighter on surface, F (1,48) =8.05, p = .007, η^sub p^^sup 2^ =.144 = -78 graphical, F (1,48) = 7,87, p = .007, η^sub p^^sup 2^ =.14, d = .74, structure, F (1,48) = 14.03, p < .001, η^sub p^^sup 2^ =.23, d =.89 , gamma, F (1,48) = 8.68, η^sub p^^sup 2^ = .005, η^sub p^^sup 2^ =.15, d = ,82, They also were higher on the indices concept (d = .33), proposition (d = .50), and matching semantics (d = .46) , but these were non-significant. These findings indicate the use of HyLighter during the learning process resulted in a substantial enhancement of content-conceptualization in these students.
Discussion
Collaborative learning is characterized by group of learners communicating and interacting toward one common learning goal (Akkerman et al., 2007). In the era of technologysupported education, computer-supported collaborative learning provides a unique platform for web-based collaborative learning (Jones et al, 2010). The efficiency and productivity of collaborative learning are due to the increased active participation in the learning process (Morgan et al., 2000), aiding students' complex thinking skills (Sloffer et al., 1999), and overall enabling more self-regulation in students (Van Boxtel et al., 2000).
This said, to the extent that the computer-supported collaborative learning merges collaborative learning and the use of information and communication technologies, it might fall short from generating the most optimal learning outcomes measured on quiz performance (Dewiyant et al., 2005). Recent literature that addresses the effectiveness of computer-supported collaborative learning reveals that, as opposed to the conventional face to face settings, learners within computer-supported collaborative learning settings may experience interaction problems (Janssen et al., 2005), which could cause a lack of understanding of the group discussion (Thompson & Coovert, 2003), as well as an inefficiency in decision making (Fjermestad, 2004), and problem solving (Baltes, 2002) . These difficulties may be due to collaborative learning setting primarily supporting cognitive processes, i.e., to the detriment of the socio-emotional processes otherwise inherent in group success (Kreijns & Kirschner, 2004).
To address this problem, the present study aimed at testing the effectiveness of a novel computer-supported collaborative learning tool: HyLighter. HyLighter is a social annotation model-learning system, hence its potential for facilitating social processes within a computersupported collaborative learning framework. Specifically, the present study aimed at testing the effects of HyLighter use on undergraduate students' learning, affects toward learning, and development of complex thinking skills. The later is of vital importance because the students in the current system are known to suffer from a lack of reading comprehension and learning skills (Hammond, et al., 2007), positive affects, and motivation for learning (Willingham, 2009) as well as complex thinking capacities and higher order processes (Joseph, 2010). It was expected that the implementation of the HyLighter tool would improve reading comprehension and retention rates, generate positive affect toward learning, and enhance overall thinking skills in learners. Thus, this study contented that students using HyLighter relative to students not using HyLighter would score higher on a reading comprehension quiz, display higher order thinking skills in a reading summary, and report increased rates of positive -valence emotions at the completion of the assignment.
These contentions were consistent with previous findings attesting to the positive effects of the social annotation model-learning systems. Specifically, past inquiries revealed that in a community college sample, the use of social annotation model-learning system enhanced participation, completion rates, and improved reading and writing skills in learners (Lebow et al., 2004). Similarly, most recent inquiries showed that in college undergraduate samples, the use of social annotation model-learning system helped learning gains, academic achievement, and positive affective responses to learning (Razon et al., 2012) as well as reading comprehension, and higher order thinking skills (Johnson et al., 2010).
Conversely to the initial predictions, however, in the present study, students using HyLighter and students who did not use HyLighter did not significantly differ on reading comprehension achievement measured by a three-items quiz, but did show substantially stronger effect sizes on content-conceptualization (i.e., mental model similarities) indices. The groups also did not significantly differ in the valence of perceived positive and negative affects for learning though HyLighter engagement resulted in higher valence for both positive and negative affective variables. A plausible explanation for these findings can be the short-term nature of the instructional task in this study. Student in the present study used the HyLighter tool during one session of two hours reading practice. Furthermore, students who participated in this study were drawn from a college-wide subject pool, thus most of the students participating in the study did not know one another prior to study. The latter is important because computer-supported collaborative learning best succeeds in groups that are cohesive, where the members share a sense of trust and community (Kreijns & Kirschner, 2004) . The short term nature of this study, and the exposure to an innovative instructional tool, may not have allowed these processes to fully develop and resulted in similar quiz performance and both more excitement, optimism, and happiness on one hand, and worry, distress, uncertainty, and pessimism on the other hand in students using HyLighter. This explanation holds further true in view of the previously positive findings being associated with longer-term model implementations. Lebow et al. (2004), Johnson et al., (2010), and Razón et al., (2012) implemented the social-annotation model learning system for at least one semester throughout the particular courses that the students were participating in one or twice a week. However, to our knowledge, the use of HIMATT within the social annotation framework and the results revealed here are of substantial importance in the future of learning and research.
HIMATT was used to test how students' mental model representations (i.e., text summaries) are compared to a reference model (i.e., expert/teacher text summary) when learning under regular and enhanced HyLighter instruction. The findings indicate that the students using HyLighter conceptualized the reading material significantly better than the Non-HyLighter students as indicated by the structural indicators (e.g., surface, graphical, structural, and gamma). Significance in structural measures indicates that complexity (e.g., breadth of concepts, conceptual knowledge, and density) of both student and expert mental model representations were more similar when learners exchange notes and ideas about the learned contents than when they refrain from doing so. When students collaborate with their peers and critique their work, the students are evaluating and comparing their own mental models with that of their peers, thus requiring more cognitive processing, which deepens their conceptualizations, and in turn allow them to modify their own mental models to be similar to that of the expert (Merrill & Gilbert, 2008).
Students using HyLighter further reported slightly higher rates for negative -valence emotions including 'worry,' 'distress,' 'uncertainty,' and 'pessimism' (Cohen ds range = -.29 to - .56). These results are somewhat surprising in view of these students' high rates of satisfaction associated with the use of the HyLighter tool. Drawing back to importance for the development of socio- emotional processes within learning communities (Kreijns & Kirschner, 2004) , due to the short manipulation format herein, students may not have felt fully confident with their ability to communicate and/or learn online, or given the non-graded nature of this assignment, they may not have acquired a clear understanding of what was required for succeeding in the task. In fact, students are known to feel more positively about online learning as they increase confidence for online communication, and acquire a clear understanding of what is expected for succeeding (Palmer & Holt, 2009).
It is further known that in groups where the sense of community is not established, students may not fully recognize the task as a group- task, hence may perform individually and not collaboratively (Carroll et al., 2003). Thus, this possible confusion and sense of isolation may have somewhat contributed to these students' sense of 'worry,' 'distress,' 'uncertainty,' and 'pessimism.' Furthermore, consistent with these results, students using HyLighter also tended to report less motivation for completing additional reading assignments as compared to students not using HyLighter. To the extent that motivation for learning increase as a result of consistent positive emotional experiences (Meyer & Turner, 2009), as a result of experiencing somewhat negative affect, these students' motivation for future reading may have been reduced.
This said, in the present study, students using HyLighter also reported slightly higher rates for positive valence-emotions including 'excitement,' 'optimism,' and 'happiness' (Cohen 4 range= -.18 to-.30). These results, although consistent with the high satisfaction rates these students reported for the use of the HyLighter tool, remain surprising in the view of the aforementioned negative -valence emotions. It seems that regardless of the challenges the learning platform may present, students may have still perceived online learning as 'enjoyable' and 'fun.' Specifically, for students who may be generally feeling autonomous and competent in computer- assisted settings (Zhao, Yaobin, Wang, & Huang, 2011; see Deci & Ryan, 1989 for review), this task, despite its specific inconveniences, still may have been perceived as more exciting and fun than a conventional reading assignment format.
Taken all together, the results of this study confirm that to promote most optimal learning outcomes, any computer-supported collaborative learning tool should facilitate the development of social processes as well as the cognitive processes. To this end, an important implication of these findings could be first to prioritize the social aspects of these systems, and second to allow time for these processes to develop properly for short-term interventions may not typically lead to most optimal learning gains, positive affect, and motivation for learning. However, students in the collaborative group demonstrated gains in content-conceptualization, as indicated by the HIMATT mental model measures, despite the short exposure to the HyLighter technology.
Discussion Questions
1. Based on the article you read, name one potential benefit and one potential shortcoming of Web-based computer-assisted systems in educational settings.
2. Provided that complex thinking skills training (i.e., critical thinking, reading comprehension, etc.) is a must in higher education, debate which system would be more likely to help achieve that goal: a conventional (paper and pencil) education system or a Web -based computer assisted education system.
3. If you were to design an online-distance education course, would you use the Hy-Ligther system? Please explain why or why not.
To Cite this Article
Razon, S., Mendenhall, A., Yesiltas, G. G., Johnson, T. E., & Tenenbaum, G. (2012, Spring). Evaluation of a computer-supported collaborative learning tool: Effects on quiz performance, content-conceptualization, affect, and motivation. Journal of Multidisciplinary Research, 4(1), 61-78.
References
Akkerman, S., Van den Bossche, P., Admiraal, W., Gijselaers, W., Segers, M., Simons, R. J., & Kirschner, P. A. (2007). Reconsidering group cognition: From conceptual confusion to a boundary area between cognitive and socio-cultural perspectives? Educational Research Review, 2, 39-63.
Baltes, B. B., Dickson, M. W., Sherman, M. P., Bauer, C. C, & LaGanke, J. (2002). Computermediated communication and group decision making: A meta-analysis. Organizational Behavior and Human Decision Processes, 87, 156-179.
Beers, P. J., Boshuizen, H. P. A., & Kirschner, P. A. (2007). The analysis of negotiation of common ground in CSCL. Learning and Instruction, 1 7, 427-435.
Benbunan-Fich, R., Hiltz, S. R., &Turoff, M. (2003). A comparative content analysis of face-toface vs. asynchronous group decision making. Decision Support Systems, 34, 45-469.
Carroll, J. M., Neale, D. C, Isenhour, P. L, Rosson, M. B., & McCrickard, D. S. (2003). Notification and awareness: Synchronizing task-oriented collaborative activity. International Journal of Human Computer Studies, 58, 605-632.
Deci, E. L., Connell, J. P., & Ryan, R. M. (1989). Self-determination in a work organization. Journal of Applied Psychology, 74, 580-590.
Dewiyanti, S., Brand-Gruwel, S., & Jochems, W. (2005, August). Learning together in an asynchronous computer-supported collaborative learning environment: The effect of reflection on group processes in distance education. Paper presented at the 1 1th Biennial Conference of the European Association for Research in Learning and Instruction, Nicosia, Cyprus.
Fawcett, L. M., & Garton, A. F. (2005). The effect of peer collaboration on children's problemsolving ability. British Journal of Educational Psychology, 75, 157-169.
Fischer, F., Bruhn, J., Gräsel, C, & Mandi, H. (2002). Fostering collaborative knowledge construction with visualization tools. Learning and Instruction, 12, 213-232.
Fjermestad, J. (2004). An analysis of communication mode in group support systems research. Decision Support Systems, 37, 239-263.
Gress, C. L. Z., Fior, M., Hadwin, A. F., &. Winne, P. H. (2010). Measurement and assessment in computer-supported collaborative learning. Computers in Human Behavior, 26, 806-814.
Hadwin, A. F., Gress, C. L. Z., & Page, J. (2006). Toward standards for reporting research: A review of the literature on computer-supported collaborative learning. Paper presented at the 6th IEEE International Conference on Advanced Learning Technologies, Kerkrade, Netherlands.
Hammond, C, Linton, D., Smink, ]., & Drew, S. (2007). Dropout risk factors and exemplary programs: A technical report. National Dropout Prevention Center, Clemson, SC, available at http://maec.ceee.gwu.edu/files/Dropout_Risk_Factors_and_Exemplary_ Programs_Clemson_2007 .pdf
Hertz-Lazarowitz, R., & Bar-Natan, I. (2002). Writing development of Arab and Jewish students using cooperative learning (CL) and computer-mediated communication (CMC). Computers & Education, 39, 19-36.
Ifenthaler, D. (2010). Scope of graphical indices in educational diagnostics. In D. Ifenthaler, P. Pirnay-Dummer, & N. Seel (Eds.) Computer-based diagnostics and systematic analysis of knowledge (pp. 77-114). New York: Springer Science + Business Media, LLC.
Janssen, J., Erkens, G., Kanselaar, G., & Jaspers, J. (2007). Visualization of participation: Does it contribute to successful computer-supported collaborative learning? Computers and Education, 49, 1037-1065.
Janssen, J., Erkens, G., Jaspers, J., & Broeken, M. (2005, August). Effects of visualizing participation in computer-supported collaborative learning. Paper presented at the 11th Biennial Conference of the European Association for Research in Learning and Instruction, Nicosia, Cyprus.
Janssen, J., Kirschner, F., Erkens, G., Kirschner, P., & Paas, F. (2010). Making the black box of collaborative learning transparent: Combining process -oriented and cognitive load approaches. Educational Psychology Review, 22(2), 139-154.
Jonassen, D., & Cho, Y. H. (2008). Externalizing mental models with mind tools. In D. Ifenthaler, P. Pirnay-Dummer, & J. M. Spector (Eds.), Understanding models for learning and instruction: Essays in honor of Norbert M. Seel (pp. 145-159). New York: Springer Science + Business Media, LLC.
Johnson, T. E., Archibald, T. N., & Tenenbaum, G. (2010). Individual and team annotation effects on students reading comprehension, critical thinking, and meta-cognitive skills. Computers in Human Behavior, 26, 1496-1507.
Johnson, T. E., Pirnay-Dummer, P. N., Ifenthaler, D., Mendenhall, A., Karaman, S., & Tenenbaum, G. (2011). Text summaries or concept maps: Which better represents reading text conceptualization? Technology, Instruction, Cognition and Learning, 8, 297-312.
Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story: Social interdependence theory and cooperative learning. Educational Researcher, 38, 365-379.
Jones, N., Blackey, H., Fitzgibbon, K., & Chew, E. (2010). Get out of myspace! Computers and Education, 54, 776-782.
Joseph, N. (2010). Metacognition needed: Teaching middle and high school students to develop strategic learning skills. Preventing School Failure, 54, 99-103.
Kirschner, F., Paas, F., & Kirschner, P. A. (2009). A cognitive-load approach to collaborative learning: United brains for complex tasks. Educational Psychology Review, 21, 31-42.
Kreijns, K., & Kirschner, P. A. (2004). Determining sociability, social space and social presence in (a) synchronous collaborating teams. Cyberpsychohgy and Behavior, 7, 155 -172.
Kreijns, K., Kirschner, P. ?., & Jochems, W. (2003). Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: A review of the research. Computers in Human Behavior, 1 9, 335-353.
Lebow, D. G., Lick, D. W., & Hartman, H. J. (2004). HyLighter and interactive annotation: New technology to develop higher-order thinking skills. Inquiry: Critical Thinking Across the Curriculum, 23, 69-79.
Lebow, D. G., Lick, D. W., & Hartman, H. (2009). New technology for empowering virtual communities. In M. Pagani (Ed.), Encyclopedia of Multimedia and Technology (2nd ed.) (pp. 1066-1071). Hershey, PA: IGI Global.
Mendenhall, A., & Johnson, T. E. (2010). Fostering the development of critical thinking skills, and reading comprehension of undergraduates using a Web 2.0 tool coupled with a learning system. Interactive Learning Environments, 18, 263-276.
Mendenhall, A., Kim, O, & Johnson, T. E. (2011). Implementation of an online social annotation tool in a college English course. In D. Ifenthaler, P. Isaías, J. M. Spector, Kinshuk, & D. G. Sampson (Eds.), Multiple perspectives on problem solving and learning in the digital age (pp. 313-323). New York: Springer Science + Business Media, LLC.
Merrill, M. D., & Gilbert, C. G. (2008). Effective peer interaction in a problem-centered instructional strategy. Distance Education, 29(2), 199-207.
Meyer, D., & Turner, J. (2006). Re-conceptualizing emotion and motivation to learn in classroom contexts. Educational Psychology Review, 18, 377-390.
Morgan, R. L., Whorton, J. E., & Gunsalus, C. (2000). A comparison of short-term and longterm retention: Lecture combined with discussion versus cooperative learning. Journal of Instructional Psychology, 27, 53-58.
McNair, D. M., Lorr, M., & Droppleman, L. F. (1971). Manual for the profile of mood states. San Diego, CA: Educational and Industrial Testing Service.
Palmer, S. R., & Holt, D. M. (2009). Examining student satisfaction with wholly online learning. Journal of Computer Assisted Learning, 25, 101-113.
Pirnay-Dummer, P. (2011). Comparison Measures of T-MITOCAR, HIMATT, AND AKOVIA. Retrieved March 30, 2011, from http://www.pirnay-dummer.de/research/ comparison_measures_2011-30-30.pdf
Pirnay-Dummer, P., Ifenthaler, D., & Spector, J. (2010). Highly integrated model assessment technology and tools. Educational Technology Research and Development, 58(1), 3-18. doi: 10.1007/sll423-009-9119-8.
Pirnay-Dummer, P., & Ifenthaler, D. (2010). Automated knowledge visualization and assessment. In D. Ifenthaler, P. Pirnay-Dummer, &. N. Seel (Eds.) Computer-based diagnostics and systematic analysis of knowledge (pp. 77-114). New York: Springer Science + Business Media, LLC.
Razon, S., Turner, J., Arsal, G., Johnson, T. E., & Tenenbaum, G. (2012). Effects of a collaborative annotation method on students' learning and learning-related motivation and affect. Computers in Human Behavior, 28, 350-359.
Sloffer, S. J., Dueber, B., & Duffy, T. M. (1999). Using asynchronous conferencing to promote critical thinking: Two implementations in higher education (CRLT Technical Report No. 8-99). Bloomington, IN: Indiana University.
Van Boxtel, C. A. M., Van der Linden, J., & Kanselaar, G. (2000). Collaborative learning tasks and the elaboration of conceptual knowledge. Learning and Instruction, 10, 311-330.
van der Meijden, H. (2007). Samenwerkend leren in gameprojecten [Electronic Version]. Retrieved January 07, 2012, from http://www.deonderwijsvernieuwingscooperatie.nl/wpcontent/uploads/2011/02/74-Samenwerken-leren-in-gameprojecten.pdf
Vygotsky, L. S. (1978). Mind in society. The development of higher psychological processes. Cambridge, MA: Harvard University Press.
Wang, Q. (2009). Design and evaluation of a collaborative learning environment. Computers and Education, 53, 1138-1146.
Willingham, D. (2009). Why don't students like school? San Francisco, CA: Wiley.
Zhao, L., Lu, Y., Wang, B., & Huang, W. (2011). What makes them happy and curious online? An empirical study on high school students' Internet use from a self-determination theory perspective. Computers and Education, 56, 346-356.
Selen Razon, Anne Mendenhall, Gonca GuI Yesiltas, Tristan E. Johnson, and Gershon Tenenbaum33
33 Selen Razon, Florida State University.
Anne Mendenhall, Florida State University.
Gonca Gul Yesiltas, Florida State University.
Tristan E. Johnson, Florida State University.
Gershon Tenenbaum, Florida State University.
About the Authors
Selen Razon is a doctoral candidate in Sport and Exercise Psychology at the Florida State University. She received her Bachelor of Science degree in Psychology from Université Paris ISorbonne. Her research interests include investigating the relationships between emotions, cognitions, and motions during performance and physical activity, perception of acute and chronic pain in athletic and medical contexts, biopsychosocial components of pain, the use of mental imagery to cope with pain, and the development of complex thinking skills in sports and educational settings.
Anne Mendenhall is a doctoral candidate in Instructional Systems at the Florida State University. She works as an Instructional Designer and Research Assistant at the Learning Systems Institute at the Florida State University. Her research interests include the use and validation of instructional design theories and models, team-based and collaborative learning, automated assessment and feedback systems, mobile learning, distance learning, and international learning systems.
Gonca GuI Yesiltas is a Ph.D. student in Measurement and Statistics at the Florida State University. She received her Bachelor of Science degree in Mathematics from Ankara University and Master of Science degree in Measurement and Statistics from Florida State University. Her research interests include applications of differential item functioning for polytomous items and comparison of differential item functioning procedures under different reliability values.
Dr. Tristan E. Johnson is a research associate in the Learning Systems Institute at Florida State University. He is also on the faculty of the Instructional Systems Program in the Department of Educational Psychology and Learning Systems in the College of Education. He recently completed a three-year term as President for the Training and Performance Division of the Association of Educational Communications and Technologies. Tristan's primary technical expert is in training, curriculum, and instruction. He has 18 years designing, developing, and evaluating curriculum and training systems for business, military, industry, and academia. He has worked on the creation of university courses, educational simulations, electronic performance support systems, workshops, certificates, and programs as well as numerous technology -based instructional materials. He has expertise in the application of instructional strategies including case studies, team-based learning, gaming, problem solving, task-centered instruction, and many other strategies that have strong levels of learning effectiveness and engagement. His research focuses on team cognition and its links to team performance. Specifically, he studies how mental models are created and shared, and their effect on performance. Dr. Johnson has published papers related to team cognition as well as visual representations effect on learning, and technology as a tool to represent internal mental models and schémas.
Gershon Tenenbaum, Ph.D., a graduate of Tel- Aviv University and the University of Chicago in research methodology and statistics, is the Benjamin S. Bloom Professor of Educational Psychology at the Florida State University, in the U.S.A. He is a former director of the Ribstein Center for Research and Sport Medicine at the Wingate Institute in Israel, and coordinator of the Graduate Program in sport psychology at the University of Southern Queensland in Australia. From 1997 to 2001, he was the President of the International Society of Sport Psychology, and from 1996 to 2008, he was the Editor of the International Journal of Sport and Exercise Psychology. He has published extensively in psychology and sport psychology in areas of expertise and decision-making, psychome tries, and coping with physical effort experiences. Gershon has received several distinguished awards for his academic and scientific achievements, and he is a member of several scientific and professional forums and societies.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright St. Thomas University Spring 2012
Abstract
The current study tested the effectiveness of a computer-supported collaborative learning tool-HyLighter-on undergraduate students' learning, affect for learning, and complex cognitive skills. The HyLighter group (N = 23) (1) digitally highlighted and annotated a reading article, and (2) reviewed and expanded on peers' and instructor's highlights and annotations. The control group (N=27) read the article in hard copy without using HyLighter and practicing its learning activities. The dependent variables included: (a) performance on a reading quiz, (b) a number of affective variables related to the reading assignment, and (3) students' cognitive modeling of the article's content. Although students reported high rates of satisfaction with the HyLighter tool, performance on the reading comprehension quiz did not differ significantly between the two groups. Students using HyLighter tended to score higher on both the positive-valence and negative-valence emotions. However, these students also showed significant and substantial superiority in mental model similarity indices. Thus, HyLigher use in the learning process was apparent in the learners' content-conceptualization more so than in quiz performances. These findings have significant implications for both instructional and research purposes. [PUBLICATION ABSTRACT]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer