ABSTRACT
In this study, graduate students and postdoctoral scholars participated ¡n a hybrid, multiinstitutional workshop series about course design. Trainees developed college courses based on their research expertise, posting works-in-progress to a shared, online drive for peer review and collaboration. Learners also met weekly with local facilitators at their institution. The program led to similar learning outcomes as when the program was previously run in a faceto-face only format at one institution. However, the multi-institutional design led to additional benefits, especially for leaders at each institution, who described a rich learning community in their collaborative work.
KEYWORDS
graduate student, postdoctoral scholar, hybrid, learning community, program assessment
INTRODUCTION
Preparation of graduate students and postdoctoral scholars for future faculty roles is an important function of graduate training (Adams 2002; Austin 2002; Austin et al. 2009; Connolly, Lee, and Savoy 2018; Connolly et al. 2016). Training opportunities for developing teaching skills are often limited compared with training for research skills, writing, and creative arts (Austin et al. 2009; Golde and Dore 2001; Shortlidge and Eddy 2018). Graduate-oriented development efforts may be focused on the immediate needs of trainees for their roles as teaching assistants (Marincovich, Prostko, and Stout 1998; Palmer 2011; Tice 1997). This type of just-in-time training may vary in content and delivery depending on program needs (Palmer 2011).
One important teaching skill for new faculty is course design. New faculty may be tasked to create meaningful learning experiences even when they are unprepared to do so (Fairweather and Rhoads 1995; Felder, Brent, and Prince 2011; Fink and Ganus 2009). Learning the skills of course design as a new faculty can be stressful as new course development may have to happen in very short time frames, and this work and skill development may be in conflict with other priorities, like establishing a research program (Austin 2002; Boice 1991). Development of these skills during graduate or postdoctoral training may lessen this burden and allow the faculty member to engage with course design work quickly and confidently. However, in many disciplines, training in the skills of course design is not widely available to graduate students or postdoctoral scholars, since course development is not a common responsibility of these individuals, particularly in the STEM disciplines (Fleet et al. 2006; Hill et al. 2019). Course design skills are achievable by learners at this level (Fink and Ganus 2009) and course design work engages creative and critical thinking, common shared values of research and creative arts programs (Feldon et al. 2011).
Recently, a workshop series was created to develop course design skills in graduate students and postdoctoral scholars (Hoffmann and Lenoch 2013). The elective workshop series described in Hoffmann and Lenoch's study guided learners through a backwards design process while they developed courses based on their research expertise area. Self-reported learning outcomes were strong, however project quality was mixed depending on learner population. While the program was low-cost and accessible for participants, facilitators experienced a burden in providing regular project feedback.
Timely feedback is an important part of improving learning and supporting quality projects (Irons 2008). Feedback can be the responsibility of an instructor, and it can be a shared responsibility of the members of the class. In fact, peer feedback has similar impacts as feedback from teaching staff (Huisman et al. 2018; Yang, Badger, and Zhen 2006). In addition, learners collaborating through feedback contribute to a supportive classroom environment. This positive, flattened hierarchy further encourages diverse perspectives, an asset for creative work. While there is great value in feedback from colleagues with generalist perspectives and/or pedagogical expertise, learners also benefit from receiving feedback from colleagues familiar with their specific content and disciplinary pedagogies. However, finding a peer who has enough shared disciplinary knowledge to provide content-specific feedback can be challenging in a multi-disciplinary cohort at one institution.
In this study, we modified the design of the workshop described by Hoffmann and Lenoch (2013) to enable more project feedback from a broader group of leaders and peers. We adapted didactic and project development content into online videos, which included short lectures, reflection prompts, and weekly assignments. We also facilitated face-to-face (f2f) workshops focused on application and feedback at each institution. To increase access to disciplinary peers, a multi-institutional approach expanded the classroom. This combination of hybrid and multi-institutional approaches allowed participants to: 1) work with didactic content in their own time; 2) use f2f time for discussion and practicing skills; 3) share works-in-progress online and receive feedback from a broader pool of peers; and 4) review others' works-in-progress for inspiration.
Here, we summarize our scholarly investigation of learning outcomes and product quality generated by graduate students and postdoctoral scholars participating in a hybrid, multi-institutional context. In addition, we compare the outcomes in this format to those in the previous f2f format. We also studied the impact of different components (videos, f2f meetings, peer review) on participants' and leaders' experiences in the workshop series. We share our results, reflections, and recommendations with faculty advisors and staff of centers for teaching and learning: members of the higher education community who mentor graduate students and postdoctoral scholars in professional and teaching development. In this contribution to scholarship about teaching and learning, we relate educational
development practices to the preparation of future faculty (Adams 2002; Boyer 1990; Condon et al. 2016; Trigwell 2013).
We hypothesized that this hybrid, multi-institutional approach would create a broad learning community that would support development of both participants AND workshop leaders at each institution. This hypothesis was expanded into several sub-hypotheses: 1) We suspected this approach would enable application of course design principles at levels similar to what was demonstrated in the traditional f2f-only format; 2) the broader learning community would enable professional interactions with colleagues outside of their institutions and empowerment of trainees for future teaching careers; 3) the learning community of course leaders would create opportunities for workshop leaders at each institution to learn with and from others across the network.
METHODS
We used a mixed-methods approach to evaluate the many aspects of the program and multiple hypotheses. An observational study was conducted to assess dynamics of the workshop series and participant engagement in the online space. We also employed a quasi-experimental design with a historical control cohort to assess skills through project quality and compare the hybrid, multiinstitutional approach to a f2f local version of this program. Finally, a phenomenological qualitative approach was used to describe common participant and leader experiences and relevant contexts through surveys and participant observation (Creswell 2013). This combination of approaches addressed levels 1 and 2 of Kirkpatrick's model of training program evaluation (Kirkpatrick and Kirkpatrick 2006). In this model for understanding degree of impact in program evaluation, level one assesses learner reaction/satisfaction and level two assesses achievement of program learning outcomes. Level three assesses behavioral impact after the training experience, typically in the workplace of the trainee. And level four assesses the broader impacts to the societies or systems to which the trainees belong.
Design rationale for Transforming Your Research into Teaching (TYRIT)
The design of the Transforming Your Research into Teaching (TYRIT) workshop series follows several common pedagogical threads in student/faculty development. This program has many features in common with the "course design institute," a type of program frequently offered to faculty working on a course transformation project (Palmer 2011; Palmer, Streifer, and Williams-Duncan 2016). These programs are often multi-day events with time split between learning didactic principles of course design and applying those principles to a course development project. In contrast with the multi-day approach, this TYRIT program uses a serial workshop format to establish flexibility for participants who may be unable to attend full-day sessions over multiple days due to busy schedules or conflicts for time with funded research work. The program also makes use of flipped classroom pedagogy, delivering didactic content online for learners to complete prior to attending the f2f sessions. Time in f2f sessions is spent on application and practice of learned concepts. The approach emphasized in this program is Backwards Design, a well-established course design approach (Wiggins and McTighe 2005).
Workshop series structure and administration
TYRIT was a nine-session course design workshop series for graduate students and postdoctoral scholars, hereafter called participants or learners, across multiple disciplines. TYRIT was first offered in this multi-institutional, hybrid format in summer 2019 (May-August) and the data reported here were gathered from that cohort of participants. TYRIT was based on a previously described graduate student development program that took place at a single institution with all activities taking place in f2f sessions (Hoffmann and Lenoch 2013). The content and scope of the workshop in the current study were identical to the previously described program. In short, participants learn a different aspect of course design in one session each week and develop a course design project based on some aspect of their research area. In the final session, learners present their course design projects (completed syllabus and focused plans for one unit) in a course pitch session emulating a faculty meeting or academic job interview.
In this iteration of the program, the TYRIT workshop was adapted from a live-only format to a hybrid design. It included in-class, mini-lectures and project development activities converted into short videos, which learners reviewed on their own prior to each weekly workshop at their home institution following principles of flipped pedagogy (Abeysekera and Dawson 2015). The online content was created by the workshop director (author Hoffmann). Files were hosted on a shared Google Drive folder and videos were hosted on YouTube. Each weekly f2f workshop focused on application and practice of course design skills, as well as peer review of developing projects.
The workshop series was adapted to enable participants at five institutions to participate in the program simultaneously. These five institutions were large, public research-intensive universities: IUB (Indiana University Bloomington, ISU (Iowa State University), CUB (University of Colorado Boulder), UI (The University of Iowa), and UNL (University of Nebraska - Lincoln). Local group leaders were identified through a multi-disciplinary network of graduate student developers (CIRTL Network, Center for Integration of Research Teaching and Learning, www.cirtl.net). Each local leader was responsible for recruiting local participants and arranging logistics for each weekly f2f session (described in Appendix). All leaders had access to a compendium of f2f activities that Author Hoffmann and Author Kelly at The University of Iowa had previously created for their traditional format of the TYRIT program. The group leaders shared plans for f2f activities through email and a shared folder prior to each weekly session, as well as shared the results of their programs each week. However, each group planned activities that they determined would be most effective for their population.
Participants viewed the same video content prior to attending local f2f workshops and shared their projects in development on the drive at several points throughout the program. They were also encouraged to review one another's work online for inspiration and to provide feedback to peers at distant institutions. Students were not given specific instructions on how to provide feedback to a crossinstitution peer. The same folder used for posting projects in development was also used to distribute workshop materials (handouts, slides) and links to mini-lecture and course development videos.
The workshop leaders had planned to hold the final session (project presentations) synchronously at all five institutions and have students present their work live across the network. However, logistics of finding a mutually available time slot across three time zones made this challenging so each institution held their own final course pitch session.
Workshop participants
Eighty participants started the workshop series by attending the first f2f session. These 80 participants were spread roughly evenly across five institutions in the United States Midwest region (table 1). Thirty-nine learners completed the program by attending the final workshop and/or completing their course design project (program completion rate = 48.75%). This completion rate reflected attrition throughout the duration of the program and was reflected in a gradual reduction in online activity. For example, in week 1, an average of 83 unique users viewed the videos and this number progressively decreased to 61.25, 42.5 and 34.2 in weeks 3, 5, and 7, respectively. Of the 39 learners who completed the program, 31 consented to have their surveys and course design projects included in this research project. Of these, 27 completed both pre- and post-program surveys (27/39: response rate = 69.23%), and 22 learners completed course design projects that could be evaluated at the conclusion of the workshop series. Others decided to postpone the completion of their project.
At all participating institutions, the workshop series was not offered for academic credit; however, at most schools this program was advertised as part of a set of learning experiences associated with the CIRTL Network. At CIRTL Network institutions, completion of local programming about teaching skills is a common mechanism for achieving Associate-level recognition with the CIRTL Network. A graduate student or postdoctoral scholar achieving CIRTL Associate-level recognition should be able to write measurable learning goals and choose and use instructional practices and assessments that align with these goals. Each institution determines the type and amount of programming necessary for this recognition. In most cases, a student who completed the TYRIT workshop received credit toward achievement of Associate level recognition in the CIRTL network (described further in Appendix).
Leader participants
The nine leaders provided support for graduate student development at their respective institutions through different campus roles and academic ranks. These leaders included one faculty, one postdoctoral scholar, two staff members in graduate schools, and five staff members in centers for teaching and learning. Of the nine leaders from the five institutions, seven completed the post-course leader survey. Because of the small sample size, participants were not asked to identify their institution.
Workshop data collection and data analysis
Three survey instruments were used to collect data from learners and leaders: 1) A pre-program participant survey collected pre-program attitudes about teaching and teaching skills. 2) A post-program participant survey collected post-program attitudes about the same skills and gathered data on perception of skill development and value of workshop series components. Post-program surveys also used free response items to qualitatively assess the learner experience and value of workshop components. 3) A post-program survey of leaders identified components of the course that the leaders perceived to be influential for participant experience and characterized phenomenologically the experience of working as a workshop leader within a multi-institutional learning community. Consent was administered in the final class session, and all consent and research processes were approved with exempt status by The University of Iowa Institutional Review Board (IRB# 201907754).
Pre- and post-program surveys were developed based upon instruments used in the prior publication on this program (Hoffmann and Lenoch 2013). The survey items are listed in their entirety in the results tables below. Surveys were designed in Qualtrics (Provo, UT, USA) and reviewed by leaders from all institutions to ensure face and construct validity. The pre-program survey established baseline attitudes and self-perception of knowledge/skills in course design. The post-program survey evaluated the same objectives as the pre-program survey and explored self-perception of learning through a retrospective pre-post self-assessment of skills (Skeff, Stratos, and Bergen 1992). Pre-program surveys were sent to participants via email two weeks prior to the program, and post-program surveys were sent to participants during the week of the final workshop session with several email reminders over the following week. Pre-program surveys were completed with identifying information, which allowed for exclusion of pre-program data from those participants who did not complete the workshop series or consent to participate in the program evaluation research. This ensured that pre-post comparisons evaluated changes within the same learner population. However, because post-program surveys were conducted anonymously, it was impossible to make paired, pre-post comparisons. Scaled response items were summarized using Means and Standard Deviation and % agreement with statement prompts. Depending on the nature of the data, paired and unpaired r-tests were used along with one-way ANOVA and Kruskal-Wallis tests. In all analyses, p < 0.05 was considered significant.
The post-program survey for leaders utilized primarily open-response items to probe their observations of the students' experiences, as well as contexts that appeared to influence those experiences. Specifically, we asked for leader perspectives on student attrition and participation, and observed value of f2f and online activities. We also asked leaders about their own experiences as facilitators of their local learning communities, including essential resources for managing the course and the meaning they derived from their roles in the workshop.
We used a phenomenological qualitative approach to understand the commonality of the experiences of participants and leaders in the course (Creswell 2013). This approach allowed us to track individual narratives within the small study populations across multiple survey questions. Each survey response was numbered in the order it was recorded and the respondent numbers are reported with representative quotations in the results section below. Responses to open-ended survey questions were grouped into clusters by Author Kearns, which were reviewed by Author Hoffmann. Representative quotations are provided in the Results as exemplars.
Level of online interaction was assessed from data collected from the Google Drive. Numbers of posts and comments were identified for each week of the workshop series. Feedback comments made by local leaders were recorded separately from feedback comments posted by student peers. YouTube analytics were also collected after the conclusion of the workshop series to evaluate the level of engagement with the video content. Total views, unique users and % view duration were extracted from the video analytics data, and averages for each week were calculated. Videos were posted as unlisted videos, so only individuals with links to the videos could access them, ensuring that usage data was not affected by viewers not associated with the program.
Two weeks after the conclusion of the program, participants' projects (course syllabi) were downloaded from the shared drive folder for those participants who consented to share their projects for this study. All feedback, comments, and identifying information (for both individual and institution) were redacted and the syllabi were then subjected to rubric analysis by two reviewers to evaluate project quality. The two reviewers were calibrated using seven syllabus projects from a prior iteration of the program and a rubric that has previously been shown to discriminate quality of learning outcomes between and within groups (Hoffmann and Lenoch 2013). During calibration, the two reviewers each reviewed the seven sample syllabus projects independently and then compared scores. When scores differed, the reviewers discussed their rationales and came to agreement on additional language for the rubric that would bring their evaluations into closer alignment. This expanded rubric was used for evaluating the syllabus projects in this study. The rubric captured key syllabus elements related to the course objectives divided into five categories: course goals, instructional methods, course resources, schedule/course outline, and other. The "other" category included key course policies (attendance, grading, office hours) and consistent formatting and style. Using this instrument, inter-rater reliability was fairly strong as determined by Pearson r = 0.738.
Historical control for project quality comparison
To ensure that adapting workshop content into videos did not diminish the quality of learning outcomes, a set of 24 course design projects from two f2f-only iterations of the workshop series (conducted in 2014 and 2015) were also analyzed by the same two reviewers and the project scores were compared directly. The population of the f2f-only historical control cohort was similar to the multiinstitutional cohort in this study (table 1). The content, sequence, and duration of the workshop series in the control group was identical and the program was led by the same director. However, in this group, learners did not have access to each other's projects outside of class time, and the learners received project feedback each week from the instructional team. Thus, this group serves as a historical control for delivery mechanism (f2f vs. online) with co-existing variables of feedback provider (instructor vs. peer and instructor) and access to peer projects (classroom access vs. free online access).
RESULTS
The multi-institutional, hybrid approach leads to similar outcomes as the f2f-only program
Learners demonstrated significant improvements in project quality, self-reported attitudes, and skills
To assess learning outcomes achievement in the hybrid, multi-institutional cohort (Kirkpatrick Level 2), learning outcomes were assessed directly through project quality analysis, and attitude and skills development self-reporting. As the most direct measure of course learning outcomes, syllabus projects of participants were first evaluated by rubric analysis (table 2). Each category was scored out of five points by two independent reviewers using a grading rubric. Category and total score means were compared by unpaired t-test and no significant differences were detected between the groups. Category subscores were compared within groups by one-way ANOVA and no significant differences were detected between the category subscores. Learners demonstrated a similar level of mastery as compared to participants in the f2f-only cohort over each of the five domains of the final project: course goals, instructional methods, course resources, schedule/course outline, and other factors (common required elements of syllabi such as grading/attendance policies, office hours, formatting/style, etc.). Average total score was 20.64/25 points (82.56%). To ensure that online delivery of content did not negatively impact learning outcomes, these results were directly compared against the historical control projects from a f2f-only workshop series. None of the category subscores or total scores were significantly different between the hybrid, multi-institutional cohort and the f2f single institution cohort. Importantly, these results show consistently high performance across all domains of the rubric, at similar or higher levels than was described in the prior study (Hoffmann and Lenoch 2013).
Responses to survey prompts about comfort with teaching and course design work were directly compared between pre- and post-program (table 3). Participants responded to the above prompts in surveys completed before and after the program using a 1-5 Likert Agreement scale (1=Strongly disagree, 2=Somewhat disagree, 3=Neither agree nor disagree, 4=Somewhat agree, 5=Strongly agree). Percent Agreement indicates those who selected "Somewhat agree" or "Strongly agree." P value is the result of a two-tailed, unpaired t-test (n=27-28), ··p<0.005 pre vs. post. The most substantial impact of the TYRIT program was on knowledge of basic principles of course design and comfort with course design work. Learners reported small but non-significant increases in comfort with teaching various student populations. Confidence in teaching ability was unchanged by this intervention.
Learners evaluated their self-perceived skill level for each of the major program objectives for before and after the workshop series (retrospective, pre-post design). Students self-assessed their preand post-program skill level with each objective on a 1-5 scale (table 4). Percent competent describes the percentage of students responding with 4 (I am quite competent at this skill) or 5 (I am highly competent at this skill) ···p < 0.0005 before vs. after, result of paired t-test. Learners reported significant improvements on all of the key objectives, with the greatest level of change on objectives related to organizing and sequencing content, writing learning objectives, and constructing the course syllabus.
Learners valued course design knowledge and skills; course products; and confidence gained from the program
Following the program, participants responded to a series of prompts describing possible impact areas (table 5) and they reported their perceived level of impact of the program using a 1-5 Likert Agreement scale (1=Strongly disagree to 5=Strongly agree). These items address Kirkpatrick's Level 1. Percent Agree is the percent of participants who responded either "Strongly agree" or "Somewhat agree." Learners' qualitative responses to the post-program survey reflected three values they perceived to the TYRIT Program: 1) content knowledge and skills they developed in course design; 2) value participants associated with their course products; and 3) confidence gained and creativity expressed in producing their course design products. Participants' comments about the program value reflected several emphasized content and skill lessons about course design principles. Aspects of backward course design and alignment of learning objectives, assessment methods, and teaching approaches figured prominently in participants' open-ended comments about important program lessons. Participant 20 stated, "The biggest lesson I learned is the importance of defining the end goal first and then designing every part of the course around that. I'd never been exposed to that kind of process before, but it makes perfect sense." Learners agreed that their projects would be useful in future job interviews (table 5; rated 4.74 ± 0.53 out of 5). In addition, they agreed that they would like to teach their course or course components in the future (4.59 ± 0.75 out of 5). However, learners on average perceived little positive relationship between their course design products and their preparation of dissertation or grants. Finally, several participants commented on how important the program was for building their confidence and supporting their creativity around course design. This program made the process of syllabus design less intimidating and more enjoyable for participants. Participant 7 stated "The overall process of developing a course is a daunting task that I did not know how to tackle or where to even begin. The biggest take away from this program was just the overall idea of what to keep in mind when developing a course, how to begin such a task, and that it really is a feasible task to start from scratch!"
Learners perceived workshop components with varying levels of importance for their learning
Students perceived different levels of importance for the various workshop components (table 6). Trainee participants evaluated each program component on an importance scale (1=Not at all important, 2=Slightly important, 3=Moderately important, 4=Very important, 5=Extremely important) in response to the following prompt: Evaluate the program components as to their importance for your learning. Percent Importance indicates selection of "Extremely important" or "Very important." Means were compared between program components using Kruskal-Wallis test with Dunn's test for multiple comparisons. ^Adjusted p<0.05, ··Adjusted p<0.005.
The most important program elements for students' learning were reported to be the online videos, handouts, f2f sessions, and feedback from leaders. Components related to project sharing with peers received significantly lower ratings. Participants reflected on the value of specific program elements in open-ended questions about important lessons from the program and general comments/suggestions. Those students appreciated that the program emphasized opportunities for feedback on their course design components from both their facilitator(s) and their peers. Additionally, participants commented on the integration among the program components-videos, f2f sessions, and feedback-to support their final course design project. Participant 11 stated, "I loved the hybrid nature of the workshop. The videos were great and very informative, mainly because of the examples given by the instructor. And then the f2f modules provided a place to share opinions, receive feedback, and learn from the others. This program itself was a great example of how to organize and prepare an excellent course."
Hybrid design of the workshop was effective and efficient: Leader perspectives
Online videos modeled backward course design and allowed time for valuable f2f activities
Leaders commented on several valuable components of the online videos, both the characteristics of the video components, as well as the ways the videos were integrated with the f2f components. Leaders appreciated that videos were short; modeled backward course design; had accompanying notes and slides that matched up with the lecture content; and included activities and reflections. Videos allowed leaders to use limited f2f time for feedback, activities, Q&A, and discussions. For example, Leader 4 said, "I think there would have needed to be many more f2f sessions in order to address everything in the videos and still give them peer review time, which proved valuable." Leader 9 echoed these sentiments: "With the short amount of in-person time, we could not have given feedback or answered questions." There was interest among the leaders in contributing to videos in future iterations so that online content reflected diversity of institutions, leaders, perspectives, and approaches contributing to the program.
F2f sessions improved participant products, supported community-building, and held participants accountable
Local leaders' rationales and insights into the design decisions for their f2f local communities are captured in their reflections in Appendix 1. This appendix features a series of case studies in which leaders illustrate how they used these shared resources and networks to make lesson plan decisions that were contextual to the participants' motivations and in-the-moment concerns. Common themes across the f2f programs include: 1) incorporation of substantial and guided peer feedback sessions on components of the course design project; 2) progression from more to less structure over the nine-week program; and 3) focus on participants' questions and concerns. Unique features include the application contexts for learners' final projects, as well as the professional roles of the leaders of the f2f sessions.
Leaders perceived several benefits to their participants of the f2f components of this program. According to survey feedback from leaders, peer-review activities incorporated into the f2f sessions gave students more feedback and promoted quality products. Leaders remarked how important it was to provide guidance and structure on peer review procedures and to incorporate spacing between feedback sessions to allow learners time to make substantial progress on their course design projects. Leaders also said that small group work and informal discussions in f2f sessions were helpful for addressing timely questions and concerns such as: teaching at predominantly undergraduate institutions; preparing for faculty interviews; and designing teaching demonstrations. F2f sessions were most valuable when leaders did not overplan each session, when they kept the number of activities limited, and when they mixed up the activities from week to week to avoid repetition. Finally, according to leader survey responses, f2f sessions kept participants accountable, provided affirmation and motivation, and were important for participant retention and project completion. For example, Leader 5 said, ".. .the f2f part was important for keeping them interested, on track, and just not depressed over the summer." Leader 4 summarized the major benefits of the f2f sessions: "I think the f2f helped keep some of them on track and gave them more feedback than I could do by myself. I think fewer would have finished if it was fully online and selfpaced."
Benefits and limitations of the multi-institutional approach
The cross-network approach offered unrealized community possibilities for participants The cross-network approach to TYRIT afforded participants .connection with other students' work - examples, feedback, and just networking possibilities," (Leader 6). In addition, Leader 7 mentioned the opportunity for participants to see "how other people in their discipline were creating courses." However, this feature was underused for much of the program. Leaders continued to use the online space to provide feedback to their local students, but cross-network interactions among participants were minimal beyond week 1 (table 7).
Leaders recognized that benefits to students of the cross-network space were not achieved with regularity. Leaders remarked that the multi-institutional community for students was underused, likely because it was not incorporated as an integral component and because learners were not held specifically accountable for engaging in the online community. Four of the seven leaders responding to the survey recommended placing participants into structured disciplinary groups across institutions with prompts for feedback.
The cross-network approach supported leaders' professional development, community-building, and interdependence
In comparison to the learner experience, leaders said that the multi-institutional approach to the TYRIT program had significant benefits for them as professionals. Leaders commented on how their engagement in leading a f2f local community within the structure of the hybrid, multi-institutional program supported their own learning and development as educational developers. As members of an interdependent leader community, they collaborated with other leaders to share materials, learn new ideas about course design, and acquire new strategies for facilitating group meetings. As Leader 7 said, "I really enjoyed the communication among all the instructors and seeing how different groups focused on different things. The insights about what worked and what was difficult was helpful." Drawing from these collective resources, local leaders had autonomy to adapt and plan experiences that were meaningful for the participants in their own communities. Leaders also expressed that working in the multi-institutional leader community imparted a strong sense of belonging. As Leader 8 said, "I had way more access to other ideas and expertise both leading up to and after each session. I was learning from other developers. I felt like the work happening in my classroom was part of something bigger, and that was very motivating."
DISCUSSION
The results of this study describe a positive learning experience-participants demonstrated learning objectives achievement through their project work and reported strong perception of learning gains across the program objectives. Course designs were unique and innovative, and reflected contemporary disciplinary approaches and values. In addition, the hybrid program design enabled more f2f time focused on application, practice, and peer review. The multi-institutional aspect created a welcome learning community among leaders. Below we provide a detailed assessment of program strengths and weaknesses.
Learning outcomes and learner perceptions
Learning objectives achievement was strong, with direct evidence of quality in project work, and indirect self-report evidence of knowledge gains and comfort with designing courses. However, this program did not have an impact on participants' self-perception of teaching ability or confidence with teaching. This outcome was somewhat expected because the program focused on course design and did not offer opportunities for actual teaching. Further, baseline confidence in teaching skills was already quite high in this population of learners compared to the prior study (Hoffmann and Lenoch 2013).
Objectives linked to course design tasks (content selection, learning objectives, sequencing, syllabus construction) were rated much higher than tasks such as assessment and instructional methods. It is possible that the syllabus and unit lesson plan projects did not allow for deep exploration of these tasks. Finally, perception of program impact was very positive; a majority expressed pride in their course designs and indicated desire to teach designed courses in the future. Most participants indicated that their project work impacted the way they understand their research area, echoing the observation that teaching experiences improve research skills (Feldon et al. 2011; Shortlidge and Eddy 2018). Participants were split on whether this would translate into tangible impact on future grants or dissertation work. These results suggest that training in basic teaching skills, even in the absence of actual teaching experiences, can impact trainees' development as researchers.
The hybrid design of this workshop series ensured efficient delivery (primary content was delivered once via videos) and learner flexibility (videos and assignments could be done at any time). This approach also allowed f2f work to be focused on application, practice, and peer review instead of primary content delivery, following established flipped pedagogy and reported benefits of this method (Abeysekera and Dawson 2015; Yarbro et al. 2014). In this program, both online (asynchronous) and f2f (synchronous) elements were evaluated by participants as similarly important for learning. Participant survey commentary indicated awareness of the value of both aspects of the workshop series in supporting an efficient and effective learning experience.
Multi-institutional workshop approach: An extension of MOOC pedagogy
Combining student cohorts across multiple institutions was a unique addition to this program, similar in design to a Massive Open Online Course (MOOC). The TYRIT program varies from the traditional MOOC: 1) there is a structured f2f component at each participating institution; and 2) f2f groups were linked through a leader network. Other examples of this type of blended MOOC approach have been described (Bruff et al. 2013; Li et al. 2014). Combining a tutorial or learning community with online delivery elements draws upon the strengths of both online and f2f pedagogies, and ameliorates attrition (Jordan 2013; Kizilcec, Piech, and Schneider 2013; Macleod et al. 2014; Onah, Sinclair, and Boyatt 2014; Parr 2013; Seaton et al. 2014). Since MOOCs are often not associated with academic credit and there is no physical community to encourage social bonds, student persistence in these programs can be low. Indeed, in the workshop series described in this study, attrition was an issue; however, the completion rate (49%) was much higher than what would be expected in MOOCs without a f2f community and not much lower than what was seen for a f2f-only iteration of this program (59%). Reasons for attrition in this program were not formally collected through the research study. While there was variation in completion rates among institutions, there were no clear distinctions in program administration or learner profiles at the different institutions that could correlate with varied completion rates. Further, the cohort sizes at each institution were small, making it difficult to draw conclusions from this data.
To support persistence, we communicated the anticipated participant weekly time commitment upfront. We had a realistic view that there would be attrition since the workshop was not required and offered no academic credit from any participating institution. We noticed that participants who missed at least two f2f sessions were both unlikely to get back on track and tended to drop out. One possible evidence-based approach to further support student persistence is purposeful grouping of learners based on disciplinary area, demographic, or geographic characteristics (Bishop-Williams et al. 2017; Kulkarni et al. 2016; Zheng, Vogelsang, and Pinkwart 2015).
In the TYRIT program, participants were encouraged to interact with peers at distant institutions by reviewing and providing feedback on each other's emerging projects. Learners did engage with peer feedback in week one of the workshop series, but this dropped off for the remainder of the program. Learners might have perceived low value in the activity, so they decided not to continue doing it. Alternatively, learners might have felt comfortable giving feedback in week one when the project designs were abstract (course scope and purpose), but less comfortable giving feedback in later weeks when participants were posting content-rich materials such as course sequences, assessment items, or instructional plans. Finally, participants might have had a hard time finding closely related projects in a rather large pool of student projects. Learners were not grouped into disciplinary areas online, so it may have been effortful to search for a project in their disciplinary area to review. Interestingly, feedback from leaders was rated much higher than feedback from learners. It is not clear whether this is due to perceived expertise of the leaders, or the quality or consistency of feedback. Nonetheless, the low participation in this aspect of the program underscores an important lesson: it not enough to simply make peer review and feedback available. Participation and perception of value likely depends on more specific assignments or feedback tasks and consistent feedback availability.
Intersecting communities of learners and leaders
This study exemplifies how multiple learning communities overlap in a multi-institutional workshop series approach. There is abundant evidence for the value of learning communities in higher education, and this can be applied to development of professional skills in graduate students and postdoctoral scholars (Bowden 2012; Kabes, Lamb, and Engstrom 2010). In this program, learners participated in two communities: a local f2f learning community and a multi-institutional, onlinelearning community. In this instance, engagement with the online learning community was low, with learners instead choosing to focus their engagement with local peers. Learning communities for professional developers are less well-studied, but are nonetheless important. Certainly, communities of faculty educators are prolific in universities (Cox and McDonald 2017; Wenger 1999).
The leader community in this workshop program emerged as a notable achievement and all leaders cited the importance of making connections outside of their institutions and collaborating on a work of shared value. While each member of the workshop leadership team was individually committed to professional development at their institution, this program gave workshop leaders opportunities to be committed to a larger learner population, and to one another. As local leaders of f2f communities, our professional roles included graduate students, postdoctoral scholars, teaching center professional staff, faculty, and graduate school administrative staff. That variety of roles in our own cross-institution leadership community was a valuable contribution to our collaborative design and sharing of f2f activities, as well as our camaraderie and collaboration. The shared vision of this group enabled a consistent experience across multiple institutions, and a community of practice for professional development. Resources for f2f activities could be shared and lessons learned at each local group meeting could inform work across the learning community. These findings represent important observations of the specific value of learning communities for professional developers.
Limitations
This study has some limitations. Survey response rate was strong (67% of students who completed the program). However, since approximately half of the participants did not complete the program, these survey results are not representative of all program participants. Another limitation is in the use of the prior f2f cohort as a historical control. These groups were similar in academic level; however, the f2f cohort had a higher percentage of biomedical science trainees. Since the biomedical population at the institution in that study had fewer prior teaching experiences in their graduate training (Hoffmann and Lenoch 2013), it is possible that the baseline level of teaching skills/experience in these two populations was different. Nonetheless, learning outcomes were very similar by rubric analysis of course design projects, suggesting that regardless of learner background, a similar achievement level could be attained using the two methods. Finally, the study aims to address program outcomes at Kirkpatrick's levels 1 and 2 (reaction and learning outcomes) (Kirkpatrick and Kirkpatrick 2006), but is not able to measure the potential impact of this training on the future workplace of these learners.
Conclusion
This study describes a unique graduate professional development program that served as a learning community for both student participants and course leaders. The successful results of this program encourage development of graduate development programs in the multi-institutional space. Future study aims to improve online interactions through structured guidance and disciplinary groups that provide a specific purpose for online interactions. Further, it will be important to evaluate the longterm impact of this program on readiness for course design work in the learners' professional lives. The observation of distinct benefits for course leaders across institutions highlights that the learning community is larger than the student population alone; deliberate leadership team engagement should be pursued in the design of multi-institutional programs. Together, this study demonstrates that collaboration around project work is a fruitful space for future faculty development and broadening our communities across our institutions has rewards for learners and leaders.
ACKNOWLEDGMENTS
The authors would like to acknowledge the support and encouragement of the CIRTL Network staff and leadership. This project arose from a dinner event at a meeting of CIRTL Network leaders in Irvine, California (2019), and the entire CIRTL team has supported this work through opportunities to share the results of the study and recruit leaders from within the organization.
Darren S. Hoffmann studies anatomy education and development of graduate students as Assistant Professor in the Department of Anatomy and Cell Biology in the University of Iowa College of Medicine (USA). https://ordd.ora/0000-0003-1081-5222.
Katherine Kearns supports development of graduate students and postdoctoral trainees as Assistant Vice Provost for Student Development in the University Graduate School at Indiana University Bloomington (USA). https://ordd.orq/0000-0003-1487-3550.
Karen M. Bovenmyer is the Program Coordinator for the Preparing Future Faculty and Graduate Student Teaching Certificate programs at the Center for Excellence in Learning and Teaching at Iowa State University (USA).
W. F. Preston Cumming, Professional Development Lead in the Center for Teaching & Learning at the University of Colorado Boulder (USA), focuses on graduate student and postdoctoral fellow teacher development. https://orcid.org/0000-0002-8854-8112.
Leslie E. Drane supports the teaching practices of graduate student and postdoctoral instructors as an Instructional Consultant at Indiana University Bloomington (USA). https://orcid.org/0000-0003-0466-427X.
Madeleine Gonin helps instructors create more inclusive classes, particularly in STEM disciplines, as an Instructional Technology Consultant and STEM Specialist at Indiana University Bloomington (USA).
Lisa Kelly, CIRTL Coordinator at the University of Iowa (USA), facilitates improved pedagogy training for graduate students and consults on career exploration and preparation.
Lisa Rohde supports graduate teaching assistants as the Associate Director of Teaching and Research Development at the University of Nebraska-Lincoln (USA). https://orcid.org/0000-0003-0999-3486.
Shawana Tabassum studies sensors, micro/nano-optics, micro fluidic devices, and applications in biomedicine/agriculture as Assistant Professor in Electrical Engineering at the University of Texas at Tyler (USA). https://orcid.org/0000-0001-6812-0437.
Riley Blay is a Master's of Clinical Anatomy graduate student from the University of Iowa (USA).
ETHICS
All consent and research processes were approved with exempt status by The University of Iowa Institutional Review Board (IRB# 201907754). Participation in the research study was voluntary and all students and leaders consented to participate.
REFERENCES
Abeysekera, Lakmal, and Phillip Dawson. 2015. "Motivation and Cognitive Load in the Flipped Classroom: Definition, Rationale and a Call for Research." Higher Education Research and Development 34, no. 1: 1-4. https://doi.org/10.1080/07294360.2014.934336.
Adams, Kathrynn A. 2002. What Colleges and Universities Want in New Faculty [Occasional paper]. Number 7. Washington, D.C.: Association of American Colleges and Universities.
Austin, Ann E. 2002. "Preparing the Next Generation of Faculty: Graduate School as Socialization to the Academic Career." Journal of Higher Education 73, no. 1: 94-122. https://doi.org/10.1080/00221546.2002.11777132.
Austin, Ann E., Henry Campa, Christine Pfund, Donald L. Gillian-Daniel, Robert Mathieu, and Judith Stoddart. 2009. "Preparing STEM Doctoral Students for Future Careers." New Directions for Teaching and Learning, no. 117: 83-95. https://doi.org/10.1002/tl.346.
Bishop-Williams, Katherine E., Kaitlin Roke, Erin Aspenlieder, and Meagan Troop. 2017. "Graduate Student Perspectives of Interdisciplinary and Disciplinary Programming for Teaching Development." Canadian Journal for the Scholarship of Teaching and Learning 8, no. 3: 11. https://doi.org/10.5206/cisotlrcacea.2017.3.11.
Boice, Robert. 1991. "New Faculty as Teachers." The Journal of Higher Education 62, no. 2: 150-73. https://doi.org/10.1080/00221546.1991.11774113.
Bowden, Randall. 2012. "Online Graduate Education: Developing Scholars through Asynchronous Discussion." International Journal of Teaching and Learning in Higher Education 24, no. 1: 42-64.
Boyer, Ernest L. 1990. Scholarship Reconsidered: Priorities of the Professoriate. San Francisco, CA: Jossey-Bass.
Bruff, Derek O., Douglas H. Fisher, Kathryn E. McEwen, and Blaine E. Smith. 2013. "Wrapping a MOOC: Student Perceptions of an Experiment in Blended Learning." Journal of Online Teaching and Learning 9, no. 2: 187-99. https://iolt.merlot.org/vol9no2/bruff 0613.htm.
Condon, William, Ellen R. Iverson, Cathryn A. Manduca, Carol Rutz, and Gudrun Willett. 2016. Faculty Development and Student Learning: Assessing the Connections. Bloomington: Indiana University Press.
Connolly, Mark R., You-Geon Lee, and Julia N. Savoy. 2018. "The Effects of Doctoral Teaching Development on Early-Career STEM Scholars' College Teaching Self-Efficacy." CBE-Life Sciences Education 17, no. 1. https://doi.org/10.1187/cbe. 17-02-0039.
Connolly, Mark R., Julia N. Savoy, You-Geon Lee, and Lucas B. Hill. 2016. Building a Better Future STEM Faculty: How Doctoral Teaching Programs Can Improve Undergraduate Education. Madison, WI: Wisconsin Center for Education Research, University of Wisconsin-Madison.
Cox, Milton D., and Jacquie McDonald. 2017. "Faculty Learning Communities and Communities of Practice Dreamers, Schemers, and Seamers." In Communities of Practice, edited by J. McDonald and A. CaterSteel. Singapore: Springer.
Creswell, John W. 2013. Qualitative Inquiry & Research Design: Choosing among the Five Approaches, 7783. Thousand Oaks, CA: SAGE Publications, Inc.
Fairweather, James S., and Robert A. Rhoads. 1995. "Teaching and the Faculty Role: Enhancing the Commitment to Instruction in American Colleges and Universities." Educational Evaluation and Policy Analysis 17, no. 2: 179-94. https://doi.org/10.3102/01623737017002179.
Felder, Richard M., Rebecca Brent, and Michael J. Prince. 2011. "Engineering Instructional Development: Programs, Best Practices, and Recommendations." Journal of Engineering Education 100, no. 1: 89-122. https://doi.org/10.1002/İ.2168-9830.2011.tb00005.x.
Feldon, David F., James Peugh, Briana E. Timmerman, Michelle A. Maher, Melissa Hurst, Denise Strickland, Joanna A. Gilmore and Cindy Stiegelmeyer. 2011. "Graduate Students' Teaching Experiences Improve Their Methodological Research Skills." Science 333, no. 6045: 1037-39. https://doi.org/10.1126/science.1204109.
Fink, Dee, and Melissa Ganus. 2009. "Designing Significant Learning Experiences." In Aspiring Academics: A Resource Book for Graduate Students and Early Career Faculty, edited by Michael Solem, Kenneth Foote, and Janice Monk, 70-85. Upper Saddle River, NJ: Pearson.
Fleet, Christine M., Meredith F. N. Rosser, Rebecca A. Zufall, Marney C. Pratt, Tracy S. Feldman, and Paula P. Lemons. 2006. "Hiring Criteria in Biology Departments of Academic Institutions." BioScience 56, no. 5: 430-36. https://doi.org/10.1641/0006-3568(2006)056r0430:HCIBDO12.0.CQ:2.
Golde, Chris M., and Timothy M. Dore. 2001. At Cross Purposes: What the Experiences of Today's Doctoral Students Reveal about Doctoral Education. Philadelphia: Pew Charitable Trusts.
Hill, Lucas B., Julia N. Savoy, Ann E. Austin, and Bipana Bantawa, B. 2019. "The Impact of Multi-Institutional STEM Reform Networks on Member Institutions: A Case Study of CIRTL." Innovative Higher Education 44: 187202. https://doi.org/10.1007/s10755-019-9461-7.
Hoffmann, Darren S., and Susan Lenoch. 2013. "Teaching Your Research: A Workshop to Teach Curriculum Design to Graduate Students and Post-doctoral Fellows." Medical Science Educator 23, no. 3: 336-45. https://doi.org/10.1007/BF03341645.
Huisman, Bart, Nadira Saab, Jan van Driel, and Paul van den Broek. 2018. "Peer Feedback on Academic Writing: Undergraduate Students' Peer Feedback Role, Peer Feedback Perceptions and Essay Performance." Assessment & Evaluation in Higher Education 43, no. 6: 955-68, https://doi.org/10.1080/02602938.2018.1424318.
Irons, Alastair. 2008. Enhancing Learning through Formative Assessment and Feedback. Abingdon, UK: Routledge. https://doi.org/10.4324/9780203934333.
Jordan, Katy. 2013. "MOOC Completion Rates: The Data." [Website]. Retrieved from: http://www.katviordan.com/MOOCproiect.html.
Kabes, Sharon, Dennis Lamb, and John Engstrom. 2010. "Graduate Learning Communities: Transforming Educators." Journal of College Teaching & Learning (TLC) 7, no. 5. https://doi.org/10.19030/tlc.v7i5.121.
Kirkpatrick, Donald L., and James D. Kirkpatrick. 2006. Evaluating Training Programs: The Four Levels. 3rd ed. San Francisco, CA: Berrett-Koehler Publishers.
Kizilcec, René F., Chris Piech, and Emily Schneider. 2013. "Deconstructing Disengagement: Analyzing Learner Subpopulations in Massive Open Online Courses." In Proceedings of the Third International Conference on Learning Analytics and Knowledge, 170-79.
Kulkarni, Chinmay, Julia Cambre, Yasmine Kotturi, Michael S. Bernstein, and Scott Klemmer. 2016. "Talkabout: Making Distance Matter with Small Groups in Massive Classes." Design Thinking Research, 67-92. https://dl.acm.org/doi/pdf/10.1145/2675133.2675166.
Li, Nan, Himanshu Verma, Afroditi Skevi, Guillaume Zufferey, Jan Blom, and Pierre Dillenbourg. 2014. "Watching MOOCs Together: Investigating Co-located MOOC Study Groups." Distance Education 35, no. 2: 217-33. https://doi.org/10.1080/01587919.2014.917708.
Macleod, Hamish, Jeff Haywood, Amy Woodgate, and Mubarak Alkhatnai. 2014. "Emerging Patterns in MOOCs: Learners, Course Designs and Directions." TechTrends 59, no. 1: 56-63. https://doi.org/10.1007/s11528014-0821-v.
Marincovich, Michele, Jack Prostko, and Frederic Stout. (eds.). 1998. The Professional Development of Graduate Teaching Assistants. Bolton, MA: Anker.
Onah, Daniel F. O., Jane Sinclair, and Russell Boyatt. 2014. "Dropout Rates of Massive Open Online Courses: Behavioral Patterns." EDULEARN14 Proceedings 8525-34. https://doi.org/10.13140/RG.2.1.2402.0009.
Palmer, Michael S. 2011. "Graduate Student Professional Development: A Decade After Calls for National Reform." In Studies in Graduate and Professional Student Development 14, edited by Laura L. B. Border, 1 - 18. New Forums Press: Stillwater, OK.
Palmer, Michael S., Adriana C. Streifer, Stacy Williams-Duncan. 2016. "Systematic Assessment of a High-Impact Course Design Institute." To Improve the Academy 35 no. 2: 339-61. https://doi.org/10.1002/tia2.20041.
Parr, Chris. 2013. "MOOC Completion Rates 'Below 7 %'." Retrieved from: http://www.timeshighereducation.co.uk/news/mooc-completion-rates-below-7/2003710.article.
Seaton, Daniel T., Yoav Bergner, Isaac Chuang, Piotr Mitros, and David E. Pritchard. 2014. "Who Does What in a Massive Open Online Course?" Communications of the ACM 58-65. https://doi.org/10.1145/2500876.
Shortlidge, Erin E., and Sarah L. Eddy. 2018. "The Trade-Off between Graduate Student Research and Teaching: A Myth?" PLOS One 13, no. 6 e0199576. https://doi.org/10.1371/iournal.pone.0199576.
Skeff, Kelley M., Georgetite A. Stratos, and Merlynn R. Bergen. 1992. "Evaluation of a Medical Faculty Development Program: A Comparison of Traditional Pre/Post and Retrospective Pre/Post SelfAssessment Ratings." Evaluation & the Health Professions 15, no. 3: 350-66. https://doi.org/10.1177/016327879201500307.
Tice, Stacey Lane. 1997. The Relationships between Faculty Preparation Programs and Teaching Assistant Development Programs [Occasional paper] Number 4. Washington, D.C.: Association of American Colleges and Universities.
Trigwell, Keith. 2013. "Evidence of the Impact of Scholarship of Teaching and Learning Purposes." Teaching and Learning Inquiry 1, no. 1: 95-105. https://doi.org/10.2979/teachlearninqu.1.1.95.
Wenger, Etienne. 1999. Communities of Practice: Learning, Meaning, and Identity. Cambridge, UK: Cambridge University Press.
Wiggins, Grant, and Jay McTighe. 2005. Understanding by Design. 5th ed. Alexandria, VA: Association for Supervision and Curriculum Development.
Yang, Miao, Richard Badger, and Yu Zhen. 2006. "A Comparative Study of Peer and Teacher Feedback in a Chinese EFL Writing Class." Journal of Second Language Writing 15, no. 3: 179-200. http://doi.org/10.1016/i.islw.2006.09.004.
Yarbro Jessica, Kari M. Arfstrom, Katherine McKnight, and Patrick McKnight. 2014. Extension of a Review of Flipped Learning. 1st Ed. Fairfax, VA: Flipped Learning Network, Pearson, George Mason University. https://flippedlearning.org/wp-content/uploads/2016/07/Extension-of-FLipped-Leaming-LIt-ReviewJune-2014.pdf.
Zheng, Zhilin, Tim Vogelsang, and Niels Pinkwart, N. 2015. "The Impact of Small Learning Group Composition on Student Engagement and Success in a MOOC." In Proceedings of the 8th International Conference of Educational Data Mining, 500-503.
APPENDIX: CASE STUDIES OF LOCAL FACE-TO-FACE GROUP DESIGN
Below, leaders of the f2f sessions at each participating TYRIT program institution describe the design of their local programs and the rationales behind their design decisions. In these reflections, leaders describe how they made lesson plan decisions based on participants' motivations and in-themoment concerns.
Indiana University Bloomington: Leslie Drane and Madeleine Gonin
Our graduate student and postdoctoral scholar participants joined with several different concrete application contexts in mind. Several participants were planning to teach their proposed courses the following fall semester. Other participants intended to submit their course proposals as part of competitive applications for graduate students to teach courses of their own design at our institution (e.g., in their programs; Collins Living Learning Center; Global Village Living Learning Center; and the Future Faculty Teaching Fellowship). Finally, many participants were preparing to apply for academic jobs and wanted to include a course proposal in their application materials. All learners who completed the workshop series received recognition through our certificate-like Graduate Teaching Apprenticeship Program, which aligns with the CIRTL Program.
We focused our f2f activities on participant-led feedback. We recognize that our participants arrive at our learning communities with prized and unique insights, knowledges, and worldviews that can help make their colleagues' products stronger and more inclusive. In centering participant voices, we wanted to build their confidence in providing valuable knowledge. Thus, the majority of our f2f activities focused on our goal of giving participants space and time to get feedback on their products from other participants. As facilitators coming from a center for teaching and learning, we provided general meeting outlines, resources on course design and teaching strategies, and training for participants to learn how to give valuable feedback. Participants were able to spend more time in groups of four to six sharing portions of their syllabus, providing written and verbal feedback, and having in-person conversations about their thoughts. Because of participants' feedback that they enjoyed being in the same groups, as they did not have to re-explain their syllabus topic, we only rotated the groups a couple times throughout the learning community.
Iowa State University: Karen Bovenmyer and Shawana Tabassum
Iowa State University graduate students and postdoctoral scholars participated to develop a course proposal for a teaching demonstration for an academic interview. In addition, participants were interested in how to use this course design to apply for national grant programs. Participants received course credit and earned a certificate from both the course and the CIRTL and PFF programs related to their effort and involvement. Some students attended via video conferencing software for all or some of the sessions, while many attended regularly in person. The sessions were facilitated by a postdoctoral associate, Dr. Tabassum, with invited institutional faculty members, Drs. Ratnesh Kumar and Mani Mina, as co-leaders. Dr. Tabassum focused on sharing exercises, while Dr. Kumar focused on the broader picture on how what they were creating in the workshop would be used on the job. Dr. Mina discussed how these materials could be used to secure educational grants such as NSF RED. In general, f2f activities from the traditional TYRIT design were used, with a high level of participation. Dr. T abassum made some modifications chiefly because many students wanted to lead discussions for our sessions. For example, a participant developed a presentation for others in the program explaining more on what grants were available and how they could apply. Dr. Tabassum used formative assessment strategies as the workshop series progressed to reshape structure based on student feedback. For example, students wanted more time to complete some of the activities, so later activities were restructured to create that time. Several times, group members stayed longer than the scheduled hour helping each other.
University of Colorado Boulder: Preston Cumming
Graduate student and postdoctoral fellows joined the workshop series as a means to help develop an upcoming course, while others joined in order to better understand the process for future faculty positions. Having several graduate students who had worked with the Graduate Teacher Program and subsequent Center for Teaching & Learning in the group, along with several postdoctoral scholars that had taught, was a huge advantage in the ultimate design of the f2f meetings. Many participants were teaching their own courses in the fall of 2019 whether on the University of Colorado Boulder campus, or as early career faculty and postdoctoral fellows on other campuses. It made me proud to see how we started and how it ended so well for so many of them. All participants received workshop credits towards either a Certificate in College Teaching or a Future Faculty Development Certificate in our CTL.
In terms of our f2f meetings, we began by following up on the online workshops content. We gave the space for each of us to give feedback on each other's questions and discussed the general ideas of the online work. We then moved into our own discussions and modeled other institutional themes of using an elevator speech during week one with peer feedback, then moved into Bloom's Taxonomy, building on types of materials used and assessment. After the first four weeks, we began moving to more organic types of discussion where we would talk about issues participants were running into designing their courses, how we could collectively help them, and what methodologies and pedagogies were used across the diverse disciplines. The participants began to really mentor each other through the process and I was merely there to guide and facilitate the discussions.
University of Nebraska, Lincoln: Lisa Rohde
As our CIRTL program is located in the Office of Graduate Studies and not a teaching center, we are able to tap potential student pools from individuals who had never been involved in teaching programming, but wanted to learn more for the future career aspirations, especially postdocs. In advertising the program we did note that it would be part of our local CIRTL offerings as we try to connect all teaching programming to CIRTL. The class could be counted toward CIRTL Associate Status at our campus which requires completion of one pedagogy class among other workshops and activities. I facilitated the f2f sessions of Transforming Your Research into Teaching in my role as our institution's CIRTL co-lead and as part of my responsibilities as Associate Director of T eaching and Research Development within the Office of Graduate Studies.
The activities for f2f classes were largely focused on peer review activities or helping students address particular concerns that had emerged that week or the previous week. While I initially started providing more of a structure and a small amount of content to the f2f meetings, it quickly became apparent that students benefited from a little more freedom in our f2f meetings. Since many had never taught previously, thinking through the aspects of course design and even some of the pedagogical concepts were entirely new to them. By giving them more freedom, but by also creating a supportive environment in the classroom, they had a place to ask questions about teaching that they often did not have a forum for within their departments. While none of the participants have yet used this class toward CIRTL certification, several participants were grateful for the opportunity to develop teaching skills and design a course that they could take on the job market, even if they never have a chance to teach it.
The University of Iowa: Lisa Kelly
Participants from The University of Iowa were recruited from a pool of individuals who had previously participated in workshops on teaching or demonstrated an interest in developing teaching skills while in graduate/postdoctoral training. These individuals expressed a strong interest in having a tangible product to take away from the program (the syllabus project). Learners also received CIRTL Associate-level recognition from the The University of Iowa for their participation in the program.
Each week our two instructors, Drs. Darren Hoffmann (faculty in the College of Medicine) and Lisa Kelly (CIRTL Program Coordinator in the office of Graduate Student Success), along with our graduate teaching assistant Dr. Sarah Sapouckey (Graduate student in Molecular Medicine), would plan activities or discussion prompts and alternate facilitation leadership. Our f2f sessions focused largely on peer review and discussion. Some weeks we engaged students in more organized activities that could help them learn and practice concepts, such as creating effective learning objectives or deciding which kinds of learning activities were most appropriate for teaching different types of content or skills. We also have learned to be flexible in order to let strong discussions run their course rather than keeping to our original plan. Students remarked throughout the process that having the opportunity to discuss their course ideas with other students both close to their discipline and across a variety of disciplines was helpful. They took discussion prompts seriously and focused attention on helping their peers improve their weekly assignments and course plans. We chose to change groups up each week for the first few weeks so that students would get feedback from a variety of people with different disciplinary backgrounds. By the second half of class students often worked with the same people so that there was more familiarity with each other's projects. We feel that there is benefit to both randomized groups and groups that stay together over time, and it was fortunate that our students experienced both kinds of group work over the duration of the summer. It seemed to work well to get feedback from a broader peer group in the beginning of the workshop when students were working with big questions about their course and what they should include. As courses became more specialized and they moved into assessments and learning activities, a more discipline-specific expertise helped them choose effective techniques.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2021. This article is published under https://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
In this study, graduate students and postdoctoral scholars participated ¡n a hybrid, multiinstitutional workshop series about course design. Trainees developed college courses based on their research expertise, posting works-in-progress to a shared, online drive for peer review and collaboration. Learners also met weekly with local facilitators at their institution. The program led to similar learning outcomes as when the program was previously run in a faceto-face only format at one institution. However, the multi-institutional design led to additional benefits, especially for leaders at each institution, who described a rich learning community in their collaborative work.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 UNIVERSITY OF IOW&Acedil;
2 INDIANA UNIVERSITY BLOOMINGTON
3 IOWA STATE UNIVERSITY
4 UNIVERSITY OF COLORADO BOULDER