Content area
Purpose
Researchers piloted a problem-based learning (PBL) activity in a master’s degree-granting strategic studies program to explore how students apply knowledge and skills learned from the curriculum to their formulation of a strategy addressing a real-world global security scenario.
Design/methodology/approach
This mixed-methods pilot study used ethnographic observation, participant feedback, document analysis and surveys to assess the learning and engagement of multinational postgraduate students in the context of a PBL environment.
Findings
Findings revealed gaps in students’ causal logic and literacy, as well as student discomfort with ambiguity and reliance upon heuristic frameworks over willingness to conduct substantive, current and relevant research. Additionally, observed group dynamics represented a lack of inclusive collaboration in mixed gender and multinational teams. These findings suggest foundational issues with the curriculum, teaching methodologies and evaluation practices of the studied institution.
Originality/value
This study highlights the need to include explicit instruction in problem-solving and causal literacy (i.e. logical reasoning) in postgraduate programs for national and global security professionals, as well as authentic opportunities for those students to practice interpersonal communication.
Introduction
Postgraduate strategic studies programs are prime learning environments for the integration of problem-based learning (PBL) pedagogy, as faculty and staff are charged with preparing students to adapt to an ever-changing global security landscape. Like other learning environments where PBL is effectively used, such as medicine and law (Houghton, 2023; Liu et al., 2023), the stakes are high. Graduates of strategic studies programs advance to leadership roles with increasing levels of influence and responsibility for national security in their respective countries. This kind of education necessitates a focus on simulations as well as historical and contemporary case studies, in which students are asked to confront complex real-world problems, widely used at both the undergraduate and graduate levels (Hunzeker and Harkness, 2014).
Problem-based learning both motivates students to seek out a deeper understanding of disciplinary concepts and requires students to make and defend decisions (Duch et al., 2001). Though studied in a variety of professional contexts – particularly business and medicine (Brownell and Jameson, 2004; Duch et al., 2001; Martyn et al., 2014; van den Bossche et al., 2004) – before now, PBL had yet to be rigorously studied in the context of postgraduate strategic studies programs.
During a ten-month master’s degree-granting academic program in the United States, we conducted a pilot study to explore how implementing a problem-based learning activity in a required foundational course related to multinational students’ ability to apply what they learned in the curriculum to their formulation of a strategy addressing a real-world global security scenario. In strategic studies, strategy, or strategic thinking, exists to address difficult national security and military challenges. It entails the creation of new courses of action and their associated frameworks. Finally, strategy is an activity that, in professional national security settings, is done with others; therefore, strategic performance must be evaluated not only individually, but in its full interpersonal dimensions (Perez, 2018). Strategy is also problem-solving and, as Carr (2024) describes, this means strategists must “first diagnose them [problems] and identify and interrogate their key dispositions through probing, sensemaking, and seeking feedback” (124). These are all activities that happen in the classroom, oftentimes together with classmates.
This pilot study allowed for an unprecedented look at strategic performance for postgraduate students while also providing an initial framework for assessing one program’s approach to strategic education. Two research questions, “How do students translate knowledge of strategy into performance?” and “How does the completion of the core curriculum influence students’ performance in problem-solving exercises?” drove this effort, which was framed as a pilot study to allow for experimentation in methods, design and socialization of PBL activities among the school community.
PBL as a way of knowing, assessing and communicating causal stories
First introduced in the 1970s in medical education as an instructional strategy to teach the process of patient diagnosis, PBL has now been adopted throughout all levels of education as a curriculum design framework and, in some cases, a philosophy of education (Boud and Feletti, 1997; Savery, 2006). Our study was guided by Savery’s (2006) foundational definition of PBL as “an instructional (and curricular) learner-centered approach that empowers learners to conduct research, integrate theory and practice, and apply knowledge and skills to develop a viable solution to a defined problem” (9). Pedagogically, our framing is helped by Ssemugenyi’s (2023) conceptualization of PBL “as a process through which an instructional strategy stimulates learners to actively construct knowledge as they try to comprehend learning tasks” (4). The inclusion of PBL in the studied school’s curriculum, which is otherwise structured as highly lecture- and discussion-based, promised a unique opportunity to assess students’ ability to align theory and practice while solving a realistic problem via formulating an original strategy, something they would be expected to do in their professional roles after graduation. Additionally, students were expected to effectively collaborate with international peers, another requirement for success in their work performance after graduation. In this way, the PBL activities married components of performance-based and authentic assessment.
Performance-based assessment can capture students’ development in problem-solving and analytical thinking, as well as their comfort with ambiguity (Tkatchov et al., 2020). In performance-based assessment, students demonstrate mastery of knowledge and skill through actual performance of an activity, versus through other means such as a multiple-choice test or essay. Recent scholarship explores the positive relationship between performance-based active learning and assessment strategies and employment outcomes (Perez-Encinas and Berbegal-Mirabent, 2023).
Authentic assessment further distils this understanding of assessment to include fidelity of environmental context and resulting limitations and opportunities. Authentic assessment demands of the student what might be demanded of them if they were performing the same activity in real life, whereas, as Wiggins (1990) writes, other forms of assessment often rely “on indirect or proxy ‘items’—efficient, simplistic substitutes from which we think valid inferences can be made about the student’s performance” (1). Both performance-based and authentic assessment feedback practices should likewise reflect the style of feedback experienced in reality (Dawson et al., 2021). At the intersection of performance-based and authentic assessment, PBL is the embodiment of situated cognition. The process of knowing, sense-making and communicating that PBL requires is thereby all at once a “product of the activity, context, and culture in which it is developed and used” (Brown et al., 1989, p. 32). This framework of PBL as a situated way of knowing and as an authentic, performance-based way of assessing makes it uniquely promising for workforce and professional development programs, such as the one we studied.
Ideally, PBL activities require students to forge connections, make substantiated arguments and consider causal logic with their recommendations and theories of action (Jonassen, 2011); in short it requires them to create causal stories that link proposed actions to desired outcomes. Autesserre (2017) finds that a failure to think through a theory of action is common among those who intervene in crisis situations, as national security professionals – the students we studied – often do. These practitioners’ theories of action “often include massive leaps in logic” and “rely on a tenuous causal chain” (Autesserre, 2017, pp. 6–7). Within the sphere of national-security and military affairs, Rapport (2012) finds that practitioners fail to think through the feasibility of long-term desired outcomes. Meiser (2016) observes a similar phenomenon, which he names “PowerPoint nirvana,” in military classrooms. He argues that so long as students depict certain activities or groups of activities in a PowerPoint slide to a desired outcome, there will be an absence of tight causal reasoning (Meiser, 2016). Finally, Hoffman (2020a, b), too, observes that causal logic should be an integral part of postgraduate education in strategic studies and resultant strategic practice while teaching via PBL, as Ssemugenyi (2023) asserts, is also one type of teaching and learning interaction in a “causal chain” (14). Our pilot study provided an opportunity to consider how a PBL intervention can influence students’ overt demonstration of causal logic in their strategy formulation and strategic thinking.
Materials and methods
We used a mixed-methods approach to study the learning and engagement of adult students at an in-residence master’s degree-granting college in the United States. Our design followed Deignan’s (2009) framework of school/research site as subject which applies PBL “as an educational tool to work on the object, i.e. the learners, with the intended outcome being the learners’ development within and beyond the university activity system” (14). This pilot study provided valuable data on what the process of student learning looks like, how student learning changes over time, and if students learn better due to both the intervention of the problem-solving exercise and the core curriculum itself.
Setting and participants
We submitted our research protocol, including data collection and analyses methods, to the studied institution’s Office of Institutional Assessment for ethical review. It was determined by the Human Protections Administrator to be human subject research but exempt under the provisions of 32 CFR §219.101(b), paragraph 1. With their consent, two seminars of students at the school were selected to participate in a PBL activity in a required Introduction to Strategic Studies course, the foundational course for 378 students in the program. The student body includes students from 73 countries. Many of the students already held graduate degrees in other fields and had approximately 15–20 years of professional experience when they entered the program. Most of the instruction for this population occurred in small-group seminars of 16 students each. Each seminar was taught by a three-person faculty team of experts in national security, leadership and military operations. Two seminars (one test seminar and one control seminar) of 16 students each were selected from the 24 total seminars. Purposeful sampling was used to select seminars that each included at least two female students and at least three international students. All three authors were faculty members at the school at the time of the study, although not responsible for teaching the course in the observed seminars.
Data and procedures
Data sources for this study included: ethnographic observations from the research team during the problem-based learning activities; feedback from the teaching teams of both the test and control groups; student artefacts/products; student feedback from a final survey administered after the last problem-based learning activity; and completed grading rubrics containing both numerical evaluations and comments from faculty on the final student presentations. The grading rubrics were identical to those used throughout the entire strategic studies program in every course and are included in Appendix. We sacrificed control over the rubric criterion and standards of performance in order to maintain consistency with the assessment instrument used across the entire school for every part of the curricula. Sample criteria in the rubrics included: oral communication (organization, clear communication of relevance to audience, substantiating evidence, nonverbal and verbal delivery) and strategic thinking (integration of concepts within and between courses, reflexive challenging of assumptions, demonstration of critical and creative thinking skills, identification of implications, application of ethical perspectives and application of historical insights).
The study began at the very start of the first course in the core curriculum. The control group completed the five-day foundational course, made up of five lessons, with no variations from the syllabus and in alignment with the rest of the resident education program cohort. The test group completed the first four regular lessons from the course and an accelerated version of the fifth lesson. On the afternoon of the fourth day of the course, the teaching team for the test group introduced an interdisciplinary real-world problem oriented on a historical case study (the 1990 Persian Gulf War) that had already been discussed in the course. The prompt, developed by the research team, was: Develop a regional strategy to reinforce regional stability and prevent proliferation of weapons of mass destruction during the 1991–2000 timeframe.
The test group students then had three hours to collaborate with their seminar classmates in small groups (two groups of eight students each) to address this problem. They chose their own method of self-organization within the small groups. This choice was a valuable data point in understanding how students navigate group dynamics and the process of problem-solving. The role of the teaching team was observational, although they did clarify the problem statement, as needed.
Students were advised that they could create relevant products to aid in their final briefing, such as:
Creating a graphic and narrative description of the relevant environment(s) necessary for a comprehensive understanding of the real-world problem. These descriptions could include those factors and the relations among them that compose the strategic situation.
Creating a list of desired outcomes that any adequate strategy would realize.
Creating a list of strategic interventions, integrative of all the elements of national power, whose aim is to realize the desired outcomes.
Specifying how the military instrument of power, integrative of joint and multinational considerations, contributes to the proposed strategy.
Creating a strategy to build a coalition among those stakeholders whose approval is necessary for changing the proposed strategy into the chosen, official strategy.
On the morning of the fifth and final day of the course, students had 90 min to finalize their work prior to their delivery of a final oral presentation to the teaching team. This bounded timeframe was an important part of the authentic assessment, as decision-makers typically have a set amount of time in which to formulate such strategies. The presentations were formatively assessed by the teaching team via a rubric that evaluated oral communication and strategic thinking and that is commonly used for multiple assessments throughout the college, including oral comprehensive exams.
From August until March, both the test and control groups completed the core curriculum as standardized across the college, with no changes. At the end of the core curriculum, both the control and test groups met separately and concurrently for three hours. Within both the control and test groups, the students self-organized into two groups of eight students to complete a problem-solving exercise. The PBL prompt for this second iteration was created by the research team and intended to address a contemporary global security issue: Assess the United States’ position in Syria in light of the president’s decision to leave a small number (approximately 200 to 400) troops in the country. Outline a wholly new strategy, both to compete and cooperate, to further U.S. interests in the region.
Following principles of curriculum design aligned with realities of authentic practice (McMillan et al., 2018; Strobel and van Barneveld, 2009), on the following day, both the control and test groups briefed their findings to an audience that included not only their teaching teams, but also senior administrative leaders of the school such as the university’s president and the provost. This audience evaluated each of the four groups with the same rubric used for the initial assessment of the test group. The audience members were blind to student assignment to either the control or test group. Faculty instructors from the teaching teams from both the test and control groups simultaneously evaluated all student groups for the purposes of interrater reliability.
At the conclusion of the exercise, students completed a short, optional, anonymous survey that asked them to:
Briefly describe the working relationships within your briefing group. Were specific work roles assigned within the group? Did a single leader emerge? Did you find the group to be collaborative?
Describe the types of sources your group used to inform your strategy development (e.g. policy documents, doctrine, scholarly articles, etc.).
Which theories or concepts from the core curriculum did you draw on most heavily in your work?
Please share any additional comments about this exercise.
Data collection for this survey was done via the tool SurveyMonkey and all responses were included in our resultant analysis.
Analysis
During all iterations of the problem-solving exercise, we assumed an ethnographic posture to capture the following details via descriptive and structural coding:
What conceptual categories (frameworks, theories, models, etc.) do students employ in their work?
What open-source (i.e. unclassified) resources do the students consult, and how extensive is this consultation?
How extensive is the richness of the students’ work?
How many and what type of factors do the students consider in their description of the environment and their proposed strategies?
How many explicit causal claims do the students specify regarding the environment, their proposed interventions and their strategy to build a coalition among stakeholders?
How many and what type of factors do the students consider as they develop their list of desired outcomes?
To what degree do the students participate in collaboration and dialogue?
How often and over what topics do these students engage in debate, and how do the students respond to these instances?
Finally, changes in learning between the fall and the spring for the test group were analyzed based on comparing the test group’s first iteration rubrics to its final rubrics. We compared the rubrics for the spring iteration of the exercise for both the test and control groups to illuminate any differences and potential advantages the test group may have experienced due to having undergone the PBL intervention in both the fall and the spring. This was done using numerical analysis of the rubric scores as well as textual analysis of narrative comments included on the rubrics.
Results
Assessing students’ performance
Evaluators awarded only one score above “Meets Standards” in the fall iteration of the task, with average rubric scores between 2.75 and 3.10 out of 5.0. Students from the test group scored higher on the spring version of the task, with average rubric scores between 3.00 and 3.30. However, it is worth noting that the control group performed comparably to or slightly better than the test group (rubric scores between 3.2 and 3.5) in the spring iteration, which may be due to student composition of the seminar, student motivation, a different faculty team, or other confounding variables. Therefore, it is not possible to isolate the effect of the test group’s improved scores between the fall and spring iterations to their prior experience with a PBL intervention [1].
Application of curricular materials
Students largely did not apply concepts or materials from their coursework in their problem-solving activities. Although by the time of the intervention the students had completed coursework on the topics in question, students neither referenced this work, nor applied concepts associated with it. In the spring exercise, students settled comfortably into discussion about well-known components of the problem statement but neglected to explore possible strategies in ways that indicated a comprehensive scan of the environment had been considered, as they had been taught. In general, we observed that the students’ own intuition and experience, relatively uninformed by the curriculum or consideration of the aforementioned topics, most informed their discussions and briefings.
Lack of substantive research to address the problem
Students conducted very little substantive research as they progressed through the exercise in both the fall and spring iterations. Sixty percent of the students who completed final surveys indicated that they looked to official “Policy Documents” as a guide. However, no group proceeded from this research to a granular study of the environment in question. Several students reported relying only on “News Articles” and “Articles from the internet” in the research phase of the task. In addition, three students reported a primary reliance on “Personal Experience, Professional Judgment, and Common Knowledge” in their research process. Only three students reported using “Scholarly Articles” as sources in their research.
Reliance on problem-solving heuristics
Although the students generally did not integrate substantive curricular materials or seek out relevant scholarly materials to integrate into their work, they did summon three template approaches to thinking about national security or military affairs that were a part of the curriculum, including an in-house framework for understanding the elements that affect strategy formulation in the United States called the Strategy Formulation Framework. Each of the four groups in the spring exercise immediately raised these three template approaches as options for organizing their work; however, in no case was an application of a template approach anything more than superficial. In each case, students evinced a lack of confidence in the approach they should be using for the wholesale revision of policy and strategy demanded by the prompt. For instance, a student in one group asked, “Are we just using the Strategy Formulation Framework?” which reveals a failure to understand that the framework is not a procedure for how to do problem-solving.
Each group ultimately decided upon a different combination of approaches. Students also seemed to misunderstand how these frameworks and tools applied to the problem. For instance, one group attempted to use the Strategy Formulation Framework as a stepwise problem-solving procedure; however, the framework is merely a depiction of factors that feed into the creation of strategy. The framework is not a decision-making procedure, but a collection of elements relevant to a given body of enquiry (Ostrom, 2010). One lone student in the group did note the misapplication of the framework, asking, “How will this [the Strategy Formulation Framework] get us where we want to be?” Given that this capstone exercise occurred at the end of the year, the students’ difficulty in organizing for problem-solving work, to include selecting a method to follow, is notable.
There is some indication that the students learned and used terminology from their coursework or from the broader community of national security; however, the students did not display a sophisticated or critical employment of these terms to advance their thinking or strategic work. Students would merely mention these terms, but they would neither define them nor describe how the concept helped provide insight into the problem and thereby their strategy recommendation. All groups also used non-specific boilerplate language in their processes and presentations (e.g. “We should prevent terrorism,” “Use humanitarian aid,” and “We need to shape the narrative”).
As expected (since these concepts had not yet been formally introduced in the curriculum), the groups from the fall iteration of the task used few core course concepts, while the groups in the spring iteration of the task employed a much wider array of course concepts. At the same time, some of the students in the spring term displayed confusion about course concepts, which is of great concern considering the late date of task administration within the school year (seven months into the 10-month program).
Discomfort with ambiguity and strategy formulation
Students were visibly stultified when asked to create a “wholly new strategy” via the spring prompt, despite their education up to that point. Students in the fall iteration of the task were challenged by what and how to prioritize within strategy formulation, while students in the spring task iteration in both the test and control groups were challenged by what course concepts to use and how to begin formulating a strategy. We observed students repeatedly turning to their faculty in the classroom for clearer guidance on not only desired outcomes and products, but also on process questions such as, “Where do we start?”, corroborating O’Brien et al. (2022) findings.
Students’ analyses were limited to real-world policy guidance, which – in this exercise – should not have been considered given that both the fall and spring prompts requested newly revised policy and strategy unlimited by current guidance. Although students did not consult scholarly or curricular literature to inform their work, they did tend to consult statements from U.S. policymakers or policy and strategic documents. Despite the global security context of the prompts and the multi-national makeup of the student body, e quivalent documents from nations other than the U.S. were not considered. Although the students were given wide latitude to do strategic work unencumbered by previously issued policy and strategic guidance, our findings indicate that they seemed to require a hierarchically provided handrail to guide their efforts. This habit of mind is of interest to the extent that, in real-world strategy formulation, official guidance – because it leaves latitude in selecting strategic options – nevertheless still requires a fine-grained study of the environment. The trend evident among all groups and in both the fall and spring iterations was that students looked instinctively to domestic strategic guidance while they neglected a granular, causal analysis of the problem in a global context.
Causal understanding
Students’ neglect of casual understanding was evident when examining how their proposed actions linked to a desired effect or end. For example, students often called for the need to “broker peace with our partners,” but the detailed understanding of how such negotiations might play out were simply ignored. An understanding of various countries and organizations’ bargaining positions and aims is necessary for diplomatic interventions to work. So, too, is an understanding of the likelihood of success, by country and organization, for each of these endeavors. Students neglected these positions, with the result being that the causal link between diplomatic efforts (“broker peace with partners”) and real-world agreement among stakeholders over how to achieve a negotiated settlement was never considered.
Although each group failed to specify the causal linkages between its proposed interventions and real-world outcomes, one group’s efforts especially illustrated this problem. This group’s proposed strategy consisted of four interventions under the heading “ways to cooperate,” including “shape narrative, support coalition, reassure partners, and diplomatic efforts.” The group’s four interventions under a second heading, “ways to compete,” included “shape narrative, military presence, provide aid, and respond to violations.” Students did not provide any causal story linking these actions to the achievement of U.S. objectives.
Group dynamics
Twenty out of 34 students (59%) responded to the optional survey at the conclusion of the task. The bulk of the student respondents described their group’s dynamics as collegial. Unfortunately, we were unable to collect demographic information from student respondents and so we do not know if a subset of students either saw the dynamics as neutral or negative, or failed to respond entirely. However, we did observe several areas of concern regarding group dynamics.
Among the groups, dominant personalities emerged within the first 10–20 min of interaction. This was consistent across test and control groups and in both fall and spring iterations of the exercise. In the initial stages of brainstorming, the emergent and self-appointed leaders in all groups delegated tasks and occasionally dismissed ideas from other group members with whom they did not agree. In one instance, a student who had a different opinion than the group leader physically removed themself from the group (moving to another side of the room entirely) and instead worked individually on a task. This dynamic severely limited productive debate over ideas and possible solutions among the students and was likely one of the factors that resulted in shallow strategic thinking.
Another concern was that we observed women and international students being side-lined to silent and/or clerical roles. This dynamic was observed in both the fall and spring and to some extent in all four of the student groups. In only one case was an international student asked explicitly for input by their peers and encouraged to participate, and that student was from the region in question. Only one time across all groups was a woman student explicitly asked for input. Overall, a lack of appreciation for a multiplicity of perspectives was evident.
Task motivation
There is evidence that the students in each of the four groups were unmotivated to expend the energy necessary to complete the task. In the fall, one observer shared, “The students approach this effort as mainly a discussion, not as rigorous work.” Similarly, in the spring iteration, a different observer noted the same trend, “There’s a sense that since this assignment is ungraded, only minimal effort is needed.” The faculty teams’ original instructions to the students with the orientation to the task as low stakes, particularly noted in the test group, may be a partial explanation of the limited strategic thinking demonstrated in the task.
Discussion
In this section, we discuss our findings together with complementary recommendations for educators, higher education administrators and policymakers.
Optimization of curriculum
Although students from the test group markedly improved in their observable implementation of heuristics, course concepts and knowledge of strategy from fall to spring, there was still some confusion over basic terms and models. To address this, we recommend instituting the following reforms for not only the studied site, but comparable security studies programs and indeed any graduate and executive education program seeking to develop students’ critical thinking and communication skills: (1) implement frequent formative and summative assessment of specific models/terms/knowledge in the curriculum, (2) create opportunities for students to think across courses to analyze, synthesize and create strategic products, (3) facilitate formal instruction in research, writing and presentation early and frequently within the curriculum and (4) continue to use this and other problem-based learning tasks to give students practice formulating strategy under as authentic of conditions as possible, resulting in realistic strategic products. Further, additional practice taking on the perspectives of stakeholders (analytic perspective-taking) to think through the implications of actions would yield substantive benefits on a task like this and professional performance after graduation.
Frequent practice and formative assessment of strategic performance
Problem-based learning assessments are essentially situational judgment tests, wherein the students’ experience-based or tacit knowledge is evaluated in addition to their “non-academic abilities” related to interpersonal interactions (Webster et al., 2020). This has implications for environmental assessment and formulation of appropriate courses of action (Cianciolo et al., 2006, 2011). As students were not regularly asked throughout the core curriculum at the school to outline/create a wholly new strategy nor present it to leaders, this may be regarded as an unfair task. However, it is indeed representative of the type of strategic task that students should graduate the program ready to perform and so in line with outcomes-based education (Mulready-Stone, 2024).
Indeed, PBL is such a vital instructional strategy in professional development programs of instruction precisely because of its authenticity. When instituting PBL activities, it is vital that the task(s) be scaffolded in such a way that students are at least familiar with some of the concomitant tasks prior to PBL instruction, and that sufficient curricular time be allotted for the activity. At the time of the study, students in this program received little instruction in researching and using research outside of formal academic research papers. This study indicates that faculty and administrators may be doing students a disservice by forgoing formal instruction on the uses of research in problem-solving tasks.
Savin-Baden (2020) explains this type of PBL instruction as a constellation “for critical understanding” (5), and the marriage of research and problem-solving tasks, as well as communication skills development, could work as what she calls a curricular assemblage. To address the skills gap revealed by this study’s findings, professional development leaders could ensure their programs and curricula are instructionally sound as a cohesive, scaffolded system (i.e. an assemblage). Desired research competencies such as “know how to elaborate a contextual framework” (Lopez, 2022), for example, could be deliberately included in multiple lessons across not only a course, but the entire program, with associated formative and summative assessment strategies.
Emphasis on normative and ethical evaluation
Students did not engage in much substantive debate about their proposed resolution to each of the two assigned tasks, i.e. the strategic ends the United States should pursue. When such discussion did arise, the substance of the students’ proposed solutions lacked familiarity with real-world matters related to the topic. Relatedly, \w e also noted that no group engaged in sustained discussion of the ethical aspects of the spring prompt (whether in terms of the effect of the civil war on the Syrian people or in terms of the application of the just-war framework of criteria to U.S. involvement in the Syrian war). This lack of normative and ethical exploration is related to and accommodated by the students’ lack of research.
To maintain PBL as an authentic assessment tool, it is important that students be evaluated on the feasibility of their proposed solutions. There are multiple extant feasibility analysis frameworks across diverse fields of study (Dando et al., 2022; Jewell and Cherp, 2022; Ssegawa and Muzinda, 2021), especially in the business and scientific communities. For graduate and executive education students, it could be especially valuable to task them with evaluating these existing frameworks and/or creating their own based on their knowledge of the curriculum before applying this feasibility framework to their strategy formulation.
Strategy development
There is a clear need for more rigor in crafting causally literate strategic interventions. To this end, the PBL activities revealed some weaknesses in the curriculum. Namely, students who had gone through the curriculum had, by the spring, still relied almost solely on heuristics and were stymied by the challenge of creating an original strategy that reflected their understanding of local dynamics of the given environment. The ability to think through cause and effect during strategy development is vital for this program’s students. In this case, the students failed to move beyond boilerplate interventions or examine the probable efficacy of these efforts. As Wang and Hoffman (2020) explain, students should “understand the emic perspectives of the people they work with” to benefit their global citizenship education (449). With this in mind, it may be necessary for the educators of the studied program and programs like it to be more deliberate in teaching perspective-taking skills to their students. Unfortunately, this recommendation is at odds with recent resourcing decisions in certain strategic and security studies programs (Hoffman, 2020a, b).
Group dynamics
As we witnessed in this study, students’ first instincts to delegate, separate, regroup and then norm in the performance of team-based PBL tasks are reductionist and at odds with the spirit of interaction and emergence often found in complex adaptive systems, of which the adult seminar environment is one. Cohort formation is extremely influential for learning, and success is tied to diverse perspectives shared – not delegated – within that cohort. As Mason (2009) explains, when this complex adaptive system is working well within the learning environment, individual student assessment becomes challenging because few salient factors of impact can be objectively identified and attributed to individual students. Faculty within PBL environments must therefore be comfortable with creative assessment practices for these sorts of exercises, and student teams should resist their natural urge to divide, concur, then regroup.
Students should also learn to embrace productive conflict in PBL environments, as absence of conflict prevents disruption in complex adaptive systems and therefore inhibits the emergence of new ideas (Zimitat and Miflin, 2003). Finally, PBL can only work as intended to help students appreciate multiple perspectives if those perspectives are allowed to arise (Brownell and Jameson, 2004). Students may need guidance to appreciate and make space for the unique perspectives that their classmates bring to a problem and, as previously stated, faculty must be appropriately trained to help them do so.
Research
The importance of evidence-based research and resultant policymaking competencies for those working in strategic studies and related fields is well-documented (Göçoğlu and Demirkol, 2023), highlighting the importance of our finding that most students did not consult research-based resources. This finding also aligns with Jonassen’s (2011) caution that the most common weaknesses in student arguments that arise during PBL experiences include lack of counter-argumentation, selective use of evidence that supports claims, demonstration of a greater conviction to personal belief than to evidence, reliance upon overgeneralization from a single source of evidence, and the use of unsupported assertions. In our study, we found that when given the opportunity to avail themselves of an unlimited variety of empirical research materials to develop their strategic responses to the problem sets, students defaulted instead to anecdotal evidence or no evidence at all. In this way, their development was streamlined to the point of contextual and evidentiary blindness to the detriment of their final performance and, ostensibly, to their eventual job performance after graduation from the program. Our findings suggest that making space and time in the curriculum to develop students’ research skills and information literacy overall will enhance student learning outcomes.
Recommendations for further study
As it proved to be an unexpectedly rich and deep source of data, we specifically recommend future research on the group dynamics of postgraduate and professional learners in PBL environments. For example, collaborative frameworks such as the one Kim and Lim (2018) tested may be further validated in settings such as ours, especially given their focus on discourse. Additionally, as the strategic studies context has proven a fruitful context for the application and assessment of PBL, researchers should explore other novel contexts in which PBL provides an appropriate and authentic assessment: professional certificate programs, teacher-education programs and skills-based education programs in addition to internationally comparative work.
Limitations
This study was carried out at a small institution and with only two out of 24 seminars. Further, seminar composition was specific and directed, which may limit generalizability to other contexts. Though data collection was robust, including student, faculty and outside observer input as well as evaluation of both processes and products, several limitations impacted ability to collect data. First, ethnographic data collection may have been influenced by an unavoidable change in observers between the fall and spring iterations. Second, the students’ low motivation and view of the task as a low-stakes activity means these results may not be an accurate depiction of student strategic thinking skill. Initial framing of the exercise that raises the stakes for students should be explored to achieve greater fidelity of results.
Third, the evaluators were highly likely to rate the students as “Meets Standards” or “3” on the rubrics (59% of total scores given), despite verbal comments from evaluators during the briefing Q&A sessions that made it clear some groups performed below standards. Similarly, there were fewer evaluative comments given to groups who scored lower on the presentation, indicating potential unwillingness of evaluators to provide constructive comments in writing. Still, this work represents an ambitious inquiry into problem-based learning and the teaching and learning of strategy.
Conclusion
Overall, this pilot study represents both a novel contribution to the literature on PBL activities and in strategic education contexts, as well as a robust multi-method PBL assessment effort. This study also contributes valuable analysis regarding group dynamics to the consideration of PBL interventions for adult learners in professional and postgraduate education programs. Our findings resulted in recommendations regarding (1) curriculum optimization and the inclusion of additional formative and summative authentic assessment opportunities, to include practice in PBL tasks, (2) normative and ethical evaluation of students’ proposed solutions, (3) more robust education in causal literacy for strategic studies students, (4) explicit inclusion of multiple perspectives in cohort formation and facilitation and (5) inclusion of scholarly research in problem-solving activities.
We would like to thank Dr Richard Lacquement (COL, US Army, Ret.) for his institutional support of this research at the US Army War College.
Notes1.However, we observed that in the spring iteration of the exercise, at least one evaluator seemed reluctant to hold students responsible for poor work. In one exchange, an evaluator criticized a group’s posited aim, which appears nowhere in extant conversations among scholars or practitioners; however, the evaluator nonetheless gave the students in that group high marks for their performance. Similarly, a group whose proposal this same evaluator criticized for its shallowness and infeasibility nevertheless received high marks on their presentation.
Conflict of interest: While the authors do not have any conflicts of interest to disclose, we are required to state that the opinions, conclusions and recommendations expressed or implied in this manuscript are solely those of the authors and do not necessarily represent the views of Marine Corps University, the United States Marine Corps, the US Army War College, the United States Army, the Department of Defense, or any other US government agency.
Table A1
Problem briefing rubrics
© Emerald Publishing Limited.
