Content area
Aim
To report the development and validation of an assessment approach for competency-based nursing education in low-income settings
BackgroundAdopting a competency-based curriculum has been linked with the resource-intensive, programmatic assessment approach. However, implementing this approach in low-income contexts has challenges. Nursing education institutions in low-income contexts reported difficulties implementing programmatic assessment. These challenges threaten the outcomes of the assessment innovation, highlighting the need to develop a tailored approach to assessment for these contexts.
DesignA modified e-Delphi study
MethodsThe modified e-Delphi was used to refine and validate a proposed assessment approach following the recommendations of Conducting and REporting of DElphi Studies. Nine health professions education experts, with qualifications or publications in educational assessment, or curriculum design and development, were purposively sampled to validate the proposed assessment approach. Consensus was reached at ≥ 70 % after two iterative rounds. Items included in the two rounds were the six domains which described organisational contexts in terms of the people, the culture and technological infrastructure.
ResultsEleven out of 47 items in round one did not reach the set ³ 70 % consensus. The 11 items were refined and sent for round two. A total of 46 items were included in the final approach. The results report the developed assessment approach for competency-based nursing education in low-income settings.
ConclusionsA contextually relevant assessment approach should be underpinned by empirical evidence and knowledge of the context. Nursing institutions should support faculty development in assessment to enhance the implementation and sustainability of the assessment approach.
Assessment strategies, approaches and models must be feasible and relevant to the context. The adoption of competency-based nursing education (CBNE), advocated for by various stakeholders including the World Health Organization ( WHO, 2022), has direct implications on how nursing students are taught and assessed. Programmatic assessment (PA) is an essential component of competency-based education (CBE) ( Van Melle et al., 2019) and a complex alternative to contemporary assessment approaches through increasing opportunities for assessment and feedback while integrating various assessment methods to align with learning outcomes resulting in defensible assessment outcomes ( Bok et al., 2018; Ryan et al., 2023).
Programmatic assessment is a system of assessment that cumulatively combines multiple low-stakes assessment activities to counteract the weaknesses of single assessment activities ( Schut et al., 2021). Thus, provides a holistic overview of the performance of the student over time. The longitudinal accumulation of low-stakes assessments informs high-stakes decision-making by expert assessors for either progression or certification of the student ( Van der Vleuten et al., 2018). Programmatic assessment allows for student-centeredness in providing feedback, thus enhancing self-directed learning (De Jong et al., 2022). The PA approach is resource-intensive ( Schut et al., 2021), rendering it unsustainable for financially incapacitated nurse education institutions (NEIs). According to Nyoni and Botma ( Nyoni and Botma, 2019), there is a drift towards traditional assessment methods in some NEIs in low-income settings due to the financial constraints of implementing PA. However, traditional assessment approaches fall short of fully assessing the competence of students ( Lockyer et al., 2017). Hence, the need for the development of an alternative feasible assessment approach.
2 BackgroundSuccessful outcomes of implementing PA are reported in high-income countries ( Van der Vleuten et al., 2018; Cheptoo, 2019). However, there are limited studies from low-income countries reporting on the outcomes of implementing PA despite reports of the adoption of CBE ( Roberts et al., 2022). The context where nurses are educated must be considered when implementing CBE and PA, in the light of transferability of knowledge and skills by nurse educators ( Pinpathomrat et al., 2023).
The education of nurses varies across contexts. The availability of resources, the types of students, the educators, learning outcomes, disease patterns, communities and types of curricula models are among the critical factors that influence the education of nurses ( Nilsen et al., 2020). Low-income settings are characterised by high student to nurse educator ratio ( Hill and Abhayasinghe, 2022), limited clinical placement opportunities ( Gassas, 2021; Hill and Abhayasinghe, 2022), outdated educational practices ( Whitehorn et al., 2021), fragmented accreditation systems ( Dovdon et al., 2022) and the lack of sufficient investment for educational development ( Rybarckzyk et al., 2020). These contextual elements must be considered when developing and implementing nursing education programmes in low-income settings, thus supporting the need for a plural perspective towards evidence-based educational practice ( Grant, 2019). However, Woods ( Woods, 2015) asserts that there is evidence suggesting that transplanting educational practice from high-income settings to low-income settings is often unsustainable and may result in curriculum drift. Curriculum drift is characterised by learning experiences that are not aligned with the intended curriculum outcomes ( Woods, 2015). Therefore, the adoption of CBE and PA should be tailor-made to align with the contextual realities of the intended setting for better educational outcomes.
The context of this study was a small low-income country in Southern Africa, with a heavy disease burden, limited resources for health care and insufficient human resource for health. The NEIs in this setting adopted PA as an assessment approach in alignment with the adopted CBE requirements ( Van Melle et al., 2019). The newly adopted assessment approach resulted in the introduction of a wider variety of assessment methods, increased opportunities for assessment and feedback and the introduction of different approaches to standard setting, having a further impact on the overall administration and oversight of assessments for the NEIs in the country ( Nyoni and Botma, 2019; Shawa et al., 2023).
However, the NEIs that implemented this CBNE over the years have reported challenges related to the implementation of PA ( Shawa et al., 2023). The NEIs struggled with resources for student assessments, limited funding and investment, a shortage of educators for successful implementation of PA and governance issues related to assessment ( Moabi and Mtshali, 2021). These PA implementation challenges have also been reported in other low-income countries ( Pinpathomrat et al., 2023, June 15). Ultimately, the outcomes of implementing PA resulted in assessments being inconsistent and fragmented with the potential of negatively influencing the outcome of implementing CBNE.
To strengthen the implementation of CBNE in this low-income country, we developed an assessment approach that aligns with the contextual realities of implementing CBE in a low-income country. This paper describes the development process of the assessment approach, the proposed assessment approach and the outcomes of an e-Delphi process to validate the developed assessment approach. We argue that validating an educational innovation, through engaging the consensus of educational expertise within the context, has a higher potential of enhancing the validity of an educational artefact.
2.1 Development of the assessment approach for competency-based nursing educationWe developed an assessment approach for the implementation of CBNE in a low-income country in response to the need for a more feasible alternative to the assessment approach which was being used in the setting. Design science research (DSR) was applied to frame the multi-phase approach by integrating results from a mapping review ( Mukurunge et al., 2024), which was embedded in the knowledge base domain of the DSR and semi-structured interviews ( Mukurunge et al., 2024) that were embedded in the environment domain of the DSR ( Weber, 2012). Findings from these two preceding phases to this study were integrated using collaborative synthesis to propose an assessment approach for CBNE in a low-income setting. Fig. 1 illustrates the DSR framework that was used in this study.
During the first phase, the characteristics of different assessment approaches, frameworks, models and methods used in undergraduate health professions education were explored using a mapping review. Studies published between the years 2000 and 2022 were retrieved and analysed to reveal the Quarter model ( Singh et al., 2012; Gupta et al., 2021) as having some resemblance to the expectations of competency-based education and had the underpinnings of the PA approach in its implementation (See: ( Mukurunge et al., 2024). Therefore, some aspects of the Quarter model were then adopted in the development of the new assessment approach.
The second phase described the institutional assessment contexts at NEIs in a low-income country through semi-structured interviews embedded in the DSR framework's environment domain, ( Weber, 2012). The following criteria of the environment domain in the DSR framework were included: ‘the people’ according to their roles and capabilities; ‘the organisation’ in terms of its strategies, structures, culture and processes; and ‘the technology’ to describe the technological infrastructure and developmental capabilities. Data were deductively analysed to yield the six domains representing NEIs organisational contexts at micro, meso- and macro levels ( Braun and Clarke, 2022). The complete details of the second phase have been reported elsewhere ( Mukurunge et al., 2024). The developed assessment approach is illustrated in Fig. 2.
2.2 The assessment approach for CBNE in a low-income countryThe assessment approach in Fig. 2 is to be enacted at three levels of decision-making. The micro level covers the implementation of assessment on a day-to-day basis in the classroom and the clinical area. At the meso level, assessment implementation is to be monitored at the organisational level. The organisational level includes key offices of decision-making that appear in the organogram of the institution like the Quality Assurance Office, offices of heads of programme and the Academic Head. At the macro level, external stakeholders should be involved in monitoring the implementation of assessments in national training institutions. The external stakeholders include governing bodies like the Nursing Council, the Ministry of Health, the Academic Senate, the Institutional Board and the Council on Higher Education. The assessment approach follows a systems approach of inputs, processes and outputs ( Hastuti et al., 2021), which are discussed in the following sections.
2.3 InputsThe inputs to a system are raw materials that can be processed into outputs ( Sidik, 2022). However, the raw materials can take different forms, not necessarily the physical form. A competence-based programme is set to produce competent lifelong learning healthcare professionals who can work towards the reduction of the disease burdens in their context ( Rhoney et al., 2024). Therefore, in the developed assessment approach, the antecedents include the health needs of the population, competencies that the health professionals should possess and the teaching and learning activities of the programme encapsulated in the competency-based curriculum (see Fig. 2). These inputs are intertwined and they inform each other. The development of a relevant competency-based curriculum needs a thorough exploration of the health needs ( Lopez-Medina et al., 2019). The findings of the exploration of the population’s health needs determine the competencies that the nurses should possess, which will be incorporated into the curriculum and hence inform the teaching and learning activities.
2.4 ProcessesThe process part of the approach is where decisions about inputs in the production of outputs are made ( Galais et al., 2020). In this assessment approach, the processes to be followed in the implementation of assessments will be at micro, meso- and macro levels. Processes involved in assessment at the three levels of decision-making are discussed below for each of the six domains that emanated from the thematic analysis of data gathered from structured interviews in Phase 2 of the study ( Table 1).
In Domain 1, the curriculum, academic planning and moderation of assessments are essential in the designing and quality assurance of assessment. At the macro level, the curriculum should be a competency-based learner-centred curriculum which is underpinned by educational theory and principles such as constructivism ( Vygotsky and Coleman, 1978; Holmes et al., 2021), constructive alignment ( Biggs, 1996; Cammies et al., 2022), scaffolding ( Wood et al., 1976; Margolis, 2020; Masava et al., 2023) and authenticity ( Roach et al., 2018). National standards for assessment implementation should be modified to the standard of CBE. At a meso level, the key offices in the institution should ensure the implementation of the competency-based curriculum and the development of a clear yearly master plan of academic activities that reflects assessment schedules per module. Appropriate and cost-effective assessment methods should be used in both theory and clinical assessments. Physical resources for the implementation of assessment like the classrooms, clinical areas and simulation laboratories should be made available. To secure assessments during their development and transfer between examiners and moderators electronically, features of cybersecurity should be put in place. The establishment of assessment training workshops, in institutions, which are facilitated by dedicated assessment experts can yield educators who are competent assessors and moderators of assessments. In addition, the strategy should outline competencies required in continuing professional development in assessment implementation for educators.
Faculty capacity and development in assessment is the focus of Domain 2. At the macro level, the drafting of a national strategy for faculty development on assessment is essential. This national strategy can result in the empowerment and development of a monitoring tool for faculty development in terms of the acquisition of assessment competencies. At the meso level, the key offices in the institutions should draw essential institutional assessment competencies from the national strategy on faculty development. The institutional assessment competencies should have an inherent monitoring strategy for the attainment of these competencies by educators. Faculty development in assessment can also be monitored through scheduled periodic performance appraisals of educators. At the micro level assessment workshops should be facilitated to all members of staff by competent dedicated faculty members. In addition, staff should be encouraged to take ownership of their learning by engaging in self-directed learning activities which include enrolling in free Massive Open Online Courses.
The external stakeholders, in Domain 3, assume both monitoring and advisory roles in institutional assessment implementation. The Ministry of Health and the regulatory bodies can establish external committees for the uniform monitoring of assessment implementation by national institutions. Each institution should have an Academic Senate made up of professional experts capable of monitoring and advising on assessment implementation. There should be terms of reference in place for institutional boards so that assessment strategies are implemented seamlessly regardless of the individuals who occupy positions on the boards. At the meso level, student feedback on assessment can be used in the development of assessment checklists. Institutional assessment committees can be established for monitoring assessment implementation.
Domain 4 of evidence-based practice refers to institutions undertaking assessment-related research, adopting and using guidelines and practices informed by research. At the macro level institutions should engage in partnerships with external institutions on assessment research to enhance research output and capabilities. At the meso level institutions should have a culture of reviewing assessment practices and adopting innovative evidence-based assessment practices. At the micro level educators should implement assessment using evidence-based practices.
Assessment methods, techniques and blueprints constitute domain 5. To implement affordable and reliable assessments institutions should ensure the selection of the correct assessment methods for given learning outcomes. There are various assessment methods in use for both theory and practice. At the meso level, the Assessment committees and Academic committees should develop blueprints according to current or revised taxonomies like the Krathwohl taxonomy. The blueprints should be aligned with learning outcomes for all modules in the curriculum. Institutions should develop assessment policies that are aligned with the curriculum to guide assessment implementation. At the micro-level educators should administer assessments using the appropriate and feasible assessment methods. For example, results from the interviews in Phase 2 revealed that the implementation of Objective Structured Clinical Examinations (OSCEs) as an assessment method in a competency-based curriculum is resource intensive. Liu (2012) introduced workplace-based assessments arguing that OSCEs tend to deconstruct the patient-practitioner interaction, further affecting the validity of assessments. The workplace-based assessments are more authentic than OSCEs because they take place in a real clinical context ( Reid et al., 2021). According to Liu (2012), assessment of competencies in the clinical area is cost-effective since there is no need to train standardised patients and the resources that the student will use in the assessment belong to the clinical area. Examples of workplace-based assessment methods that can be used are direct observation of procedural skills; mini-clinical evaluation exercises and case-based discussions ( Donkin et al., 2023; Batra et al., 2022). Therefore, in the implementation of cost-effective assessments in competence-based nursing education, NEIs can consider adopting workplace-based assessments.
Domain 6 refers to the organisation and its finances. To operate at their optimum level, all departments in institutions should be operating in sync. Therefore, Administrative staff in institutions should be trained on budgeting processes for assessment activities in the programme. At the meso level, the administrative staff should develop budgets that pertain to assessment in terms of capacity development of educators and technology-related infrastructure. Administrative staff should take part in institutional meetings where planning for assessment takes place. At the micro level, the administrative staff should attend capacity-building workshops on budgeting for assessment activities in the institution.
2.5 OutputsOutputs are the results of the processing of the inputs ( Galais et al., 2020). As shown in Fig. 2, the outputs are assessment criteria for good assessment, high quality student outcomes, professional nurses and enhanced patient outcomes. For comprehensive results there should be interaction among the elements of the outputs as shown in Fig. 2.
2.6 Purpose of the studyThe study aimed to develop an educational assessment approach and through a Delphi process have it validated.
2.7 Materials and methodsA modified e-Delphi technique was used in the validation of the developed assessment approach. According to Iqbal and Pipon-Young ( Iqbal and Pipon-Young, 2009) the modified e-Delphi technique is used to reach a consensus on a topic among a panel of experts in a particular field. This modified e-Delphi was used instead of a Classical Delphi because there was no initial open round to generate ideas. Rather, the initial idea was the developed assessment approach. The modified e-Delphi technique had essential benefits to the validation process of the developed assessment approach. The modified e-Delphi technique has an inherent characteristic to engage the opinions and experiences of experts which was essential in the validation of the developed assessment approach. The anonymity that is associated with the modified e-Delphi technique allowed the education and assessment experts to participate freely in the validation process without interference from potentially dominant participants ( Junger et al., 2017). Consensus was reached at ≥ 70 % agreement on the components of the developed assessment approach ( Hong et al., 2019). The process of conducting the modified e-Delphi technique followed the Conducting and Reporting of Delphi Studies (CREDES) guidelines ( Junger et al., 2017). According to these guidelines, experts with diverse experience must be included irrespective of their geographical location. The Delphi process was conducted anonymously to avoid social pressure emanating from the need to conform to the most dominant view ( Williamson et al., 2021). The procedure was iterative with two rounds. The design of the second round was informed by a summary of the responses from the first round ( Junger et al., 2017). In this paper, the modified e-Delphi technique refers to the method while the modified e-Delphi process refers to the entire consensus building process. The six domains that constituted the assessment approach were deconstructed into individual elements to develop the questionnaire that was used in the modified e-Delphi process, see Additional item 1. The assessment approach questionnaire was piloted on two local assessment experts before dissemination to the panellists.
2.8 Selection of the panel of expertsPurposive sampling was used in the selection of 20 members of the panel of experts in assessment. The literature recommends that for a Delphi process at least 10–50 expert members per panel are required to achieve a wide range of opinions ( Iqbal and Pipon-Young, 2009; Williamson et al., 2021). However, Delphi processes have been carried out with as few as seven panellists ( Iqbal and Pipon-Young, 2009). The inclusion criteria for the panellists were: health professions educators from low-income countries with either qualifications or publications in education and assessment or curriculum design and development; and had a proficiency in the English language. The prospective participants for the modified e-Delphi were identified during a preceding mapping review study ( Mukurunge et al., 2024) and were contacted using their email addresses obtained from the contact details available in their published articles. Twenty prospective panellists were contacted via emails and ten responded showing interest in participating. In the emails, the aim, process and possible risks and benefits of the expert consultation were communicated. Written consent was received from ten expert panellists who agreed to participate in the modified e-Delphi process. The modified e-Delphi process was carried out from 18 September 2023 to 13 October 2023 and took two iterative rounds for the expert panellists to reach a ≥ 70 % consensus. For each round, all responses were collected and analysed anonymously.
2.9 Round 1 of the DelphiA Google form was developed which contained statements that make up the assessment approach, that is the six domains and their application at their micro, meso and macro levels as shown in Table 1 and the Additional item 1. Accompanying the Google form was a comprehensive narration, in Word format, of the domains and their application at the micro, meso and macro levels to allow the panellists to make informed decisions. Panellists were to decide whether the statements were clear and relevant by selecting either a “yes” or “no” response and providing a reason for a “no” response. Panellists were given two weeks to compile their responses on the document. Statements that had < 70 % consensus among the panellists were discussed by the three authors and fed back into Round 2 of the e-Delphi process after the suggested amendments were made.
2.10 Round 2 of the DelphiA second Google form was created which contained the revised statements that had < 70 % consensus in the first round. The Google form was sent to the panellists for them to decide if the amendments were correct. Panellists were given a week to work on the revised statements. All the statements, after round two, had more than 70 % agreement; hence, there was no need for a further round.
3 ResultsThe results are presented according to the demographics and the assessment approach validation in the modified e-Delphi process.
3.1 Characteristics of the participantsTen experts (n = 10) indicated interest in participating in the study. However, only (n = 9) participated in the modified e-Delphi process as shown in Table 2. Most of the panellists (n = 8) had a PhD qualification and had between 8 and 37 years of experience in education and assessment. All the participants were educators at their institutions.
3.2 Validation of the assessment approach in the e-Delphi processThe proposed assessment approach went through two rounds of validation in the e-Delphi process. Table 3 illustrates the e-Delphi process results from Round 1 and 2.
Of the 47 items (n = 47) included in Round 1, (n = 36) items reached consensus with > 70 %. In the first round, domain 5 had two missing items at the micro level. In this case, “missing” means that the particular statements for the domain were mistakenly omitted in the questionnaire. These were included in the second round, as shown in Table 3. Eleven items (n = 11) were sent for Round 2 for panellists to validate and reach a consensus. Only (n = 1) item was removed from the developed assessment approach.
4 DiscussionThe measurement of the acquisition of competency by nursing students requires a robust assessment approach (Norcini et al., 2022). Noteworthy, an assessment approach should reflect an integration of theories and principles used in implementing assessment in an educational programme. The adoption of the PA approach has been reported to influence the validity and reliability of assessment outcomes ( Van der Vleuten et al., 2018). However, reports of the success of PA implementation are skewed towards high-income countries ( Timmerberg et al., 2022). This could be linked to the high costs of PA implementation. Reports from NEIs in a low-income setting that adopted CBE revealed contextual challenges they faced in the implementation of PA ( Shawa et al., 2023; Bvumbwe and Mtshali, 2018). Lubbers, Otter and Schildkamp, ( Lubbers et al., 2022) in their study showed that the institutional contexts determine the success or failure of implementing educational innovations. Therefore, on the premise that contextual characteristics have some bearing on the success of the adoption of educational innovations, a more feasible assessment approach was designed for NEIs in low-income settings.
Effective validation of the developed assessment approach necessitated the recruitment of assessment experts with knowledge of NEIs’ organisational contexts in low-income settings.
Knowledge of the organisational context which includes the culture, the human resources, the technology and the “deep” structures of the organisation ( Akala, 2021; Veugelers et al., 2020) was essential among the recruited panel of experts. The recruitment may have had a positive influence on the expert decisions made by the panellists since they had a better understanding of the context of the low-income country.
The innovations in education have brought multiple roles for educators including facilitation of learning, curriculum and module planning, creation of resource materials, mentoring students and programme evaluation and use of technology in implementing assessment, amongst others ( Branfield et al., 2020). However, the interpretation of the CBE curriculum in conjunction with the implementation of new assessment approaches pose challenges for educators in different contexts ( Ghezir et al., 2021; Ross et al., 2021). Therefore, Iobst and Holmboe ( Iobst and Holmboe, 2020) affirmed that NEIs in low-income settings need to invest in capacity building among educators by establishing training workshops and more innovative approaches. This point is supported by one study carried out on implementing educational innovations, where results showed that support for educators is paramount in the success of educational innovation implementation ( Lambriex-Schmitz et al., 2020). In addition, we propose that NEIs compile mandatory assessment competencies that educators should renew annually under the supervision of their line managers. Acquisition of these competencies by educators can lead to incorporation of evidence-based practises according to the global trends in health which has a ripple effect on the quality of nurses that are educated in the NEIs. NEIs could then establish collaborations with other international NEIs to engage in benchmarking exercises for capacity building to improve the assessment of learning. NEIs should develop formal policies on capacity building for educators, as stated by Srinivas and Adkoli (2009). The NEIs can refer to the Global Pillars Framework which provides guidelines for professional nursing education. The Global Pillars Framework was developed by the Global Alliance for Leadership in Nursing Education and Science GANES ( Global Alliance for Leadership in Nursing Education and Science (GANES), 2019) and it confirms that a curriculum used in educating nurses should be responsive to the local context and its health needs.
Panellists agreed with all statements in domain three in the first round. In this domain, the issue of stakeholders in the NEIs emerged as crucial in the implementation of the assessment approach. Various stakeholders are involved in the implementation of the assessment approach according to the level of decision-making involved. Thus, for effective implementation, all stakeholders must participate fully. As per the Global Alliance for Leadership in Nursing Education and Science GANES ( Global Alliance for Leadership in Nursing Education and Science (GANES), 2019), regulatory bodies at the national level should establish standards for assessment implementation, along with monitoring and evaluation guidelines to ensure regular review and monitoring of implementation in NEIs. Sometimes, stakeholders fail to engage due to a lack of awareness of what is expected of them ( Torre et al., 2022). Therefore, it is necessary to provide orientation to all stakeholders about the assessment approach so that they can participate fully in its implementation. In addition, NEI leadership should note that academic advisors play a vital role in assessment implementation ( Rich et al., 2019). Therefore, arrangements can be made with the national regulatory bodies to engage external experts who can work with local stakeholders in developing guidelines and policies for assessment implementation.
The success of establishing and sustaining educational innovations is rooted in evidence-based practices ( Woods et al., 2023). Outcomes of implementing the assessment approach, especially in low-income settings should, therefore, be documented and presented in the literature. To enhance the probability of the generation of literature, research collaborations between poor and rich institutions should be established. Parejo, Revuelta-Dominguez, Guerra-Antequera and Ocana-Fernandez ( Parejo et al., 2022) reiterated in their study that the success of implementing educational innovations lies in concurrent research to evaluate and critique the innovations. NEIs can also establish mechanisms for periodic performance reviews to hold educators accountable for their work. Finally, exceptional performance by educators should be recognised and rewarded to keep educators motivated ( Mok and Yeen, 2021). Consequently, motivated educators have a higher probability of excellence in their performance at work. In addition, innovation has become the core emphasis in the educational context which has a ripple effect on the quality of education offered to the students. According to a study by Faud et al., ( Faud et al., 2020) institutions that engage in curricula innovations have a higher probability of producing students with 21st-century competency skills to deal with the current global trends in health.
The assessment approach emphasises the importance of the use of different types of assessment methods to evaluate the attainment of competency because no single assessment method is void of limitations ( Torre et al., 2022). Domain five refers to the use of various assessment methods and the importance of blueprints in the development of assessment items. The choice of assessment methods has a direct influence on the resources that are required for implementation. In this regard, NEIs should use their assessment policies and curricula to guide the choice of assessment method that is both feasible and financially viable in their context. The use of blueprints ensures precision in assessment development ( Kirby et al., 2021). However, due to the lack of knowledge on the development of blueprints and the choice of the most appropriate assessment methods, some educators may face challenges ( Heeneman et al., 2021). Hence, the establishment of faculty development programmes is paramount in enhancing the capabilities of educators ( Rich et al., 2019).
The success of implementing an educational innovation should have strong financial backing as reiterated in domain six of the assessment approach. In a similar study where an educational innovation was introduced in institutions of technology, financial resources and a supportive leadership were key to sustainability of the innovation ( Lubbers et al., 2022). Therefore, the NEIs should have functional financial systems that support assessment. The finance departments in the NEIs should be knowledgeable of the processes and strategies that are implemented in assessment. The result of the involvement of the finance department in planning meetings for assessment implementation could be improved monetary flows towards assessment implementation. In addition to assessment, the finance departments in NEIs are involved in many other activities. According to the GANES (2019), NEIs should have financial resources to support the establishment of a functional library and internet resources for students to learn to use evidence to inform their learning. However, governments in low-income settings battle with many competing financial challenges to finance their NEIs. These challenges include conflict, low investment, more emergent priorities like infectious diseases, low fiscus and brain drain of the academics to high-income countries ( Nsengimana et al., 2024). In the context of low-income countries, these challenges require more immediate solutions than their demand for libraries and internet coverage. Therefore, this study recommends the adoption of the developed assessment approach and optimisation of available open-access resources to enhance the successful implementation of assessment in educational innovations. Innovative solutions, like open access resources, are necessary to curb the financial challenges.
4.1 Strengths and limitationsThe use of the CREDES guidelines in the modified e-Delphi method ensured credible results. The study can be reproducible in similar contexts because of the standard methodology that also has an audit trail of implementation. The inclusion of international panellists with extensive experience in education and assessment ensured that the proposed assessment approach was contextually relevant for the NEIs in the low-income country. However, the selection of panellists for the modified e-Delphi method was limited by language. Only English-speaking panellists were selected excluding non-English speakers who could have brought in their expert opinions on the assessment approach.
5 ConclusionInappropriate assessment approaches in competency-based nursing education motivated the development and validation of a tailor-made contextually relevant assessment approach in low-income contexts. The developed assessment approach can be used in similar low-income settings. Noteworthy, is that other low-income settings can implement the design processes implemented in this study to develop their unique assessment approach. The findings from this study can influence practice where institutions can set up structures to support and fund the implementation of assessment in competency-based nursing education. Further research is recommended to test the feasibility of implementing the developed assessment approach.
FundingThis research did not receive any specific grant from funding agencies in the public, commercial or not-for-profit sectors.
Ethical statementBefore conducting the study, ethics approval was granted from the University (reference number HSD2022/0510). Twenty prospective panellists were contacted via emails, and ten responded, showing interest in participating. The emails communicated the aim, process and possible risks and benefits of the expert consultation. Written consent was received from ten expert panellists who agreed to participate in the modified e-Delphi process. All responses in the modified e-Delphi process were collected and analysed anonymously.
CRediT authorship contribution statementEva Mukurunge: Writing – original draft, Project administration, Investigation, Formal analysis, Conceptualization. Champion N. Nyoni: Writing – review & editing, Writing – original draft, Validation, Supervision, Methodology, Investigation, Formal analysis, Conceptualization. Lizemari Hugo: Writing – review & editing, Writing – original draft, Validation, Supervision, Methodology, Investigation, Formal analysis, Conceptualization.
Declaration of Competing InterestThe authors declare that they have no competing interests. The authors declare that the present manuscript is not being considered for publication elsewhere and has not been published before.
AcknowledgementsThe authors would like to thank the panel of experts for their commitment to participating in the modified e-Delphi process until the last round and Prof Ruth Albertyn for her invaluable critical reading input to the article.
| DOMAINS | MICRO LEVEL | MESO LEVEL | MACRO LEVEL |
| 1.Curriculum, academic planning and moderation | 1.1.1 Educators as moderators
1.1.2 Educators as assessors 1.1.3 Educators as trainers | 1.2.1 Yearly master plan reflecting assessment schedule per module
1.2.2 Moderation of all assessments 1.2.3 Appropriate and cost-effective assessment method 1.2.4 Competency-based curriculum 1.2.5 Student-centred learning 1.2.6 Physical resources for assessment 1.2.7 Infrastructure for e-assessment 1.2.8 Cybersecurity related to institutional assessments | 1.3.1 Institutional/ independent assessment practice
1.3.2 National standardised infrastructure for assessment |
| 2.Faculty capacity and development | 2.1.1. Capacity development on assessment and moderation
2.1.2. Capacity development on evidence-based assessment and research in assessment | 2.2.1. Educator performance appraisal
2.2.2. Educator essential assessment competencies 2.2.3. Monitoring maintenance of educator essential assessment competencies 2.2.4. Monitor research output in assessment | 2.3.1. National strategy for faculty development on assessment
2.3.2. Continuing professional development in assessment 2.3.3. National nurse educator faculty development coordination |
| 3.Stakeholders | 3.2.1. Involve students in assessment evaluation.
3.2.2. Internal mechanism/committees for monitoring assessment | 3.3.1. Professional authorities/bodies that monitor assessment.
3.3.2. Clear terms of reference for organisation boards regarding institutional assessment strategy 3.3.3. External mechanism/committees for monitoring assessment | |
| 4.Evidence based practice | 4.1.1. Evidence-based assessment practices | 4.2.1. Periodic review of assessment practice
4.2.2. Adoption of innovative evidence-based assessment practices | 4.3.1. Partnership with external institutions/ bodies on research in assessment |
| 5.Assessment methods, techniques, blueprints | 5.1.1. Logbooks, checklists,
5.1.2. Training of supplementary human resources on assessment | 5.2.1. Blueprint of all learning outcomes
5.2.2. Relevant and updated taxonomy for assessment 5.2.3. Assessment policy aligned with the curriculum model and assessment design principles. | |
| 6.Organisation and finances | 6.1.1. Capacity development for administrative staff on budgeting for assessment | 6.2.1. Budget for assessments
6.2.2. Budget for capacity development in assessment 6.2.3. Budget for technology related infrastructure for assessment 6.2.4. Institutional meetings and planning for assessments |
| Country of practice | Highest qualification | Years of experience in education and assessment |
| Malawi | PhD | 20 years |
| Kenya | PhD | 8 years |
| Thailand | Masters | 35 years |
| Rwanda | PhD | 18 years |
| South Africa | PhD | 34 years |
| South Africa | PhD | 27 years |
| Kenya | PhD | 10 years |
| South Africa | PhD | 37 years |
| Lesotho | PhD | 20 years |
| | | | |
| | | 62.5 % | 100 % |
| | | ||
| 1.1.1. | Educators as moderators | 100 % | |
| 1.1.2. | Educators as assessors, | 100 % | |
| 1.1.3 | Educators as trainers | 75 % | |
| 1.2 | | ||
| 1.2.1. | Yearly master plan reflecting assessment schedule per module | 100 % | |
| 1.2.2. | Moderation of all assessments | 87 % | |
| 1.2.3. | Appropriate and cost-effective assessment method | 100 % | |
| 1.2.4. | Competency-based curriculum | 100 % | |
| 1.2.5. | Student-centred learning | 100 % | |
| 1.2.6. | Physical resources for assessment | 87.5 % | |
| 1.2.7. | Infrastructure for e-assessment | 87.5 % | |
| 1.2.8. | Cybersecurity related to institutional assessments. | 100 % | |
| 1.3 | | ||
| 1.3.1. | Institutional/independent assessment practice | 100 % | |
| 1.3.2. | National standardised infrastructure for assessment | Missing | 87.5 % |
| 2. | | 75 % | |
| 2.1 | | ||
| 2.1.1. | Capacity development on assessment and moderation | Missing | 87.5 % |
| 2.1.2. | Capacity development on evidence-based assessment and research in assessment | 87.5 % | |
| 2.2 | | ||
| 2.2.1. | Educator performance appraisal | 75 % | |
| 2.2.2. | Educator essential assessment competencies | 87.5 % | |
| 2.2.3. | Monitoring maintenance of educator essential assessment competencies | 87.5 % | |
| 2.2.4. | Monitoring research output in Assessment. | 87.5 % | |
| | | ||
| 2.3.1. | National strategy for faculty development on Assessment | 62.5 % | 100 % |
| 2.3.2. | Continuing Professional Development in Assessment | 75 % | |
| 2.3.3. | National Nurse Educator Faculty Development Coordination | 75 % | |
| 3. | | 75 % | |
| 3.1 | | ||
| 3.2.1. | Involve students in assessment evaluation. | 87.5 % | |
| 3.2.2. | Internal mechanism/committees for monitoring assessment | 87.5 % | |
| 3.3 | | ||
| 3.3.1. | Professional authorities/bodies that monitor assessment. | 87.5 % | |
| 3.3.2. | Clear terms of reference for organisation boards regarding institutional assessment strategy | 75 % | |
| 3.3.3. | External mechanism/committees for monitoring assessment | 75 % | |
| 4. | | 87 % | |
| 4.1 | | ||
| 4.1.1. | Evidence-based assessment practices | 100 % | |
| 4.2 | | ||
| 4.2.1 | Periodic review of assessment practice | 87 % | |
| 4.2.2. | Adoption of innovative evidence-based assessment practices | 87.5 % | |
| 4.3 | | ||
| 4.3.1. | Partnership with external institutions/ bodies on research in assessment | 87.5 % | |
| 5. | | 62.5 % | 100 % |
| 5.1 | | ||
| 5.1.1. | Logbooks, checklists, | Missing | 100 % |
| 5.1.2. | Training of supplementary human resources on assessment | Missing | 100 % |
| 5.2 | | ||
| 5.2.1. | Blueprint of all learning outcomes | 87.5 % | |
| 5.2.2. | Relevant and updated taxonomy for assessment | 75 % | |
| 5.2.3. | Assessment policy aligned with the curriculum model and assessment design principles. | 0 % | Taken out |
| 6. | | 87.5 % | |
| 6.1 | | ||
| 6.1.1. | Capacity development for administrative staff on budgeting for assessment | 75 % | |
| 6.2 | | ||
| 6.2.1. | Budget for assessments | 87.5 % | |
| 6.2.2. | Budget for capacity development in assessment | 62.5 % | 87.5 % |
| 6.2.3. | Budget for technology related infrastructure for assessment | 62.5 % | 87.5 % |
| 6.2.4. | Institutional meetings and planning for assessments | 62.5 % | 75 % |
©2024. The Authors