Content area
Aim
To evaluate the early impact of the Centre.
Background
The educational quality of nursing and midwifery student clinical placement learning varies. The National Placement Evaluation Centre has a pivotal role in quality improvement by systematically collecting student and supervisor feedback on placement experiences.
Design
Multimodal impact evaluation.
Methods
Data were collected from website analytics and a survey shared via stakeholder registration emails and social media. The Impact Management Planning and Evaluation Ladder model was used to understand impact. Data were analysed using a sequential, hybrid inductive/deductive approach. Descriptive statistics summarised participant characteristics and qualitative data were themed.
Results
Website analytics revealed Centre visitors were from 92 countries, with most reviewing placement evaluation tools. Registered NPEC members represented 37 Australian education and 590 placement providers and together they accessed their data 6213 times. There was n = 107 survey responses. Thematic analysis revealed four themes describing user experiences, enhancements to placements and recommendations for system improvements: 1) A valued system, 2) Barriers to engagement, 3) Resulting changes and 4) It’s early days! The deductive analysis revealed far reaching national impact.
Conclusions
The Australian National Placement Evaluation Centre impact is amplified by its nationally consistent reach. Data collection via validated placement evaluation tools and streamlined data reporting informs improvements to the quality of nursing and midwifery clinical placements. Increased promotion of the NPEC to students is needed. The Centre has the potential to enhance clinical placement quality across national and international healthcare professions. Promotion to students is necessary. Future directions are reported.
1 Introduction
Worldwide, clinical placement experiences are a requirement for nursing and midwifery students to integrate classroom learning with clinical practice ( Moroney et al., 2022). Yet, many challenges are reported that impact the quality of students’ clinical learning. Research highlights that nursing and midwifery students report variations in supervisor and staff support, in the nature of the clinical learning environment and student preparation for learning, in multiple clinical settings, ( Bogossian et al., 2025b; Cant et al., 2021a; 2021b; Hyun et al., 2024; Rodríguez-García et al., 2021; Ryan et al., 2024). For more than a decade reports from the United Kingdom ( Francis, 2013; Palmer et al., 2024), Australia ( Darcy Associates, 2016; Schwartz, 2022) and the World Health Organization ( World Health Organization, 2021) have called for national approaches that investigate the educational quality of nursing and midwifery practice education.
To our knowledge only two national systems are available for nursing and midwifery: the Quality Management of the Practice Learning Environment (QMPLE) in Scotland and the Australian National Placement Evaluation Centre.
2 Background
In Australia in 2018 the Council of Deans Nursing and Midwifery (Australia and New Zealand) (CDNM) commissioned the first stages of a national project that aimed to monitor and report on the educational quality of degree level nursing and midwifery students’ placements. This work was based on a co-design project model, The Model for Improvement and its embedded action-oriented approach Plan, Do, Study, Act (PDSA) ( Associates in Press Improvement, 2024). Education and placement provider stakeholders associated with clinical placements contributed to the NPEC design (Deans and their staff, students, clinical supervisors, health service managers and their placement provider staff). These stakeholders were represented across three groups including a monthly core NPEC team meeting and quarterly meetings of NPEC end users and advisory group. Meetings are focused on generating ideas, gathering user feedback and testing systems prototypes. The National Placement Evaluation Centre (NPEC) uses evidence based and validated evaluation tools to survey nursing and midwifery students and their supervisors’ learning experiences following clinical placements ( Cooper et al., 2020a; Bogossian et al., 2025a) to survey nursing and midwifery students and their supervisors’ learning experiences following clinical placements. The Placement Evaluation Tool (PET) is an evidence-based tool that was developed and validated in 2020. ( Cooper et al., 2020a; 2020b).
PET development was informed by a literature review which identified 10 tools we judged as lacking and too lengthy ( Cooper et al., 2020a). However comparative analysis indicated that the PET had strong concurrent validity with the Clinical Learning Environment and Supervision Scale (r = .834) ( Cooper et al., 2020b). Briefly, the PET comprises nineteen items that assess two factors: the Clinical Learning Environment (8 items) and Learning Support (11 items). A 20th item of overall educational satisfaction is rated (1 = low, 10 = high). A free-text comment box is included for additional feedback. There are three versions of the PET. PET- Nursing was validated in a trial with over 1200 nursing students across Australia, demonstrating strong validity, reliability and ease of use ( Luders et al., 2021). PET - Midwifery was contextually adapted and validated using a three-phase, sequential study design involving 248 midwifery students in seven Australian universities and representing various years of entry-level midwifery courses ( Bogossian et al., 2025a). There was strong concurrent validity between the PET-Midwifery and the MidSTEP CLE scale [r = 0.503, p = 0.01] ( Bogossian et al., 2025a). PET - Supervisor, whilst contextually adapted, has not yet undergone further validity testing.
To collect placement evaluations the core NPEC team recruited education provider staff to register with the NPEC to disseminate the PETs and enable secure access to evaluation data. PETs are shared via QR codes and weblinks. Placement provider staff were invited to register to access data related only to their institutions. Downloadable systems generated automated reports and raw data sets are available ( National Placement Evaluation Centre NPEC, 2025). Recruitment was supported through a national online seminar delivered in 2022 to raise the NPEC profile and the NPEC team purposively emailed key stakeholders with existing registrants encouraged to share information about the Centre (personal correspondence, S. Cooper, funders project report, 2022).
By December 2024, 37 (97 %) of the 38 education providers that offer entry-to-practice nursing courses and 17 (74 %) of the 23 midwifery education providers were registered, as well as staff representing more than 590 placement providers. Three NPEC project team members, who were funded by Health Education Services Australia compiled and reported the national dataset in 2023 and 2024. Further, the NPEC development and available data are reported via peer reviewed journal articles and national and international conference presentations (
National Placement Evaluation Centre NPEC, 2025).
3 Study design
NPEC is a co-designed quality improvement project. In 2024, it was considered timely to conduct a formal impact evaluation of the NPEC project. This evaluation included wider stakeholder experiences of the NPEC website and resources aiming to understand how stakeholders engage with and use the PET evaluation data. Impact studies are important to identify project successes and reach and to inform future improvements, replication and future expansion ( Clarke et al., 2019). This study adheres to the Standards for Quality Improvement Reporting Excellence (SQUIRE) 2.0 reporting guidelines ( Ogrinc et al., 2015).
4 Aim
This study aims to describe the impact of the NPEC as a national approach to collecting degree level nursing and midwifery student and supervisor feedback about the quality of clinical placements.
5 Methods
Impact and evaluation study methods can differ widely. Project impact can be measured at multiple levels: project team members (e.g., NPEC core team, end user and advisory groups) and organisations (education and placement providers) and organisations unrelated to the project or the team (
Hinton, 2014).
Louder et al. (2021) suggest researchers choose appropriate methods of measurement focusing on project contributions to knowledge and influence on policy, practice and solving stakeholder concerns. This study utilised multiple information sources to explore the impact of the NPEC as an educational project, based on the IMPEL model (
This model is a fitting choice as we seek to answer the research questions:
1. What are the experiences of Australian nursing and midwifery stakeholders of a national approach to collecting student and supervisor feedback on clinical placements?
2. How has a national approach to collecting student and supervisor feedback informed changes to nursing and midwifery clinical placements?
5.1 Survey participants, settings and recruitment
Participants were stakeholders (NPEC core team, end user and advisory groups and education and placement providers). The lead author also shared the recruitment invitation in member-only nursing and midwifery social media groups, seeking a response from wider stakeholder groups with experience of the NPEC or the PET.
The survey was only offered digitally. An e-mailed invitation to complete the survey included a QR code and weblink, plus participant information sheets. This was emailed and posted four times to the afore-mentioned groups. A consent acknowledgement formed part of the survey.
5.2 Data sources
Data sources were NPEC website analytics, email and a purpose-developed survey. These data enabled integration of several foci describing NPEC impact.
5.2.1 Website analytics
Details of site visits and visitor interactions with the available resources, registrant access and downloads of institutional datasets and the average time spent on the site were summarised from Jan 1st, 2023, to May 30th, 2025. E-mail messages from NPEC users to the website support inbox were reviewed for information that related to the study aims. Project publication citation metrics were extracted from Google Scholar. Key findings in the annual summary reports (freely available on the website) were reviewed.
5.2.2 Stakeholder experience survey
A survey housed in Qualtrics was purpose-developed by all authors considering survey design principles: 1) participant target groups; 2) question writing and development; 3) pre-testing and pilot testing; and 4) survey administration ( Dillman et al., 2014; Pagano et al., 2020). The resultant questions were designed to make sense to the readers for example, role titles were derived from the NPEC registration legend and demographic questions were drop down options with additional free- text response options. The author team (who are expert NPEC/PET users) undertook face validity testing. Survey questions are available from supplementary data file 1.
5.3 Ethical clearance
Student and supervisor evaluations collected by the NPEC has Human Research Ethics Committee exemption as a quality improvement project (Approval number 2021/373). For this in-depth impact study, using additional data collection methods, the lead author gained ethical approval from the institution where they work (HREC 25313).
5.4 Data analysis
Participant characteristics were summarised using descriptive statistics. An inductive/deductive sequential approach is useful for evaluation studies, increasing rigour as participants’ comments are prioritised in the analysis rather than coding directly to pre-determined categories ( Proudfoot, 2023). Inductive analysis approaches seek to allow codes and themes to emerge from participants’ subjective experiences to illuminate deeper insights and contribute new understandings ( Thomas, 2006). In contrast, deductive analysis uses pre-determined codes and is often used in evaluation and impact research ( Delve and Limpaecher, 2024).
The first two authors independently and manually applied Thomas’ (2006) general inductive three step approach to 1) reveal meanings emerging from the data 2) arrange the meanings as patterns and themes to assist in responding to the study aim and 3) describe the themes to answer the research questions. Analysts designed a theming table with four columns: emergent themes, codes; comments; and notes/thoughts. When complete, the first author merged the two theming tables into one.
An online discussion (90 min) conducted with a third researcher finalised the theming. Four themes emerged:
1) A valued system, 2) Barriers to engagement, 3) Resulting changes and 4) It’s early days! These are supported by seven subthemes
. The deductive analysis was completed during the same online meeting. All authors agreed on the final themes and the deductive coding (
Tables 3 and 4).
6 Results
6.1 Website analytics
The website analytics showed close to 190 000 visits to the website.
Visitors to the site originated from 92 unique countries with most from Australia (98.5 %). Of these, 50 countries recorded a single episode of engagement such as downloading the tool or clicking on a site page. The top ten countries of origin after Australia are shown in
6.2 Stakeholder experience survey
There were 107 responses to the survey. 86 % identified as nurses (n = 92), 13 % (n = 14) were midwives and one student responded.
Nearly half 55 % (n = 59) the respondents first engaged with the NPEC either during, or prior to 2023, when national stakeholder onboarding commenced. Of the 107 responses received, 72 (67.3 %) provided detailed comments with the remaining third (n = 35) completing only the personal characteristic questions.
6.3 Thematic analysis
Inductive/deductive analysis methods can be reported as integrated or standalone results (
Proudfoot, 2023). We chose to report the two separately, as this can offer greater insight into user experiences (
Bonner et al., 2021).
The first theme, A valued system describes participants’ positive experiences of engaging with the NPEC website, data repository and the placement evaluation tools (PETs). Automated system email alerts received when students rate placement experiences poorly were considered beneficial. The responses fell into two subthemes valued features and valued data. There were 34 responses to the survey item asking about the most useful resources. Access to downloadable datasets was by far the most frequently mentioned (n = 16) see
Supplementary data file 2. Yet a common reflection was the value of the automated reports:
‘ My experience engaging with NPEC has been generally positive. The website functionality is user-friendly and I’ve found it easy to navigate and access the information I need. The automated data reports are particularly useful, as they provide clear insights that help streamline my workflow. I also appreciate the CSV download feature, as it allows for quick extraction of data for further analysis’ (Nurse, Faculty R11).
Most sentiments from both faculty and placement providers confirmed the value of the data:
‘NPEC is a valuable tool to centralise reporting and benchmark placement quality standards’ (Nurse, Faculty R12) and ‘Works well… questions are relevant, automated data reports are helpful as are the CSV downloads’ (Nurse, Clinical Supervisor R43).
There were shared challenges forming a second theme
Barriers to engagement. Described under two sub themes,
Response rates and
Website and data functionality this theme captures the challenges experienced and the need for future development and site improvements. One common barrier was that of low student and supervisor response rates:
‘ We are currently having discussions about returning to a local evaluation tool because the response rate is significantly reduced to when we had previously had a local feedback tool’ (Nurse, Placement Provider R103).
Nurses and midwives from faculty and placement providers stated that PET are competing with existing surveys:
‘We also collect data internally…this data supports us organisation wide, not just nursing and midwifery students’ (Nurse, Health Service Manager R77) .
‘I believe the MidStep tool is more suitable for midwifery students’ (Midwife, Faculty and Head of Programme R66) confirmed also by (Midwife, Faculty R3) and (Midwife, Placement Provider/Faculty joint role R67).
This following comment highlights concerns regarding enquiry response times from the NPEC staff and serves as a reminder that stakeholder support is essential:
‘ Communication has been poor (from NPEC) including response time to email queries and a communication strategy to engage stakeholder groups’ (Nurse, Health Service Manager R88).
Website and data functionality were sometimes challenging. Participants’ experiences highlighted difficulties and provided direction for future system improvements:
‘ The CSV file is hard to filter and convert to reports’ (Nurse Faculty R18) and from another:
‘I find the website clunky and the CSV downloads while manageable with small amounts of data could be integrated more easily …to produce more usable results’ (Nurse, Faculty R7).
Some participants sought education resources from the website:
‘Minimal information available on site. Site does not have a lot to offer in way of education or support’ (Nurse, Clinical Supervisor R22).
There were 20 responses to the survey item asking about the least useful resources. Users cleaning of downloaded datasets was mentioned (n = 8). Some felt survey questions were too broad or did not suit organisational needs (n = 6). The remaining (n = 6) responses confirmed the subthemes reported in this theme. Supplementary data file 3 tabulates all responses.
The third and largest theme, Resulting changes, highlights the importance of the NPEC for influencing practice and procedures. Emerging trends indicating sector-wide changes fell into two sub-themes:
targeting students’ and supervisors’ experiences and
approaching stakeholder relationships differently.
Supporting data are also reported in the deductive analysis, see
Table 4. Changes to clinical supervisor education and student and supervisor support were often mentioned:
‘We have introduced an internal model for clinical placements…NPEC is a valuable way of evaluating how well our model suits and supports students’ (Nurse, Health Service Manager R72).
This nurse clinical supervisor reported actioning student feedback:
‘Where there have been concerns from students, we have tried to improve our practice to take these concerns into consideration…’ (Nurse, Clinical Supervisor R31).
Stakeholder relations are key to enhancing placement experiences. Participants appreciated the Centre for assisting in
approaching stakeholder relationships differently:
‘NPEC assists us with stakeholder engagement. We find that the content relates mainly to the preparation of students and acceptance of students on the ward environments - an area we don't have much control over as an EP [education provider]’ (Nurse Faculty R10).
The fourth and final theme, It’s early days! situates the NPEC as a developing system with potential for greater future impact. The subtheme
project enhancements describes participants’ suggestions for project sustainability and wider stakeholder involvement. A common view was to consider increasing student and others’ participation:
‘ If healthcare can provide the link when the students are ending placement, I feel we would have a greater volume of responses’ (Nurse Clinical Supervisor R47).
‘… make it available for all health services … then management could require staff to strategise how they could improve their rankings/ratings’ (Midwife Faculty Placement Team R81).
Responses included widening participation to students studying for a vocational qualification such as the Enrolled Nurse qualification: ‘. .any updates on ENs being included?’ (Nurse Placement Provider Team R95).
The issue of students’ survey fatigue was raised:
‘To avoid survey fatigue it would be great if the NPEC survey could link into [redacted, State based quality assurance systems] as in a 1 stop shop survey’ (Nurse Health Service Manager R80).
Several participants indicated commitment to the NPEC as summed here:
‘I look forward to NPEC continuing to grow and develop as it is an important and valuable resource (Dean, Faculty, R56).
6.4 Deductive analysis
Table 4 presents results of the deductive coding exercise, illustrating the NPEC impact and influence aligned with the IMPEL model’s pre- determined codes. The thematic analysis results are included as reference. The NPEC reach extends across all seven areas of the model showing immediate to widespread influence.
7 Discussion
This paper is unique in that it reports nursing and midwifery stakeholder experiences of the Australian National Placement Evaluation Centre (NPEC), offering a consistent national approach to collecting nursing and midwifery student and supervisor clinical placement evaluations. A descriptive study design using a hybrid inductive-deductive analysis answers the research questions: 1) What are the experiences of Australian stakeholders of a national approach to collecting student and supervisor feedback on clinical placements? and 2) How has a national approach to collecting student and supervisor feedback informed changes to nursing and midwifery clinical placement?
7.1 Impact on stakeholders
Widespread engagement with the NPEC is evident, with 37 of 38 nurse education and 17 of 23 midwifery education providers participating. Stakeholders valued the system’s consistency and the ease of accessing datasets via the website; however, challenges emerged. In some institutions low student response rates and the requirement that seven or more completed PET responses must be received prior to institutional data access (to maintain student anonymity) potentially discourages participation. There were some comments relating to slow responses to stakeholder enquiries. Fostering stakeholder participation, website engagement and retention is important and thus the NPEC staff provide continued stakeholder instruction and support to users regarding NPEC processes, data reporting and access functions ( Belew and Elad, 2024). Despite some shortcomings, student and supervisor feedback has informed quality improvement initiatives. The deductive analysis confirms the NPEC reach amongst Australian nursing and midwifery education and placement providers with early indication the Centre systems and resources have influence across healthcare professions. Examples of changes to student and supervisor preparation for placement including enhancing student document collection and supervisor professional development are tabulated in Table 4, Sections 4 and 5.
7.2 Spreading the word
Two annual reports and nearly 20 peer-reviewed publications and conference presentations with multiple citations (n = 75) evidence the project’s dissemination and scholarly impact ( National Placement Evaluation Centre NPEC, 2025). PET tools are translated into six languages, supporting international accessibility. Website analytics show close to 190 000 visits with more than 45 000 completed PET evaluations and NPEC registered members making more than 6 000 visits to the data pages ( National Placement Evaluation Centre NPEC, 2025). Website visitors originated from 92 countries, suggesting early global interest, though most traffic remains Australian. To strengthen this momentum, the project team should continue leveraging scholarly outputs, translated tools and digital engagement strategies. Enhancing the visibility of success stories and supporting champions in the sector will be critical to advancing from awareness to broader systemic adoption.
7.3 Opportunistic and systemic adoption
Wider adoption may have already occurred outside of nursing and midwifery given the enquiries for interdisciplinary and international tool access received to the Center support inbox, as reported in Table 4. This and the evidence that PET is competing with state based and disciplinary specific tools means data are siloed outside of the national NPEC data base. This could be a great loss, impacting positive change and future Centre expansion to improve clinical learning and teaching across nursing and midwifery and more broadly other healthcare professions ( Clarke, 2019; Cooper et al., 2020b). Subsequent evaluations should strive to understand the difference between PETs and available evidence-based and alternative tools. Integrating the NPEC data with other national and quality improvement systems was suggested. Such initiatives, if shared widely, could inform contemporary best practices, compliment sector initiatives to attract students and retain registered nurses and midwives in the workforce ( World Health Organization, 2021).
Results indicate that targeted recruitment of placement providers to distribute the PETs might further strengthen stakeholder partnerships and increase student and supervisor response rates. Recently, Mak et al. (2024) urged education providers to work harder to improve relations with nursing clinical placement partners, particularly when it comes to implementing quality improvement (QI) initiatives. Encouraging students to complete PETs needs more attention. Standardised approaches provided by the NPEC could be one solution to introducing students to the importance of quality improvement (QI) generally and participating in QI initiatives more specifically ( Mak et al., 2024). Producing targeted short videos successfully introduced clinical supervisors to new concepts and could support a standardised approach to widespread engagement with NPEC and completion of supervisor and student PETs ( Ryan et al., 2023). Increased student completions of evaluations may increase responses to follow up impact evaluation studies that could prove valuable for improving tool and systems features.
Well-known and persistent challenges in preparing students and clinical supervisors for successful clinical placements have been long reported ( Cant et al., 2021a; Kearney et al., 2025; Ryan and McAllister, 2019). Clinical placements are a key contributor to students’ satisfaction with higher education studies and their future employment decisions ( Anyango et al., 2024; Rodríguez-García et al., 2021). Regarding clinical supervision, positive student-teacher relations is a key factor in motivation and students’ satisfaction with clinical learning ( Rowland and Trueman, 2024; Ryan and McAllister, 2019). Our results extend the extant literature confirming the issues using a nationally consistent approach to evaluating clinical placements. More importantly our study results show key stakeholders are working collectively, using student and supervisor feedback, designing and implementing novel solutions to improve the educational quality of clinical placement learning. The theme “It’s early days!” reflects the formative stage of NPEC implementation. Conducting this impact study at an early phase provides a foundation for iterative improvements and long-term sustainability ( Papageorgiou et al., 2021). To maximise NPEC’s impact, we recommend: enhanced user education through multimedia resources, standardised stakeholder engagement strategies to improve PET response rates, streamlined administrative support to reduce engagement barriers, strategic integration with other QI systems to broaden influence, repeat studies such as this and continuous NPEC evaluations to inform adaptive project development.
7.4 Strengths and limitations
As with any study, this study has strengths and limitations. The authors are nursing and midwifery researchers who formed a project advisory team, which could be both a limitation, regarding bias in data analysis and a strength as we have the experience and depth to interpret the data. To enhance rigor, we combined both inductive and deductive coding, the latter known for returning higher inter-rater reliability than from the use of preset codes. By using multiple data sources combining quantitative survey data with qualitative feedback, the researchers provide a holistic view of the NPEC's impact and a nuanced understanding of the project’s effectiveness and gaps.
There were limited responses from midwifery stakeholders and nursing and midwifery students. A concise survey design and personalised email invitations and reminders were sent four times, which is best practice in online survey design and implementation ( Dillman et al., 2014). Despite these limitations this impact evaluation comprehensively answers the research questions. Future impact study designs for this project will consider interview and/or focus group methods and greater efforts to recruit midwives and midwifery and student responses.
Conclusion
This study provides valuable insights into the impact of the Australian National Placement Evaluation Centre (NPEC), a national quality improvement project, aimed at collecting evaluations of nursing and midwifery clinical placements to improve overall educational quality. The NPEC is valued and welcomed for its nationally consistent approach to collecting student and supervisor feedback via the three evidenced based placement evaluation tools (PETs). Stakeholders shared innovative interventions, informed by student and supervisor feedback and initiated to enhance the educational quality of nursing and midwifery student clinical placements. Stakeholder engagement with the centre is thus both meaningful and impactful. Future system improvements are suggested. There is a need to promote the Centre to students so they understand the value and impact of providing their feedback. Widening participation could extend the project impact to further enhance the quality of clinical placements through advising consistent national approaches to supervisor development and preparation of students for placement learning experiences. Further, the NPEC model has the potential for advancing clinical placement quality across healthcare professions, both nationally and internationally.
Funding
The NPEC is funded by
CRediT authorship contribution statement
Hughes Lynda: Writing – review & editing. Bogossian Fiona: Writing – review & editing, Methodology, Conceptualization. Cooper Simon: Writing – review & editing, Methodology, Funding acquisition, Conceptualization. Ryan Colleen: Writing – review & editing, Writing – original draft, Project administration, Methodology, Investigation, Formal analysis, Data curation, Conceptualization. Robyn Cant: Writing – review & editing, Writing – original draft, Investigation, Formal analysis, Data curation, Conceptualization. Areum Hyun: Writing – review & editing, Investigation, Formal analysis, Data curation. Debbie Procter: Writing – review & editing, Writing – original draft, Formal analysis, Data curation. Louise Alexander: Writing – review & editing. Dianne Bloxsome: Writing – review & editing.
Declaration of Competing Interest
Authors have nothing to declare.
Authors acknowledge
Mr Ruben HOPMANS, School of Nursing and Midwifery, Monash University, Melbourne, Australia for providing website data analytics.
Krysta Davis, student representative Griffith University for input into the survey development.
Higher Education Services Australia (HESA) under the auspices of the Australian Nursing and Midwifery Accreditation Council for providing financial support to the NPEC.
Appendix A Supporting information
Supplementary data associated with this article can be found in the online version at
Appendix A Supplementary material
Supplementary material
Table 1
| Page | Visits | Completed Visits | Ave time spent Minute:seconds | |
| 1 | PET-Nursing | 171 069 | 43 006 | 3:45 |
| 2 | PET-Midwifery | 8 524 | 1 906 | 3:01 |
| 3 | PET-Supervisor | 1 426 | 344 | 2:38 |
| 4 | Data review/access | 6 213 | 1 607 | 5:46 |
Table 2
some respondents nominated dual/multiple roles.
| Respondents’ professional roles | n |
| Clinical supervisor (placement provider) * | 30 |
| Clinical supervisor (faculty) * | 14 |
| Placement team (placement provider) | 18 |
| Placement team (faculty) * | 12 |
| Academic (faculty) * | 21 |
| Dean/Head of Discipline (faculty) | 10 |
| Health Services Manager (placement provider) | 10 |
| Head of Course/Programme (faculty) | 7 |
| Student | 1 |
| Other: nonstudent facing roles and members of NPEC advisory and user groups | 3 |
| Total | 126 |
Table 3
| A valued system | Barriers to engagement | Resulting changes | It’s early days! |
| (i) Valued features
(ii) Valued data |
(i) Response rates
(ii) Website/ data functionality |
(i) Targeting student and supervisor experience
(ii) Approaching stakeholder relationships differently |
(i) Project enhancements |
Table 4
Note: Evidence taken May 30th, 2025, from NPEC website
Citation metrics from article metrics as at May 30th, 2025.
Analytics summed in Table 1
| IMPEL Model codes | Deductive results (website analytics, publications, emails, additional qualitative data) | Inductive themes |
|
1. Change to immediate project team members. |
• ‘On an individual level it has provided me with opportunities to work and engage in research with colleagues that I would not otherwise have been able to’ (Nurse Faculty R13). • ‘I have a better understanding of the value of early warm welcomes and messaging around expectations’ (Nurse Placement Provider R99). |
Valued system
Resulting changes Its early days |
|
2. Influencing change to student clinical placement experiences. |
• 37/38 institutions offering degree level nursing and 17/23 offering midwifery courses are members. * • NPEC annual summaries report significant changes. * For example:• PET-Nursing: Lower educational quality p < .001) and significantly lower satisfaction when nursing students were supervised by university employed educators ( p < .001) (2024 report). Specialty units return significantly higher satisfaction (2023 report). • PET-Midwifery: Dual degree (Bachelor of Midwifery and Bachelor of Nursing) students rated higher quality and satisfaction than other degree types p < .001. (2024 report). |
Valued system
Resulting changes It’s early days |
|
3. Spreading the word (journal publications, PET translations, NPEC acknowledgements). |
• Annual reports (n = 2). * • Publications directly related to the project and PET data (n = 9). * • Conference presentations related to the project or PET data (n = 9). * • Publication citations total (n = 75). ** • PET- Nursing translations (n = 5) (Chilean, Chinese, Greek, Norwegian, Slovenian). * • PET - Midwifery translations (n = 1) (Turkish). * • Website analytics *** |
Valued system
Resulting changes It’s early days |
|
4. Key stakeholder adoption and changes to the institutions’ affecting students. |
1. NPEC annual summaries (n = 2). * 2. ‘most students in my cohort have no idea why they should complete the PET when they are unsure what the outcome of their contributions will be (Student R106). 3. We now collaborate with educators and discuss methods of improving the quality of the supervision and in some instances, we have stopped using certain facilities based on feedback’ (Nurse Faculty R8). 4. ‘We now have a [central] Portal for compliance/mandatory training/orientation reducing students time [uploads to different sites] improves their experience’ (Nurse Faculty Placement Team R17). 5. ‘ We are still working out how to implement changes based on NPEC data’ (Nurse Faculty Head of Programme R63). 6. ‘We have tried to implement clinical supervision workshops or basic educational qualifications and have been advised the health services provide their own education’ (Midwife, Faculty Placement Team R81). |
Valued system
Barriers to engagement Resulting changes |
|
5. Key stakeholder adoption influencing changes for all relevant students. |
• ‘Regular student debriefs and more consistent orientation for students’ (Midwife Faculty Head, R7). • ‘We have integrated the introduction of NPEC into our onboarding process for new health facilities’ (Nurse Faculty R6). • ‘The live nature of the data allows services to track their own areas of lower performance… led to the implementation of initiatives that could be associated with NPEC’ (Nurse Faculty R12). |
Valued system
Barriers to engagement Resulting changes It’s early days |
|
6. Adoption beyond participating institutions influences change for the institution’s students. |
• Website analytics. *** • Paramedicine (correspondence to NPEC support, 2020) and Diploma of Nursing (correspondence to NPEC support, 2023, 2025) distribute modified PET and store data in facility designed data bases. # • PET international language translations (n = 6). * |
Valued system
Barriers to engagement Resulting changes It’s early days |
| 7.Adoption beyond participating institutions influences change for all relevant students. Δ |
• Site visits including international visitors *** • PET translations (n = 6). * • Correspondence received to NPEC support email 2023–2025 from Australian Council of Allied Health Professional Education Providers, student nurse researchers from the Philippines, New Zealand Nursing and Midwifery education providers and a clinical placement supervisor (medicine). • Interest from United Kingdom regarding nursing student evaluations (personal correspondence to lead author, 2025). # • ‘…there has to be an independent role that has some governance over the health services to ensure responsibility for student experience and strategies are implemented AND monitored to show improvement’ (Midwife Faculty Placement Team R6). |
Valued system
Resulting changes It’s early days |
© 2025 The Authors