Content area
Background
Evaluating resident physicians’ competencies is critical in medical education to ensure high standards of patient care and professional development. Field notes are increasingly used as reflective tools in postgraduate medical education. Despite their growing use, skepticism about their effectiveness persists. This study aims to identify challenges with learner engagement in field notes and gather suggestions for operational improvements.
Methods
A qualitative study was conducted in the Department of Family & Community Medicine at the University of Toronto. Semi-structured interviews were conducted with seven postgraduate year one and year two family medicine residents. The interviews focused on residents’ experiences and challenges with field notes. Data were analyzed using inductive thematic analysis to develop a comprehensive codebook, in alignment with Braun and Clarke’s framework.
Results
Several key challenges with the use of field notes were identified including the redundancy of feedback, sporadic utilization, and time constraints for preceptors. Residents also expressed uncertainty about the expectations for using the tool and identified it as complex and cumbersome. Operational suggestions for improvement included the development of a mobile-friendly platform, streamlined functionality, a standardized and integrated feedback system, and clearer guidelines for use.
Conclusions
The study highlighted significant challenges in the use of field notes within family medicine training programs and underscored the need for technological and procedural innovations to improve their effectiveness. Addressing these challenges through user-friendly design, clear guidelines, and integrated support systems could transform field notes into a more robust tool for competency-based medical education, benefiting residents, preceptors, and the broader medical community.
Introduction
In medical education, evaluating resident physicians’ competencies is crucial for maintaining high standards of patient care and advancing professional skills. Competency-based assessments, particularly field notes, are increasingly utilized in postgraduate medical education across Canadian family medicine programs [1].
Field notes are structured documentation tools used to capture observations of clinical encounters and provide timely, specific feedback to learners [2]. In residency training, they serve as both assessment and teaching instruments. Unlike traditional summative evaluations that occur at the end of rotations, field notes are designed to be collected throughout training to document progress across various domains of competency [3]. These brief narrative assessments typically include documentation of the clinical scenario, observed strengths, suggestions for improvement, and often include a rating scale to indicate level of supervision required [4]. The intent is to create a longitudinal record of resident performance that demonstrates growth over time and highlights areas needing focused attention. Field notes can be initiated by either the resident or preceptor, may be completed on paper or electronically, and ideally are discussed face-to-face before being formalized in writing. They represent a cornerstone of workplace-based assessment within competency-based medical education frameworks, particularly in Canadian family medicine programs where they have been widely adopted [3].
Serving as a reflective tool, field notes enable both residents and preceptors to evaluate clinical performance. This process aids residents in making actionable improvements based on thorough feedback [5, 6]. However, despite their growing use, significant skepticism remains about their effectiveness. A study by Matthew et al. revealed that nearly half of the surveyed residents doubted the effectiveness of field notes [7]. This study aims to identify the challenges with learner engagement and gather operational suggestions to improve the tool. The findings will contribute to academic discussions and guide the continual implementation of field notes in postgraduate medical education programs.
Methods
Study design and setting
This qualitative study was conducted within the Department of Family & Community Medicine (DFCM) at the University of Toronto. By employing a qualitative design, we aimed to capture in-depth insights into the residents’ experiences with the field notes tool.
Data collection
Participants were recruited from the cohort of postgraduate year one (PGY-1) and two (PGY-2) family medicine residents currently enrolled in the program at the University of Toronto. Recruitment was conducted through email to ensure a diverse representation in terms of residency year, gender, and prior experience with the field notes tool. Participation was voluntary and recruitment continued until thematic saturation was reached. We conducted semi-structured interviews between December 2022 and August 2023, using a flexible yet targeted approach to explore residents’ views while adapting to the natural flow of conversation. The interview guide, developed to probe into residents’ experiences with field notes and avenues for potential enhancements, served as a foundational script (Supplementary Material 1). In accordance with contemporary research adaptations necessitated by the COVID-19 pandemic, interviews were conducted via Zoom. This method ensured the safety of all participants while maintaining the integrity of the data collection process. Interviews were audio-recorded with the consent of the participants and then de-identified to maintain confidentiality.
Data analysis
Transcripts from the interviews were analyzed using inductive thematic analysis to identify patterns and themes in the data. Two medical students at the University of Toronto (R.S.H., T.S.) independently reviewed the verbatim transcripts to pinpoint key words, phrases, and latent content. These preliminary findings were then discussed in a group setting to achieve consensus on a coding schema. Discrepancies in interpretations were addressed through constructive dialogue and, when necessary, arbitration by a third author (F.L.), who is a faculty member within the DFCM with experience supervising residents and completing field notes. A codebook was crafted to facilitate the systematic identification, organization, and reconciliation of themes both independently and as a collective. Through iterative group discussions, patterns were identified, categorized and developed into themes. To substantiate the identified themes, direct quotes from the transcripts were extracted. This process was conducted in alignment with Braun and Clarke’s framework for inductive thematic analysis (familiarization with the data, generating initial codes, searching for themes, reviewing themes, defining and naming themes, and producing the report) [8].
Ethical considerations
Ethical approval was obtained from the University of Toronto Research Ethics Board. Informed consent was acquired from all participants. This study was conducted in compliance with the Declaration of Helsinki.
Results
Participant characteristics
Seven family medicine residents participated in semi-structured interviews (Table 1).
[IMAGE OMITTED: SEE PDF]
Theme 1: residents acknowledge benefits of field notes
Residents identified several positive aspects of the field notes tool that enhanced their educational experience (Table 2). The accessibility of field notes was particularly valued, with residents appreciating how easily they could review their feedback throughout their training. One resident noted, “I liked how you have very easy access to all the field notes throughout residency. It wasn’t just like the latest one. I could very easily review any of the ones I wanted to” (P4). Participants also recognized the potential of field notes for providing regular feedback across diverse clinical scenarios, with one stating, “I like how you get regular feedback on a variety of cases using the tool” (P1).
[IMAGE OMITTED: SEE PDF]
The inclusion of standardized rating scales was another feature residents found beneficial. These scales helped them understand their performance level and supervision requirements in concrete terms. As one resident explained, “I think the rating scale is pretty standard and that’s good because you see where someone is in terms of how much supervision is required” (P3). Beyond these structural benefits, residents believed field notes contributed meaningfully to their professional development. One participant shared how field notes “helps with direction in terms of career… that was helpful in terms of deciding fellowship” (P2). Residents also valued the learning opportunities provided by field notes, appreciating how the tool created a structured approach to receiving constructive feedback. The formative nature of field notes reduced anxiety about performance, as one resident explained: “So even if it was a bad field note, like it literally doesn’t matter at the end of the day. So, there was never any embarrassment… It was always like taken as like this is a learning opportunity” (P5).
Theme 2: residents are critical of field notes, highlighting some limitations
Despite recognizing potential benefits, residents described several challenges that diminished the field note tool’s utility. Many considered it an additional administrative burden amidst their already demanding schedules. One resident remarked, “The process is just cumbersome in general. It’s like an extra thing people have to do and on top of patient care” (P5). This sense of burden was exacerbated when field notes seemed redundant with verbal feedback already received during clinical encounters. As one resident noted, “I don’t know if necessarily having them in the form of field notes actually contributed to the fact that that feedback helped me out” (P7). The sporadic use of field notes further diminished their effectiveness. Rather than serving as a consistent, ongoing feedback mechanism, residents reported that field notes were often completed in batches toward the end of rotation blocks. Time constraints emerged as a significant barrier, particularly for preceptors. As one resident observed, “Broadly speaking, time is a barrier to complete the field notes” (P1). These constraints often led to rushed feedback that lacked depth and specificity.
Uncertainty regarding expectations created additional stress for residents. One participant shared, “I think there was a little bit of stress associated with the tool, mainly because there was no concrete idea of how many field notes you are supposed to have” (P6). Technical and usability issues further complicated the process. Residents described a cumbersome experience of chasing preceptors for completion and navigating multiple software platforms. One resident explained, “The navigation of the site is kind of difficult… I get to go through multiple pages to find a field note or maybe I’m not well versed on the software” (P1). Inconsistent engagement among faculty also affected the tool’s perceived value, with one resident noting, “I found like different staff took them more seriously as compared to others” (P6).
Theme 3: Residents would like an improved field notes platform
Residents provided thoughtful suggestions for enhancing the technological aspects of the field notes system (Table 3). A recurring recommendation was the development of a mobile-friendly platform that would streamline the process. One resident emphasized, “It would be easier if the system was designed like an app where you just pull it up and fill it out, boom, boom and it’s done” (P6). Another echoed this desire for simplified access, stating, “I would love it if it was something that’s done through an app that doesn’t require as many steps to complete” (P5).
[IMAGE OMITTED: SEE PDF]
Participants also advocated for more streamlined and efficient functionality to reduce the time burden. One resident suggested the need to “streamline the process a little bit too, so you can have it done quickly” (P2). Recognizing the importance of preceptor engagement, another noted, “As long as we can ease the process, you have to make it easy for preceptors to give feedback” (P3). Integration and standardization were also prioritized in residents’ suggestions. Many expressed frustration with having to navigate multiple platforms for different types of feedback and recommended centralizing these systems. As one resident explained, “It would be great if it was all the feedback was centralized, so then we can get like a summary of all the different types of feedback, in one page” (P6).
Theme 4: Improvements to processes and communication may enhance their experience
Beyond technological enhancements, residents identified process and communication improvements that could transform their experience with field notes. Clarity of expectations emerged as a critical need. One resident highlighted, “Better if the objectives were clearly defined in terms of what you’re expecting… make them a bit more specific” (P4). Another emphasized the need for explicit guidelines, stating, “Clearly you don’t need like a certain number of field notes to graduate because I didn’t have that many and I passed. I think the DFCM just should set the expectations clearly on what exactly they need, how many need to be done” (P3). Participants offered diverse perspectives on field note initiation. Some preferred shifting responsibility toward preceptors, with one resident suggesting, “More of like preceptor-driven feedback versus the resident having to ask for it” (P1). Others focused on improving the quality and comprehensiveness of evaluations, recommending that “when preceptors are actually evaluating an encounter, making sure that they’re evaluating it from start to finish, not just snippets” (P7).
Additional support systems were suggested to enhance the field notes process. Residents valued discussion-based approaches, with one proposing, “One of the things I would change with the field note is that it should be discussed before they’re submitted” (P3). Practical aids such as navigation guides were recommended, along with structural changes like protected time for preceptors. As one resident realistically observed, “Unless there’s some sort of protected time in clinic before preceptors go home, I don’t really see this changing unless somehow the process is simplified” (P4).
Discussion
Residents identified key operational challenges that compromise the field note tool’s effectiveness. These findings contribute to the ongoing discussion on assessment tools in medical education, emphasizing the need to align assessment methods with educational goals while ensuring practicality and ease of use.
At the University of Toronto’s DFCM residency program, field notes are implemented through a web-based platform that residents and preceptors access through the program’s learning management system. Residents are expected to collect approximately 2–3 field notes per week across different clinical domains. The process typically begins with the resident initiating a request for observation, after which the preceptor observes all or part of a clinical encounter. Feedback is ideally provided verbally immediately following the encounter, and then documented electronically within the platform. The field note form includes sections for describing the clinical scenario, documenting specific feedback on strengths and areas for improvement, and rating the resident’s performance using a 5-point entrustment scale ranging from ‘close supervision required’ to ‘able to supervise others.’ Residents receive an email notification when preceptors complete a field note, allowing them to review the feedback and reflect on their performance. Program directors periodically review residents’ collected field notes to identify patterns and inform summative assessments. While there is no specific minimum number of field notes required for promotion, residents are encouraged to gather sufficient documentation to demonstrate progress across all competency domains.
While the benefits of field notes—such as promoting reflective practice and guiding residents through a competency-focused curriculum—are well-acknowledged [9,10,11], our findings indicate that the timing and consistency of feedback delivery often fail to meet the immediacy required by competency-based medical education. Feedback was frequently postponed until the end of clinical rotations, limiting its impact on immediate learning and adaptation. Furthermore, criticisms regarding the redundancy of field notes, their sporadic use, and real and perceived administrative burdens highlight the discrepancy between the goals of competency-based assessments and their practical execution, especially given medical residents’ demanding schedules [12]. The perception of field notes being burdensome is exacerbated by preceptors’ time constraints, which compromise feedback’s timeliness and specificity, thereby reducing the educational value of the tool [13,14,15]. These findings underscore the challenge of translating the theoretical advantages of comprehensive assessment tools into practice.
The identification of these challenges points towards a critical need for innovation in the design and deployment of assessment tools within medical education. A streamlined, user-friendly system that minimizes administrative burdens while maximizing the opportunities for regular, meaningful feedback could significantly enhance the utility and acceptance of field notes among residents and preceptors alike. Such a system would ideally integrate seamlessly into the daily workflows of medical education, ensuring that the process of documenting and reflecting on clinical experiences does not detract from the core educational objectives of learning and improvement. Moreover, the call for a more user-friendly and efficient system for field notes reflects a broader recognition of the importance of technology and design thinking in medical education [16,17,18]. By adopting principles of usability, accessibility, and integration, educational tools can be developed to support the complex process of learning and assessment in clinical settings more effectively [19,20,21,22,23]. This approach not only addresses the practical challenges identified by residents and preceptors but also aligns with the evolving landscape of medical education, which increasingly recognizes the value of adaptive, learner-centered strategies that support the continuous development of clinical competencies.
The limitations inherent in our study warrant consideration. Primarily, the relatively small sample size, confined to residents within a single department at one institution, poses a significant constraint on the generalizability of the results. Therefore, the insights garnered from our cohort may not fully encapsulate the diverse range of experiences and perceptions that exist among all medical residents. The thematic coding was conducted primarily by two medical students at the institution, which, while offering valuable insights from learners’ perspectives, may have introduced certain biases in data interpretation. To enhance reflexivity, the team engaged in regular discussions to critically examine how their perspectives, prior experiences, and assumptions could shape data interpretation. We maintained reflexive memos documenting emerging interpretations and potential biases, challenging each other’s readings of the data and seeking alternative explanations before finalizing themes. Members with field note experience provided contextual insights during team discussions while being mindful not to impose their preconceptions. Through this iterative process of questioning our assumptions and returning to the data, we worked to ensure our themes authentically represented participants’ experiences rather than our own expectations. Additionally, the qualitative nature of our study, while invaluable for obtaining in-depth, nuanced understandings of residents’ experiences and perceptions, also means that the perspectives captured may not be representative of all residents’ experiences. Qualitative research is inherently subjective and seeks to explore complex phenomena within specific contexts rather than to generalize findings across populations. As such, the rich, detailed insights provided by the participants in our study offer a deep dive into the specific challenges and opportunities for enhancing feedback practices within their context but may not encompass the full spectrum of opinions, experiences, and practices that exist within the wider medical education community. Furthermore, our study did not include member checking, which should be considered in future studies.
Conclusion
Our study suggests that there may be technical and policy-driven challenges that affect the utility of field notes in competency-based medical education. Participants identified potential areas for improvement, including enhanced technological platforms and clearer operational guidelines. These insights, while drawn from a limited sample, point to possible directions for optimizing field notes in medical education. Future research with larger, more diverse samples across multiple institutions would be valuable to determine the generalizability of these findings and develop evidence-based strategies for enhancing field note systems.
Data availability
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Abbreviations
DFCM:
Department of Family & Community Medicine
PGY-1:
Postgraduate Year One
PGY-2:
Postgraduate Year Two
COVID-19:
Coronavirus Disease 2019
Ross S, Lawrence K, Bethune C, et al. Development, implementation, and Meta-Evaluation of a National approach to programmatic assessment in Canadian family medicine residency training. Acad Med. Feb 2023;1(2):188–98. https://doi.org/10.1097/acm.0000000000004750.
Wald HS, Davis SW, Reis SP, Monroe AD, Borkan JM. Reflecting on reflections: enhancement of medical education curriculum with structured field notes and guided feedback. Acad Med Jul. 2009;84(7):830–7. https://doi.org/10.1097/ACM.0b013e3181a8592f.
Lacasse M, Douville F, Desrosiers É, Côté L, Turcotte S, Légaré F. Using field notes to evaluate competencies in family medicine training: a study of predictors of intention. Can Med Educ J. 2013;4(1):e16–25.
Zaki N, Cavett T, Halas G. Field note use in family medicine residency training: learning needs revealed or avoided? BMC Med Educ Aug. 2021;27(1):451. https://doi.org/10.1186/s12909-021-02883-6.
Ross S, Poth CN, Donoff M, et al. Competency-based achievement system: using formative feedback to teach and assess family medicine residents’ skills. Can Fam Physician Sep. 2011;57(9):e323–30.
Huang RS, Kam A. Humanism in Canadian medicine: from the Rockies to the Atlantic. Can Med Educ J. 2023.
Mathew AE, Kumar Y, Angeline RP, Christopher P, Rehman SP, Venkatesan S. Workplace-based assessment of family medicine competencies using field note tool - A pilot study. J Family Med Prim Care Nov-Dec. 2018;7(6):1458–63. https://doi.org/10.4103/jfmpc.jfmpc_141_18.
Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Research in Psychology. 2006/01/01 2006;3(2):77–101. https://doi.org/10.1191/1478088706qp063oa
Griffiths JM, Luhanga U, McEwen LA, Schultz K, Dalgarno N. Promoting high-quality feedback: tool for reviewing feedback given to learners by teachers. Can Fam Physician Jul. 2016;62(7):600–2.
Schultz K, Griffiths J, Lacasse M. The application of entrustable professional activities to inform competency decisions in a family medicine residency program. Acad Med. Jul 2015;90(7):888–97. https://doi.org/10.1097/acm.0000000000000671.
Kam A, Lam T, Chang I, Huang RS, Fernandez N, Richardson D. Resident perceptions of learning challenges in concussion care education. Can Med Educ J. 2023.
Škrinjarić B. Competence-based approaches in organizational and individual context. Humanit Social Sci Commun. 2022;01(1):28. https://doi.org/10.1057/s41599-022-01047-1. /25 2022.
Irby DM, Wilkerson L. Teaching when time is limited. Bmj Feb. 2008;16(7640):384–7. https://doi.org/10.1136/bmj.39456.727199.AD.
Fitzgerald JT, Burkhardt JC, Kasten SJ, et al. Assessment challenges in competency-based education: A case study in health professions education. Med Teach May. 2016;38(5):482–90. https://doi.org/10.3109/0142159x.2015.1047754.
Chen D, Alnassar SA, Avison KE, Huang RS, Raman S. Large Language model applications for health information extraction in oncology: scoping review. JMIR Cancer Mar 28. 2025;11:e65984. https://doi.org/10.2196/65984.
Guze PA. Using technology to Meet the challenges of medical education. Trans Am Clin Climatol Assoc. 2015;126:260–70.
Huang RS, Lu KJQ, Meaney C, Kemppainen J, Punnett A, Leung FH. Assessment of resident and AI chatbot performance on the university of Toronto family medicine residency progress test: comparative study. JMIR Med Educ Sep. 2023;19:9:e50514. https://doi.org/10.2196/50514.
Huang RS, Benour A, Kemppainen J, Leung FH. The future of AI clinicians: assessing the modern standard of chatbots and their approach to diagnostic uncertainty. BMC Med Educ Oct. 2024;11(1):1133. https://doi.org/10.1186/s12909-024-06115-5.
Singaram VS, Bagwandeen CI, Abraham RM, Baboolal S, Sofika DNA. Use of digital technology to give and receive feedback in clinical training: a scoping review protocol. Syst Rev Dec. 2022;13(1):268. https://doi.org/10.1186/s13643-022-02151-8.
Huang RS, Mihalache A, Popovic MM, et al. Quantitative fluorescein angiography biomarkers in diabetic macular edema. Retina Jun. 2025;1(6):1125–33. https://doi.org/10.1097/iae.0000000000004424.
Jomy J, Lin KX, Huang RS, et al. Closing the gap on healthcare quality for equity-deserving groups: a scoping review of equity-focused quality improvement interventions in medicine. BMJ Qual Saf Jan. 2025;28(2):120–9. https://doi.org/10.1136/bmjqs-2023-017022.
Han H, Resch DS, Kovach RA. Educational technology in medical education. Teach Learn Med. 2013;25(Suppl 1):S39–43. https://doi.org/10.1080/10401334.2013.842914.
Patil NS, Huang R, Caterine S, Varma V, Mammen T, Stubbs E. Comparison of artificial intelligence chatbots for musculoskeletal radiology procedure patient education. J Vasc Interv Radiol Apr. 2024;35(4):625–e62726. https://doi.org/10.1016/j.jvir.2023.12.017.
© 2025. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.