Content area

Abstract

The ability to debug code is critical to being a programmer and represents a distinct skill from writing code. Yet debugging is rarely explicitly taught in introductory programming and Computer Science courses. Instead, novices typically develop their own debugging habits and strategies when they encounter bugs in their code, which are often less effective than those of expert programmers. When students do seek help with debugging, they traditionally turn to office hours conducted by Teaching Assistants (TAs) or, increasingly, to Large Language Models such as ChatGPT. However, both sources of assistance have limitations. TAs are often students themselves with limited teaching experience, yet little has been studied about how this inexperience manifests in actual interactions with students. And general LLM assistants can quickly identify and fix bugs, but are not designed as pedagogical tools and may not help students learn effective debugging strategies.

This dissertation addresses these limitations by first filling the gap in our understanding of TA-student interactions during office hours, then developing tools designed to improve debugging assistance more broadly. We conduct observational studies which show that TAs often struggle with both debugging student code and providing pedagogically sound guidance, and identify specific TA debugging strategies that predict tutoring quality. We develop a core algorithm which analyzes buggy student code, generates corrections that preserve the student's intended approach, and produces runtime data explaining how fixes change program behavior. We then design and evaluate tools which present this algorithmic analysis to an intermediary --- a Teaching Assistant or, in later iterations, a Large Language Model --- and guide the intermediary through interpreting the analysis for the student, in a way that can help the student learn an effective debugging process. Our evaluation demonstrates that algorithm-driven assistance can significantly improve both debugging accuracy and speed for TAs, potentially enabling them to focus on providing sound guidance to the student; while LLM-mediated interfaces show promise for providing consistent, pedagogically sound guidance directly to students.

Details

1010268
Business indexing term
Title
Combining Code Analysis and Pedagogical Guidance: Automated Tools for Teaching Debugging in Introductory Programming
Number of pages
274
Publication year
2025
Degree date
2025
School code
0252
Source
DAI-A 87/2(E), Dissertation Abstracts International
ISBN
9798290954714
Committee member
Bogost, Ian; Ericson, Barbara; Ottley, Alvitta; Yeoh, William
University/institution
Washington University in St. Louis
Department
Computer Science & Engineering
University location
United States -- Missouri
Degree
Ph.D.
Source type
Dissertation or Thesis
Language
English
Document type
Dissertation/Thesis
Dissertation/thesis number
32169208
ProQuest document ID
3240417413
Document URL
https://www.proquest.com/dissertations-theses/combining-code-analysis-pedagogical-guidance/docview/3240417413/se-2?accountid=208611
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Database
ProQuest One Academic