Full text

Turn on search term navigation

© 2018. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and conditions, you may use this content in accordance with the terms of the License.

Abstract

Learning programming has become more and more popular and organizing introductory massive open online courses (MOOCs) on programming can be one way to bring this education to the masses. While programming MOOCs usually use automated assessment to give feedback on the submitted code, the lack of understanding of certain aspects of the tasks and feedback given by the automated assessment system can be one persistent problem for many participants. This paper introduces troubleshooters, which are help systems, structured like decision trees, for giving hints and examples of certain aspects of the course tasks. The goal of this paper is to give an overview of usability (benefits and dangers) of, and the participants' feedback on, using troubleshooters. Troubleshooters have been used from the year 2016 in two different programming MOOCs for adults in Estonia. These MOOCs are characterized by high completion rates (50-70%), which is unusual for MOOCs. Data is gathered from the learning analytics integrated into the troubleshooters' environment, letters from the participants, questionnaires, and tasks conducted through the courses. As it was not compulsory to use troubleshooters, the results indicate that only 19.8% of the users did not use troubleshooters at all and 10% of the participants did not find troubleshooters helpful at all. The main difference that appeared is that the number of questions asked from the organizers about the programming tasks during the courses via helpdesk declined about 29%.

Keywords : MOOC, open education, programming, troubleshooting system

Introduction

Teaching introductory programming courses has become an important subject matter in Estonia in connection with the need to raise awareness of, and interest in, information technology. Supporting the learning of the programming language Python, a massive open online course (MOOC) in Estonia called About Programming (in Estonian, Programmeerimisest maalähedaselt) was created in 2014. Research has shown that the average completion rate for MOOCs in the world is approximately 15% (Jordan, 2014; Siemens, 2013), but in our case the percentage of completions has been constantly over 50%. This paper addresses the idea of having a helpdesk supporting the participants in the course and reducing the number of questions from the participants by creating troubleshooters for the programming tasks.

Programming MOOCs rely mostly on automated assessments, which enable the participants to post the solutions for the tasks in a way that the system could automatically analyze the solutions and give automated feedback. Self-assessment should be used as an assessment for learning instead of an assessment of learning (Admiraal, Huisman, & Pilli, 2015). In programming, some mistakes in the code can be very difficult to resolve and therefore our MOOCs offered a helpdesk email address to answer the questions that appear during the course. The instructors and university students who lent their assistance, agreed to answer the helpdesk emails in less than 8 hours. While having people on watch all the time is not very cost effective, the helpdesk offers instant help that beginner learners need. The questions asked from the helpdesk give a lot of information about the problems occurring with the tasks during the course.

To reduce the number of questions asked from the helpdesk, troubleshooters were provided for every programming task, starting from 2016. The troubleshooters include collections of answers and clues to the questions, which can arise when solving the course tasks.

This paper gives an overview of the creation of the troubleshooters to support the course and presents the learners' opinions about the troubleshooters. The impact of troubleshooters is discussed in the context of the resources needed for creating troubleshooters and the results of course optimization, needed to keep it automated.

Theoretical Background

This section provides a theoretical background on supporting online programming courses with helpdesk and troubleshooters by categorizing programming mistakes that beginners make.

MOOCs

Massive Open Online Courses (MOOCs) are one of the recent models in open and distributed learning (Downes, 2017). The history of MOOCs can be divided into two phases: cMOOC (connectivist MOOCs) period and xMOOC (content-based MOOCs) period (Baturay, 2015). However, there is a move away from the cMOOC/xMOOC division towards recognition of the multiplicity of MOOC designs, purposes, topics, and teaching styles (Admiraal et al., 2015).

While the educational world is proliferated with MOOCs and they are hyped in the media, there are still some challenges for MOOCs to overcome (Veletsianos & Shepherdson, 2016). One of the most salient challenges is the dropout rate (Siemens, 2013), with widely cited figures of 10% completion rates (Ebben & Murphy, 2014). Researchers are trying to examine the reasons behind the low retention rates (Greene, Oswald, & Pomerantz, 2015; Hone & El Said, 2016). It has been found that a lack of incentive, insufficient prior knowledge about the topic, ambiguous assignments, and having no one to turn to for help can be possible reasons for non-completion (Hew & Cheung, 2014). MOOC content and interaction with the instructor were also shown to have a significant effect on retention (Hone & El Said, 2016).

Due to having thousands of participants per instructor, it is impossible for MOOC instructors to conduct assessments and provide individual feedback (Suen, 2014). Different models of interaction are used, such as automated feedback (Pieterse, 2013), peer support (Onah, Sinclair, & Boyatt, 2014), self-assessment (Papathoma, Blake, Clow, & Scanlon, 2015), helpdesk (Warren, Rixner, Greiner, & Wong, 2014), and scaffolding messages like troubleshooters (Vihavainen, Luukkainen, & Kurhila, 2012).

Helpdesks

As the number of questions on various topics of the course rises and it is difficult to find answers to the questions in a course with thousands of participants, we were faced with the challenge of how to retain the availability of sufficient support to positively finish the course. Using a helpdesk could be one option for answering the questions and monitoring the process. Previous MOOCs that used a helpdesk were rated extremely positive (Warren et al., 2014).

A helpdesk could use different kinds of data, video, and voice support (Motsuk, 1999), but our course offered a helpdesk email from the organizers of the MOOCs (faculty members and students) who had to answer any letters in less than 8 hours. The possibility to ask questions from the helpdesk could have been one of the key factors that helped more than 50% of the participants finish our courses (Lepp et al., 2017a).

As course participants send emails to the helpdesk address and receive answers from it, several helpdesk systems are available for managing such a system. A helpdesk system needs to be usable online, look nice and simple for users, be easy to use, include various functions, like a search engine, option to set labels to letters, and archive the letter data for later analysis. Developing such a system can be too complex task for a simple project (Washburn & El-Bayoumi, 2003). In our case an online helpdesk system, called Freshdesk (https://freshdesk.com/) was used.

Using a helpdesk has several advantages for organizers, too. One of the benefits is that engaging students in answering the helpdesk emails can have a positive influence on their studies (McRitchie, 2009) and reduce the cost of helpdesk (Sinnett & Barr, 2004). When counting the number of people getting help and being educated by MOOCs, the cost per participant can be rather low too. Frequently asked questions can be gathered to create helpful troubleshooters for each course task.

Troubleshooters

Troubleshooters are systems that are mostly used for IT services helping to solve problems manually by clicking answers to various questions to find a solution to the problem in a system with a decision tree structure. A similar kind of self-assessment (exercises with built-in scaffolding messages inside the programming environment) has been tried in case of programming MOOC and found to be fruitful (Vihavainen et al., 2012).

One way of identifying the problems that need to be included in troubleshooters would be mining the course data (and constructing, for example, Bayesian networks; Skaanning, Jensen, & Kjærulff, 2000). It can be difficult, as many filters should be applied to get reasonable results (Wustner, Joumblatt, Teixeira, & Chandrashekar, 2012). Sometimes the problems occurring can be rather difficult to track, as the real problems can be different from those originally discovered.

Creating troubleshooters can be difficult, but systematically organizing the problems that need to be solved can make it a lot easier. The presence of the course personnel in labs can be one possibility for answering the question about the next problem that can be encountered by a student (Vihavainen et al., 2012). In case of MOOCs, creating systematic decision trees for troubleshooters can be done by analyzing past help requests for the tasks and categorizing the questions in a way that supports the development of hints and examples to guide learners to answers to frequently asked questions.

Categorizing the Problems in Solving Programming Tasks

This paper addresses the system of help for typical problems of novice programmers. As many questions arise during the programming MOOCs, starting from questions about registration and ending with understanding specific nuances of certain aspects, this article is limited to the frequently asked questions that have been asked by the participants in an introductory programming MOOC. It can be much more difficult to help with the problems in more complex courses, including aspects such as inheritance, objects, class, encapsulation, methods, message passing, polymorphism, and abstraction (Sanders & Thomas, 2007).

Many questions can be about error messages. The Center for Computing Education and Diversity at the University of California has identified 17 categories of errors that can occur in Python programming (Marceau, Fisler, & Krishnamurthi, 2011), but when looking at one task, few of them usually occur and users are often accustomed to that when trying to resolve a mistake in the code. Error messages are only a part of the problems that can occur and code can often be wrong even when executed with no errors. This could be the case, for example, when trying to understand the changes that need to be made in the code to produce different outputs for certain inputs.

Garner, Haden, and Robins (2005) have organized introductory programming courses and investigated the mistakes novice programmers make during the practice sessions. They noticed that the more assistance weaker participants receive the better is their achievement in the course. Garner and colleagues described 27 problems that can appear in the practice sessions of a programming course for beginners. As our courses were online courses, we had to use helpdesk letters instead of direct feedback from practice sessions.

The problems occurring can be different in various situations. In pair-programming, the pairs would later be able to solve more low-level syntax problems individually than in solo-programming (Hanks, 2008). As in our courses the assignments are individual, we needed a system to help more with the low-level syntax problems.

As the problems appear during the process of solving certain tasks, our idea was to cultivate from that and to look at the problems coming out from the MOOC tasks via the helpdesk. Although in our case many of the problems (like errors and input-output faults) are solvable with the help of the automatic assessment tool, that assessment tool can create extra problems and questions that need to be solved.

Research Problem

The purpose of this study was to develop and evaluate troubleshooters for the programming tasks to provide additional support to MOOC participants and reduce the number of learner emails with questions to organizers while maintaining a high completion rate. Figure 1 presents the research problem.

The research questions were:

1. Can troubleshooters facilitate the work of MOOC organizers?

2. How do participants perceive troubleshooters as an additional support tool?

Figure 1. The research problem.

Murelahendaja Environment for Troubleshooters

Based on previous studies (Garner et al., 2005; Vihavainen et al., 2012), our troubleshooter creation process, which was rather difficult and time consuming, includes:

1. Analyzing the questions asked via the helpdesk about the weekly programming tasks;

2. Categorizing the questions asked by creating a table of types of typical questions;

3. Creating a tree-structured hint system with examples called troubleshooters to help with questions that have been asked.

Analysis of Questions and Categorization of Occurring Problems

This paper deals with an introductory programming MOOC About Programming in Estonia for adults that has been organized several times since December 2014. The Institute of Computer Science also organizes a MOOC named Introduction to Programming, which will only be touched upon briefly in this article.

A helpdesk was organized in our MOOCs to help participants with their problems and to get an overview of the questions asked about the tasks. After collecting the questions that were asked from the helpdesk in 2015, a table of data was compiled to categorize the problems that occurred in certain aspects of the tasks. This paper focuses on troubleshooters created for the course in 2016 to help with these problems with the programming tasks.

As our idea was to create helpful hints for the tasks of each week, it meant that each task needed to be looked at separately. The course About Programming had eight different parts in 4 weeks (2 parts per week):

Details

Title
Troubleshooters for Tasks of Introductory Programming MOOCs
Author
Lepp, Marina; Palts, Tauno; Piret Luik; Kaspar Papli; Suviste, Reelika; Säde, Merilin; Hollo, Kaspar; Vaherpuu, Vello; Eno Tõnisson
Publication year
2018
Publication date
Sep 2018
Publisher
International Review of Research in Open and Distance Learning
e-ISSN
14923831
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2139908990
Copyright
© 2018. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and conditions, you may use this content in accordance with the terms of the License.