Content area
Purpose
Many African countries grapple with school dropout, partly due to weak mechanisms for retaining children in schools, inadequate investment in education or concerns about the relevance of education to the current and future needs of children and their countries. Education Management Information Systems (EMIS) are expected to inform planning and decision-making processes for education service delivery to alleviate such problems. Yet the role and relevance of EMIS remain contested even in the best of circumstances. This paper explores the relationship among dropout, drop-in and EMIS in Eritrea, noting challenges confronting EMIS and arguing for increased focus on drop-in in EMIS.
Design/methodology/approach
The overarching methodology was influenced by interpretivism and constructionism. This research explored the following questions: What is the status of risk of dropout among Middle school students in Eritrea? What else can the Eritrean EMIS identify and measure? The first author conducted a descriptive analysis of existing quantitative EMIS data and interview data with students, teachers and government officials.
Findings
The key finding was the significant risk of dropout among middle school students in Eritrea. Secondly, whereas EMIS documented dropout rates annually, the system was not adept at identifying, documenting and analysing risks among students who drop in. There were tensions among diverse stakeholders about EMIS ownership, validity and reliability of the data and its management.
Research limitations/implications
The research was limited by inability to access raw EMIS data from the Ministry or from schools.
Practical implications
Nonetheless, the research highlighted the dilemma of using existing EMIS data for reporting on dropout and drop-in and recommended updating Eritrea’s outdated EMIS tools that were developed in 1993.
Originality/value
Relatively little research has focused on Eritrea’s education system. Although EMIS is widely considered a standard tool, this research found it problematic regarding its influence to improve educational access.
Introduction
This paper seeks to demonstrate the relationship between EMIS, drop-in and dropout. It is based on the findings of the doctoral research conducted by Emmanuel Kamuli in two middle schools in Eritrea: Beles and N’Hafash (both pseudonyms). It presents a statistical layperson’s engagement with the country’s Education Management Information System (EMIS) and shares the dilemmas of working with national information systems that are prone to competing tensions. Whereas the public may assume that the compilation of EMIS is a straightforward exercise that supports objective decision making, the reality is that EMIS is a negotiated product that accommodates sometimes contradictory imperatives.
In educational planning, EMIS is expected to play a critical role in illuminating what goes on in the education space of a country. However, as we argue in this paper, that expectation is mostly a theoretical ideal. Key stakeholders at the school and community levels, and those with limited statistical expertise in Ministries, NGOs and development agencies are either excluded or play peripheral roles and the outcomes that form EMIS are not entirely free of biases or influence.
The paper is organised as follows. First, it provides a brief overview of EMIS, then it examines the risks contributing to dropout and why they have EMIS relevance. Section two defines risk in educational contexts and the concept of drop-in as applied in this research and relate it later to EMIS. Section three presents the methodology, followed by section four on findings on EMIS data analysis and discussion in relation to drop-in and dropout. Section five concludes the paper.
Education management information systems
Countries maintain databases on different aspects of education, ranging from enrolment, teachers, classrooms and scholastic materials to highly sophisticated systems encompassing a wide array of aspects of education (
According to Villanueva, “The main purpose of an EMIS is to integrate information related to the management of educational activities, and to make it available in comprehensive yet succinct ways to a variety of users” (
Arguably, a good EMIS can and should “fuel progress toward … improved student learning, increased equity, and stronger accountability relationships among policymakers, school administrators, teachers, parents, and students” (
Concerns about EMIS persist. Apart from the seven ways highlighted by
This paper seeks to demonstrate the relationship between EMIS, drop-in and dropout. It argues that paying equal attention to drop-in and dropout in EMIS will improve accountability for students in education systems.
Defining risk and drop-in
Risk: There is considerable literature on risks confronting children in and out of school. Studies by individuals (
The meta-analysis by
Drop-in: Globally, and especially in low-income countries of sub-Saharan Africa, the phenomenon of children who enrol but fail to complete a given cycle of education continues to challenge their right to education. Some scholars have attempted to explain the causes and impacts of dropout (
The OOSC Initiative, which aimed to “make a significant and sustainable reduction in the number of children who are out of school” (
The five dimensions of exclusion (5DE) extracted from the out of school children’s initiative by UNICEF and UNESCO (2015)
The 5DE model presents a modified lens for characterising children in and out-of-school from the perspective of exclusion. Thus, the model enriches knowledge about the protracted process of dropout.
Methodological approach and limitations
Originality: Relatively little research has focused on Eritrea’s education system. Although EMIS is widely considered a standard tool, this research found it problematic regarding its influence to improve educational access.
Approach: The study followed a two-step sequential process of a descriptive analysis of secondary quantitative EMIS data from twelve annual reports of the Eritrea Ministry of Education 2003–2015 to analyse dropout trends. The research explored the following questions:
RQ1.What is the status of risk of dropout among Middle school students in Eritrea? What else can EMIS capture?Analysed EMIS data for 2003–2015 showed that Amiche district in the Northern Region consistently had the weakest indicators of wastage, survival and transition rates. Therefore, Amiche district was purposively sampled as a case study. The research based on existing literature to develop research tools to identify individual, family, community and school factors that contributed to dropout and drop-in. Twenty-eight participants were purposively sampled to participate in a total of 13 interviews and four focus group discussions. They included six teachers who were directly involved in the pastoral care for at-risk students and 21 students identified by schools as at-risk. A ministry official participated by virtue of office.
The research was limited by inability to access raw EMIS data from the Ministry or from schools. Resorting to secondary data meant no researcher control over what went into the official analysis or its output.
Whereas EMIS data were to be analysed using the (
The EMIS questionnaires of the Eritrean Ministry of Education (MOE) collected data annually on the causes of dropout. Questionnaire Number 9 (Q9) had a preselected list of 12 risk factors plus an “other” category, which schools could use to report on causes that did not fall within the list of 12 (
Pre-selected risk factors (Figure extracted by authors from Eritrea EMIS Questionnaire 9, 2014/15)
However, it was not possible for schools to explain what the “other” causes were. The quantitative data were analysed by level of education (elementary, middle school, etc.), teaching staff and internal efficiency of the system. The section on internal efficiency provided data on flow rates. The data on promotion, dropout and repetition for 12 academic years 2003/4 to 2014/15 (
Combined flow rates (percentage) for middle school students in Government of the State of Eritrea (2003/4–2014/15)
| Year | Grade | Dropout (percent) | Repetition (percent) | Promoted (percent) | ||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Boys | Girls | Total | Boys | Girls | Total | Boys | Girls | Total | ||
| 2003–4 | 6 | 7.3 | 6.5 | 7 | 17.7 | 16.8 | 17.3 | 75 | 76.6 | 75.7 |
| 2004–5 | 6 | 8.3 | 6.6 | 7.7 | 17.9 | 14.8 | 16.7 | 73.7 | 78.5 | 75.6 |
| 2005–6 | 6 | 16.5 | 8 | 13.1 | 17.4 | 15.8 | 16.8 | 66.1 | 76.2 | 70.1 |
| 2006–7 | 6 | 8.4 | 5.3 | 7.2 | 19.7 | 18.1 | 19.1 | 71.9 | 76.6 | 73.8 |
| 2007–8 | 6 | 7.9 | 4.9 | 6.6 | 21.2 | 18.7 | 20.2 | 70.9 | 76.4 | 73.2 |
| 2008–9 | 6 | 10.1 | 5.1 | 8 | 14.1 | 11.3 | 12.9 | 75.8 | 83.6 | 79.1 |
| 2009–10 | 6 | 7.7 | 4 | 6.1 | 16.1 | 12.1 | 14.4 | 76.2 | 83.9 | 79.6 |
| 2010–11 | 6 | 6.98 | 4.4 | 5.87 | 15.84 | 11.9 | 14.14 | 77.18 | 83.71 | 79.99 |
| 2011–12 | 6 | 7.6 | 4.8 | 6.4 | 19.5 | 14.3 | 17.3 | 72.9 | 80.9 | 76.3 |
| 2012–13 | 6 | 7.5 | 4.1 | 6.1 | 19.3 | 13 | 16.6 | 73.2 | 82.9 | 77.3 |
| 2013–14 | 6 | 4.8 | 10.1 | 7.8 | 14.1 | 20.1 | 17.5 | 81.1 | 69.8 | 74.6 |
| 2014–15 | 6 | 9.6 | 4.8 | 7.5 | 22.2 | 14.6 | 18.9 | 68.2 | 80.6 | 73.6 |
| 2003–4 | 7 | 6.5 | 5.9 | 6.3 | 12.9 | 10.3 | 11.8 | 80.6 | 83.8 | 81.9 |
| 2004–5 | 7 | 8.7 | 7 | 8 | 21.3 | 15 | 18.8 | 70 | 78 | 73.2 |
| 2005–6 | 7 | 20.3 | 8.7 | 15.8 | 12.8 | 11.3 | 12.2 | 66.9 | 80 | 71.9 |
| 2006–7 | 7 | 7.6 | 5.2 | 6.6 | 15.5 | 13.6 | 14.8 | 76.9 | 81.2 | 78.7 |
| 2007–8 | 7 | 6.3 | 4.8 | 5.7 | 12.9 | 10.6 | 12 | 80.8 | 84.5 | 82.3 |
| 2008–9 | 7 | 9.4 | 5.1 | 7.6 | 9.8 | 7.6 | 8.9 | 80.8 | 87.3 | 83.6 |
| 2009–10 | 7 | 6.6 | 4 | 5.4 | 10.5 | 8 | 9.4 | 82.9 | 88 | 85.2 |
| 2010–11 | 7 | 6.31 | 3.9 | 5.24 | 9.96 | 7.36 | 8.8 | 83.73 | 88.74 | 85.97 |
| 2011–12 | 7 | 7 | 4.3 | 5.8 | 12.4 | 9.6 | 11.2 | 80.7 | 86.2 | 83.1 |
| 2012–13 | 7 | 7 | 3.8 | 5.6 | 11.6 | 7.4 | 9.8 | 81.4 | 88.8 | 84.6 |
| 2013–14 | 7 | 4.9 | 10.5 | 8.1 | 8.9 | 13.5 | 11.5 | 86.1 | 76 | 80.5 |
| 2014–15 | 7 | 9.1 | 4.4 | 7 | 15.3 | 8.8 | 12.3 | 75.6 | 86.7 | 80.7 |
| 2003–4 | 8 | 7.9 | 6.4 | 7.3 | 16.9 | 10.7 | 14.5 | 75.2 | 82.8 | 78.1 |
| 2004–5 | 8 | 10.2 | 7.4 | 9.1 | 19.9 | 11.2 | 16.5 | 69.9 | 81.4 | 74.4 |
| 2005–6 | 8 | 27.3 | 11.1 | 20.4 | 5.8 | 5.1 | 5.5 | 66.9 | 83.8 | 74.1 |
| 2006–7 | 8 | 7 | 7.8 | 7.4 | 11.1 | 9.4 | 10.3 | 81.9 | 82.7 | 82.3 |
| 2007–8 | 8 | 5.6 | 6.2 | 5.9 | 12 | 9 | 10.7 | 82.4 | 84.8 | 83.4 |
| 2008–9 | 8 | 9.5 | 6.6 | 8.2 | 10.2 | 6.4 | 8.6 | 80.3 | 87 | 83.2 |
| 2009–10 | 8 | 6.9 | 4.2 | 5.7 | 10.7 | 6.3 | 8.7 | 82.4 | 89.5 | 85.6 |
| 2010–11 | 8 | 5.64 | 4.82 | 5.27 | 11.35 | 6.83 | 9.29 | 83.02 | 88.35 | 85.45 |
| 2011–12 | 8 | 7.1 | 5.6 | 6.4 | 15.5 | 9 | 12.4 | 77.4 | 85.5 | 81.2 |
| 2012–13 | 8 | 7 | 5.4 | 6.2 | 18.9 | 10.6 | 15.1 | 74.1 | 84 | 78.7 |
| 2013–14 | 8 | 7.1 | 12.2 | 9.8 | 11.9 | 20.8 | 16.6 | 81 | 67 | 73.5 |
| 2014–15 | 8 | 9.7 | 6.1 | 7.9 | 20.8 | 10.6 | 15.9 | 69.5 | 83.3 | 76.2 |
Source(s): MOE EMIS, Asmara (with authors’ italic)
Compiling the totals for each year yielded summary data on dropout, repetition and promotion, as shown in
Eritrea – middle school annual flow rates
| Year | Dropout (percent) | Repetition (percent) | Promotion (percent) | ||||||
|---|---|---|---|---|---|---|---|---|---|
| Boys | Girls | Total | Boys | Girls | Total | Boys | Girls | Total | |
| 2003–4 | 7.2 | 6.3 | 6.8 | 15.9 | 13.1 | 14.8 | 76.9 | 80.6 | 78.4 |
| 2004–5 | 9 | 7 | 8.2 | 19.6 | 13.8 | 17.3 | 71.4 | 79.2 | 74.5 |
| 2005–6 | 20.7 | 9.1 | 16.1 | 12.7 | 11.2 | 12.1 | 66.6 | 79.7 | 71.8 |
| 2006–7 | 7.8 | 6.1 | 7.1 | 15.9 | 14 | 15 | 76.4 | 80 | 77.9 |
| 2007–8 | 6.7 | 5.3 | 6.1 | 16 | 13.3 | 14.9 | 77.3 | 81.4 | 79 |
| 2008–9 | 9.7 | 5.5 | 7.9 | 11.6 | 8.6 | 10.4 | 78.7 | 85.8 | 81.7 |
| 2009–10 | 7.1 | 4.1 | 5.7 | 12.7 | 9 | 11.1 | 80.2 | 86.9 | 83.2 |
| 2010–11 | 6.3 | 4.4 | 5.5 | 12.6 | 8.8 | 10.9 | 81.1 | 86.9 | 83.7 |
| 2011–12 | 7.2 | 4.9 | 6.2 | 15.9 | 10.8 | 13.6 | 76.9 | 84.3 | 80.2 |
| 2012–13 | 7.2 | 4.5 | 6 | 16.9 | 10.4 | 14 | 75.9 | 85 | 80 |
| 2013–14 | 5.7 | 10.9 | 8.6 | 11.8 | 18.4 | 15.5 | 82.5 | 70.6 | 75.9 |
| 2014–15 | 9.5 | 5.1 | 7.5 | 19.7 | 11.6 | 16 | 70.8 | 83.3 | 76.5 |
Source(s): MOE EMIS, Asmara (with authors’ italic)
From the analysis, it was evident that dropout and repetition were proxy indicators of risk. There was no evidence in EMIS reports that the MOE considered or analysed the 12 and “other” factors collected using Q9.
Dropout statistics
Within the period of 2003–2014, Eritrea experienced the highest dropout rate of 16.1% at middle school level in 2005/06 (see
Disaggregated data showed that boys were the most vulnerable, given 20.7% dropped out during the 2005/06 academic year compared to 9.1% girls.
While this information helped in establishing the extent of the problem, it left information gaps on what factors led to dropout, particularly in years like 2005/06 and 2013/14 where spikes were experienced (i.e. 16.1% and 8.6% dropouts respectively) or why some years had favourable rates. The data did not explain why particular grades were more prone to dropout than others (e.g. 20.4% of grade 8 students dropped out in 2005/6 compared to 13.1% of grade 6 students in the same year). Neither did the data explain the significant differences among different sexes, including whether there were any specific strategies that enabled students to drop-in and persist with schooling.
Repetition of grades
All the Eritrea EMIS reports cited grade repetition as a major contributor to wastage. Consistent with
As with dropout, boys were more likely than girls to repeat grades. In three out of the 12 years analysed, girls’ repetition rates fell below 10%; the highest ever recorded was 18.4% in 2013/14. For boys, the lowest ever recorded was 11.6% in 2008/9 and the highest was 19.7% in 2014/15. Much as the research did not deliberately apply a gender lens when analysing the data, it was evident that gender mattered. Overall, average repetition rates across the 12 years were 11.8% (12.8% for males and 10.3% for females). This was significantly higher than the sub-Saharan regional trends for 2000, 2005 and 2010 reported by
Despite the stipulation of official policy that grade repetition “is a prime concern of the MOE [and] that students should not and need not repeat classes” (MOE, 2003, p. 10), EMIS reported on the phenomenon regularly without providing any insights or qualitative explanations as to why students were repeating grades and why certain years returned significantly higher repetition rates than others. There was no reference to the MOE policy (2003) regarding repeating. Arguably, repetition was contributing to the learning crisis.
Other risks of dropout
In its questionnaire, the Eritrean EMIS acknowledged that there were a range of causes of dropout from schooling.
An examination of Q9 templates across the years noted that ever since the MOE developed the list of 12 causes in 1993 (
Findings on EMIS data analysis and discussion
The named first author of this paper was positioned as an insider/outsider and this vantage location complemented the processes of data collection and analysis (
Annual middle school dropout rates 2003/4–2014/4
| Year | Dropout | ||
|---|---|---|---|
| Boys | Girls | Total | |
| 2003–4 | 7.2 | 6.3 | 6.8 |
| 2004–5 | 9 | 7 | 8.2 |
| 2005–6 | 20.7 | 9.1 | 16.1 |
| 2006–7 | 7.8 | 6.1 | 7.1 |
| 2007–8 | 6.7 | 5.3 | 6.1 |
| 2008–9 | 9.7 | 5.5 | 7.9 |
| 2009–10 | 7.1 | 4.1 | 5.7 |
| 2010–11 | 6.3 | 4.4 | 5.5 |
| 2011–12 | 7.2 | 4.9 | 6.2 |
| 2012–13 | 7.2 | 4.5 | 6 |
| 2013–14 | 5.7 | 10.9 | 8.6 |
| 2014–15 | 9.5 | 5.1 | 7.5 |
Source(s): MOE EMIS, Asmara
What emerged was that the UIS data on OOSC grew exponentially while the EMIS data shrank for progressive years. Secondly, UIS figures were more than twice the figures of MOE. If indeed the population of Eritrea was about three million, then every year at least 10% of the population was out of school. However, by 2016 UIS had ceased reporting on Eritrea altogether. Nevertheless, it is conceded that the insider/outsider status challenged the researcher’s ability to remain objective and maintain theoretical sensitivity (
Three important issues emerged about EMIS and we explore them further below.
Inconsistent population figures
The first issue was inconsistency in population figures.
Annual repetition rates for middle school level
| Year | Repetition | ||
|---|---|---|---|
| Boys | Girls | Total | |
| 2004–5 | 19.6 | 13.8 | 17.3 |
| 2014–15 | 19.7 | 11.6 | 16 |
| 2013–14 | 11.8 | 18.4 | 15.5 |
| 2006–7 | 15.9 | 14 | 15 |
| 2007–8 | 16 | 13.3 | 14.9 |
| 2003–4 | 15.9 | 13.1 | 14.8 |
| 2012–13 | 16.9 | 10.4 | 14 |
| 2011–12 | 15.9 | 10.8 | 13.6 |
| 2005–6 | 12.7 | 11.2 | 12.1 |
| 2009–10 | 12.7 | 9 | 11.1 |
| 2010–11 | 12.6 | 8.8 | 10.9 |
| 2008–9 | 11.6 | 8.6 | 10.4 |
Source(s): MOE EMIS, Asmara (with authors’ italic)
Notably, the population estimate by the NSO was the same for 2001 and 2010. An analysis of the annual EMIS reports revealed that starting with the 2010/2011 academic year the MOE discontinued the practice of giving population figures as the basis for EMIS calculations. The EMIS report of 2011 did not explain the discontinuation. However, from Table 46 of the annual EMIS booklet (2010/11), which gives detailed information on age-specific enrolment, it was possible to prorate the school aged population of 2,155,951 to arrive at a population figure of 3,142,228 for that year. This was evidence of an inconsistency as described by
The declining population estimates had attracted the interest of
Different population estimates by source. Compiled by the authors
What emerged from the analysis was that there was no single reliable population estimate for Eritrea. Triangulation of figures from the various sources confirmed the absence of a reliable estimate.
This was not an indictment on Eritrea’s EMIS but rather an exemplification of what
Observed mismatches for OOSC trends
The second issue about EMIS was mismatched trends. Reviewing the EMIS data revealed a change in the historical figures for out-of-school children (OOSC). As illustrated in
Extract from EMIS report showing original flow rates (with authors’ highlight)
Figure 5Extract from EMIS report showing adjusted flow rates (with authors’ highlight)
Mismatch of age of children enrolled in school
The third issue was over-enrolment. The EMIS publication of 2012/13 showed that 65,916 eight-year-old children had enrolled into grade two, which was greater than the estimated total population of 65,042 eight-year-old children in the country (
Extract from EMIS report showing age 8 enrolment anomalies (authors’ highlight)
There had not been any reported influx of refugees. Barring the possibility that invisible children (
Yet, the abiding sense was that the three discrepancies were not necessarily cases of ineptitude among the processors of EMIS data. Rather, they presented a dilemma to the research given the data were from secondary sources.
Official response about anomalies and possible remedies
The three key issues were the population figures which formed the basis for the EMIS computations by MOE; the rationale for recalculating the historical out-of-school statistics from 1991 to 2010 and the possibility that more children enrolled into Grade Two than the entire population estimate for that age group.
Analysed data from interviews with MOE and NSO officials indicated that they considered information on population statistics in the country to be a highly sensitive issue. An official from NSO cautioned,
No other authority than the NSO whether in the country or elsewhere has the mandate to determine the population of Eritrea. … Partners working in Eritrea should be wary of regurgitating wrong figures purported to be from so-called reliable sources (interviewed 7 December 2015).
First author recalled the animated discussion between MOE and UNESCO officials regarding UNPD figures. However, he also concluded that the different figures cited by various departments of the government emanated from the understanding that no changes could be made to the population estimates unless such changes were communicated by the NSO. If a department did not need to publish routine data the way EMIS does with education data, then possibly they did not take the initiative to seek updates from the NSO. Further analysis of data from the interviews with the MOE EMIS officials revealed that the MOE used different mathematical models to calculate the population growth figures and this may have contributed to the higher population size.
The NSO acknowledged having asked the MOE to discontinue the practice of citing their population denominator in the EMIS publications. During the interview with an official of the NSO, first author asked whether the changes in population estimates could have been arbitrary, for example from 3.2 million to 2.9 million for 2010. The official explained, “Such changes are normal since data cleaning processes are continuous and no database is ever exact” (interviewed 7 December 2015). That explanation was consistent with literature (
Our analyses showed that the problem of different population figures was not unique to Eritrea government data sources only. United Nations agencies working in Eritrea had different figures much as they depended on the same UNPD for their respective population estimates. In a nutshell, there were different estimates of the national population of Eritrea depending on the source.
The second observation was about a reduction in the OOSC population across the years. The change had resulted in a 45% reduction in the OOSC rates. The MOE explained that the change was a result of adjusting the national intake age from seven years to six years. It was not immediately clear why the adjustment stretched to 1991. In the interview discussion with the NSO, the officer explained that because the MOE had consistently used inappropriate mathematical models to calculate population growth figures, they accumulated errors due to exaggerated population size. That was why the EMIS unit had to revise their historical population “projections” and to recalculate the flow rates. He concluded, “That was why the NSO asked the Ministry to discontinue the practice of citing their population denominator in the EMIS publications” (Interviewed 7 December 2015).
In the analysis of the quantitative data the report of 2012/2013 showed that 66,245 eight-year-old children were enrolled whereas the total estimated population of eight-year-old children that year was 65,042 (
School-level efforts to capture qualitative information on students
A key finding related to the way schools documented student risks and how their innovative efforts were neither harnessed nor acknowledged by the EMIS. At both schools, the first author probed how qualitative information on risks was collected and stored and whether such school-level information fed into the annual EMIS reports.
The two schools had different ways of collecting the information. In N’Hafash the Principal had an incident book in which teachers on duty reported any concerns they noted during the week. It was not confidential. In Beles, they maintained a confidential logbook on incidents affecting students’ participation. The principal and deputy principal were the only custodians.
Curiously, the information on risks that schools gathered was not compiled and escalated to the MOE to be woven into the narrative of EMIS. Thus, EMIS reports did not benefit from school level insights, particularly how teachers and students converted their risk cases into drop-in and completion cases. That would have contributed to analysing the “other” category of Q9, the risks they faced, what actions they had taken, the challenges encountered or what lessons they had drawn from dealing with the risks to foster drop-in.
Secondly, given that EMIS data are collected once a year, the absence of a feedback mechanism denies the process a potential source of comprehensive data that would enrich analyses and conclusions of EMIS reports. A teacher at Beles asked if the researchers had experienced such a system of school level inputs to the national EMIS elsewhere in Africa. None of the researchers had seen such a practice, thus conceding that this could be a general gap with traditional EMIS systems. Our view is that teachers and school communities would benefit from MOE guidance on how to systematically capture, document, analyse, report and utilise the locally generated qualitative information on the risks of dropout and more importantly on how schools could collate their experiences of drop-in and convert them into important lessons for the education system and its multiple stakeholders.
There was a procedural omission regarding who accessed the EMIS reports once they were published. While it was laudable that the MOE published the annual EMIS reports within a year of collecting the data, circulation of the reports was limited to MOE and regional education offices. Schools never received copies of the EMIS reports whose data they helped to compile. That inadvertently excluded schools from tapping into the findings of the EMIS questionnaires. Similarly, the language in the EMIS reports is very technical, the graphs and formulae are complex and the underlying assumptions need to be simplified for the wider audience of diverse stakeholders.
Conclusion
The key finding from the foregoing procedures was to highlight the dilemma of using existing EMIS data for reporting on dropout, drop-in or even learning outcomes. We believe that wastage in the school system, directly relates to the way information on schools is gathered, processed, managed and disseminated through EMIS. Our analysis has contributed to a diagnosis on EMIS reporting especially the likelihood of underreporting on OOSC. Additionally, the research identified that capturing information on drop-in has the potential to enrich government actions and accountability for students. More broadly, we also note that EMIS remains a mystery to several levels of stakeholders who may not grasp the technical complexities, iterative steps, critical decisions and what informs the final product. Consumers of EMIS have a stake and the potential to participate in enriching EMIS.
Lastly, whereas the burden of providing accurate EMIS rests with the national government, other nongovernment players have influence on the final outcomes and they can contribute the required financial and technical resources needed to support strong and credible EMIS in sub-Saharan Africa. That way EMIS can meaningfully contribute towards mitigating the learning crisis.
References
© 2025 Khabusi Emmanuel Kamuli and Moses Oketch. http://creativecommons.org/licences/by/4.0/legalcode (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
