Back to Africa Check

COMMENT: A case for improved education data

For this year’s International Day of the Girl, ONE launched the Toughest Places for a Girl to Get an Education index. This was aimed to shed light on the countries that must be prioritised to ensure that girls worldwide are in school and learning.

Africa Check recently reviewed one of the index’s findings – that women in our top 10 countries have spent less than two years of their lives in school – and rated it as “misleading”.

I’d like to take the opportunity to further unpack the data analysis that went into our finding, as well as to position it within the broader context of education sector data and its limitations. This will illustrate the tough choices we as researchers and data analysts must sometimes make as a result.

Any index relies on subjective decision-making


Data products like indices may, on their face, seem entirely neutral and impartial to an outside observer. When looking at quantitative data – whether numbers, scores or rankings – it can be easy to assume that there was a straightforward, objective way of compiling and analysing the underlying data.

But any index – including our girls’ education index – necessarily relies on a certain degree of subjective decision-making: Should we include this indicator or that one? Should we cut off data points that are more than five years old, or ten? Should we rely on a particular source – and where it’s missing data points, should we try to fill in the gaps?

In putting together our new girls’ education index, we made every effort to ensure that the choices we inevitably had to make were well-grounded in evidence – and to be sure, we then had the methodology peer-reviewed by education data experts.

As Africa Check mentions in its review of our findings, we relied primarily on UNESCO’s Institute for Statistics database (UIS), and where there were gaps we filled those in with data from the World Inequality Database on Education (WIDE). We used WIDE data only if a country lacked both administrative data and household survey data from UIS.

Africa Check considered our finding misleading in large part because we elected to plug in WIDE data where UIS data wasn’t available, noting that UIS reflects mean years of schooling for women ages 25 and above, whereas the WIDE indicator reflects mean years of schooling for women ages 20 to 24.

Why ONE chose to fill data gaps this way


We certainly agree that these two indicators are not identical. That said, there is an explanation for why we chose to fill data gaps in this way.  

Plugging in WIDE data would be likely to increase, not decrease, the mean years of schooling for women in these countries. That is because 20-24 year-olds are likely to be more educated than older women, whose education levels are captured by UIS data.

This makes it likely that mean years of schooling would be even lower if UIS data were available across all 10 countries. It’s then reasonable to state that women within the toughest countries have less than two years of schooling. Our choice to include WIDE data therefore likely made the 1.99 years average, if anything, an overestimate.

We see this borne out in the data Africa Check cited in its review – UIS data puts the mean years of schooling for women aged 25 and above in Burkina Faso at 0.95, whereas WIDE put the mean years of schooling for women aged 20-24 at 2.07. In Ethiopia, it was 1.23 versus 5.07, respectively; in Guinea, 0.73 versus 3.81, and so on.

Having to make these judgment calls is obviously not ideal, and it’s why we were emphatic in our index report about the need for improved education sector data. Of the 193 UN member states, 37% were missing data for four or more indicators for the period 2010–2016.

This left us with 122 countries and over half of those were also missing data points. This meant that just 58 of the 193 original countries (30%) had complete data – taking into account both UIS and WIDE as sources. Among the countries with insufficient data to include in the index were Canada, France, Germany, Somalia and Syria.

Improvements in education sector data critical


Our analysis of data gaps underscores the need for improved education sector data, particularly gender-disaggregated data. Improvements in education sector data are critical not only for those of us relying on it to do these types of analyses, but also to get all children globally in school and learning.

How can we effectively track progress, or know which interventions are working and which aren’t, if we can’t turn to data that tells us one way or the other?

Africa Check’s review of our mean years of schooling finding mentions that we relied on a “patchwork” of data and that’s absolutely right. The truth is that many data products, including indices, often do. And they must – until data sources improve to the point where we don’t have to make tough choices to fill data gaps.

In the meantime, the best we can do is ensure that our assumptions are reliable and that our choices to fill gaps are based on logic and evidence. We look forward to working with Africa Check going forward to ensure that future ONE data products are similarly based on cogent methodologies and rooted in rigorous evidence.

Megan O’Donnell is ONE’s policy manager for gender issues.

 

Additional reading:

https://africacheck.org/reports/anti-poverty-group-misleads-with-patchwork-of-data-on-womens-education-in-africa/

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
limit: 600 characters