The Good, the Bad, and the Ugly of Using IATI Results Data

September 19, 2017
Taryn Davis
Results Data

It didn’t surprise me when I learned that — when Ministry of Finance officials conduct trainings on the Aid Management Platform for Village Chiefs, CSOs and citizens throughout the districts of Malawi — officials are almost immediately asked:

“What were the results of these projects? What were the outcomes?”

It didn’t just matter what development organizations said they would do — it also mattered what they actually did.

We’ve heard the same question echoed by a number of agriculture practitioners interviewed as part of the Initiative for Open Ag Funding.  When asked what information they need to make better decisions about where and how to implement their own projects, many replied:

“We want to know — if [others] were successful — what did they do? If they weren’t successful, what shouldn’t we do?”

This interest in understanding what went right (or wrong) came not from wanting to point fingers, but from genuine desire to learn. In considering how to publish and share data, the importance of — and interest in! — learning cannot be understated.

At MERL Tech DC earlier this month, we decided to explore the International Aid Transparency Initiative (IATI) format,  currently being used by organizations and governments globally for publishing aid and results data. For this hands-on exercise, we printed different types of projects from the D-Portal website, including any evaluation documents included in the publication. We then asked participants to answer the following questions about each project:

  1. What were the successes of the project?
  2. What could be replicated?
  3. What are the pitfalls to be avoided?
  4. Where did it fail?

Taryn instructing use of IATI data

Taryn Davis leading participants through using IATI results data at MERLTech DC

We then discussed whether participants were (or were not) able to answer these questions with the data provided. Here is the Good, the Bad, and the Ugly of what participants shared:

The Good

  1. Many were impressed that this data — particularly the evaluation documents — were even shared and made public, not hidden behind closed doors.
  2. For those analyzing evaluation documents, the narrative was helpful for answering our four questions, versus having just the indicators without any context.
  3. One attendee noted that this data would be helpful in planning project designs for business development purposes.

The Bad

  1. There were challenges with data quality — for example, some data were missing units, making it difficult to identify — was the number “50” a percent, a dollar amount, or another unit?
  2. Some found the organizations’ evaluation formats easier to understand than what was displayed on D-portal. Others were given evaluations with a more complex format, making it difficult to identify key takeaways.  Overall, readability varied, and format matters. Sometimes less columns is more ( readable). There is a fine line between not enough information (missing units), and a fire hose of information (gigantic documents).
  3. Since the attachments included more content in narrative format, they were more helpful in answering our four questions than just the indicators that were entered in the IATI standard.
  4. There were no visualizations for a quick takeaway on project success. A visual aid would help understand “successes” and “failures” quicker without having spend as much time digging and comparing, and could then spend more time looking at specific cases and focusing on the narrative.
  5. Some data was missing time periods, making it hard to know how relevant it would be for those interested in using the data.
  6. Data was often disorganized, and included spelling mistakes.

The Ugly

  1. Reading data “felt like reading the SAT”: challenging to comprehend.
  2. The data and documents weren’t typically forthcoming about challenges and lessons learned.
  3. Participants weren’t able to discern any real, tangible learning that could be practically applied to other projects.

Fortunately, the “Bad” elements can be relatively easily addressed. We’ve spent time reviewing results data for organizations published in IATI, providing feedback to improve data quality, and to make the data cleaner and easier to understand.

However, the “ugly” elements  are really key for organizations that want to share their results data. To move beyond a “transparency gold star,” and achieve shared learning and better development, organizations need to ask themselves:  

Are we publishing the right information, and are we publishing it in a usable format?”

As we noted earlier, it’s not just the indicators that data users are interested in, but how projects achieved (or didn’t achieve) those targets. Users want to engage in the “L” in Monitoring, Evaluation, and Learning (MERL). For organizations, this might be as simple as reporting “Citizens weren’t interested in adding quinoa to their diet so they didn’t sell as much as expected,” or “The Village Chief was well respected and supported the project, which really helped citizens gain trust and attend our trainings.”

This learning is important both for organizations internally, enabling them to understand and learn from the data; it’s also important for the wider development community. In hindsight, what do you wish you had known about implementing an irrigation project in rural Tanzania before you started? That’s what we should be sharing.

In order to do this, we must update our data publishing formats (and mindsets) so that we can answer questions like, “How did this project succeed? What can be replicated? What are the pitfalls to avoid? Where did it fail?” Answering these kinds of questions — and enabling actual learning — should be a key goal for all project and programs; and it should not feel like an SAT exam every time we do so.

Image Credit: Reid Porter, InterAction

Share This Post

Related from our library

The Results Data Initiative has Ended, but We’re still Learning from It

If an organization with an existing culture of learning and adaptation gets lucky, and an innovative funding opportunity appears, the result can be a perfect storm for changing everything. The Results Data Initiative was that perfect storm for DG. RDI confirmed that simply building technology and supplying data is not enough to ensure data is actually used. It also allowed us to test our assumptions and develop new solutions, methodologies & approaches to more effectively implement our work.

July 2, 2020 Strategic Advisory Services
Catalyzing Use of Gender Data

From our experience understanding data use, the primary obstacle to measuring and organizational learning from feminist outcomes is that development actors do not always capture gender data systematically. What can be done to change that?

March 16, 2020 Global Data Policy
Sharing DG’s Strategic Vision

Development Gateway’s mission is to support the use of data, technology, and evidence to create more effective and responsive institutions. We envision a world where institutions listen and respond to the needs of their constituents; are accountable; and are efficient in targeting and delivering services that improve lives. Since late 2018, we’ve been operating under

October 15, 2019 Global Data Policy