Who benefits from development data?
At what point does more data make us less effective?
Over the past several months we have been teasing out how results data quality, sharing, and use can be improved to inform better development outcomes. One of the emerging themes has been a tension around the needs of our “macro” and “micro” selves: what our “macro self” (e.g., staff at development partner headquarters) wants to know in order to inform big-picture strategy, and what our “micro self” (e.g., local-level government and implementers) wants to know to make sure projects deliver on the local level.
This tension isn’t confined to Results Data Initiative work – the what, how, and who for behind data in development is a fairly entrenched debate with no easy answers. However, some examples from our recent work may provide some insight on the question. For example, just two weeks ago I spoke with a Honduran development partner who shared that he’s grappling with this same dilemma in his organization:
We have an internal indicator problem. We want to know everything possible about a community in order to analyze and micro-target our projects. But for the community itself, the only questions that matter are: Did I eat today or not? Did my child go to school today or not? Did I experience crime or not? Period.
This point is something we should all be considering: If we can’t answer the local community’s questions, are we focusing on the right information? and Who are we helping when we collect more data to answer different questions?
To be clear: we are not saying that there is no value in data reporting. As a community we need to honestly evaluate our impact, and data isn’t automatically “bad” when it cannot be used directly at a local level. Rather, the issue is one of priority and resource investments in data. While there is no foolproof way to strike the right balance between micro- and macro-relevant indicators, this challenge is still worth problematizing – especially as we begin working towards the Sustainable Development Goals (SDGs).
Ms. Jenna Slotin of the UN Foundation wrote, quite rightly, that “If designed correctly, the [SDG] indicators should highlight real people’s experiences and whether their lives are getting better or worse.” However, coming in at 230 indicators for 169 targets – and given the real-time gaps in institutional strength, resources, and methodologies – whom does this data-driven approach truly benefit? Are we appeasing our macro-desire for more data at the expense of our micro-level efficacy? At DG, we’re trying to put together some qualitative and quantitative evidence that should inform the ways we invest in these kinds of results indicator data as a field moving forward.
As we continue to refine our approach towards the SDGs, I hope we prioritize finding the right balance between our macro and micro obligations. Leaving no one behind means both better-planned programs, and answering those questions which matter the most to the world’s most vulnerable.
Image taken inside a Sri Lanka health facility – papers used for data collection and reporting purposes.
Share This Post
Related from our library
The Results Data Initiative has Ended, but We’re still Learning from It
If an organization with an existing culture of learning and adaptation gets lucky, and an innovative funding opportunity appears, the result can be a perfect storm for changing everything. The Results Data Initiative was that perfect storm for DG. RDI confirmed that simply building technology and supplying data is not enough to ensure data is actually used. It also allowed us to test our assumptions and develop new solutions, methodologies & approaches to more effectively implement our work.
Catalyzing Use of Gender Data
From our experience understanding data use, the primary obstacle to measuring and organizational learning from feminist outcomes is that development actors do not always capture gender data systematically. What can be done to change that?
Sharing DG’s Strategic Vision
Development Gateway’s mission is to support the use of data, technology, and evidence to create more effective and responsive institutions. We envision a world where institutions listen and respond to the needs of their constituents; are accountable; and are efficient in targeting and delivering services that improve lives. Since late 2018, we’ve been operating under