Last week, the Governance Data Alliance launched a new report (led by our friends at AidData) on the use of governance data, drawing on 500+ survey respondents from development policy actors in 126 countries (in government, development agencies, and civil society). This report presents an excellent capstone to 2016, which has been a big year for understanding the role of evidence in policy and programming decisions. This year at DG, we launched our Results Data Initiative (RDI) reports based on approximately 500 in-depth interviews with development actors at all levels of government — from the President’s office to the local health clinic — in Tanzania, Ghana, and Sri Lanka. These, and many other recent efforts (for a quick sample, see here, here, and here) are shedding key light on barriers to and opportunities for evidence-based policy.
As 2016 wraps up, I am excited to see these initial forays give way to a 2017 where these programs reach critical mass. Here are three reasons why:
- We’ve finally gotten serious about evidence on how to insert evidence into policy
We often failed as a community to appropriately “take our own medicine” by collecting and analyzing evidence of when/how/why programming on the use of evidence is effective in influencing policy decisions. The “build it and they will come” appeal was too strong for too long. Some quick hits that resonate from research efforts like those listed above:
- Context matters. Shocking, I know, but policy actors care about not only what your evidence says, but also where that evidence comes from. Taking the time to understand why your “best practices” may/may not fit in the country/district/agency you are engaging is crucial.
- Incentives matter. Again, something we all knew but has been too often disregarded. Individual incentives (recognition, advancement, autonomy) for using evidence are often limited.
- Policy space matters. It matters very little that I spend days/weeks preparing a beautiful, evidence-based argument for a new program if my department budget is set in stone. For data/evidence to matter, users have to be able to do something differently with that information.
- Joined-up data matter. We found that the gap between results and resources data makes evidence-informed planning challenging. Many potential data use cases will similarly require linking two or more data sources (e.g. linking open contracting and beneficial ownership data to enable both anticorruption and market analysis efforts).
- Learning and adaptation are “in”
From USAID’s work with groups like Mercy Corps and Feedback Labs’ Practical Adaptation Network on adaptive programming to our friends at the Open Contracting Partnership and Global Integrity launching open learning plans, 2016 has been an exciting step forward in adaptive programming (note: expect “Learning Director/Manager/Consultant” to be the “Data Scientist” development job title of 2017). Inserting evidence into policy requires a move from thinking of governments as monolithic entities to a nuanced understanding of complex systems of interministerial politics, organizational culture, individual personalities, and positive deviants (in short, getting at How Change Happens). This seems to be a moment when the traditional top-down, evidence-based policy workstreams are colliding in helpful and exciting ways with the bottom-up, Doing Development Differently/Thinking and Working Politically/Adaptive Learning agendas.
- Funders are paying attention
Last week, we were thrilled to announce the next steps for RDI, in which we aim to put our learning and evidence into practice. Crucially, a program of this ambition requires working with a funder that “gets it”, recognizing that adaptation, learning, and open communication are a must for success. Other funders like the Hewlett Foundation, the Arnold Foundation, the US Government, DFID and more are similarly making once-scarce resources more readily available to innovators in this space.
Our New Year’s Resolution
At DG, we enter 2017 energized and commit ourselves to more actively generating concrete evidence and more systematically documenting and sharing our learning as we undertake programming on exciting challenges in open contracting, linking resources and results data, SDG monitoring and management, and beyond. Let’s work together to make 2017 a banner year for evidence use to improve lives.
If an organization with an existing culture of learning and adaptation gets lucky, and an innovative funding opportunity appears, the result can be a perfect storm for changing everything. The Results Data Initiative was that perfect storm for DG. RDI confirmed that simply building technology and supplying data is not enough to ensure data is actually used. It also allowed us to test our assumptions and develop new solutions, methodologies & approaches to more effectively implement our work.
15 years ago, AMP development was led by and co-designed with multiple partner country governments and international organizations. From a single implementation, AMP grew into 25 implementations globally. Through this growth, DG has learned crucial lessons about building systems that support the use of data for decision-making.
This past March, DG launched an AMP module that helps the Ministry of Finance, Planning, and Economic Development in Uganda track aid disbursements in their existing Program Budgeting System. This blog examines DG’s technical process and the specific solutions used to overcome AMP-Program Budgeting System (PBS) integration challenges.