Interoperability as a Cornerstone of Resilient Digital Systems
How many apps do you have sitting unused on your phone – once downloaded to meet an urgent need but since then left unused? If you needed them again, would you even remember they were there?
Now, imagine your phone is a government system, and those apps are digital solutions meant to address public service challenges. Over time, as more platforms, dashboards, and tools are introduced, they pile up in silos: disconnected, hard to navigate, and ultimately less effective at delivering long-term impact for the people they were designed to serve.
Governments, the private sector, and NGOs have more digital tools and data at their fingertips than ever before, yet much of it remains untapped, underutilized, or completely unused. Trapped in silos, these data cannot be effectively leveraged to drive evidence-based decision-making or improve public services. Beyond limiting the sustainable impact of these solutions, siloed systems also hinder AI readiness and reduce the value data could add to digital public infrastructure (DPI) – both of which depend on interoperability and open data flows to achieve digital transformation.
At Development Gateway: An IREX Venture (DG), interoperability is central to how we approach digital transformation and design solutions that remain useful and usable long after the lifecycle of a program. In this blog, we unpack what interoperability means to us, drawing on insights from our ‘a Livestock Information Vision for Ethiopia’ (aLIVE) program, before highlighting why interoperability is more needed than ever before.
Solving Data Silos in Africa: Insights from aLIVE
DG has a long history of developing and implementing interoperable digital solutions across a wide range of sectors in Africa. Drawing on Steele and Orrell’s 2017 definition of interoperability as “the ability to join-up data from different sources in a standardized and contextualized way,” we see it as the most effective pathway to transform siloed systems into integrated ecosystems while safeguarding data sovereignty and ensuring local ownership.
But solving data silos is not just a technical challenge. As we‘ve seen in Ethiopia through our aLIVE program, it requires building trust among stakeholders, aligning global best-practices with local priorities, and designing systems to last past a program’s lifecycle. In other words, interoperability is as much about people and processes as it is about technology.
Through aLIVE, we are working with Ethiopia’s Ministry of Agriculture to strengthen the country’s livestock data ecosystem. Rather than layering yet another digital tool on top of existing platforms, the program focuses on connecting what already exists; standardizing livestock data to ensure systems can “talk” to one another. This makes data more usable and timely, while simultaneously fostering a shift from fragmented reporting to collaborative, evidence-based planning and stronger service delivery.
Some key insights from aLIVE’s work in Ethiopia’s livestock sector include:
- Build trust first: Interoperability depends on collaboration, not just between different systems but among the people involved, too. By bringing government agencies and technical experts to the same table, aLIVE has opened a space for dialogue, transparency, and shared ownership of solutions.
- Prioritize the local context: While global standards provide guidance, they only succeed when rooted in local priorities. aLIVE’s co-design process ensures the digital solutions put forward are practical, relevant, and usable within Ethiopia’s unique context.
- Design for sustainability: Technology is only as strong as the ecosystem that supports it. By investing in capacity building – training system owners and data users on how to get the most out of the standardized livestock data – aLIVE is ensuring that the improvements in livestock data will outlast the program itself.
Together, these insights show that interoperability is not only about connecting technical systems, but also about empowering the people and processes behind them to work toward achieving shared goals. By embedding trust, local ownership, and sustainability into its design, aLIVE demonstrates how data silos can be transformed into interoperable ecosystems that strengthen decision-making, improve service delivery, and prepare systems for the future.
It is through ensuring that systems are future-ready that interoperability further sets in place the foundations for AI readiness. With interoperable livestock data, Ethiopia can harness AI tools to predict disease outbreaks, improve food security, and strengthen climate resilience. And because livestock data connects to wider systems – trade, public health, environmental management, and economic planning – it becomes part of the country’s broader digital public infrastructure (DPI). In this way, aLIVE is not only transforming Ethiopia’s livestock data systems but also contributing to a stronger, more resilient digital foundation for the country as a whole.
Why Interoperability is More Needed Than Ever Before
The funding freeze from USAID and other US government agencies in early 2025 sent shockwaves through the international development sector. Its immediate effects were devastating: 34,880 metric tons of emergency food aid bound for Ethiopia sat rotting in shipping containers off the coast of Djibouti, while more than 20 million people worldwide lost access to life-saving HIV treatment and services.
But beyond these visible crises, the longer-term systemic consequences may prove just as damaging. The sudden halt of humanitarian projects destabilized fragile data ecosystems, disrupted data collection efforts, and undermined many of the digital systems that governments and NGOs rely on to deliver essential services. While less headline-grabbing than the rotting of emergency food aid or interruptions to lifesaving treatments, the erosion of data for service delivery is a hidden systemic crisis – one with the potential to weaken development and humanitarian efforts for years to come.
This moment underscores why interoperability is more important than ever. As aLIVE demonstrates in Ethiopia, interoperable data systems are not just about connecting platforms, but about ensuring that countries can build resilience, harness AI, and extend the value of their data across multiple sectors. In the face of funding volatility, interoperability becomes a safeguard against systemic fragility, helping countries sustain service delivery even when donor support diminishes.
Disruption also creates opportunity. The withdrawal of funding has highlighted the need to move away from historic overreliance on external funding toward more country-led, localized, and whole-of-system approaches. As our CEO, Josh Powell, along with the CEO of Results for Development, Gina Lagomarsino, and the former CEO of Global Partnership for Sustainable Development Data, Claire Melamed, wrote in a recent blog on the data crisis following USAID’s withdrawal, interoperability can be a powerful vehicle for achieving this form of digital transformation.
At this moment of uncertainty for the international development sector, interoperability should not be an afterthought, but a cornerstone for ensuring digital resilience. By enabling open data flows, strengthening local ownership, and embedding sustainability into design, it offers a path for countries to withstand funding shocks, protect sovereignty, and drive long-term digital transformation.
***
Read our white paper titled ‘Demystifying Interoperability, which discusses in practical terms what goes into implementing interoperable solutions in partnership with public administrations.
Why Africa Will Define the Next Decade of Digital Public Infrastructure
A reflection by three DG colleagues across country implementation, partnerships, AI, Data Governance and Strategic Communications
The conversation on Digital Public Infrastructure (DPI) is moving fast, and this was clear during the three-day Global DPI Summit in Capetown, South Africa, in November 2025. The vision was bold: a future where identity, payments, and data exchanges unlock development at scale. As highlighted by the CEO of Ekstep, Shankar Maruwada, during one of the side events on AI and DPI at the Summit, language is powerful, and shared vocabulary across stakeholders ensures alignment among technologists, policymakers, implementers, and funders. This is most visible with the DPI conversation, where language adoption has brought together various stakeholders at the global level, and the narrative is now shifting from high-level frameworks to one where countries are now looking at tools, governance models, and partnerships that support actual service delivery.
While the energy now feels different with this Summit, we also sensed a growing tension. DPI is gaining endorsement from governments, philanthropists, multilaterals, and private-sector actors, but remains deeply complex. As colleagues based in Africa and working across different countries, in digital transformation, data governance, AI, interoperability, partnerships, policy, and communication, this is the space where we have quietly been working for years. This piece reflects what resonated, what challenged us, and where we think the conversation must go next.

1. The Conversation Has Shifted: From Idealism to Implementation Reality
From the Summit, it was clear that there was a sense of urgency to move towards implementation and ensure that DPI works for countries and serves citizens. Panel discussions showcased country examples but also asked difficult questions: how do we finance and maintain DPI sustainability? How do we move fast and avoid vendor lock-in? How do we ensure interoperability?
The 50-in-5 Campaign is growing and is now in 32 countries. We saw concrete commitments such as the 10-year Public Private Partnership (PPP) model (Palestine) and production-grade systems replacing pilots. This symbolizes a collective appetite towards implementing DPI and making it work. The conversation on implementation also brought in the messier reality of implementing DPI, moving us away from concepts and rhetoric. Political transitions, capacity (not only technical but strategic), ethics, and culture shape what is possible. This on-the-ground reality is where African experiences matter, as it brings a valuable realism to the global conversation and highlights the trade-offs that must be clear.
2. Trust and Inclusion: The Backbone of DPI
If there is one theme that came up in every session, it was trust. It was highlighted as a core building block to ensure that digital systems survive political cycles and earn public buy-in. The trust conversation was framed through various lenses at the summit – procedural (focusing on safeguards and regulation), operational (systems that work), and relational (where citizens believe the services provided will benefit them). A panelist from Kenya gave an example that illustrates this point: his 18-year-old son completed 80% of passport processing online using the eCitizen (a portal for government information and services), only to repeat biometric steps that he had completed for his national ID. This raises the question: why isn’t data that has already been collected used to improve service delivery? This is both a technical question on interoperability, but also a question of institutional trust and people’s risk comfort.
In Africa, perspectives on trust are especially important. This Research ICT Africa project highlights that DPI in African countries cannot be copied from other regions and cannot be separated from issues such as uneven access. Trust building must begin with understanding these realities.
3. Interoperability is Non-Negotiable but is Still Poorly Understood
It was great to finally see interoperability discussed as a non-negotiable, not a nice-to-have. This helps with making DPI more valuable. We heard examples of this in practice, such as the Inter American Development Bank’s tech platform, which supports digital integration across Latin America and the Caribbean by allowing citizens to access digital services with a single account across multiple platforms and countries. Similarly, the Government of Indonesia shared insights from its Government Services Liaison System, which is designed with auto-scaling capabilities. This allows the system to automatically increase capacity during spikes in demand, such as during the rollout of large public programs. As the country’s Director of Digital Government Application, Yessi Arnaz Ferrari noted, “This is exactly what allows us to grow from 65 agencies to 435 connected institutions, handle over 58 million data transactions this year, and still maintain about 99.9% uptime.”
Despite this, we noticed that many DPI initiatives shared as examples in the summit focused on digital ID and payments, and they are seen as more fundable and politically attractive. Data exchange remains the missing middle and is challenging to achieve. An example from Burundi on a data exchange system deployment highlighted issues around language barriers: French as the main platform language and local languages lacking technical terms, access limitation with only 40% of the population having smartphones, and some local offices hesitant to participate in integration across registries. We know from DG’s experience helping governments repurpose legacy systems, share data assets, and build data exchange mechanisms fit for interoperability that this invisible area is both hard to achieve and will be essential for DPI going forward.
Where does the conversation go from here, and what must the Global community learn?
As the global DPI movement moves from ‘what’ to ‘how’ and enters an operational decade, African experiences will determine what this decade looks like. As Dr James Mwangi, CEO of Equity Bank, noted at the Summit, Africa’s current 2-3% GDP contribution, despite accounting for 18% population share, is not a deficit but evidence of massive untapped potential that private sector-led DPI can unlock. He argued that Africa’s underdevelopment makes it the world’s biggest growth opportunity and emphasized that African capital must be deployed by Africans for African infrastructure rather than relying on grants and philanthropy. Most critically, he reminded the audience that by 2050, Africa will hold 42% of the world’s labour force, 2.6 billion people with a mean age of 18, and that with the right DPI foundation, this workforce can serve global markets from Africa, flipping the prevailing narrative on migration and dependency.
In our internal reflections, one analogy that sparked discussion was viewing DPI as a public good – similar to roads. The comparison helped us to surface questions about who funds, builds, governs, and coordinates systems so they serve the public interest. While real-world arrangements are often far messier and differ depending on context, the analogy was useful in highlighting risks around fragmentation and misalignment when coordination is weak.
Against this backdrop, DPI will not succeed in Africa on advocacy and branding alone. The continent will test assumptions, reveal blind spots, and ensure that innovation is practical and works for the people it aims to serve.
Notably, civil society voices were largely missing from the conversation. While some Open Government Partnership colleagues were part of the discussions, there is a need for more civil society and a wider range of government actors in this space as well. The narrative on speed, with some voices missing, risks overshadowing institutional readiness and leaving DPI deepening digital inequalities in countries, working to the disadvantage of the marginalised communities it also seeks to benefit.
The campaign may be ahead of the evidence around economic value and socio-technical realities, but research projects such as this by Research ICT Africa will ensure that DPI in Africa is grounded in real political economies and institutional incentives and works for the people it serves. This moves the conversation to the next stage, stepping away from “how do countries adopt existing DPI models” to “what does DPI need to succeed in African countries”? Communities of Practice and Informal Networks, such as ImNet hosted by Open Cities Lab, bringing together African DPI implementers, will ensure a coordinated approach amongst DPI implementers in support of inclusive economic opportunity, digital sovereignty, and improved public service delivery.
The global community has started to listen, and the continent’s realities on access, linguistic diversity, informal economies, and affordability challenges will bring clarity that global frameworks often overlook. On that front, Africa will define DPI on its own terms and show the world the way forward.
Building Useful & Usable AI: A New Tool to Curb Procurement Corruption
Public procurement accounts for one-third of government spending across the globe, totaling around 10 trillion dollars a year. Despite producing millions of pages of procurement data annually, governments make the vast majority of this information either inaccessible to the public or available in formats that make it hard to extract meaningful insights on government spending. Only 2.8% of public procurement documents are published as open data, with significantly less published in a format that would allow journalists, civil society, or the private sector to flag potential cases of corruption.
Given the sheer volume of transactions that take place globally, large sums of money involved, complexity of the process, and the close interaction between public officials and businesses, public procurement is particularly vulnerable to malign acts. Globally, an estimated 20-25% of government spending on public contracts falls prey to corruption, with this percentage rising to 50% or higher in certain regions. One of the enablers is the lack of transparent processes and accessible data.
As such, the HackCorruption program – a collaboration between Accountability Lab and Development Gateway: An IREX Venture (DG) – has developed a new contract summary and analysis tool powered by artificial intelligence (AI) that has been submitted for registration as a Digital Public Good (DPG).
See the GitHub page below for documentation: https://devgateway.github.io/automatic-contract-summarizer-portal/
Powered by a large language model (LLM) to extract important information from lengthy government contracts, the tool enables easier flagging of corruption risks – allowing users to analyze and summarize thousands of documents in hours with minimal human intervention. By automating the data analysis process and highlighting trends in the often opaque procedures of public procurement, it provides users with the timely, accessible data required to strengthen transparency and ensure good governance in public contracting.
Identifying The Core Challenge
The idea for this AI tool came from the lessons learned and insights gained during the regional hackathons organized through the HackCorruption program. While working alongside the winning teams from HackCorruption Latin American in Colombia, it became apparent that many were struggling with a similar challenge: how to efficiently and effectively process public procurement contracts. The need for such a tool was further highlighted during HackCorruption South-East Asia. Once again, teams expressed the same desire to process contracts without the significant time and cost of human intervention.

Manually reviewing these lengthy contracts is a slow, costly, and ineffective process. Extracting important information from contract documents – such as names, amounts, durations, a list of goods and services requested and provided, and so on – requires a significant amount of time and effort for a human to complete. However, with the help of AI technology, this process can be completed in a matter of hours or days at most.
Drawing on the experiences of these hackathon teams, the HackCorruption team began work on creating a tool that could be used to streamline the process of analyzing large numbers of contracts to ensure a more efficient method for identifying corruption risks.
Using AI to Untangle Opacity in Green Funds: A Case Study From HackCorruption Balkans
Green Funds Transparency was one winning hackathon teams that particularly struggled with getting the contract data that they needed for their tool, a website that focused on two interconnected challenges:
- The lack of transparency in the allocation and use of green funds: These contracts, which include funds for climate resilience and environmental infrastructure, often pass through a complex system of procurement channels. This means the documentation is prone to becoming fragmented, inconsistent, and difficult to audit.
The broader issue of procurement opacity: Government contracts are lengthy documents that are, for the most part, poorly structured. This makes them difficult to analyze and search through, decreasing the ability to detect irregularities or patterns that could indicate corruption or mismanagement.

As they began to develop their tool, the Green Funds Transparency team relied on the manual review of procurement documents, donor reports, and government disclosures. This was a slow, labor-intensive process and, due to inconsistent formats and limited access to source data, often resulted in an incomplete analysis. Further complicating matters was the fact that many of the contracts they needed to review were not machine-readable, making it almost impossible to cross-reference them with environmental impact metrics or financial flows.
The team believes that an AI tool that can extract structured data from unstructured contract documents would be transformative.
“The tool could help track fund flows, identify discrepancies, and flag contracts that lack environmental safeguards or performance indicators. It would also allow for real-time monitoring, pattern recognition across large datasets, and proactive identification of red flags – empowering oversight institutions, civil society, and journalists to act on credible insights rather than anecdotal suspicions,” he adds.
While beneficial, it is vital to also take into account safety and ethical considerations that accompany the use of AI for such purposes. This includes ensuring that the methodology for training the AI is transparent and that the data it is trained on doesn’t reinforce biases in historical funding practices. According to Green Funds Transparency, these safeguards should include:
- Human oversight in interpreting outputs
- Clear documentation of training data
- Protection of sensitive data
- Mechanisms for correcting errors or misclassifications
- The responsible use of predictive analytics to avoid false positives
With the right safeguards in place, AI and the specific tools we create using it have the ability to fundamentally change the way we analyze vast amounts of data, as well as to vastly reduce the amount of money, time, and effort required to do so. Such tools can enable a small team or even an individual reporter to more accurately identify corruption risks in government contracts, empowering them to enforce greater levels of transparency, more accountability, and, ultimately, enhance good governance in their specific sector.

Building useful and usable AI based on actual needs
The idea for this AI tool came from the lessons learned and insights gained from the HackCorruption project. While working alongside the winning teams from HackCorruption Latin American in Colombia, it became apparent that many were struggling with a similar challenge: how to efficiently and effectively process public procurement contracts. The need for such a tool was further highlighted during HackCorruption South-East Asia. Once again, teams raised the same desire to process contracts without the huge amount of time and cost of human intervention required to do so.
Drawing on the experiences of these hackathon teams, the HackCorruption team began work on creating a tool that could be used to streamline the process of analyzing large numbers of contracts to ensure a more efficient method for identifying corruption risks.
After assessing the needs of several HackCorruption teams, such as Green Funds Transparency, plans were put in place for the creation of an AI model that could be used to extract or summarize information in contracts of any size (from a few Kilobytes to Megabytes) in either pdf or doc.x formats. While initially focused on contracts written in English, this tool will support additional languages in the future. The final requirement – and one placed as the top priority for success – was that this AI model should be free to use and able to be run either in the Cloud or on local computer systems using consumer-grade GPUs – meaning that the tool can be used without needing to spend thousands of dollars on expensive equipment.
Designing the tool to operate within these parameters ensures that it can serve anyone who is interested – including those from journalism, academia, or NGOs. By assisting users to extract relevant information from contracts that exist only as pdf/MS Word documents and whose information cannot be easily accessed from a database or in a standardized format such as the Open Contracting Data Standards (OCDS), this AI tool makes it possible for those working on anti-corruption to do their work more easily and faster than ever before.
Training AI to understand and extract meaning from public contracts
HackCorruption’s AI model is designed to read documents from a directory one by one and generate a text file that follows a predetermined format that contains all relevant information extracted. This includes the contract ID, contract name, implementation dates, lists of goods and services, and so on. There were two different phases required to create this tool:
- Training: The first task was to select a small randomized subset of documents and train the LLM model so that it could learn to recognize and extract important information and generate the analysis in the desired output format. We generated results that were clear, valid, and cohesive with a subset of around 100 contracts. The training methodology we used was supervised training, where each contract was paired with a human-written summary. The model used this to “learn” how to process new documents with a similar format/layout and then produce summaries similar to the example provided. The structure used in the summary documents is below.

2. Data Extraction: Once the LLM had been trained and there was minimal error detected in its analysis of the documents, then it was used to extract information from all of the remaining documents. While the tool can assist in extracting useful information from contracts, it is important to recognize that some cases may require additional manual checks. The source code comes equipped with guidelines on how to set the tool up and begin using it for data extraction, and is structured in such a way as to be easily upgraded with future AI models.
3. Data Quality: Once the AI tool summarizes each contract, it applies a series of post-processes aimed at minimizing the possibility of hallucinations.
Lessons Learned & Recommendations
We developed the tool through listening and learning from HackCorruption teams from different regions. And yet, that was only the beginning of the learning journey. Here are three lessons we learned while conceptualizing and creating this AI tool:
-
- Cloud is better, but local still works: Having access to cloud resources can help you speed up the training process, but it is not mandatory – you can still use your pc/laptop and obtain good results.
- Flexibility and adaptability are key: AI is an evolving area with new LLMs, tools, and free libraries appearing almost every week. As such, it is paramount to keep well-informed and to ensure that the source code written for these new tools can be easily upgraded to ensure they don’t quickly become outdated.
Learn from others: As all those involved in HackCorruption over the past three years know, collaboration is the key to success. There is a huge community of developers and researchers who offer their code examples, tools, and experience for free. Learn from them and see what you can incorporate into your own projects.

While this is an ongoing project and there will be many more lessons to learn as we progress, there are two recommendations that we have identified to date that can help potential users and developers looking to utilize AI to its fullest potential:
- Invest some time to learn the basics of how AI really works. You don’t need to read all the papers nor understand all the math or coding behind it, but, as with everything, the better you understand the basics, the more likely you are to use the tool effectively.
- It’s important to keep in mind that not all libraries are open source, especially those for processing pdf files. As such, it may take time to change the code later if a library substitution is needed.
With our application submitted for this AI Contract Summarizer tool to be registered as a Digital Public Good (DPG), it will soon be available for use by all working to curb corruption in public procurement. Keep an eye out for a follow-up blog that announces the tool’s successful registration as a DPG!
Accelerating Institutions: How DG’s 25 Years Create Unique Value for AI
As we celebrate 25 years of DG’s efforts supporting institutions to use digital and data to accelerate and amplify their impact, governments, companies, organizations, and people globally are grappling with how to invest in AI tools with this same goal in mind. Technology is evolving rapidly, the hype cycle is overwhelming, and many implementations are falling short of their intended value.
Often, as any good technologist will likely tell you, the technology is the easy part, and failure lies in a combination of design, process, capacity, and commitment. The next few years will require thoughtful and adaptive efforts to rightsize and rework AI implementations that create real value and impact in the lives of students, smallholder farmers, patients, and others who rely on them.
At DG, our engagement with AI follows the same use case-driven, complexity-respecting capabilities and methods that we have used for decades:
- We look beyond hype to identify and drive genuine value. Our processes begin with understanding people’s missions, capacity, and objectives, distilled into use cases and problem definitions that ensure that institutional solutions are both grounded in context and truly impactful for our partners.
- We have a deep understanding of working with governments and their existing technology ecosystem. Effective AI solutions rely upon high-quality, accessible, and well-governed data. We have decades of experience supporting staff across levels in government ministries – and other large institutions – in assessing, updating, and making interoperable their existing ecosystem of legacy data systems. The correct combination of technical, political, analytical, and change management skills needed for this work is invaluable (and rare) in positioning Low- and Middle-Income Countries (LMIC) public sector institutions to meaningfully adopt and use AI to become more effective.
- We are impartial and outcome-oriented. DG is a technology-agnostic organization. When appropriate, we prioritize using open-source tools that prevent vendor lock-in and preserve flexibility. We open-source our own technology and have registered multiple Digital Public Goods. Governments, organizations, and other institutions may have existing agreements with technology vendors, and we are comfortable evaluating and working with proprietary software. This flexibility allows us to effectively act as a trusted advisor for leaders choosing among a rapidly changing array of possible AI applications, rather than a vendor for any single product.
- We balance ethics, sustainability, and pragmatism. Our partners operate in complex environments with complex challenges. We do not believe that technology can solve every problem, and it must be paired with the right intentions and processes. Flexibility, compromise, and attention to long-term sustainability are key to successful digital transformation and resilience.
With these principles in mind, DG is engaging in AI through four pillars:

- Strengthening data pipelines and governance
For decades, DG has delivered systems interoperability and strengthened data pipelines, helping institutions maximize their data systems. Our Aid Management Program (AMP), for example, launched twenty years ago with the belief that standardized aid data could strengthen country ownership of the development process and align aid with national priorities.
We’ve also shared insights from transforming Ethiopia’s livestock data systems into a resilient, interoperable data ecosystem and have informed best practices in data governance across health, agriculture, and other sectors from the farm level to cross-border contexts.
Grounded in this expertise, we will continue to enable the flow of interoperable data, establish governance protocols, unlock inter-institutional data sharing, promote privacy, and center ethical considerations that bolster successful AI readiness and responsible adoption.
- Supporting Pragmatic AI Visioning
DG applies our technology-agnostic approach to define use case-driven requirements for AI and assess potential solutions. Through a co-design process, we work with stakeholders to identify use cases and existing workflows before developing technical requirements and solutions.
This same approach guides our assessment of organizational AI readiness and informs our development of AI strategies and roadmaps. Our visioning takes factors such as existing digital infrastructure, technology policies, and personnel capacity into consideration, balancing ambition and pragmatism. This approach allows for AI strategies that are context-driven and fit for the future. Through the Early Grades Education Activity [Asas for short], led by IREX in Jordan, we are supporting the development of a new AI strategy for the Ministry of Education.
- Identifying and Sandboxing the Right Tools
Landscaping – DG’s experienced software development team and history of translating global best practices into local solutions position us well to support the execution of AI visions. Once use cases for AI are established and prioritized, in turn creating a decision framework, we can systematically scan the fast-changing landscape of AI projects, software tools, and services that could potentially meet stakeholder and institutional needs.
Our evaluation framework considers not only the technical capability, interoperability, and scalability, but also sustainability factors such as licensing models, energy efficiency, portability, and policy alignment (i.e., data sovereignty regulations in country). We also assess local and regional AI efforts. This landscaping clarifies which tools merit deeper exploration for next steps and highlights the risks and limitations of potential solutions.
Sandboxing – When appropriate, DG can develop a controlled environment, known as an AI Sandbox, for safe experimentation and red teaming. Open-source solutions can be directly deployed in our sandbox, while proprietary tools will be evaluated via structured demos and reviews with prospective vendors. We capture metrics–such as potential efficiency gains, accuracy and reliability, governance alignment, user experience, and total cost of ownership–that will inform if and how specific tools can be taken further. This hands-on exploration enables stakeholders to see how tools perform against real requirements and builds a tangible foundation for decision-making for leaders in the changing AI space.
- Guiding Adoption and Scale
Institutions worldwide are grappling with how AI pilots will scale. Whether we are analyzing the results of the landscaping and sandboxing or partnering with a government that has already deployed solutions, we guide further adoption and scale.
This pillar of engagement includes both technical development, accompaniment of enterprise adoption, and recalibrating AI roadmaps into actionable plans for scale, incorporating recommendations on what should be adopted, adapted, and avoided. Recommendations can also incorporate standards for software, hardware, data standards, ethical safeguards, sustainable development, and capacity-building programs, all of which improve the chances of long-term sustainability of systems.
Technology Changes, Institutional Use Stays (Largely) the Same
Remarkably, while AI is potentially revolutionary and transformational, our methods for engaging with it remain within DG’s standard principles and approaches. Our partnership with IREX embodies this ethos, combining DG’s data systems and governance expertise with IREX’s institutional performance leadership to help civil society and government partners responsibly unlock AI’s potential. The technology may be new, yet ensuring we are context-driven, ethically grounded, and sustainability-focused has been DG’s ethos for decades.
Excitingly, we are already engaging with AI through our partnership with IREX and in other work across sectors and borders. Looking forward, these areas of focus will guide our support to partners across our priority sectors: education, agriculture, health, and governance.
Reflecting on 3 Years of Digital Advisory Support for Agricultural Transformation
Building resilient food systems is essential to ensuring global access to nutritious, sustainable diets. Yet, despite agriculture and food production increasing globally over recent decades, acute food insecurity and malnutrition continue to rise, particularly among the world’s most vulnerable populations.
Addressing this challenge requires systemic change across the entire food ecosystem, from production and monitoring to distribution and quality assurance. It also involves equipping smallholder farmers with timely data and user-friendly digital tools that empower them to make informed decisions, improve productivity, better integrate into existing value chains, and deal with the interconnected challenges they face.
However, merely supplying data and digital agriculture tools is insufficient to enable the systemic change required to strengthen the resilience of food systems. To achieve this form of long-term impact, it is necessary to provide sufficient advisory services on how to best utilize these data and tools, as well as local capacity building and training to ensure that the necessary advisory support services remain available when and where they are needed most.
DAS: Improving the Efficiency of Agriculture Data Use
To support this need, Development Gateway (DG), in partnership with digital development experts, Jengalab, and training experts, TechChange, with support from the International Fund for Agricultural Development (IFAD), implemented the Digital Advisory Support Services for Accelerated Rural Transformation (DAS) program from 2022 to 2025. By providing technical support for ICT4D activities within IFAD-financed programs, DAS aimed to improve the livelihoods of rural smallholder farmers through facilitating digital transformation in agriculture. The program’s target regions were East and Southern Africa, Central and West Africa, and the Near East and North Africa.

With the DAS program concluding in March 2025, we reflect on its impact in integrating ICT4D solutions to improve rural livelihoods, the effectiveness of its knowledge transfer and capacity building for sustaining gains, and the lessons it offers for future agricultural technology programs.

Integration of ICT4D Solutions
To support IFAD – the only multilateral development institution that focuses exclusively on transforming rural economies and food systems – with its digital strategy, DAS provided on-demand advisory services to support ICT4D activities for IFAD-financed programs. By filling gaps related to technical support, DAS strengthened the ability of these programs to build, maintain, and scale their use of technology in supporting farmers.
While the impact of the DAS support services is ongoing and may only become apparent in the months and years to come, some have already been identified, captured in the examples below.
Nigeria
In Nigeria, the DAS program conducted an assessment of the digital agriculture ecosystem. Serving as foundational research for various project activities – including online training on principles of digital development, interoperability, and in-person monitoring and evaluation (M&E) training – this assessment helped identify key digitalization challenges in the country, such as connectivity and affordability, and generated recommendations for strategic partnerships to address these challenges.
The assessment, which identified data gaps, prioritized interventions, and fostered collaboration with stakeholders, facilitated the development of an ICT4D strategy with the Nigerian government, laying a solid foundation for local ICT-enabled projects and providing a strong case for scaling digital innovations in IFAD-supported programs.
Morocco
In Morocco, DAS provided ICT4D support to the Integrated Rural Development Project of the Mountain Areas in the Oriental Region (PADERMO), contributing to the development of specific actions within the project. These included a pilot initiative to monitor beehives digitally, the establishment of digital agricultural cooperatives, and the use of digital services to promote and market agrifood products.
In particular, support from DAS partners enhanced the integration of digital tools, such as piezometers – geotechnical sensors that measure pore water pressure and water level in soil and rock – during the design phase. To further improve impact in future interventions, the team noted that digital skills training and a deeper understanding of implementation partners’ practical capacities would be highly beneficial.
Tanzania
In Tanzania, DAS assisted with the development of the country’s first e-agriculture strategy – the Digital Agriculture Strategy (2025 – 2030) – by conducting a situational analysis and generating a draft zero for it. This enabled the Ministry of Agriculture (MoA) to better leverage digital agriculture technologies and enhance data-driven decision-making in the country. DAS further supported capacity-building activities for the government and implementing partners on how to effectively integrate digital solutions for agriculture, informed by digitalization trends and practices from other countries.
Following the development of the e-agriculture strategy, IFAD, in collaboration with the United Nations Capital Development Fund (UNCDF) and the Food and Agriculture Organization of the United Nations (FAO), formed the Joint Programme on Data for Digital Agricultural Transformation, supporting the MoA to better leverage data and strengthen interoperability to transform the country’s agriculture sector and provide more efficient service delivery. Through the technical support provided by DAS and the subsequent Joint Programme, 10 innovative agri-tech projects are expected to scale up, with additional catalytic investment provided for 3 agri-tech projects. It is estimated that this will result in more than 500,000 smallholder farmers – including 300,000 women and 100,000 youth – gaining access to enhanced digital services.

Impact of Capacity Building Trainings
By identifying specific learning areas within digital agriculture that would be of greatest relevance to IFAD-financed programs, Development Gateway and TechChange designed a number of training courses that provided capacity building for those who work in agriculture, as well as to mid-to-senior level officials and technical staff working in ministries, agencies, and organizations.
A total of 19 in-person training sessions were held on a variety of different topics, reaching 940 participants. Following up with these participants more than six months after they had completed their training, 92% stated that much or all of the course content remained relevant to their work and that they continue to frequently support colleagues using the concepts learned.
In addition to in-person training sessions, the DAS ICT4Ag Digital Classroom Series was created, containing four IFAD-sponsored self-paced training courses and ensuring the continued impact of the initiative. Two examples of IFAD-sponsored training courses on digital agriculture are highlighted below.
Basics of Digital Agriculture Ecosystems and Interoperability
In this course, participants learned about the causes and consequences of fragmented data, as well as the tools and techniques needed to achieve digital transformation for rural agriculture. Through exploring models of enabling environments for digital agriculture, participants learned how to assess the maturity of their own agriculture ecosystem. This course further outlines use cases and best practices for interoperable data use in digital agriculture, linking data from different sources in a standardized, context-aware way.
Access the free 2-hour self-paced Basics of Digital Agriculture Ecosystems and Interoperability course.
Basics of Digital Rural Finance for Agriculture
Introducing participants to digital finance in agriculture, this course explores how digital services and tools can be applied to an agricultural project or intervention with a financial component. Completing the course enabled participants to draw insights from IFAD’s project portfolio on the relevance of digital technologies to advance financial inclusion in rural areas, as well as to utilize IFAD’s financial strategy to deliver these tools and services to smallholder farmers.
It further equipped participants with knowledge of how to accurately assess the financial needs of smallholder farmers and how best to leverage digital technologies to address them. Through the use of real examples and case studies, the course made understanding the principles underlying the design and implementation of digital financial initiatives as simple and applicable as possible.
Access the free 2-hour self-paced Basics of Digital Rural Finance for Agriculture course.

Lessons Learned & Recommendations
Through its 3-year lifespan, the DAS program made significant strides in supporting the digital transformation of agricultural systems across IFAD’s portfolio across East and Southern Africa, Central and West Africa, and the Near East and North Africa. Through tailored advisory services, targeted capacity building, and the development of knowledge products, the programme strengthened the ability of governments, project teams, and implementing partners to integrate digital tools in a way that is inclusive, sustainable, and aligned with local contexts However, as with every program, there were lessons to be learned on how to improve the impact of such programs in the future.
- Deploying ICT4D solutions in resource-constrained settings with low levels of digital literacy requires a strategic, context-aware approach that grounds innovation in practicality.
- Time constraints were a recurring challenge in countries such as Tunisia and Uzbekistan, where short mission durations restricted the depth of engagement. This hindered the ability to address more complex digital needs or provide sufficient capacity-building.
- In scenarios where capacity is limited and advisory engagements are short, it can further be difficult for the Project Management Unit (PMU) staff to define their needs and internalize the steps necessary to drive change and sustainably take recommendations forward to achieve long-term success.
- Programs could have benefited from a more unified approach to digitalizing Monitoring & Evaluation (M&E) processes, as well as involving end users in the design of any tool.
- Although the DAS intervention has proved valuable in a multitude of implementing countries, room for improvement exists in tailoring the digital solutions provided to fit the local context. Feedback suggested that consultants could benefit from a deeper understanding of IFAD’s context and projects to enhance their ability to provide practical and implementable technical solutions. This is particularly important in aligning with the realities on the ground, such as the technical capacities of the implementing partners, the developmental (rather than research-oriented) nature of IFAD projects, and the characteristics of the final beneficiaries – mainly smallholder producers with relatively low digital and technical skills.

Recommendations for Future Programs
As the DAS program closes, it opens more doors for follow-up initiatives that build on the results it has achieved. In particular, the three facets of knowledge transfer, local ownership, and long-term sustainability will remain paramount for any initiative seeking to achieve digital transformation in the rural agriculture sector.
More than that, the following recommendations are suggested for future programs:
- Prioritize preliminary capacity-building activities: It is highly recommended that future digital interventions aiming to achieve similar results carry out preliminary capacity-building activities, such as a dedicated introductory phase for digital training. This will ensure that teams and PMUs are equipped with the necessary skills and knowledge to effectively implement digital solutions, efficiently adapt ICT4D tools to the local context, and maximize their ability to sustainably impact development outcomes.
- Implement M&E frameworks during the design phase: In order to accurately assess the effectiveness and impact of digital initiatives, it is paramount that clear M&E methodologies – as well as a structured approach to introducing these frameworks – be included in the program’s design. Doing so enables teams to easily track progress, effectively maintain alignment with the program’s objectives, and efficiently evaluate how well the digital components were implemented, as well as their ongoing impact on driving systemic change.
- Provide long-term ICT4D support: Finally, it is important to note that short-term advisory missions often lack the depth needed to fully implement programs that provide technical assistance. As such, it is recommended that long-term technical assistance be provided within IFAD’s projects to foster continuity, mentorship, and problem-solving beyond initial design, ensuring impact continues to evolve well after the program’s close.
Democratizing Digital or Digitizing Democracy in 2025?
In 2023, I wrote about the OGP Global Summit in Tallinn, Estonia, and its focus on digital democracy. Estonia put digital democracy on center stage, showcasing e-Estonia and the digitization of government services such as legislation, voting, education, justice, healthcare, banking, taxes, policing, etc. e-Estonia is an example of digitizing democracy.
Also prevalent at the 2023 Summit was the concept of democratizing digital, or ensuring that digital tools, platforms, and policies are made and governed in ways that reflect democratic values. Open government and digital democracy can improve transparency, accountability, and participation in society. However, open government mechanisms and digital democracy must be usable, accessible, and inclusive.

Former Estonian Prime Minister Kaja Kallas delivers her keynote at the 2023 OGP Summit (Image: OGP)
As we approach the 2025 OGP Summit, the Government of Spain and Cielo Magno have prioritized the themes of People, Institutions, and Technology & Data. These themes signal that democratizing digital and digitizing democracy remain cornerstones for building resilient, effective, and trusted institutions.
However, the Summit takes place amid a backdrop of democratic backsliding, decreasing institutional trust, and increased influence of Big Tech on policy. By continuing to apply technology to problems that require “people” solutions, distrust in institutions will continue. My theory remains that if democratic principles are applied to digital government, then institutions will be stronger, more effective, and more accountable.
What’s Changed Since 2023?
Three key developments have had a significant impact on digital democracy. First, governments globally are grappling with how to invest in and leverage AI – and more specifically, generative AI – for economic growth and service delivery. From racing to build data centers, to appointing an AI bot as minister in Albania, governments globally do not want to be left behind in the AI divide. Second, Digital Public Infrastructure (DPI) has gained traction and investment to expand access to services, improve efficiency, and foster innovation by providing the “digital highways” for identity, payments, and data. Third, global leadership shifts have fundamentally changed approaches to regulation, digital sovereignty, protections, and sustainability.
Funding cuts from USAID resulted in the loss of millions of dollars in digital and data programming, as well as forecasted programming and subsequent resources aimed at advancing digital democracy. Cuts from bilateral donors, along with a cacophony of geopolitical dynamics, technological advancements, and regulatory policies, have prompted a focus on digital sovereignty, in turn putting more governments in the driver’s seat of digital decision-making.
Promises and Risks for Digital Democracy: Making Democracy Work, digitally
For those of us who have worked in digital democracy through multiple hype cycles, we have seen technology promise and fail to solve complex problems such as corruption, inclusion, and poverty. AI will not fix government shortcomings. By promising that technology can solve problems, and then in turn failing, institutions are breaking promises to citizens and stakeholders, thereby further eroding trust.
Using Digital to solve the right problems
Advocates for digital democracy should be promoting tech-assisted government, seeking improvements, not replacement. When equipped with the right tools at the right time, AI can transform efficiency and effectiveness. Automated document processing (ADP) tools, for example, can manage paperwork, reducing staff workloads and errors. According to an Amazon Web Services case study, the Illinois Institute of Technology (IIT) reduced transcript processing times from several weeks to just one day using AI-powered automation. IIT automated their entire workflow from document intake through to student record updates, international grade conversions, and customer relationship management (CRM) integration. This streamlined process reduced the prospective student credential evaluation process from 4-6 weeks to a single day. Transforming efficiency and reducing errors has the potential to improve government delivery and trust if done with rights-respecting processes and technology.
However, trust in AI is not universal. The 2025 Edelman Trust Barometer shows that nearly 1 in 2 are skeptical of the use of AI in business. Those with high levels of grievance have even deeper distrust of AI. Therefore, applications of AI for government, while having the potential to improve efficiency, may further erode trust. This risk is heightened when AI tools are created in a black box without transparency.
Bridge the deepening divide
AI and DPI are expanding, but their perceived and actual benefits are uneven. Without rights-driven values embedded into technology, the solutions risk reinforcing the inequities they aim to address. For communities without reliable connectivity, digital literacy, or protections, AI and DPI can deepen exclusion. Digital-only ID systems may exclude those without access to the internet, particularly impacting women and girls. AI is perceived as a driver of manipulated information, pervasive in democratic elections. Tools are more sophisticated, and protecting civic discourse is harder than ever. Without safeguards against misinformation, harassment, and manipulation, AI can be used to further target, discredit, and exclude voices.
Reflections for moving forward
To make democracy work digitally, the Open Government community should:
- Promote hybrid digital and non-digital approaches that enable institutions to work better and more transparently, not replace them.
- Deploy digital and AI tools for specific problems, not as blanket solutions.
- Embed human rights values in design, regulations, and oversight of digital systems.
- Prioritize sustainable funding and local ownership to ensure digital democracy endures.
Digital democracy faces profound promise and risk. Using technology as a tool to deliver on government promises can build institutions. Democratizing these tools ensures that systems reflect and reinforce democratic values. By democratizing digital and digitizing democracy, we can advocate for and build trusted, resilient, and inclusive institutions.
Digital Sovereignty & Open-Source: The Unlikely Duo Shaping DPI
Reflections and insights into the value of open-source data exchange infrastructure and the conditions needed to enable it.
The global digital development landscape is undergoing a profound transformation. As we navigate the mid-2020s, a shift towards national digital sovereignty is fundamentally redefining how countries approach technology, data, and international cooperation. This pivot, particularly evident in 2025, carries significant implications for sustainable development in the global majority countries, including the evolution of Digital Public Infrastructure (DPI) and the role of open-source solutions.
The Accelerating Pivot Towards Digital Sovereignty in 2025
2025 has emerged as a pivotal moment in global geopolitics. In the West, the established liberal paradigm of global governance, which has underpinned sustainable development efforts for at least twenty-five years, has evaporated. A renewed era of realpolitik is emerging to fill the void left over, with Western countries pulling back from multilateralism, free trade, aid, and the projection of soft power.
In the digital policy sphere, one way in which this new reality is manifesting is as a push towards digital sovereignty by governments and political blocs worldwide.
In the European Union (EU), the State of the Digital Decade 2025 report explicitly calls for EU and Member State action on digital transformation and digital sovereignty. Its recommendations span crucial policy and tech stack areas, including advanced connectivity, semiconductors, secure and sovereign cloud infrastructure, Artificial Intelligence (AI), and cybersecurity.
The trend is equally strong across Africa. For instance, the African Union’s (AU) Malabo Convention on Cyber Security and Data Protection (2023) and its Continental AI Strategy (2025) both advocate for sovereign data management and AI capabilities, aiming to promote African countries’ control over modern data sources and technology. Moreover, key events in 2025, including the Global AI Summit on Africa in Rwanda and the 2nd Conference on the State of Artificial Intelligence in Africa in Nairobi, have focused on the critical need for digital sovereignty alongside inclusive development tailored for the continent, emphasizing control over critical infrastructure, including digital infrastructure.
In the United States, in February 2025, the Trump administration issued a memorandum authorizing trade tariffs against companies deemed to be “hindering American companies’ global competitiveness”. This move was a direct response to the administration’s belief that European regulations and fines on US tech firms violated “American sovereignty” and compromised “American economic and national security interests.” This ongoing “tussle and negotiation” over tech firm regulation underscores the broader global trend toward greater exertion of sovereignty over digital infrastructure.
This global pivot signifies a fundamental realignment, where national control and self-determination over digital assets and infrastructure are becoming paramount modes through which sovereignty is asserted.
Digital Sovereignty and Open-Source
In our view, as Development Gateway: An IREX Venture (DG), one of the most compelling characteristics of the recent shift towards digital sovereignty, particularly from a digital development perspective, is the heightened emphasis on open source solutions as crucial enablers of digital sovereignty and political agency.
Perhaps counterintuitively, open source has become a technical cornerstone of digital sovereignty. This is largely because the use of open source technologies largely frees users from dependency on others. In short, open source technology allows for a greater degree of independent control over digital infrastructure.
Beyond political endorsements, renewed enthusiasm for open source is evident in various initiatives, for instance, within the EU’s Gaia-X project, which is developing open source cloud and data infrastructure. Denmark has gone even further, actively exploring replacing governmental reliance on proprietary solutions like Microsoft’s Windows and Office suite with open source alternatives like Linux and LibreOffice to assert digital sovereignty over civil servants’ daily digital tools. We anticipate that in the near future, more countries will translate political statements in support of digital sovereignty into operational reality through the use of open source technology stacks.
This transition to open source, while promising for sovereignty, is not without its complexities, especially when considered through a development lens. Moving from a proprietary solution to an open-source alternative has significant implications across various operational aspects: maintenance, cybersecurity protocols, data integration strategies, governance frameworks, management practices, staff capacity requirements, and the broader tech stack. How governments will approach these issues will significantly impact how successful they ultimately are at “exerting sovereignty” over their digital infrastructure. There is a real risk that poorly thought-through or incomplete approaches to adopting open source will leave critical components of national tech stacks either dependent on external providers or vulnerable to cyberattack.
Data Exchange: The Interoperable Core of Open-Sourced DPI
The current shift towards digital sovereignty will have a significant trickle-down impact on national DPI policies. With open source technologies being prioritized to achieve digital sovereignty, there will need to be a corresponding focus on developing DPI using these open, transparent, and controllable solutions.
As open-source approaches in DPI development and implementation pick up pace, the criticality of data exchanges to the functioning of digital goods and services will also increase. Data exchange, in its simplest form, refers to the process of transferring or sharing data across institutions, systems, platforms, or even individuals. It is recognized as a core pillar of DPI, with organizations like GovStack and the Digital Public Goods Alliance (DPGA) particularly emphasizing its importance in the context of open-source solutions. Data exchange is essential for linking digital identity or e-payment services to specific e-services – from pensions to agricultural subsidies – and any other financial interaction between administrative authorities and individuals.
Achieving effective data exchange as part of their DPI investments presents pressing challenges for governments transitioning to open-source systems. At its heart, data exchange is a complex series of interoperability challenges spanning technological, data, organizational, and human layers. Many government offices struggle with siloed, sector-specific digital systems that impede effective data utilization and decision-making, thereby hindering ambitious sustainable development goals.
To ensure that DPI successfully meets public needs in this new era of digital sovereignty, unlocking interoperability in data exchanges will be crucial. This is essential for ensuring that the datasets powering digital services, evidence-based decision-making, and public transparency are standardized, adhere to critical protocols, and are semantically interoperable (i.e., that machines can read the data in a manner that preserves its meaning and integrity, ensuring it is fit for purpose).
As governments move towards integrating open-source solutions into their DPI, they need to be mindful of the technical capacity, cost, and operational needs that accompany this shift. They also need to ensure that they do not over-focus on e-payments and digital ID systems as pillars of DPI to the detriment of data exchanges, which in open-source systems are arguably the cornerstone of DPI.
Operationalizing Digital Sovereignty for Resilient Systems
In 2025, digital sovereignty and open source have become an unlikely duo. From the point of view of governments, the operationalization of digital sovereignty through open source technologies creates opportunities for more resilient data and digital systems, including DPI, to be built.
As this trend accelerates, the function of data exchanges in establishing interoperability between open-source systems is ever more important. Based on our decades of experience building open source digital infrastructure at DG, for governments adopting the open source approach to digital sovereignty, several insights and recommendations stand out:
- Governments must examine existing data governance policies and practices to proactively prevent unexpected blockers when implementing open source solutions, particularly within broader digital transformation strategies.
- It is essential to understand what already exists at a technical level to effectively set priorities. Rather than discarding functional systems, the focus should be on connecting existing systems through new data exchange infrastructure, such as creating data standards or semantic interoperability layers. This approach necessitates creating data standards and identifying champions who can advocate for and develop these standards across sectors and regions, recognizing that this is a time-consuming process.
- Institutionalizing sustainability from the outset is paramount. Data exchange infrastructure initiatives must integrate human, institutional, financial, and technical components to ensure long-term scalability and viability.
- Finally, there’s a need to build data exchange capabilities now for future innovation. Investing in open source data exchange creates a stronger foundation for implementing innovative technologies, particularly AI, which relies heavily on vast amounts of data. This also necessitates fostering institutional and individual digital literacy, as well as robust data organization and categorization.
In the coming decade, DPI development in the global majority countries is likely to be characterized by a balance between procuring off-the-shelf solutions from diverse markets (including the US, EU, China, and others) and deploying locally developed or supported open source solutions. The increasing focus on data exchange as the central cog of open-sourced DPI will necessitate continuous updates to digital transformation policies, strategies, and open source tools to meet these evolving needs. This path forward requires a careful balance between practical implementation and the political realities of 2025.
The Need for Health Taxes to Raise Revenues
The social sector is operating amidst periods of unprecedented uncertainty and a shifting political landscape and priorities. As a result, there is a significant need for increased domestic resource mobilization for foreign aid. Economists have found that as of mid-2025, official development assistance (ODA) has decreased between $41 billion and $60 billion compared to 2023 levels of $228.3 billion. Additionally, countries in sub-Saharan Africa are expected to see a total decline of 16% to 28% by the end of the year.
These deficits lead to an unexpected shock of financial sources that fund essential government services – from education to health. As an example, in Côte d’Ivoire, 8,600 healthcare providers who provide essential services have been terminated. In Southeast Asia, Myanmar terminated its supply of malaria diagnostic tests and drugs, despite a 10 times increase in cases compared to previous years (Source).
Within this context, the UN held the fourth Financing for Development Conference, bringing together more than 60 heads of state, as well as high-ranking officials from development banks and multilateral institutions from around the world, and 15,000 attendees to tackle how countries may address the financing for these services. The conference and outcomes document spoke of a number of measures, including increased financial transparency and a need for innovative blended finance mechanisms. Conversations on these issues will continue from mid-late September in New York during the UN General Assembly (UNGA), and we hope, in particular, one method will receive more attention: the key role health taxes can fill in closing the financing gap.
To speak to this issue, representatives from the World Health Organization (WHO), Tax Justice Network Africa (TJNA), and Development Gateway: An IREX Venture (DG) came together to discuss the critical role of health taxes in mobilizing domestic resources for health.

Development Gateway’s Experience
Development Gateway has direct experience in supporting more effective tax policy through providing access to clear evidence and best practices. Through the Tobacco Control Data Initiative, an ongoing tobacco control program in seven African countries, the program aggregates and visualizes data related to tobacco legislation. DG developed six country-specific websites that display tobacco research through close collaboration with government ministries such as ministries of health, finance, trade, as well as civil society organizations, academics, and local research firms.
Through this process of developing and sharing this information, DG successfully provided key evidence and information in the lead-up to tax reform in several priority countries. For example, Nigeria updated its specific tobacco taxes by 30% in 2022 after consultation with DG and other key stakeholders. In Ethiopia, the Ethiopian Food and Drug Authority and the Ministry of Finance used the tax modeling we had developed – in conjunction with the University of Cape Town – to understand the impact of different tax scenarios. With that information, they updated the excise tax in 2020 and in 2024. The average tax collected per pack was increased from 4.12 Birr to 12.4 (approximately US$0.086), and an 8 Birr Specific tax was added per pack, in line with best practices. These increased tobacco taxes lead directly to increased funding available for health expenditures.
WHO’s Experience
The World Health Organization discussed its launch of a new initiative: the 3 by 35 initiative. This initiative is a bold movement for countries worldwide to increase the real prices by 50% of three unhealthy products – alcohol, tobacco, and sugary drinks – by 2035. This movement showcases how taxes are a critical tool to raise revenue and save lives. WHO states that a “one-time tax increase sufficient to raise prices by 50% could generate up to US$3.7 trillion in new revenue globally within five years, or an average of US$740 billion per year – equivalent to 0.75% of global GDP.”
Tax Justice Network Africa’s Experience
Since 2017, TJNA has worked with national partners in seven African countries (Democratic Republic of Congo, Ghana, Kenya, Nigeria, Senegal, South Africa, Zambia) to implement the Tobacco Tax Advocacy in Africa (TTAA) project. Through targeted policy advocacy, the partners promote increases in tobacco excise tax in order to raise domestic revenue and decrease the affordability and consumption of tobacco and nicotine products. Pivotal to the process is the tax structure implemented in each country. As such, the TTAA partners advocate for specific or mixed tax structures to enhance administrative capacities and price movement.
The TTAA project has yielded various positive results – the most recent being in Ghana with the enactment of the Excise Duty Amendment Act, 2023 (Act 1093). The Act revised the tax structure to include a specific tax component and reduced the ad valorem excise rate from 175% to 50%. A post-enactment assessment reveals that retail prices of cigarettes increased between May 2023 and May 2024 for almost all brands. Tobacco excise revenue increased from GHS189 million in 2022 to GHS252 million (approximately US$20 million) in the first half of 2024. Simultaneously, Kenya, Nigeria, while the South African and Zambian Ministries of Finance announced above-inflation (real) increases in tobacco excise tax.

Next Steps
As nations and partners go into further discussions on financing during UNGA meetings, it’s more important than ever for smart investments and policy decisions to raise needed revenue for critical services. Government agencies have the opportunity to capitalize on this moment of reduced foreign investment to review their tax structure and ensure that their taxes on tobacco products, sugary-sweetened beverages, and tobacco are in line with evidence-based best practices. These include (1) a tax policy that leads to a real price increase on these products and (2) a regular process of updating these taxes to keep up with inflation. As stated by Dr. Jeremy Farrar, Assistant Director-General of WHO, “Health taxes are one of the most efficient tools we have. They cut the consumption of harmful products and create revenue governments can reinvest in health care, education, and social protection.”
For African governments looking for country-specific recommendations on tobacco tax, we recommend the resources available at tobaccocontroldata.org. For other questions, we recommend that government officials reach out to the Tax Justice Network Africa and the Health Taxes Unit in the WHO for individual consultation. The opportunity is now to capitalize on these opportunities to mobilize resources for much-needed healthcare.
From Data to Impact: Why Data Visualization Matters in Agriculture
A lot has been said about the importance of data and data-driven decision-making. However, having data isn’t enough; what matters is being able to understand it and apply it to enable better decisions.
Research across disciplines has shown strong evidence that visualizing data, rather than presenting numbers alone, significantly enhances how people comprehend and use information. Easy-to-follow visualizations help deliver more information while making it easier for our brains to process, reducing the mental effort required to interpret complex datasets.
In fields such as agriculture, where decisions can have long-term consequences, having user-friendly data tools becomes even more crucial. Decision-makers across various levels, whether they’re farmers in the field or government policymakers, need tools that not only provide data but also make it usable and actionable.
Data visualization training as an entry to data use
The Ethiopian Ministry of Agriculture (MoA), along with the Ministry of Regional Trade & Integration (MoTRI) and the Livestock Development Institute (LDI) manages significant volumes of livestock data covering disease, markets, and genetics. However, the gap in technical capacity hindered effective data management, analysis, and use in decision-making. To this end, the a Livestock Information Vision for Ethiopia (aLIVE) program, funded by the Gates Foundation, conducted a series of structured and hands-on training in data visualization techniques in November 2023 and January 2024 with the goal of increasing data use.
Federal system owners received training with a focus on practical application in real-world scenarios, utilizing their data generated from their databases. Participants learned how to clean and manage datasets, apply Excel formulas and functions, use PivotTables and PivotCharts for data summarization, and develop interactive visualizations with Power BI. A critical component of the training emphasized key data quality principles, such as validity, integrity, precision, reliability, and timeliness, ensuring that system owners could produce reliable and accurate reports. Many trainees had prior theoretical knowledge but lacked hands-on experience, which this training effectively addressed by allowing them to work with live datasets from their databases.

Visualization improves efficiency in reporting
As participants began applying their new data skills, the practical benefits became clear. Being able to visualize data using tools like Power BI and PivotTables in Excel, along with advanced filtering, enabled trainees to enhance the efficiency of utilizing livestock data for various reporting purposes. The trainees reported that data tasks that previously took them several days to complete could now be done in hours or even minutes.
Systems owners as trusted intermediaries
Systems owners play a critical role in connecting and translating information from the ground up, delivering data from farmers and local stakeholders to policymakers. To do this effectively, they must be empowered data users, capable of transforming raw datasets into clear, usable insights for decision-makers. Timely, clear, and well-targeted data visualizations created by system owners are a critical enabler of informed policymaking.
When data is presented in an accessible and actionable format, it becomes far easier for policymakers to reference and utilize it. Conversely, poor decisions often stem not just from low-quality data, but from decision-makers being unable – or unwilling – to engage with overly complex or delayed data.
Through the aLIVE training program, system owners develop the ability to deliver information in a way that enables decision-makers to confidently act upon it. Importantly, this training goes beyond technical capacity. It positions system owners as trusted intermediaries in the livestock value chain – individuals who can translate raw field-level data into national-level insights that shape policies, programs, and investments.
Crucially, the program also employs a training-of-trainers (ToT) model. System owners cascade their knowledge to regional focal points, who in turn are better equipped to interpret and communicate data to local stakeholders, including farmers. For instance, once Dr. Gashaw completed the training, he immediately trained DOVAR-II owners at the Woreda level on data management and analysis.
Additionally, regional focal points now use PivotTables for their own reporting. According to Dr. Tewodros, the Animal Disease Notification and Investigation System (ADNIS) System Owner at MoA, “The fact that aLIVE expanded its training to the regions based on our recommendation has simplified their data reporting and improved our ability to follow up. They have begun using Pivot Tables in their reports, and many are eager for similar and further training.”
This cascading approach helps ensure that data literacy does not remain concentrated only at the federal level but extends through the broader livestock system. By reinforcing accountability, transparency, and empowerment across different levels of governance, this approach protects and entrenches data use for the long term as it moves through the value chain.

Visualized data enables evidence-based decisions at every level
Outside of reporting, visualizing data lowers the barrier to making evidence-based decisions, especially when the visualizations are simple, like bar graphs and simplified charts. This evidence can improve decisions by introducing new insights that can shift prior beliefs, reduce uncertainty, and can be understood from federal to regional to farmer level.

Improvement in data quality awareness was noted, as described by Dr Gashaw from the DOVAR System: “Since we are not data experts, we used to consider quality issues only in relation to outliers. The training has helped me understand data quality very well, including concepts such as integrity and validity. We are now discussing how the data that comes to our system can attain these standards.”
Improving data competency facilitates a data-driven culture
Embedding a data-driven culture requires demonstrating the value of data and building stakeholders’ capacity to use it. Data visualization training serves two purposes: enhancing the utility of data and reducing entry barriers, allowing stakeholders to generate their own visualizations. With improved capacity comes an increased willingness to integrate data into daily workflows.
Improved competency translates into a more confident mindset when working with data and a stronger willingness to engage with data tasks without relying on external support. As a result, participants are better equipped to generate insights quickly. One expert, for instance, went on to create time series analysis to identify trends and patterns, which could help policymakers plan for the future, allocate resources more effectively, and respond to emerging challenges.


Connecting the Dots
In conclusion, visualization is not only a practical tool for data analysis but also a powerful trust-building mechanism, as making data more accessible improves stakeholders’ ability at every level to engage with and interpret information. This demystifies complex livestock systems, empowering all stakeholders to explore data independently and reinforcing both confidence in the data and its use. Dashboards and visualization can enhance communication and coordination throughout the value chain, fostering trust and attracting new, previously disengaged stakeholders who might otherwise be excluded by technical barriers.
…
This is the second blog in our series sharing lessons from the aLIVE program, which is supporting livestock decision-makers in Ethiopia by improving the accessibility and reliability of livestock data. The series examines how data can be made more accessible, actionable, and sustainable to enhance decision-making across government to farm levels.
Read the first blog here.
Introducing The HackCorruption Civic Tech Tools Repository
Working to curb corruption requires collaboration. Developers working to create innovative tech tools to increase transparency and flag corruption risks use shared knowledge and open source code to advance tools, building upon one another’s work. As Development Gateway: An IREX Venture and Accountability Lab mentored HackCorruption teams selected during regional hackathons for their innovative digital tool ideas, we noticed that these teams spent time searching for existing tools that could be built upon, as well as open source code that could be leveraged to create new tools.
A HackCorruption gathering in Nepal, August 2024We developed the HackCorruption Civic Tech Tools Repository as a comprehensive, open-source collection of impactful and scalable digital solutions for combating corruption to be used not only by HackCorruption teams but also by any others interested in building or enhancing digital anti-corruption tools. The repository is designed for continuous growth through community contributions via GitHub and is intended to serve as a centralized hub for tools, source code, and resources organized across six key thematic areas:
- Foundational anti-corruption data and metadata management tools
- Beneficial ownership transparency
- Open contracting
- Budget transparency and participatory budgeting
- Climate finance transparency
- Illicit financial flows
Key Features
- 41+ curated tools across multiple categories with direct access to source code
- Open-source accessibility allowing community contributions and continuous updates
- Comprehensive documentation, including technology overviews, deployment examples, intellectual property information, and impact evidence
- Wiki integration providing background information and contributor guidelines
- Global scope with geographic tagging and categorization
We invite developers and others working on anti-corruption tech tools to contribute their open source code and tools to this repository! We hope that this resource can act as a catalyst for the collaboration and cooperation required to create change and curb corruption.
Access the repository below.
Click here to learn more about HackCorruption.
