Data for Learning: 5 Lessons to Make Evidence-based Education Quality Improvements

May 20, 2022 Education
Rebecca Ward
IREX

This piece was published by DG’s strategic partner, IREX. The original post can be found here.

Data can be a powerful resource for education reform. Without it, education leaders can be immobilized because they do not have enough information to recognize that problems exist or to make the case for change. However, data only provides value if it is used to inform decisions.  Unfortunately, many education systems struggle with this step.  So, how can education leaders best be supported to use data to make better decisions in support of quality education improvements? In collaboration with local partners, IREX is supporting the development of data systems and tools to improve education systems through better, evidence-based decision-making.

We share below some of the lessons from our work, including the importance of providing clarity on what data will be used for; balancing long term integration with rapid modeling; institutionalizing clear decision making points for data use; modeling inclusion; and the benefits of using context data. 

Improving the use of data across education systems continues to be a major focus of governments and development partners worldwide. Good data can help actors better understand system inputs, outputs, outcomes, and contextual factors so that decisions about policy and practice are responsive and evidence based. IREX has variously supported education partners to collect and use data about institutional capacity and performance, student satisfaction and learning outcomes, perceptions of the teaching profession, and teacher demand, supply, and career paths.

At this year’s Comparative and International Education Society Conference (CIES), I’ll be discussing what we have learned from developing a National Perceptions of Teaching Survey in Jordan and our development and use of a Higher Education Institutional Capacity Assessment Tool.  In case you can’t join me at these sessions, here are 5 lessons emerging from our work with partners to improve education system data.

Agree and Provide Clarity on What Data will be Used for

In higher education, institutional assessments can be used to enable leaders to recognize and prioritize performance improvement needs, foster organizational learning, and support planning and strategic decision making. They can also be used for benchmarking, accreditation, and accountability and there is considerable tension between these functions. IREX has used its Higher Education Institutional Capacity Assessment Tool (HEICAT) with universities in Sub-Saharan Africa, Eurasia, and the Middle East, for the purposes of both internal organizational learning and external accreditation. We have learned it is vital to engage partners prior to the assessment to agree and provide clarity across all stakeholders on the purpose of the assessment, including who will have access to data and what data will be used for, in order to build trust and facilitate honest dialogue.

Aim for Integration but Model User-friendly Functionality to Secure Buy-in

In our work to support the Government of Jordan to collect and use data and projections about teacher supply and demand, we agreed with core partners early on that it should be housed in their existing EMIS, following best practices to integrate new data initiatives within existing systems and avoid proliferation of databases. To avoid losing stakeholder interest and buy-in during the integration process, IREX and our data partner created a fully functional “sandbox” model that  has been vital to maintain momentum, serving dual purposes: as a resource for training Ministry staff on the model, its capabilities, and data-informed decision making; and to demonstrate user-friendly functionality, which has been instrumental in fostering understanding and buy-in, including the Ministry’s decision to use the data to allocate scholarships for teacher preparation to areas of greatest need.

Identify and Institutionalize Clear Decision Making Points for Data Use

The introduction of new data collection regimes can stall in the absence of clear roles, responsibilities, and decision making points. If the data user community does not have a clear plan for data use, fatigue and resistance are more likely to set in. In the West Bank, IREX adapted the HEICAT to align with national licensure and accreditation requirements, ultimately working with university partners and the Accreditation and Quality Assurance Commission to integrate the self-assessment into the Common Framework for Quality Assurance in Palestinian Higher Education. The framework includes clear roles and responsibilities, timelines, and procedures for data collection and use. In Georgia, IREX provided technical assistance to the National Center for Teacher Professional Development (TPDC) to develop and use data from a Training Management System in support of a nationwide rollout of training to over 18,000 teachers and 2,000 school principals. Use of the database was integrated into the program management cycle to assist planning for subsequent budget years, including the number of additional trainings, catch-up trainings, and trainer procurement.

Model Inclusion and Build Data Producer and User Communities

Data can be a source of power with the potential to exclude. Notably, youth are often absent in the production and use of data about education systems, despite being central to them. IREX is changing this by supporting youth and adults to generate and use data as a language to collaborate and inform decision-making on issues that affect them.

The Kisumu Issue-based Collaborative Network (ICON) is further enhancing youth work readiness by engaging higher education institutions, the public sector, and the private sector with diverse youth and youth-led/youth-serving organizations. A recent ICON data summit brought together 125 participants providing a platform for young people to share and explore data on youth employment and work readiness with decision makers in national and country government. In response to research indicating that employers valued soft skills as much as digital skills, summit participants advocated for soft skills such as empathy and communication to be incorporated in youth digital skills programs.  Also upon learning that receiving a certification from a vocational education and training center is not seen by many youth as a catalyst to better economic opportunities, youth participants advocated for stronger bridges to employers and earning.

Don’t Forget Context Data

Educational data tends to focus on system inputs (e.g., enrollment, teachers, resources, expenditure, infrastructure), outputs (e.g., attendance, attainment), and outcomes (e.g., learning outcomes, employability). However, it is widely acknowledged that contextual factors significantly impact the performance of education systems, and these factors can provide valuable data. In Jordan, we are conducting a biennial National Perceptions of the Teaching Profession Survey to understand how Jordanians view the profession and to learn what they know about how to become a teacher. The data is being used to inform the content and targeting of annual national campaigns by the Ministry of Education to improve the profession’s public image with the goal of attracting more and better applicants to teacher training programs.

Share

Recent Posts

Three “Hacks” in Advancing Anti-Corruption Work To Strengthen Accountability

Since partnering with Accountability Lab (AL) on HackCorruption in 2023, staff from Development Gateway: An IREX Venture (DG) and AL have mentored teams from the hackathons hosted by the project, which is aimed at leveraging innovative digital tools to identify and fight corruption. After serving as the mentorship team for hackathon winners, DGers Gabriel Inchauspe and Kelley Sams, along with AL’s David Sada have identified three hacks in mentoring anti-corruption change-makers and ultimately, furthering transparency and accountability through the creation and use of digital tools.

April 3, 2024  
aLIVE Program Reaches Milestone: Livestock Data Standards Endorsed by Ethiopia’s Ministry of Agriculture

In early 2024, the "a Livestock Information Vision Ethiopia" (aLIVE) governing committee endorsed a comprehensive set of standards to guide the collection, storage, and maintenance of livestock data in Ethiopia (i.e., a data standard). The data standard specifically focuses on standardizing data on cattle, sheep, goats, and camels in the country. The National Livestock Data Standard document contains standardized data sets for national animal data recording, animal disease, diagnosis, treatment, vaccination recording, animal events recording, location, and other additional attributes.  

March 28, 2024 Agriculture
Custom Assessment Landscape Methodology 2.0 – Reflections After Five Years

Our new white paper re-introduces our flagship methodology, Custom Assessment Landscape Methodology (CALM), to help our partners, collaborators, and teammates better engage with a flexible and adaptive assessment methodology.

March 19, 2024 Process & Tools