SUMMARY MONITORING AND EVALUATION FOR IN-SERVICE TEACHER PROFESSIONAL DEVELOPMENT PROGRAMS SEE TECHNICAL Guidance on how to design, implement, GUIDANCE NOTE use, and sustain an M&E System for teacher professional development programs COACH TOOLS AND RESOURCES Coach is the World Bank’s program focused on accelerating student learning by improving in-service teacher professional development around the world. INTRODUCTION Objective: Audience: To provide high-level guidance for Task Team • World Bank TTLs and project teams looking for technical Leaders (TTLs), project teams, and external guidance during project identification, preparation, appraisal, players on how to design, implement, and and implementation (additional financing/restructuring). sustain a monitoring and evaluation (M&E) system for an in-service teacher professional • Policymakers and external players looking for direction on development (TPD) program. how to design and implement an M&E system within an in- service TPD program. The information can support project design, component descriptions, technical analyses, and Project Appraisal Document (PAD) descriptions. COACH 1 SCOPE This Guidance Note Presents: High-level direction on key factors to consider when designing and implementing an effective M&E system for a TPD program. Specific design features may vary depending on contextual factors, including: • available resources • local technical capacity • political environment or fragility • and the exact features of the TPD program (“highly-” versus “low-” structured programs; and school- and cluster-based versus other models) Monitoring and Evaluation for In-Service Teacher Professional Development Programs 2 HIGHLIGHTS What building, maintaining, and sustaining What an M&E system can do: an effective M&E system requires: Enable key actors to make evidence-based decisions Carefully selecting outcome indicators, establishing data systems, to expand the program and allocate budget. and getting the technical details right. Provide valuable data to feed into both the Consistent tight feedback loops linking data with the decision- implementation and design of the program by offering making processes to help governments and other actors improve opportunities for course correction. the ongoing design and implementation of TPD programs. Strengthen accountability relationships among Equally important are strong political support and buy-in from key stakeholders. stakeholders who are willing to use the system to make evidence- informed decisions. Finally, context matters. Low-resource and FCV settings may need to start with an M&E system that has only a few basic indicators and requires ongoing support with technical and financial resources. Specifically, FCV contexts need to ensure adaptability within the M&E system to respond quickly to changing environments. COACH 3 PURPOSE OF A TPD M&E SYSTEM To guide the TPD program toward its objectives of improved teaching practice, better quality student-teacher interactions, and, ultimately, improved student learning outcomes by: Monitoring implementation fidelity Monitoring result outcomes • Data from an M&E system can help ensure that the TPD • Data from an M&E system can help determine progress program is implemented with fidelity. toward desired outcomes, such as improvements in • Implementation data (such as the frequency of observations teaching practice, quality of student-teacher interactions, and feedback sessions by trainers, or the number of teacher and student learning. guides distributed per teacher) can help monitor • Outcome data can include longer term results, including implementation fidelity and enable implementers to make changes in student learning, and medium- to short-term course corrections, as necessary. results such as progress (or lack thereof) in how teachers structure the lessons or engage with students. Ultimately, data should help decision-makers track progress, decide whether the goals of the program are being achieved, and enable future evidence-informed decisions. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 4 HOW DOES AN M&E SYSTEM ENSURE PROGRESS TOWARD PROGRAM OBJECTIVES? Through tight feedback loops that These tight loops can help agencies and iteratively direct information implementers learn, innovate, and into decision-making processes. improve the design and implementation of TPD programs. COACH 5 Figure 1. Underlying an M&E System Is a Sound Results Framework for a TPD Program Monitoring and Evaluation for In-Service Teacher Professional Development Programs 6 M&E IN PRACTICE: contd. on next slide CASE STUDY OF KENYA’S TUSOME PROGRAM Tusome, a successful national program in Kenya to improve early grade education, built in capacity to monitor outcomes and implementation fidelity in six important ways: Frequent Monitoring Tusome enabled frequent monitoring of key learning outcomes and For example, student learning data were embedded it as a key feature of the accountability relationship between viewable on a dashboard shared with CSOs and the MoE officials, to whom the CSOs curriculum support officers (CSOs) who supported teachers and Ministry of were accountable. Education (MoE) officials. Tight Feedback Loops Tusome built in tight feedback loops to monitor teachers’ progress toward For example, at each visit, the CSO recorded whether the teacher employed the techniques for which the desired outcomes and to tailor feedback to teachers as appropriate. teacher had received training during the lesson and provided feedback to the teacher accordingly. Ongoing Guidance CSOs received ongoing guidance and feedback from implementation firm For example, the Tusome technical team often observed the CSOs’ one-on-one sessions with experts and county-level education officers who observed CSOs’ feedback teachers and gave feedback on the quality of sessions with teachers. the CSOs’ instructional support. COACH 7 M&E IN PRACTICE: CASE STUDY OF KENYA’S TUSOME PROGRAM (contd.) Tusome, a successful national program in Kenya to improve early grade education, built in capacity to monitor outcomes and implementation fidelity in six important ways: Outcome & Process Indicators To ensure implementation progress, Tusome used not only outcome For example, the CSOs’ tablets, which were used for classroom observations, were equipped with GPS indicators but also process indicators. monitoring, enabling implementers to ensure that CSOs conducted their allocated visits. Linked Data with Incentives To strengthen accountability, Tusome linked monitoring data with For example, the data on the number of classroom visits for each school were used to determine the CSOs’ incentives. travel reimbursement. Data Shared Widely Monitoring data were shared widely to facilitate feedback and strengthen For example, a dashboard showing the percentages of target visits at the county and national levels was used accountability relationships among actors. by MoE leadership to increase accountability for instructional support. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 8 KEY ELEMENTS IN TUSOME’S PLAN TO MONITOR OUTCOMES AND IMPLEMENTATION FIDELITY Frequent Tight Linked Outcome & Data Shared Ongoing monitoring Feedback Data with Process Widely Guidance Loops Incentives Indicators COACH 9 TYPES OF MONITORING INDICATORS FOR A TPD M&E SYSTEM Process indicators to monitor Outcome indicators to measure results: implementation: Process indicators show whether Outcome indicators include data to show whether activities are being implemented as planned. These are program inputs, activities, and outputs have improved vital to track management and implementation of outcomes. • For example, based on information from classroom programs, use of resources, and delivery of services. observation tools, has student-teacher interaction (an Nevertheless, by themselves, indicators do not show intermediate outcome) improved as a result of using teacher guides (an input)? whether outcomes have been achieved. • For example, a TPD program could be on track to meet the target number of teacher guides produced. However, if these guides are not well designed or if teachers do not use them, the teacher guides are unlikely to improve teaching practice. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 10 MOVING BEYOND PROCESS INDICATORS TO OUTCOME INDICATORS • M&E plans should include clear and measurable indicators that go beyond process indicators, such as those that track inputs, to those that measure outcomes. • Outcome indicators can suggest whether the program is on track to meet intended objectives. Prioritizing outcome indicators can help draw the attention of policymakers and managers to results as opposed to process-oriented tasks. • To incentivize progress toward results, outcome indicators also may be linked to financing. Regardless of Type, Indicators Should Follow the SMART Principles 1. Specific: Indicators measure as closely as possible what we want to know. 2. Measurable: Indicators are specific and can be clearly measured. 3. Attributable: Indicators are logically and closely linked to the program’s objectives. 4. Realistic: Data are obtainable at feasible cost with reasonable accuracy and frequency. Indicators are specific to the program’s target group. In a TPD program, the target groups may include 5. Targeted: teachers, students, and trainers. COACH 11 Monitoring and Evaluation: Two Sides of the Same Coin Monitoring and evaluation play complementary roles. Monitoring data usually indicate a program’s progress at any given time relative to targets and can inform ongoing course corrections. In contrast, evaluation usually attempts to address causality: the reasons that targets are or are not being achieved. Monitoring TPD Programs A monitoring system gives ongoing information about the direction, pace, and magnitude of change to see whether the program is moving in the right direction and whether implementation is rolling out as intended. However, monitoring data do not give the basis for causal inference, that is, why or how changes are occurring. Evaluating TPD Programs Many types of evaluations can help measure progress toward desired outcomes. Types range from randomized experiments, including nimble experiments; to quasi-experimental designs; to non-experimental evaluations, such as process implementation evaluations or rapid appraisals. Each evaluation type produces different strengths of evidence. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 12 EFFECTIVE PARTNERSHIPS TO FACILITATE DATA COLLECTION AND MONITORING contd. on IMPLEMENTATION FIDELITY next slide When designing, implementing, and using M&E systems, governments may face financial and technical constraints. In such contexts, collaborations with private firms, development partners, or other actors can facilitate data collection and use to monitor implementation fidelity and progress toward desired outcomes. Partnerships can help facilitate data collection and can be used to monitor implementation fidelity. For example: • Successful implementation of Tusome in Kenya • World Bank formed a multi-layered partnership resulted from a partnership between the Kenyan with the government of Punjab, Pakistan to Ministry of Education, RTI (Research Triangle implement Teach, a classroom observation tool Institute), the United States Agency for International to monitor and evaluate teacher practices. The Development (USAID), and the United Kingdom partnership along with a staggered rollout Foreign, Commonwealth and Development Office helped create early opportunities to integrate (FCDO), which funded the program. RTI provided lessons learned, including feedback from human and technical support to monitor partners, teachers, and district leaders. implementation fidelity and improved program implementation based on the data collected. COACH 13 EFFECTIVE PARTNERSHIPS TO FACILITATE DATA COLLECTION AND MONITORING IMPLEMENTATION FIDELITY (cont'd) Partners also may help to… …build systems to collect, store, maintain, and integrate data, …Facilitate production of new evidence of program especially in contexts in which such technical skills or systems effectiveness. are limited. For example, the government of Malawi partnered with the For example, the South African Department of Basic Education partnered World Bank, Royal Norwegian Embassy, and FCDO to set up with a third party, DataFirst, to host and securely share de- the Malawi Longitudinal School Survey (MLSS). It collected identified administrative data. These data have provided data management nationally representative data on school conditions and learning benefits to the government and enabled multiple impact evaluations of outcomes and produced policy-relevant insights on students’ various education initiatives. learning trajectories and the impacts of pilot interventions. To ensure a program’s long-term sustainability, it is vital to build into partner contracts the technical assistance to strengthen the skills and capacity of the public bureaucracy and governmental actors, such as coaches and monitors. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 14 STEP-BY-STEP PROCESS TO BUILD, MAINTAIN, AND SUSTAIN AN M&E SYSTEM Step 1 • Conduct a readiness assessment Step 2 • Agree on outcomes to monitor and evaluate Step 3 • Select key performance indicators to monitor outcomes Step 4 • Set baselines and targets; gather data on indicators Step 5 • Monitor and report results Step 6 • Use the findings Step 7 • Sustain the M&E system COACH 15  Confirm whether any M&E structures exist in TPD programs and can be built on.  Identify the objectives of the TPD program.  Identify local capacity of the actors and agencies involved to perform their M&E roles. STEP 1.  Identify needs and opportunities to build partnerships that CONDUCT A address capacity constraints. READINESS  Confirm whether the program development objectives are anchored in the country’s education sector strategy. ASSESSMENT  Identify key barriers to implement an M&E system and potential ways to address them.  Identify key incentives to design and build an M&E system, and ways to use these incentives to create buy-in for the system. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 16  Map the results framework underlying the TPD program by laying out the causal pathway from inputs, activities, and outputs to outcomes and impacts.  Based on the theory of change, identify key outcomes to STEP 2. monitor, including improved teaching practice, improved student- teacher interaction, and improved learning outcomes. AGREE OUTCOMES TO MONITOR AND  If possible, involve key actors including the government, teacher unions, CSOs, school leaders, NGOs, donors, and other education EVALUATE stakeholders to help build consensus and create buy-in for key outcomes to monitor. COACH 17  Identify key performance indicators to measure improvements in outcomes linked to the effectiveness of the TPD program. STEP 3.  Identify key process indicators to ensure that the program is being implemented with fidelity. SELECT KEY PERFORMANCE  Confirm that the selected indicators are specific, measurable, attributable, realistic, and targeted (SMART). INDICATORS TO  Ensure that the M&E system is not overloaded with too many MONITOR indicators. OUTCOMES  Pilot the indicators at a small scale and make changes as necessary before large-scale roll-out. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 18  Identify existing data sources. Determine whether these sources can be used to establish a baseline. • For example, EMIS, SABER, SDI, existing classroom observation data (from Teach or Stallings), and local assessment data.  If new data are collected, identify tools for data collection. • For example, Tangerine and KoBoToolbox.  When choosing a classroom observation tool, keep in mind the licensing costs, STEP 4. cultural relevance, and enumerator capacity required; identify whether teachers and coaches need training to use the tools. SET BASELINES  Establish the baseline before the program is rolled out. AND TARGETS, AND GATHER DATA ON  Agree on a target corresponding to each indicator over a set time frame. Targets should be realistic, contextually appropriate, time bound, and evidence based. INDICATORS  Decide how frequently data will be collected. (Should be collected at regular intervals and ideally comparable over time to analyze trends).  Develop data validation processes to ensure that data collected are accurate and reliable.  In addition to quantitative data, gather qualitative information from key stakeholders including teachers, coaches, and master trainers about program results. COACH 19  Monitor performance against established targets.  Develop a central repository or dashboard in which data are stored and can be accessed by key actors and decision- makers. STEP 5.  Identify plans for analysis and dissemination of reports and MONITOR AND data visualizations. REPORT RESULTS  Ensure that results are reported to the intended audience in a clear and timely manner. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 20  Based on data insights, identify key actors responsible for decision-making, such as ministry officials and implementation partners.  Identify whether the appropriate decision-making bodies have the time, capacity, and autonomy to regularly review, discuss, and act on the data.  Identify which programmatic decisions and course corrections are being made as a result of M&E information. STEP 6. • For example, which results feed into how resources such as USE THE teacher guides are allocated, or how training is designed? FINDINGS  Identify whether the program is meeting its outcome goals and use the findings to make decisions. • For example, are results being used to make changes to program design, or to make decisions about budgetary allocations and scale-up?  Identify whether the program is being implemented with fidelity. • For example, are trainers conducting their target number of visits to teachers? Does every teacher have access to a guide or other inputs? COACH 21  Lay out clear roles and responsibilities for actors in charge of managing and maintaining the M&E system.  Ensure that teachers, school and pedagogical leaders, and other actors who collect or provide data are given appropriate time and resources to perform their tasks.  Ensure that the M&E system is producing credible, valid, STEP 7. timely, and reliable information. SUSTAIN THE  Identify whether technical and financial capacity exists to sustain the system; identify actions to enhance the capacity and M&E SYSTEM performance of the agencies involved.  Identify whether the incentives for various stakeholders are sufficiently aligned with helping sustain the M&E system.  Identify a strong political champion for the M&E system.  Identify and establish a process to evaluate the M&E system. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 22 Additional Reading This summary is based on the accompanying Monitoring and Evaluation for In-Service Teacher Professional Development Programs: A Technical Guidance Note. The guidance note provides details on these highlights and sets out how to navigate some of the challenges that governments and other organizations may face when designing and implementing an M&E system for an in-service TPD program. Citation: Akmal, Maryam. 2022. "Monitoring and Evaluation for In-Service Teacher Professional Development Programs: Technical Guidance Summary." Coach Series, World Bank, Washington, DC. License: Creative Commons Attribution CC BY 4.0 IGO. COACH 23 References Bruns, Barbara, Leandro Costa, and Nina Cunha. 2018. “Through the Looking Glass: Can Classroom Observation and Coaching Improve Teacher Performance in Brazil?” Economics of Education Review 64: 214-50. https://doi.org/10.1016/j.econedurev.2018.03.003. Castro, Juan F., Paul Glewwe, and Ricardo Montero. 2019. "Work with What You’ve Got: Improving Teachers’ Pedagogical Skills at Scale in Rural Peru." Working Paper 158. Peruvian Economic Association, Lima, Peru. https://ideas.repec.org/p/apc/wpaper/158.html. Haslam, M. Bruce. 2010. “Maryland Teacher Professional Development Evaluation Guide.” GTL Center, Arlington, VA. https://gtlcenter.org/sites/default/files/docs/MarylandPDEvaluationGuide.pdf. Karlan, Dean. 2017. “Nimble RCTs: A Powerful Methodology in the Program Design Toolbox.” Innovations for Poverty Action (IPA), New Haven, CT. https://pubdocs.worldbank.org/en/626921495727495321/Nimble-RCTs- WorldBankMay2017-v4.pdf. Kekahio, Wendy, Brian Lawton, Louis Cicchinelli, and Paul R. Brandon. 2014. “Logic Models: A Tool for Effective Program Planning, Collaboration, and Monitoring.” US Department of Education, Washington, DC. https://ies.ed.gov/ncee/edlabs/regions/pacific/pdf/REL_2014025.pdf. Kusek, Jody, and Ray Rist. 2004. “A Handbook for Development Practitioners: Ten Steps to a Results-Based Monitoring and Evaluation System.” World Bank, Washington, DC. https://openknowledge.worldbank.org/bitstream/handle/10986/14926/296720PAPER0100steps.txt?sequen ce=2&isAllowed=y. Piper, Benjamin, Joseph Destefano, Esther M. Kinyanjui, and Salome Ong’Ele. 2018. “Scaling up Successfully: Lessons from Kenya’s Tusome National Literacy Program.” Journal of Educational Change 19 (3): 293–321. https://doi.org/10.1007/s10833-018-9325-4. Monitoring and Evaluation for In-Service Teacher Professional Development Programs 24 Contact us at coach@worldbank.org and visit us at www.worldbank.org/coach