Early-Stage Evaluation of the Multiphase Programmatic Approach © 2024 International Bank for Reconstruction and Development / The World Bank 1818 H Street NW Washington, DC 20433 Telephone: 202-473-1000 Internet: www.worldbank.org ATTRIBUTION Please cite the report as: World Bank. 2024. Early-Stage Evaluation of the Multiphase Programmatic Approach. Independent Evaluation Group. Washington, DC: World Bank. COVER PHOTO Shutterstock/Prachak Sawang EDITING AND PRODUCTION Amanda O’Brien GRAPHIC DESIGN Rafaela Sarinho This work is a product of the staff of The World Bank with external contributions. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The bound- aries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. RIGHTS AND PERMISSIONS The material in this work is subject to copyright. Because The World Bank encourages dissem- ination of its knowledge, this work may be reproduced, in whole or in part, for noncommercial purposes as long as full attribution to this work is given. Any queries on rights and licenses, including subsidiary rights, should be addressed to World Bank Publications, The World Bank Group, 1818 H Street NW, Washington, DC 20433, USA; fax: 202-522-2625; e-mail: pubrights@worldbank.org. Early-Stage Evaluation of the Multiphase Programmatic Approach November 20, 2024 Contents Contents Abbreviations v Acknowledgments vi Overview vii World Bank Management Response xiii 1. Background and Evaluation Portfolio����������������������������������������������������������������������� 1 2. Scope, Evaluation Questions, and Methodology����������������������������������������������������8 Scope 8 Evaluation Framework 9 Methods 14 3. The Evaluation Questions Addressed�������������������������������������������������������������������� 17 To What Extent Does the Design of Multiphase Programmatic Approaches Meet Expectations? 17 To What Extent Are Multiphase Programmatic Approaches Functioning in Line with Expectations? 24 To What Extent Has the Enabling Environment Supported Multiphase Programmatic Approaches as Intended? 34 4. Findings and Implications���������������������������������������������������������������������������������������39 Main Findings 39 Implications 40 References�������������������������������������������������������������������������������������������������������������������42 Boxes The Independent Evaluation Group’s Evaluation of the World Bank’s Box 3.1.  Early Support to Addressing the COVID-19 Pandemic 22 Examples of Institutional and Technical Learning in World Bank Box 3.2.  Multiphase Programmatic Approach Projects 30 ii Figures Figure 1.1. Types of Multiphase Programmatic Approaches  3 The Multiphase Programmatic Approach Portfolio, Figure 1.2.  Fiscal Years 2018–24  4 Figure 1.3. Multiphase Programmatic Approach Portfolio by Global Practice  5 Figure 2.1. Theory of Action of the Multiphase Programmatic Approach  10 Level of Ambition in Outcomes: Multiphase Programmatic Approach Figure 3.1.  versus Comparators  18 Reasons MPAs Are Considered More Appropriate Than Regional Figure 3.2.  IPF (Horizontal MPAs), Series of Projects, or Stand-Alone Projects (Vertical MPAs) 19 Figure 3.3. Long-Term Institutional Development  21 Multiphase Programmatic Approach–Related Features Perceived Figure 3.4.  to Support Coherence  26 Figure 3.5. Reasons for Restructuring  33 Figure 3.6. Factors Cited as Critical in the World Bank Authorizing Environment  36 Tables Hypothesized Multiphase Programmatic Approach Mechanism Table 2.1.  Associated with Development Effectiveness  13 Comparison of Horizontal Multiphase Programmatic Approaches Table 3.1.  and Regional Investment Project Financing  23 Appendixes Appendix A. Evaluation Methods 46 Appendix B. Portfolio Review and Data Analysis 60 Appendix C. Key Informant Interviews 69 Appendix D. Validation of Operations Policy and Country Services Findings 75 iii Abbreviations CPF Country Partnership Framework FCS fragile and conflict-affected situations IDA International Development Association IPF investment project financing MPA multiphase programmatic approach OPCS Operations Policy and Country Services PDO project development objective PforR Program-for-Results PrDO program development objective TTL task team leader All dollar amounts are US dollars unless otherwise indicated.     v Acknowledgments This report was prepared by an Independent Evaluation Group team led by Rashmi Shankar under the guidance of Birgit Hansl (manager) and Theo David Thomas (director) and under the overall guidance of Sabine Bernabè (Director-General, Evaluation). The evaluation team included Andrea Rojas, Diana Stanescu, Andrei Wong, and Marwane Zouaidi. Rasmus Heltberg advised the team. Additional advisers to the team were Harsh Anuj (data scientist) and Munib Qasim Zia (consultant). The report was peer-reviewed by Srinivasan Venkat (evaluation specialist, Asian Development Bank) and Deborah Wetzel (consultant, former World Bank director for regional integration for Africa and the Middle East and North Africa). The team is grateful to the Operations Policy and Country Services unit for technical discussions and data sharing. vi Overview The World Bank introduced the multiphase programmatic approach (MPA) in 2017 as a means of structuring a long, large, or complex engage- ment as a set of shorter linked projects or phases using either investment project financing or Program-for-Results financing.1 This engagement was intended to take either a vertical form within a single country, typically over 8–10 years, or a horizontal form across several countries at the same time (or several states within a country), supporting activities that were either scalable or modular or that followed a predictable course, provided that in each case they were consistent with the program development objective. The motivation behind the MPA, outlined in a 2017 paper from the Board of Executive Directors, was to provide continuity of engagement, allow more flexibility in responses to changed circumstances, encourage adaptive learning, and support stepwise progress toward a long-term development objective (World Bank 2017). By signaling a willingness to pursue a long-term development objective, the MPA was intended to strengthen the coherence of World Bank–financed interventions, contribute to building consensus on the client side, and diminish the likelihood of interruptions in support between phases. Compared with a single large operation, the MPA would enable more adaptation to circumstances by incorporating multiple opportunities for reflection and course corrections. The 2017 Board paper also expected benefits to accrue operationally. By committing financing in smaller phases, rather than committing the total cost of the larger program up front, the World Bank could reduce undisbursed balances and allow borrowers to save on commitment fees, as well as reduce the processing costs of follow-up phases relative to stand-alone operations. The 2017 Board paper also noted that the MPA might provide a framework for engagement by other lenders even beyond the duration of World Bank financing. For instance, the World Bank could mobilize commercial financing for the first phase of an infrastructure program or could partner with other     vii multilateral development banks, commercial lenders, private investors, or private companies to finance or implement subsequent phases. Since the introduction of the MPA, there has been a steady increase in its use. At the end of December 2023, the portfolio comprised 11 horizontal and 29 vertical MPAs. Most nonemergency MPA financing occurs in International Development Association–eligible countries and targets infrastructure and sustainable development, with agriculture, energy, water, and transport being the most important users. MPAs have been used in all World Bank Regions. Three-quarters of nonemergency MPA financing has been for the Africa Region, 85 percent has been for International Development Association–eligible countries, and 20 percent has been for fragile and conflict-affected countries. This evaluation assesses the performance of the approach against the expectations outlined in the 2017 Board paper. The youth of the MPA portfo- lio means that no ex post assessment of outcomes is possible. The evaluation instead asks if the design of MPAs is fulfilling the expectations outlined in the 2017 Board paper, if the specific features of MPAs are functioning as expected, and if there is an enabling environment for MPAs within the World Bank and on the client side. The evaluation also assesses the claims made in Early-Stage Evaluation of the Multiphase Programmatic Approach  Overview the Operations Policy and Country Services technical briefing to the Board on the processing times for MPAs relative to non-MPAs. It does not assess the uptake of MPAs, assess the suitability of the instrument where it has not been applied, ask if the policy scope of MPAs or the delegation of authority for MPAs should be expanded, or assess the improved efficiency of manage- ment of the World Bank Group’s financial resources. Specifically, the evaluation assesses whether the design and early imple- mentation of the MPA has supported the objectives on which the approach’s effectiveness depends. These objectives are as follows: » Coherence. A coherent program fits within the broader program at the level of the country, sector, and institution. The MPA is expected to be more coherent than its alternatives because it was intended to leverage external partnerships and internal collaboration more effectively. » Continuity. This refers to the MPA’s ability to provide stable, long-term support. The vertical MPA supports continuity better than its alternatives be- cause of its programmatic structure and the provision for overlapping phases. viii » Learning. Although all operations should embed knowledge, the MPA re- quires an explicit learning plan, with specificity on implementation arrangements and how the knowledge is to be used. » Adaptation. This refers to the ability of the MPA to adjust the content and timing of its phases in response to new information, evolving priorities, and changing context due to having a larger number of preset points for stocktaking than would be present in a single operation. Main Findings Overall, the evaluation finds that so far, the MPA meets expectations on learning and continuity, although it is less of a departure from business as usual than was hoped for on coherence and adaptation. There are positive indications of support for learning and continuity from the MPA, with no observable differences so far on coherence or adaptation. The ex ante objec- tives of MPAs are not set at a higher outcome level than comparators, nor are they more likely to measure institutional strengthening,2 which may be useful in more challenging fragility, conflict, and violence–affected environ- ments. MPAs better support and measure climate-related objectives. With respect to expectations from the MPA, the evaluation finds the following: » Coherence. There is no evidence yet that MPAs are more tightly anchored in Country Partnership Frameworks than other engagements. Respondents view the longer-term horizon of vertical MPAs and the regional aspect of horizon- tal MPAs as more effective in motivating partnerships with other donors, but Independent Evaluation Group World Bank Group    ix this is not yet reflected in cofinancing data. Neither vertical nor horizontal MPAs appear to support collaboration within the World Bank any better than the alternatives. » Continuity. All those vertical MPAs that have moved beyond a first phase have done so without a break in support. Both vertical and horizontal MPAs provide flexibility in the timing of additional phases that can be tailored to country circumstances. Although risks to continuity have not materialized significantly, there is a perception that MPAs better manage these risks. Only two MPAs have been converted to stand-alone projects over the evalu- ation period. » Learning. Most MPAs have adequate learning agendas. There is also evidence that lessons learned have informed follow-up phases of both hor- izontal and vertical MPAs and that learning is perceived as more effective under MPAs. At the same time, there is no discernible difference from other operations in how learning is financed (mainly through trust funds), and reporting on the implementation of learning plans is mainly on the prepa- ration of follow-up phases. » Adaptation. Given that the approach is relatively new, there is little evidence that the frequency of, motivations for, and content of restructuring under MPAs are any different from non-MPAs. However, for the small number of MPAs that have progressed beyond phase 1, there is evidence that learning is informing the design of subsequent phases. The growing use of MPAs highlights the need to consider the trade-offs among scale, speed, and complexity. The expectation for MPAs to deliver at scale and with speed is linked to the use of emergency response MPAs during the COVID-19 Strategic Preparedness and Response Plan. These emergency Early-Stage Evaluation of the Multiphase Programmatic Approach  Overview MPAs benefited from key fiduciary and operational flexibilities, which sig- nificantly contributed to their success, allowing for rapid disbursement and the swift achievement of project development objectives. Although MPAs can still deliver with scale and speed, achieving these objectives will require that the project design and implementation features are intentionally geared toward these objectives. Increased complexity could slow down implementa- tion when speed and replicability are explicit objectives. Implications The evaluation finds several areas for management to consider as the World Bank increases its use of the MPA that could strengthen the effec- tiveness of the approach. However, it notes that the portfolio is for the most part at an early stage of implementation, and all programs are still active, with only 7 out of the 40 having moved beyond phase 1 during the evalua- tion period. The benefits and risks of using the approach, particularly the absence of a firm commitment on either side to follow-up phases, should be better explained to clients at the outset. Among the issues for consideration are the following: x » Outcomes. It is important to ensure that MPA objectives and their targets, including outcomes and contributions to high-level outcomes, are set at an appropriate level of ambition and adjusted to reflect changing circumstances over the lifetime of the MPA and that targets are ratcheted up where possible. » Coherence. The justification for using either type of MPA should be clearly articulated in Country Partnership Frameworks. This would include clarity on how the MPA fits into the larger World Bank Group program and, where relevant, how the program complements interventions by partners. Political consensus around a long-term objective or strategy can facilitate the im- plementation of vertical MPAs, while regional organizations, platforms, or specific mechanisms designed to intermediate knowledge across countries can similarly support the horizontal MPAs. » Continuity. The prioritization of MPAs in Country Partnership Frameworks should be considered, particularly for International Development Association–eligible countries subject to greater uncertainty over funding allocations, where funding trade-offs may be most acute and shocks more prevalent. The leveraging of MPAs through alternative sources of external fi- nance (for example, climate finance or debt swaps) should also be considered. There is a strong need for dedicated training and networking opportunities to ensure that team leaders are well equipped with operational, technical, and interpersonal skills, as well as sector and country knowledge, to manage the complexities of implementing MPAs, especially those requiring coordination Independent Evaluation Group World Bank Group    xi across sectors and countries. » Learning. MPA learning activities need to be properly tailored to project activities, adequately resourced and monitored, and able to adapt to incor- porate “learning moments” to strengthen feedback. Learning should also encompass institutional development over the program cycle. This learn- ing may entail developing indicators that, for vertical MPAs, measure the effectiveness of long-term institutional reforms and, for horizontal MPAs, in- centivize and measure the effectiveness of collaboration among participants. 1 Each phase in the multiphase programmatic approach follows the policies and procedures for the lending instrument it uses, including those for restructuring, though approval of addi- tional financing within the program envelope is delegated to management.  2 Findings on outcome orientation and institutional development were confirmed using t tests. Chi-squared tests of outcomes capturing institutional development suggest that comparators outperform multiphase programmatic approaches. This is mainly because Operations Policy and Country Services guidance is for multiphase programmatic approaches to have program development objective indicators at the beneficiary level.  Early-Stage Evaluation of the Multiphase Programmatic Approach  Overview xii World Bank Management Response Management of the World Bank thanks the Independent Evaluation Group for the report Early-Stage Evaluation of the Multiphase Programmatic Approach. The evaluation assesses the performance of the multiphase programmatic approach (MPA) in accordance with the expectations outlined in the 2017 Board paper. It examines the MPA’s design, early implementation, and effectiveness in supporting coherence, continuity, learning, and adaptation. Despite its limitations, this early-stage evaluation is relevant and timely, particularly given the growing use of MPAs and their potential role in supporting Global Challenge Programs. Overall Management welcomes the report’s finding that MPAs have successfully met expectations on learning and continuity objectives, noting positive indications of support in these areas. Since the introduction of MPAs in 2017, their use has risen steadily across the World Bank, with most nonemergency MPA financing directed toward eligible countries of the International Development Association, focusing on infrastructure and sustainable development. MPAs have been used across all World Bank Regions. The report also finds that vertical MPAs that progressed beyond their initial phase maintained continuous support, and both vertical and horizontal MPAs provided flexibility in the timing of subsequent phases to align with country-specific needs. Furthermore, most MPAs have strong learning agendas, and evidence shows that lessons learned have effectively influenced follow-up phases of both horizontal and vertical MPAs, with many perceiving increased effectiveness in learning within this framework. Management recognizes the report’s finding highlighting the need to enhance the coherence and adaptation objectives of MPAs to strengthen the effectiveness of the approach; however, it notes that the evaluation is based on the expectations outlined in the 2017 Board paper and reflects an early     xiii stage of implementation. The focus of the evaluation is on assessing whether the MPAs meet the Board expectations in terms of design and functioning. Management notes that the context in which the evaluation is conducted (2024) has changed significantly since 2017. For instance, COVID-19 had a profound impact on development, and the effective use of MPAs during this period provided valuable lessons that have strengthened the approach in appropriate contexts. Furthermore, the vision and mission of the Bank Group have evolved with the implementation of the Evolution Roadmap and the establishment of six Global Challenge Programs. As a result, the Board’s expectations regarding the use, impact, scale, and anticipated effectiveness of the MPA approach have shifted significantly since 2017. Implications Management acknowledges that the ability to deliver quickly and at scale may be limited by the complexity of evolving MPA designs. The use of MPAs during the COVID-19 Strategic Preparedness and Response Plan may Early-Stage Evaluation of the Multiphase Programmatic Approach  Management Response have created an expectation that MPAs can always deliver with speed and scale. These emergency MPAs benefited from key fiduciary and operational flexibilities that were critical to their success, enabling rapid disbursement and swift achievement of objectives. However, nonemergency MPAs can only deliver with speed and scale if intentionally designed with these characteristics in mind. Clear communication is essential for setting realistic expectations and helping clients understand both the opportunities and complexities of an MPA, and management is committed to communicating the benefits and risks of MPAs to clients from the outset. Management acknowledges the report’s findings on coherence, particularly as they underscore the importance of retaining operational flexibility in MPA deployment. While management concurs with the report’s conclusion that political consensus around a long-term objective or strategy can facilitate the implementation of vertical MPAs, it emphasizes that this should be seen as a factor that enhances success rather than a mandatory condition. Similarly, management agrees with the report’s conclusion that regional organizations or platforms can enhance the success of horizontal MPAs. In their absence, management notes that horizontal MPAs can still be pursued, provided alternative mechanisms with similar functions are incorporated into the MPA design. Regarding anchoring MPAs in Country xiv Partnership Frameworks, management believes this may not yet be fully defined at the Country Partnership Framework preparation stage. It is essential to maintain flexibility—particularly when deciding whether to pursue the MPA approach and, if doing so, which type of MPA to use (vertical or horizontal). Management agrees with the implications of the report’s finding on continuity and learning, which suggests that careful task team leader (TTL) selection and specialized training are important factors in ensuring stability and long-term continuity in MPAs. Management will improve TTL selection processes; take stock of MPA learning agendas to ensure that they are adequately tailored to project activities; provide expanded guidance, training, and networking opportunities to equip TTLs with the necessary skills and knowledge; and facilitate smooth transitions between TTLs to ensure program continuity. Management acknowledges the findings on outcomes and clarifies that MPAs were never intended to aim for higher-level outcomes than other instruments. The finding that MPAs are at the same level of outcome orientation as comparators is insightful and demonstrates the World Bank’s strong focus on achieving outcomes across all its instruments. However, management wishes to clarify that the 2017 Board paper never suggested that the MPA approach would exceed other approaches in terms of outcome orientation. Instead, the Board paper positioned MPAs as an approach designed to foster an explicit long-term focus, facilitating the development Independent Evaluation Group World Bank Group    xv of programs that could span multiple sectors, borrowers, and political cycles. This long-term focus was envisioned as a means to build consensus among stakeholders and support sustainability through transitions in political administrations and governance structures. Management is committed to continuing to ensure that MPA objectives and targets are set at an appropriate level of ambition. These objectives will be adjusted as needed throughout the MPA’s lifespan, allowing for recalibration based on evolving circumstances and opportunities for greater ambition. Management acknowledges the findings on adaptation, which show that restructuring under MPAs is not significantly different from that under non-MPAs, but cautions that the evaluation does not fully capture the phased adaptability of MPAs, making it premature to conclude that they are underperforming in this area. The evaluation found no evidence that adaptation under MPAs differs from non-MPAs, based on the limited use of project restructuring. However, this narrow focus overlooks the phased design of MPAs, where each phase builds on lessons learned from earlier ones. The phased approach allows MPAs to adapt to changing circumstances over time, a flexibility that is not fully captured by looking solely at within-phase restructuring. Additionally, adaptation involves more than just formal restructuring. Operational teams implementing MPAs regularly learn and adjust during implementation. The adaptive learning aspect of the MPA is a benefit recognized not only in Regional MPAs but also in MPAs within countries seeking to learn, adapt, and benchmark subnational governments in the implementation of this approach. Early-Stage Evaluation of the Multiphase Programmatic Approach  Management Response xvi 1 | Background and Evaluation Portfolio The World Bank introduced the multiphase programmatic approach (MPA) in 2017 as a means of structuring a long, large, or complex engage- ment as a set of shorter linked projects or phases using either investment project financing (IPF) or Program-for-Results (PforR) financing.1 The MPA was intended to take either a vertical form within a single country, typically over 8–10 years, or a horizontal form across several countries at the same time (figure 1.1). It could also support activities that were either scalable or modular (for example, upgrading a road network in stages) or that followed a predictable course (for example, combining phased investments in clean water and sanitation with those in hygiene and nutrition education), pro- vided that in each case those activities were consistent with the program development objective (PrDO). Another expectation was that the approach would combine complementary financing instruments (for example, an energy transition program might first finance investments in transmission and storage and then provide guarantees to stimulate private investment in generation). The motivation behind the MPA, outlined in a 2017 paper from the Board of Executive Directors (henceforth referred to as the “2017 Board paper”), was to provide continuity of engagement, allow more flexibility in responses to changed circumstances, encourage adaptive learning, and support stepwise progress toward a long-term development objective (World Bank 2017). First, by signaling a willingness to pursue a long-term development objective, the MPA would strengthen the coherence of World Bank–financed interventions, contribute to building consensus on the client side, and diminish the like- lihood of interruptions in support between phases. Second, compared with a single large operation, the MPA would enable more adaptation to circum- stances by incorporating multiple opportunities for reflection and course correction. Third, the MPA would encourage structured learning and adapta- tion by requiring teams to articulate a forward-looking knowledge agenda.    1 The 2017 Board paper also expected benefits to accrue operationally. The MPA was intended to reduce processing costs and allow the World Bank to better manage its capital. By committing financing in smaller phases, rath- er than committing the total cost of the larger program up front, the World Bank could reduce undisbursed balances and allow borrowers to save on commitment fees, as well as reduce the processing costs of follow-up phases relative to stand-alone operations. The 2017 Board paper also noted that the MPA might provide a framework for engagement by other lenders, even be- yond the duration of the World Bank financing. For instance, the World Bank could mobilize commercial financing for the first phase of an infrastructure program or partner with other multilateral development banks, commercial lenders, private investors, or private companies to finance or implement subsequent phases. Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 1 2 Figure 1.1. Types of Multiphase Programmatic Approaches Supports modular, scalable, or predictable activities Example: Reducing stunted growth of children Phase 3 Expand service coverage on a multisectoral platform Vertical PrDO Single country, deep dive, Phase 2 long term (for example, Expand to adolescent 8- to 10-year duration) health and convergence da with other sectors gen inga Phase 1 arn Scale replicable delivery Le platform, increase nutrition utilization Supports multiple borrowers to achieve a common objective Example: COVID-19, locusts, food security, trade, transport, WURI Horizontal Multiple countries, states within Countries 1 and 2 start Learning agenda a country, subnational entities; narrow focus; short- or Country 3 joins PrDO medium-term menu-based design (for example, 4- to 6-year duration) Country 4 joins Source: World Bank 2017. Note: PrDO = program development objective; WURI = West Africa Unique Identification for Regional Integration and Inclusion. Independent Evaluation Group World Bank Group    3 Since the introduction of the MPA, there has been a steady increase in its use (figure 1.2). The World Bank has approved $18 billion to MPAs under its COVID-19 response and a further $28.8 billion under nonemergency MPAs. At the end of December 2023, the portfolio comprised 11 horizontal and 29 vertical MPAs. Most nonemergency MPA financing is in International Development Association (IDA)–eligible countries and targets infrastructure and sustainable development, with energy, agriculture, water, and transport the most important users (figure 1.3). MPAs have been used in all World Bank Regions. Of the $28.8 billion in nonemergency MPA financing, 75 percent has been for the Africa Region, 85 percent has been for IDA-eligible countries, and 20 percent has been for countries experiencing fragile and conflict-affected situations (FCS). Just a quarter of approved MPAs are horizontal, but they account for 55 percent of MPA financing, nearly 90 percent of which has gone to the Africa Region. Figure 1.2. The Multiphase Programmatic Approach Portfolio, Fiscal Years 2018–24 12,000 45 Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 1 40 10,000 35 Commitment (US$, millions) 8,000 30 MPAs (no.) 25 6,000 20 4,000 15 10 2,000 5 0 0 FY18 FY19 FY20 FY21 FY22 FY23 FY24* Fiscal year MPA approvals MPA commitments MPAs Source: Independent Evaluation Group calculations. Note: MPA = multiphase programmatic approach. * Data as of December 31, 2023. 4 This evaluation is timely given the growth in the use of the MPA and the approach’s potential role in supporting the eight Global Challenge Programs outlined in “Evolving the World Bank Group’s Mission, Operations, and Resources: A Roadmap” (World Bank 2022a). These programs are to be supported through scalable solutions underpinned by knowledge, partnerships, and improvements in operational efficiency (for example, see World Bank 2024). According to the World Bank’s Operations Policy and Country Services (OPCS), the 24-month lending pipeline included 31 MPAs with potential financing of $5.7 billion, of which 87 percent is expected to support Global Challenge Program objectives.2 Most of these MPAs are in digital development, energy and extractives, agriculture, and water.  ultiphase Programmatic Approach Portfolio by Global Figure 1.3. M Practice 9,000 12 8,000 10 MPA approval amount (US$, millions) 7,000 6,000 8 5,000 MPAs (no.) 6 4,000 Independent Evaluation Group World Bank Group    5 3,000 4 2,000 2 1,000 0 0 EAE AGR WAT TR HNP SPJ DD EDU URL SSI FCI Global Practice MPA approvals MPAs Source: Independent Evaluation Group calculations. Note: AGR = Agriculture; DD = Digital Development; EAE = Energy and Extractives; EDU = Education; FCI = Finance, Competitiveness, and Innovation; HNP = Health, Nutrition, and Population; MPA = multiphase programmatic approach; SPJ = Social Protection and Jobs; SSI = Social Sustainability and Inclusion; TR = Transport; URL = Urban, Disaster Risk Management, Resilience, and Land; WAT = Water. The evaluation has been requested by the Committee on Development Effectiveness to inform the Board of Executive Directors’ ongoing discussion with management on the MPA. The evaluation’s audience is the World Bank’s Board of Executive Directors, the Committee on Development Effectiveness, and World Bank Group management and staff working on MPAs. Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 1 6 1 Each phase in a multiphase programmatic approach follows the policies and procedures for the lending instrument it uses, including those for restructuring, though approval of additional financing within the program envelope is delegated to management.  2 Probability A or B only, as of December 31, 2023. See appendix B for definitions of probabilities.  Independent Evaluation Group World Bank Group    7 2 | Scope, Evaluation Questions, and Methodology Scope The evaluation portfolio is limited to the 40 approved nonemergency MPAs as of December 31, 2023, as assessing emergency MPAs would re- quire a distinct evaluation framework. It therefore excludes the COVID-19 Strategic Preparedness and Response Plan MPA and the Emergency Locust Response Program, the first of which has been covered by a separate evalua- tion (World Bank 2022b). The evaluation assesses the performance of the MPA against the expec- tations outlined in the 2017 Board paper. The scope of this evaluation is largely determined by the youth of the MPA portfolio. All 40 nonemergency MPAs are under implementation, and 17 of them were approved in 2023. No ex post assessment of outcomes is therefore possible. The evaluation instead asks if MPA design is fulfilling the expectations outlined in the 2017 Board paper, if the specific features of MPAs are functioning as expected, and if there is an enabling environment for MPAs within the World Bank and on the client side. The evaluation also assesses the claims made in the OPCS technical briefing to the Board on the processing times for MPAs relative to IPF (see appendix D).1 The evaluation does not look at the achievement of long-term outcomes, assess the uptake of MPAs or whether there have been “missed opportuni- ties” to apply an MPA, ask if the policy scope of MPAs or the delegation of authority for MPAs should be expanded, or assess the improved efficiency of management of the Bank Group’s financial resources. These issues are outside the scope of this evaluation and, given the youthful nature of the portfolio, not evaluable. 8 Evaluation Framework The overarching objective of the evaluation is to assess whether the MPA is meeting Board expectations on design and functioning so far. This evaluation assumes that Board expectations were set by the 2017 Board paper. The evaluative framework for addressing this objective is underpinned by the theory of action for the evaluation (figure 2.1). In figure 2.1, column 1 contains the design features of the MPA—namely, the long-term horizon, the flexibility in the content and timing of phases, the learning requirements, and the processing efficiency—that are expected to support the MPA objectives listed in column 2: coherence, continuity, learning, and adaptation. Independent Evaluation Group World Bank Group    9 10 Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 2 Figure 2.1. Theory of Action of the Multiphase Programmatic Approach MPA design features* Key MPA objectives Direct outcomes . horizon Long-term Coherence More effective approach to development challenges that require continuous, adaptive support, including through private capital mobilization Flexibility of timing Continuity and content through phasing Learning agenda Learning Processing efficiency Adaptation Source: Independent Evaluation Group. Note: The Independent Evaluation Group developed this theory of action based on discussions with the Operations Policy and Country Services unit and a review of relevant World Bank documents. MPA = multiphase programmatic approach. * All MPA features feed to some extent into all key MPA objectives. . These design objectives are not unique to the MPA, but the approach was expected to enhance their delivery. All projects are expected to aim for these design objectives. However, the MPA’s design features were meant to strengthen its ability to support a learning-based, adaptive, stable, and co- herent program and thereby better support development effectiveness in the face of recurring and complex development challenges. These design objec- tives are briefly described as follows: » Coherence. A coherent program fits within the broader program at the level of the country, sector, and institution. The MPA is expected to be more coherent than its alternatives because it was intended to leverage external partnerships and internal collaboration more effectively. » Continuity. This refers to the MPA’s ability to provide stable, long-term sup- port, mainly for vertical MPAs. The vertical MPA supports continuity better than its alternatives because of its programmatic structure and the provision for overlapping phases. » Learning. Although all operations should embed knowledge, the MPA re- quires an explicit learning plan, with specificity on implementation arrangements and how the knowledge is to be used. » Adaptation. This refers to the ability of the MPA to adjust the content and timing of the phases in response to new information, evolving priorities, and changing context to better support the PrDO. An MPA would also have a larg- Independent Evaluation Group World Bank Group    11 er number of preset points at which a stocktake could be done—for example, the Mid-Term Review of each phase—and would be better positioned to use learning to inform the design and implementation of subsequent phases. Adaptation therefore takes place both through restructuring and through learning-informed program design. Three questions are addressed in this evaluation: 1. To what extent has the design of MPAs followed Board expectations and management guidance? a. To what extent have the objectives of MPA operations been oriented to- ward high-level impacts, including climate-related objectives and private capital mobilization? b. To what extent have MPAs been designed to support institutional development and learning? c. To what extent do MPAs conform to either the horizontal or vertical models outlined in the 2017 Board paper? 2. To what extent have the design features embedded in the MPA worked as expected to achieve design objectives? a. To what extent have the design features improved the coherence of interventions? b. To what extent have the design features supported programmatic conti- nuity? c. To what extent have the design features facilitated and supported moni- toring of learning within or across phases? d. To what extent have the design features supported adaptation to chang- ing circumstances and priorities? Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 2 3. Under what circumstances or enabling conditions has the MPA worked as intended? a. To what extent have client-side conditions enabled or prevented the MPA from working as intended? b. To what extent have conditions within the Bank Group enabled or pre- vented the MPA from working as intended? Since the analysis is largely ex ante, the evaluation relies on hypothesized mechanisms associated with the development effectiveness of the MPA. As the portfolio is still under implementation and nearly half the programs were approved in fiscal year 2023, it is not possible to evaluate outcomes or impact. The evaluation therefore assesses the extent to which the design objectives anchoring the theory of action are being achieved through specific hypothesized mechanisms, as described in table 2.1. The observable impli- cations vary by type of MPA and are proposed based on technical discussions with OPCS and a review of project documents. More details on how achieve- ment of the design objectives was evaluated are given in appendix A. 12  ypothesized Multiphase Programmatic Approach Mechanism Table 2.1. H Associated with Development Effectiveness Observable Implications Design Objective Mechanism Vertical Horizontal Coherence Agreement on long-term Articulation of Same as for vertical; (World Bank objectives and constraints long-term objectives in evidence of 2017) across Global Practices country strategies that additionality from and with development strengthens consensus regional approach partners; management of around them within risks to program’s ability the World Bank team; to stay on track toward greater cross-sector meeting program collaboration on the development objectives World Bank side and the client side; great- er collaboration with external partners; more private capital mobili- zation Continuity Greater likelihood of Overlapping phases; Management of long-term financing management of risks risks to continuity without interruption in to continuity engagement Learning Requirement of learning World Bank super- More parallel plan in PAD backed by vision more oriented learning across monitoring, implementa- toward learning than World Bank teams tion arrangements, and compliance; more and clients than in a capture of lessons learned self-evaluation by set of independent vertical MPA clients operations Independent Evaluation Group World Bank Group    13 than in a single large operation Adaptation Multiple points for Earlier cancellation Same as for vertical reflection (Mid-Term or restructuring in Review and the end response to changed of each phase) that circumstances and enable restructuring or lessons learned than in cancellation of activities a single large opera- tion; more evidence of restructuring anchored in learning; more frequent restructuring Source: Independent Evaluation Group. Note: The mechanisms will be refined and expanded during the evaluation. MPA = multiphase program- matic approach; PAD = Project Appraisal Document. Methods The evaluation relies on a two-pronged analytical strategy, using (i) data analysis and (ii) key informant interviews across all evaluation questions. It uses a portfolio review of the 40 approved nonemergency MPAs and a set of comparators or selected non-MPA operations; a desk-based document review; and structured and semistructured interviews with key informants. First, the evaluation relies on analysis of data from all nonemergency MPAs to assess the MPA design characteristics and mechanisms through in-depth content analysis of project documents. Then, it tests the extent to which MPAs follow a business-as-usual model by comparing the set of MPAs with a matched non-MPA comparator group comprising approximately 60 non- MPA operations. We extracted and coded several outcomes for both the MPA and comparator groups (see appendix B for a list of extracted outcomes and coding criteria). We focused on testing the observable implications of the MPA expected to materialize earlier in the project life cycle. Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 2 To construct the comparator group, two groups of comparator projects (henceforth referred to as “comparators”) were selected to maximize in- tervention similarity while minimizing the influence of key confounders. The groups comprise (i) the most similar operations in the same country, for vertical MPAs, or in the same Region, for horizontal MPAs, and (ii) the most similar operations in a similar context, as measured by the public administration Country Policy and Institutional Assessments for fiscal years 2018–22 (as a proxy for institutional capacity). Project similarity is calculated as the cosine distance of mean text embeddings from the project description section of Project Appraisal Documents from that of reference MPA projects. Second, the evaluation leverages structured and semistructured interviews with key informants, covering approximately 75 percent of MPAs, to vali- date findings, bridge gaps in evidence, understand how MPAs operate in the field, and triangulate the perspectives of various stakeholders. We conduct- ed interviews with respondents within the World Bank (task team leaders [TTLs] of horizontal and vertical MPAs, practice managers, country directors, regional directors, directors of strategy and operations, and vice presidents) 14 and from client countries, using a combination of purposive sampling strate- gies and stratification. We followed structured and semistructured interview protocols—asking all respondents within the same respondent category a set of identical questions—and then extracted and coded several items mapped to the evaluation subquestions using manual processing and NVivo (see appendix C). We mitigated potential biases inherent in this type of data (for example, selection, social desirability, confirmation) via proper selection of interviewees, projects, and interview questions (see appendix A, table A.1, and appendix C for detailed discussions of how we ensured the robustness of our analyses). In the next chapter, we triangulate evidence from these sources to deter- mine if MPAs align with expectations and add value through improved programmatic coherence, continuity, learning, and adaptability relative to comparators. Independent Evaluation Group World Bank Group    15 1 The Operations Policy and Country Services unit also noted that the multiphase programmatic approach would enable clients to achieve higher-level results faster than a set of stand-alone operations. We view this claim as unverifiable given the youth of the multiphase programmatic approach portfolio. Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 2 16 3 | The Evaluation Questions Addressed To What Extent Does the Design of Multiphase Programmatic Approaches Meet Expectations? MPAs aim at higher-level outcomes, but no more so than comparators. Although MPAs are more likely to support and monitor climate objectives than comparators, they are no more likely to track institutional develop- ment. Most MPAs have a well-developed learning agenda, with clarity on how learning is to be implemented and used. The typology of MPAs has evolved beyond expectations, with the Africa Region in particular moving toward more complex horizontal MPAs. MPAs are no more likely to support higher-level outcomes than comparators. OPCS guidance is that all MPA PrDOs have at least beneficiary-level outcome indicators (see appendix B). In this evaluation, we compare the outcome level of MPA PrDOs and project development objectives (PDOs) and indicators relative to non-MPA operations by classifying them according to the plan set out in the Independent Evaluation Group’s review Results and Performance of the World Bank Group 2020 (World Bank 2020; see appendix B for definitions and examples relating to this aspect of the evaluation). We do not examine the level of ambition at which indicator targets are set. Although all MPA PrDOs and PDOs are at least intermediate outcomes, the distribution of indicators across the four outcome categories is very similar to that for non-MPAs (figure 3.1), with vertical MPAs being slightly more outcome oriented than horizontal MPAs.1 The MPAs do, however, include a higher proportion of climate-related objectives. Almost two-thirds of MPA PrDOs measure climate change mitigation or adaptation outcomes, compared with one-third of non-MPA PrDOs. Interview evidence also suggests that the ambition of outcomes is not a notable reason for choosing to use the MPA. Rather, the main motivations for preferring the approach to either a regional    17 IPF, a stand-alone project, or a series of projects are its longer-term horizon, its lending scale, and its flexibility (figure 3.2).  evel of Ambition in Outcomes: Multiphase Programmatic Figure 3.1. L Approach versus Comparators a. Vertical MPAs versus vertical comparators 80 Share of MPAs or comparators (%) 70 60 50 40 30 20 10 0 Level 1 Level 2 Level 3 Level 4 Outcome level Vertical MPAs Vertical comparators Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 b. Horizontal MPAs versus horizontal comparators 60 Share of MPAs or comparators (%) 50 40 30 20 10 0 Level 1 Level 2 Level 3 Level 4 Outcome level Horizontal MPAs Horizontal comparators Source: Independent Evaluation Group calculations. Note: Level 1: output (product or service provided is within the control of the client). Level 2: immediate outcome (development of the capability of a group or organization, or initial benefit to people). Level 3: intermediate outcome (stakeholders apply a new capability to solve an issue, which causes a change in the lives of the ultimate beneficiaries). Level 4: long-term outcome (sustained change in delivery or governance or a sustained benefit to a beneficiary). (See appendix B for full definitions.) The sample size 18 for vertical MPAs is 111, and for horizontal MPAs, it is 55; the sample size for vertical comparators is 248, and for horizontal comparators, it is 99. MPA = multiphase programmatic approach.  easons MPAs Are Considered More Appropriate Than Figure 3.2. R Regional IPF (Horizontal MPAs), Series of Projects, or Stand-Alone Projects (Vertical MPAs) 100 90 80 Share of respondents (%) 70 60 50 40 30 20 10 0 y y riz rm al g en t ng en e Ps at e m en ilit lit ci Tim nt om sc din on e t cy n G di ho -te na rn m ib io n n le ie tc ex ve ge io ng Le Fu tip or Ou g Fl go a Lo Re effi ul g ith En M w Reason Senior TTLs (vertical) TTLs (horizontal) management Source: Independent Evaluation Group calculations. Note: Sample size for interviews with senior management is 13; for TTLs of vertical MPAs, it is 17; and for TTLs of horizontal MPAs, it is 10. Missing bars indicate 0 responses. GP = Global Practice; IPF = invest- Independent Evaluation Group World Bank Group    19 ment project financing; MPA = multiphase programmatic approach; TTL = task team leader. It was also envisaged that MPAs would support private capital mobilization, including through guarantees, but this has not yet happened.2 Nine MPAs have aimed at and tracked private capital mobilized through PrDO, PDO, or intermediate indicators: seven in energy, one in agriculture, and one in digital development. None reported any capital mobilized, though comparators at a similar stage of implementation have not done so either. Only the Ethiopia Renewable Energy Guarantees Program currently uses a project-based guarantee (World Bank 2019a). Unfortunately, the viability of the program’s second phase was undermined when an unanticipated increase in foreign exchange risk meant that private developers were unable to secure financing. There have been no cases in which other financiers have taken responsibility for the design and implementation of follow-up phases. The Indonesia Geothermal Resource Risk Mitigation Project is an example of a sophisticated risk-sharing project but is being converted to a stand-alone project because of slow disbursement. MPAs are designed to better support learning, as expected. OPCS guidance asks that all phases include a learning agenda and that follow-up phases provide an update on what has been learned. Almost all MPAs specify clear- ly what will be tested, how and by whom, and how the knowledge acquired will be used, although a few provide only an outline (see appendix B for examples). But the inclusion of a learning agenda is not exclusive to MPAs. Though not required, some series of projects have used the same approach. Examples are the Regional Climate Resilience Program for Eastern and Southern Africa (P180171) and the National Agriculture Development Program in the Democratic Republic of Congo (P169021). The MPA is no more likely to monitor institutional outcomes than com- parator operations overall. We define institutional outcomes as measured improvements in the functioning of organizational structures, management Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 systems, or monitoring and evaluation systems. Several interviewees saw the MPA as a better means of building capacity across government cycles, but this perception is not fully supported by the portfolio analysis. As shown in figure 3.3, MPAs are no more likely to track institutional strengthening than comparator operations through PDO and PrDO indicators. This may be because OPCS guidance precludes defining PrDOs as institutional outcomes. Vertical MPAs support institutional strengthening through project manage- ment and other components to a greater extent than comparators, however. 20  ong-Term Institutional Development Figure 3.3. L a. Vertical MPAs versus vertical comparators 60 Share of project components 50 or indicators (%) 40 30 20 10 0 Institutional Institutional PrDO/PDO Intermediate capacity capacity building indicator indicators for building via project via other for institutional institutional management component development development Component or indicator Vertical MPAs Vertical comparators b. Horizontal MPAs versus horizontal comparators 120 Share of project components 100 or indicators (%) 80 Independent Evaluation Group World Bank Group    21 60 40 20 0 Institutional Institutional PrDO/PDO Intermediate capacity capacity building indicator indicators for building via project via other for institutional institutional management component development development Component or indicator Horizontal MPAs Horizontal comparators Source: Independent Evaluation Group calculations. Note: Sample size for vertical MPAs is 114 and for horizontal MPAs is 36; sample size for vertical com- parators is 187 and for horizontal comparators is 39. MPA = multiphase programmatic approach; PDO = project development objective; PrDO = program development objective. Horizontal MPAs have grown significantly more complex over time, calling for careful thought on resourcing and for realism about implementation challenges. The relatively quick and large disbursements under the COVID-19 operation in 2020 (box 3.1) suggested to World Bank management that horizontal MPAs could be used more broadly to address regional development challenges relating to the energy transition, food security, and health emergency preparedness. Several such horizontal MPAs have since been approved, almost entirely in Africa, to address the perceived need for tackling development challenges at scale, with country participation encouraged by the flexibility to enter when ready and the ability to tailor the approach to country circumstances.3 But these MPAs are more complex than either the COVID-19 response operation or regional IPF, with a greater degree of country contextualization and variation in results frameworks (table 3.1). They also require more TTLs than regional IPF, and these should be seasoned staff members with advanced project management skills (“super-TTLs”). Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3  he Independent Evaluation Group’s Evaluation of the World Box 3.1. T Bank’s Early Support to Addressing the COVID-19 Pandemic The World Bank’s response to the COVID-19 emergency was of unprecedented scale and speed. The Independent Evaluation Group’s evaluation noted that the response was particularly swift in the most vulnerable countries. Although it was too early to observe outcomes, the evaluation pointed to promising evidence of early successes, such as the expansion of critical health and social protection capacities. The World Bank used its experience from past crises to respond quickly and effectively, and teams innovated and adapted. Operational flexibility facilitated rapid financing, and procurement was smooth. World Bank country programs also drew on existing part- nerships, crisis instruments, and regional projects. Internal World Bank efforts were facilitated by already having operational support for human capital, gender, disease preparedness, data systems and partnerships, and crisis instruments included in country portfolios. Source: World Bank 2022b. 22  omparison of Horizontal Multiphase Programmatic Table 3.1. C Approaches and Regional Investment Project Financing Characteristic COVID-19 SPRP Horizontal MPA Regional IPF Scope 94 operations, 11 operations, 145 operations, US$20 billion US$12.4 billion US$22.2 billion commitments commitments commitments (including additional financing) Rationale Global public good; Regional public Regional integration; rapid preparation; good/connectivity; coordination platform for platform for cross-country cross-country learning; World learning; Bank procurement experimentation support within country Structure Same PrDO across Same PrDO across Same PDO across operations; same phases; flexibility in countries; same theory of change theory of change theory of change but menu of and activities; and activities; same activities; core plus flexibility in PDOs/ intermediate country-specific intermediate indicators PDO/intermediate indicators indicators Management 2 TTLs per 6 TTLs per 4 TTLs per and resources operation; operation; operation; US$12,200 US$13,600 US$14,700 supervision budget supervision budget supervision budget per country/month per country/month per country/month Independent Evaluation Group World Bank Group    23 Source: Independent Evaluation Group staff analysis. Note: IPF = investment project financing; MPA = multiphase programmatic approach; PDO = project development objective; PrDO = program development objective; SPRP = Strategic Preparedness and Response Plan; TTL = task team leader. Although the MPA allows financing instruments to be combined across phases, this feature has not been used to any significant extent. The 2017 Board paper was not explicit in its expectations for PforR financing. Some MPAs have highlighted the importance of being able to combine financing instruments to address different objectives within the same program, even though the approach excludes the use of development policy financing. But in practice, most of the portfolio has used IPF. Only four operations have used PforR: the second (Tanzania) phase of the Food Systems Resilience Program for Eastern and Southern Africa, the Dominican Republic Water Sector Modernization Program, the Kenya Green and Resilient Expansion of Energy Program, and the fifth (Tanzania) phase of the Accelerating Sustainable and Clean Energy Access Transformation in Eastern and Southern Africa Program. In three of these, PforR is either the sole intended instrument or is used to support a single phase in a horizontal engagement; only in the Kenya engagement is there a planned shift from IPF to PforR from one phase to another.4 Overall, on design, MPAs do better on learning, according to content analy- sis of program documents only. However, there is no observable difference relative to comparators in terms of outcome level (except with respect to climate-related indicators) or support to institutional development. The ty- pology of MPAs has evolved somewhat, with nonemergency horizontal MPAs becoming increasingly complex, both because of the nature of the problems being tackled and because of greater country contextualization than is possi- ble under a regional IPF. Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 To What Extent Are Multiphase Programmatic Approaches Functioning in Line with Expectations? There is evidence, albeit mixed, that MPAs better support coherence, con- tinuity, and learning. Although the age of the portfolio precludes any definitive conclusion on adaptation, early evidence suggests that there is no difference between MPAs and non-MPAs in terms of the frequency of and the underlying reasons for restructuring. Coherence The MPA’s convening power was expected to strengthen programmatic co- herence. The 2017 Board paper refers to coherence but is not explicit about underlying expectations or definitions. In evaluation, coherence is typically defined in terms of how a program fits with others in the country, sector, or institution. All projects, under an MPA or not, are expected to be coherent. However, the MPA is expected to better support programmatic coherence because it is expected to convene partnerships better, both within the 24 Bank Group (internal coherence) and externally (external coherence). In addition, the MPA was expected to be strongly anchored in the strategic priorities described in the country strategies (strategic coherence). There is no evidence that MPAs are any different from non-MPAs in anchor- ing country strategies. We reviewed Country Partnership Frameworks (CPFs) and asked two questions. First, are the MPAs designed to address constraints prioritized by the CPFs? Second, do CPFs clearly influence the rationale for selecting an MPA? Although MPAs address strategic constraints prioritized in the country strategies, the rationale for selecting the approach is mostly not presented in CPFs, with a few exceptions (for example, the Fiji Tourism Development Program and Improving Nutrition Outcomes Using the Multiphase Programmatic Approach in Madagascar). This omission may be occurring for two reasons: (i) the youth of the portfolio and its lack of align- ment with the timing of preparation of the CPF and (ii) pipeline projects and programs not typically being well fleshed out in country strategies. Evidence that the MPA supports stronger complementarity with develop- ment partners comes entirely from interviews.5 The evaluation assessed whether MPAs complement, co-implement, or coordinate with the activ- ities of development partners. Cofinancing by development partners is no higher under the approach than under comparators. However, inter- viewees pointed out that cofinancing picks up only one aspect of external partnership and that much collaboration takes the form of undocumented Independent Evaluation Group World Bank Group    25 dialogue, knowledge sharing, and coordination, often around learning. In the Western Balkans Trade and Transport Facilitation and Serbia Railway Sector Modernization Programs, for example, clients and World Bank teams empha- sized the role of the MPA in supporting the priorities of the European Union, with which the World Bank coordinates closely. Under the Accelerating Transport and Trade Connectivity in Eastern South Asia Program, trust-funded consultations between the two countries participating in the MPA and India, which shares borders with both, have led to greater region- al dialogue and pragmatic solutions to specific border issues. The Horn of Africa Groundwater for Resilience Project led to development partners and the World Bank sharing knowledge on technical and operational solutions to problems in the borderlands that had been challenging to find. Longer time horizons are seen to support partnerships in vertical MPAs, suggesting improved external coherence (figure 3.4). Fifty-three percent of the TTLs of vertical MPAs interviewed for this evaluation referred to comple- mentary partner interventions in the sector or to coordination around these interventions. The longer-term horizon anchoring the program facilitates partnerships and aligns better with the longer-term government strategies, including those designed to span political cycles, as in Côte d’Ivoire, the Dominican Republic, and Kenya. In fragile environments where the World Bank has not been lending regularly, the MPA is seen as signaling a long-term commitment that anchors partner expectations, as in the Central African Republic. TTLs also noted that in the early stages of implementa- tion, the role played by the MPA in supporting alignment around complex development agendas, such as nutrition and food security, is not captured by project documents.  ultiphase Programmatic Approach–Related Features Figure 3.4. M Perceived to Support Coherence Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 80 70 Share of respondents (%) 60 50 40 30 20 10 0 Long-term Cross- Cross- Menu Identification Other objectives sector country approach of global factors strengthen collaboration learning public good consensus and collaboration MPA-related feature TTLs (vertical) TTLs (horizontal) Source: Independent Evaluation Group calculations based on interview data. Note: Sample size for interviews with TTLs of vertical MPAs is 17, and for horizontal MPAs, it is 10. Missing bars indicate zero responses. MPA = multiphase programmatic approach; TTL = task team leader. 26 Nearly all interviewed TTLs of horizontal MPAs highlighted several advantages of the MPA relating to partnerships. Partnerships are seen as being the cornerstone of the MPA by the majority of TTLs of horizontal MPAs, for several reasons. First, the common program objective and indicators support coherence with regional objectives. Second, development partners can target funding to phases of the program that align with their own budgetary cycles. Even when not providing funding, partners saw the MPA in some cases as providing a base engagement, so that the phasing and sequencing over the long term generated more opportunities to complement the program. Third, regional platforms can mobilize funds even when partners do not have a funding relationship with individual countries in the program. Fourth, the regional platforms embedded in partner regional institutions support regular exchange of information and experiences and strengthen partnerships with and within regional economic communities.6 Clients also perceived that exchange of experiences and regular interaction within various regional platforms support programmatic coherence, as does the engagement of regional economic communities and institutions. Some clients noted that individual country objectives could change over time, or commitment to regional integration could weaken, which would put programmatic coherence at risk. Contrary to expectation, MPAs are no better at supporting cross-sector col- laboration within the Bank Group, suggesting the approach does not improve Independent Evaluation Group World Bank Group    27 internal coherence. The expectation was that the MPA would better motivate cross-sector collaboration. This was driven by the assumption of the 2017 Board paper that the MPA would integrate elements of country dialogue that would otherwise be confined to individual operations. Both portfolio review and interview data suggest that cross–General Practice collaboration is no different across MPAs than across comparators. Interviewees working in areas that are traditionally multisectoral in approach—for example, food security or nutrition—collaborate the same way across MPAs and non-MPAs. Even in fragile contexts, the graduated, iterative approach of the MPA and its programmatic structure can help maintain consensus on the program and manage risks to coherence. Sixty percent of TTLs of horizontal MPAs and 53 percent of TTLs of vertical MPAs see the MPA as supportive of building consensus on the program (figure 3.4 presents a breakdown by MPA characteristic). Four of the five TTLs of vertical MPAs covering FCS noted that the long-term horizon of the MPA supported consensus building among key stakeholders. For horizontal MPAs, TTLs from four of six of the programs covering FCS perceived that cross-country learning and collaboration supported consensus building. Ninety percent of TTLs of horizontal MPAs also highlighted that the menu-based approach permitted countries to contextualize the program to their own level of readiness or their own needs, even within the larger common regional framework. This contextualization supported client ownership and helped build consensus on the larger program-level goals across very different countries. Continuity The long-term horizon and the programmatic structure of vertical MPAs were expected to support uninterrupted engagement, without much clarity in the 2017 Board paper on what continuity meant for the horizontal MPAs. The 2017 Board paper saw continuity mainly in terms of a stable engagement with the client across a longer time horizon and, therefore, as applying more Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 to vertical MPAs. Because the portfolio is young, portfolio-based evidence on continuity comes mainly from the three vertical MPAs that have closed phase 1 successfully. Two design features of the MPA support continuity: the longer-term horizon and overlapping phases. Overlapping phases are very common among the more advanced MPAs. The second phases of the three vertical MPAs that have moved beyond a first phase started 16, 54, and 56 months, respectively, before the planned closure of the first phase. Although series of projects share the design feature of overlapping phases, the few respondents who were able to compare noted that continuity was stronger under MPAs, provided there was a long-term strategy to anchor the program. Some pipeline MPAs were converted to series of projects at a late stage of program preparation and carried forward design features of the MPA, including overlapping phases. Risks to program continuity—or the risk of cancellation—have not mate- rialized to a significant extent so far, though some programs face severe challenges, and two are being converted to stand-alone engagements. There 28 are three examples of risks to program continuity materializing because of implementation challenges. In the first case, the program design was not adequately tailored to the client context, and the client’s regulations did not support the implementation of phase 1. This was compounded by a series of shocks (the pandemic and a macroeconomic crisis leading to civil unrest). In two other cases, the policy framework for successful program implemen- tation never materialized. All three cases were characterized by very low disbursements, reportedly affecting the World Bank’s commitment to the program. However, risks to program implementation are no different for MPAs than for non-MPAs. Learning More advanced MPAs used learning to inform subsequent phases under both types of MPAs. As noted under evaluation question 1 (see chapter 2), most MPAs do embed learning plans. Evidence from MPAs that have pro- gressed past phase 1 suggests that MPAs are using learning as envisaged. Portfolio analysis shows that this learning is mostly institutional and used to inform the design and implementation of subsequent phases. Six of the seven MPAs that have progressed to a second phase show strong evidence of redesign based on new technical and institutional knowledge.7 Of these seven, the four horizontal MPAs refer to the importance of sharing practical knowledge across project implementation units and of working through and Independent Evaluation Group World Bank Group    29 strengthening regional organizations (see examples in box 3.2), including for data sharing (Western Balkans Trade and Transport Facilitation, West Africa Unique Identification for Regional Integration and Inclusion Program, West Africa Food System Resilience Program, and Food Systems Resilience Program for Eastern and Southern Africa). The three vertical MPAs also emphasize institutional learning, especially Advancing Sustainability in Performance, Infrastructure, and Reliability of the Energy Sector in the West Bank and Gaza.  xamples of Institutional and Technical Learning in World Bank Box 3.2. E Multiphase Programmatic Approach Projects Western Balkans Trade and Transport Facilitation: Given the politically sensitive nature of trade facilitation, assessing the political and institutional landscape was important. It was also critical to draw on the experience of the country team on imple- mentation arrangements. Biweekly meetings with project implementation units helped maintain strong collaboration across first-phase countries and encouraged healthy competition. Standardizing and sharing procurement documents (terms of reference and calls for expressions of interest) improved the quality of procurement packages and sped up implementation. West Africa Unique Identification for Regional Integration and Inclusion Program: The first phase of the multiphase programmatic approach showed that it was important to emphasize the role of identification systems in service delivery from an early stage. This required engagement with the government, the public, and service providers. It was also crucial to underscore that registration was free and Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 that authentication methods were designed to prevent exclusion. The second phase therefore emphasized (i) engagement across government to develop national strategies for using identification credentials for service delivery, (ii) increased financing for communications and outreach, and (iii) the design of appropriate service delivery and authentication methods consistent with an all-doors-open policy. Source: Independent Evaluation Group. Learning is perceived as more effective under the MPA. As evidenced by interviews, 70 percent of TTLs of horizontal MPAs and 47 percent of TTLs of vertical MPAs see MPAs as embedding learning differently from non-MPAs. Almost half the respondents experienced with vertical MPAs and 90 percent of the respondents experienced with horizontal MPAs provided concrete ex- amples of incorporating distinct activities relating to learning. Vertical MPAs have a significantly longer learning horizon than horizontal MPAs, suggest- ing that the benefits of investing in learning may also become visible at a later stage. Moreover, although the percentage of respondents experienced with vertical MPAs who explicitly noted differences in learning was lower, 30 more than half did not comment on the topic. All but two clients had posi- tive views on learning under the MPA, mostly validating TTL views; however, only 12 interviews were conducted with clients across 5 programs, so this should not be interpreted as highly representative of a majority view among clients in general. Horizontal TTLs were strongly positive on learning, highlighting its role in supporting operational standardization and replicability. The Board paper had expected learning under horizontal MPAs to strengthen these areas.8 In practice, the majority of horizontal MPAs are at early stages of imple- mentation. However, TTLs were able to provide concrete examples of how learning is supported through regional hubs that serve as repositories for both technical and operational knowledge. Horizontal MPAs use regional platforms, typically housed in the implementing regional partner’s facility, to support learning in a manner perceived as unique to the approach. These platforms are perceived to have become learning hubs supporting opera- tional efficiency and problem-solving across countries (for example, the Accelerating Sustainable and Clean Energy Access Transformation Program, the West Africa Food System Resilience Program, and the Horn of Africa Groundwater for Resilience Project). TTLs highlighted that since countries enter the MPA based on willingness rather than readiness, activities are typ- ically contextualized, so entrants at earlier stages of preparation can learn from more advanced incumbents. This feature is unique to the horizontal MPA. Clients agreed that regional platforms facilitated peer-to-peer learn- ing, helpful both for improving operational efficiency (for example, shared Independent Evaluation Group World Bank Group    31 templates for tenders, lessons learned from mistakes in phase 1, and shared consultants across countries in the same program) and for technical learning (for example, operationalization of trade facilitation–related digital plat- forms, strengthened coordination, and data sharing across border agencies). The perception of strengthened support to learning under horizontal MPAs could be driven by non-MPA regional projects having less flexibility on the timing and content of phases across countries. The TTLs of vertical MPAs mainly highlighted that the longer-term hori- zon and phasing of the approach allows for investment in learning that can inform design during implementation. In some cases, the MPA was designed to have significant overlap between phase 1 and phase 2, with more learning expected between phase 2 and phase 3. TTLs also highlighted that the ver- tical MPAs allowed time for experimentation where new knowledge is being built. Clients also perceived that capacity building over the longer term was expected to sustain beyond the life of the program. The deep, in-country engagement under the vertical MPAs also permitted time to implement in- stitutional and service integration across agencies, identified as a challenge for complex, cross-cutting programs such as those supporting client resil- ience or those taking sectorwide approaches. Learning is perceived to be better supported under MPAs, though resourcing and reporting are the same under MPAs and non-MPAs. Although TTLs and senior management note that learning is working as expected in most cases, reporting is limited and no World Bank budget for learning was identifiable for MPAs. Learning is largely donor financed, both through trust-funded analytical and advisory work linked to the program and through multidonor trust funds managed centrally (typically by Global Practices, such as Health, Nutrition, and Population). Although this is also true for non-MPAs, the MPA was expected to incorporate and use learning more effectively, and this Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 evaluation expected resourcing to be commensurate with these expecta- tions. The institutional arrangements for implementing learning plans and managing knowledge are also uneven. Adaptation The MPA was expected to support adaptation better because of two design features—the long-term horizon and the programmatic structure—that allowed for longer-term and complex undertakings to be broken into short- er, more tractable phases. Stability of engagement over the longer horizon would permit a larger number of preset checkpoints for self-evaluation and course correction. At the same time, the programmatic structure of the MPA allowed for the program to be informed by lessons learned in earlier phases. However, there is little evidence that adaptation under MPAs is different from under non-MPAs, with emerging data from advanced programs in- sufficient to draw any broad conclusions. Overall, there is no significant difference between MPAs and non-MPAs in terms of the frequency of or underlying reasons for restructuring (figure 3.5). However, when MPAs 32 restructure, they do so earlier than comparators. It is unclear whether this is an indication of adaptability—it may simply reflect that preparation was inadequate, necessitating restructuring during implementation, as is also the case for non-MPAs. The share of first restructurings unrelated to external or country-specific events is identical across the two groups.  easons for Restructuring Figure 3.5. R MPA Comparator 14 12 10 Restructurings (no.) 3rd restructuring 8 2nd 6 restructuring 4 1st restructuring 2 0 l e in er l er rna Oth rna e in Oth exte ent hang rnal exte ent hang rnal i n m C i e ent t i n m C te nge viro n n nge viro n in ent Cha en i ronm Cha en i ronm env env Reason Source: Independent Evaluation Group calculations based on 15 MPA and 22 comparator restructurings. Note: MPA = multiphase programmatic approach. The underlying reasons for restructurings are mainly driven by country con- Independent Evaluation Group World Bank Group    33 text. These reasons may be factors external to the program that pose a risk to continuity (for example, political instability, shifting client expectations and priorities, other shocks, or low institutional capacity) or to business as usual (for example, moving financing around components of the project or reset- ting targets against results indicators). These factors are corroborated by interview evidence: within the factors identified by TTLs and senior manage- ment that can hinder implementation, political context and client context dominate.9 The nature of restructuring is also similar across MPAs and non-MPAs—for example, adjustments to indicators, closing-date extensions, or reallocation of funding. The lack of uniqueness in adaptation of the MPA may be because of the age of the evaluation portfolio: only 11 operations had been restructured as of December 2023, 10 of which were vertical MPAs. To What Extent Has the Enabling Environment Supported Multiphase Programmatic Approaches as Intended? The authorizing environment is supportive of MPAs and recognizes the potential of the approach for recurrent development challenges requiring continuous support. However, the approach is typically used to support a complex range of activities, often under challenging implementation con- ditions, and calls for adequate resourcing, guidance, systems support, and patience to enable adequate design and client engagement. This evaluation question was addressed entirely through interviews with clients, senior management, and TTLs. The enabling environment was as- sessed along two main avenues: (i) Has the enabling environment changed since 2018, and how? and (ii) What seem to be the emerging aspects of the Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 enabling environment? The respondents perceive the MPA as having the potential to deliver on complex challenges if expectations are realistic and programs are well sup- ported. Its long-term horizon and phased, learning-based support are seen as advantages by clients, senior management, and TTLs, especially in FCS. But it is misleading to suppose that the World Bank’s relatively rapid deploy- ment of the large-scale COVID-19 response will translate to nonemergency contexts. The iterative nature of the MPA is critical to its effectiveness, and there should be no expectation that MPAs are more likely to generate large, front-loaded disbursements than non-MPAs, especially in fragile contexts. Client Authorizing Environment According to most interviews with clients, senior management, and TTLs, the decision to use the MPA had been taken by the World Bank and not the government. Only in the Dominican Republic and the Horn of Africa pro- grams was the World Bank asked to support already articulated programs, whose designers recognized the need for something beyond business as 34 usual, whether motivated by a longer-term horizon or by the flexibility to accommodate participants at different stages of readiness. However, even when the approach originated externally, several World Bank counterparts acknowledged its benefits. These were largely the same as those expected under the MPA theory of action (especially flexibility and cross-country learning for horizontal MPAs), though one client observed that phasing also enabled fiscally constrained countries to manage their borrowing more effectively. Contrary to the expectations of the 2017 Board paper, shifts in government priorities have not yet affected the implementation of later phases. Previous experience with adaptable program loans had indicated that client support for long-term programs often turned out to be more vulnerable to politi- cal cycles than initially presumed (World Bank 2017). However, interviews suggest that MPA objectives have been sufficiently strategic or universal to command broad political support and that risks to continuity, whether exter- nal shocks or changing priorities, are broadly similar across MPAs and other operations and are addressed in the same way through restructuring.10 TTLs and clients note that efficiencies in preparation can be jeopardized if countries under horizontal MPAs are at different stages of readiness. In the West Africa Unique Identification for Regional Integration and Inclusion Program, for example, Benin and Togo already had the legal and institution- al pillars in place before approval, but the other second-phase participants, Burkina Faso and Niger, did not. Some TTLs and clients also observed that, Independent Evaluation Group World Bank Group    35 as with non-MPAs, internal pressure to approve operations quickly had shifted activities from preparation to implementation and delayed effective- ness, unfairly contributing to a perception of underperformance and putting subsequent phases at risk. World Bank Authorizing Environment The World Bank authorizing environment has shifted toward stronger sup- port for MPAs since 2018, mainly because of an expectation that MPAs will deliver scale with speed. On the plus side, this means that staff feel sup- ported in choosing the approach. But according to senior management, the growing enthusiasm for MPAs is largely motivated by the perception that the approach allows for lending rapidly at scale. This perception is linked to the relative speed at which the COVID-19 Strategic Preparedness and Response Plan was disbursed. However, as discussed earlier in this chapter, the COVID-19 response was an outlier. Senior management and TTLs cautioned that although there may be operational efficiencies down the line, for the most part, MPAs are slow-moving, complex programs, and that the World Bank leadership’s expectations of speedy results may be unfounded. The overall authorizing environment was characterized by managerial incentives, awareness and capacity, and budget (figure 3.6).  actors Cited as Critical in the World Bank Authorizing Figure 3.6. F Environment 90 80 Share of respondents (%) 70 60 50 40 30 Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 20 10 0 Managerial Awareness and Budget incentives capacity Critical factors Senior TTLs (vertical) TTLs (horizontal) management Source: Independent Evaluation Group calculations. Note: “Managerial incentives” refers to ownership of the MPA and encouragement to prepare programs under the approach by World Bank leadership; “awareness and capacity” refers to Operations Policy and Country Services guidance being adequate and consistent to teams, as well as to awareness of the MPA in practice among Country Management Units; “budget” refers to preparation and supervision of the World Bank budget (excludes trust funds). The sample size for interviews with senior manage- ment is 13; for TTLs of vertical MPAs, it is 17; and for TTLs of horizontal MPAs, it is 10. MPA = multiphase programmatic approach; TTL = task team leader. This push for larger MPAs was not envisaged in the 2017 Board paper, which mainly focused on the approach being suitable for learning-based, itera- tive engagements that could address complex development challenges. The push may also conflict with the emphasis on long-term continuity. A third of senior management and TTLs highlighted that pressure to disburse and 36 deliver results fast could erode World Bank support for follow-up phases in programs that face early challenges in implementation. Several observed that there was a push to cancel funds prematurely, sometimes as early as two years, which reflected a tension between ambition and reality. A further threat to continuity lies in uncertainty over the programming of IDA funds. This risk was identified in the 2017 Board paper, which highlight- ed that the availability of and conditions attached to World Bank financing might change during the implementation of the MPA. Most client inter- viewees did not mention any concerns over the World Bank’s commitment to future phases, contingent on performance, but at least one said they had been unaware that the World Bank was not committing to the entire pro- gram. About half of senior managers and a fifth of TTLs also pointed out that the current pipeline will lead to a bunching of demand for IDA funds in fiscal year 2026. One MPA may be converted to a stand-alone project because the Country Management Unit lacked the resources to fully fund the second phase. As the portfolio expands, maintaining the credibility of the approach may entail a crowding out of financing for other engagements and reduced flexibility of the broader country program. Continuity of World Bank technical support is also undermined by the obligatory 4-year rotation of international TTLs, which is even more prob- lematic in the context of a 10-year engagement than it is in a 5- or 6-year engagement. Two clients mentioned that changes in TTLs had disrupted Independent Evaluation Group World Bank Group    37 understandings between the World Bank and implementing agencies. It may be that sustaining MPA client relationships comes to depend more on local staff, who may require significant support and training. All senior managers and TTLs working in FCS highlighted the need to ensure management and TTL commitment to the program. 1 The mean outcome levels are 2.5 for vertical multiphase programmatic approaches (MPAs) versus 2.3 for horizontal MPAs (program development objectives), and 2.3 for vertical MPAs versus 2.2 for horizontal MPAs (indicators). These differences are statistically significant.  2 “The MPA could serve as a vehicle for crowding in funding from other sources, spurring a ‘Cascade’ effect” (World Bank 2017, 17).  3 The Accelerating Sustainable and Clean Energy Access Transformation Program MPA, for example, envisages 100 million beneficiary households, and the West Africa Food System Resilience Program targets 4 million farmers and agricultural firms.  4 The Dominican Republic Water Sector Modernization Program used a hybrid Program-for- Results and investment project financing approach for both phases, rather than a transition from one instrument to the other. This combination of instruments is common in the non- MPA portfolio.  5 Development partners include bilateral agencies, multilateral development banks, and re- gional institutions.  Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 3 6 External partners include regional institutions and regional economic communities.  7 Only one does not (the Kenya Green and Resilient Expansion of Energy Program) because that MPA’s second phase was designed intentionally to overlap with the first.  8 The Independent Evaluation Group’s 2019 evaluation, Two to Tango: An Evaluation of the World Bank Group Support to Fostering Regional Integration highlighted that the MPA ad- dressed a recommendation of the 2014 Independent Evaluation Group evaluation Learning and Results in World Bank Operations: Toward a New Learning Strategy (World Bank 2014, 2019b). Two to Tango underscored that the World Bank Group had a comparative advantage in terms of its ability to intermediate global knowledge across regions, but the evaluation did not analyze how regional projects generated and managed learning.  9 Positive factors include client ownership, clarity on availability of funding, and the com- mitment of World Bank leadership to the program. Negative factors include shocks to the country context, electoral cycles leading to shifts in priorities, mismatches between program design and implementation capacity, and uncertainty about the availability of International Development Association or other resources.  10 That said, one MPA may be converted to a stand-alone operation because the government 38 did not implement a tariff adjustment.  4 | Findings and Implications Main Findings Overall, the MPA has partly met the expectations set at its creation. This evaluation cannot assess the MPA’s relative performance in helping achieve long-term outcomes because of the youth of the MPA portfolio, but the find- ings currently show that although there are positive indications of support for learning and continuity, so far there is little difference on coherence and adaptation. In addition, the ex ante objectives of MPAs are not set at high- er outcome levels than those of comparator projects and programs, nor are MPAs more likely to support or measure institutional strengthening, which may be critical in more challenging environments that are experiencing FCS. We also found the following with respect to the specific design expectations from the MPA: 1. Coherence. There is no evidence yet that MPAs are more tightly anchored in CPFs than other engagements. Respondents view the longer-term horizon of vertical MPAs and the regional aspect of horizontal MPAs as more effective in motivating partnerships with other donors, but this is not yet reflected in cofinancing data. Neither vertical nor horizontal MPAs appear to support collaboration within the World Bank any better than the alternatives. 2. Continuity. All those vertical MPAs that have moved beyond a first phase have done so without a break in support. Both vertical and horizontal MPAs provide flexibility in the timing of additional phases that can be tailored to country circumstances. Although risks to continuity have not materialized significantly, there is a perception that MPAs better manage these risks. Only two vertical MPAs are being converted to stand-alone engagements. 3. Learning. Most MPAs have adequate learning agendas. There is also     39 evidence that lessons learned have informed follow-up phases of both horizontal and vertical MPAs and that learning is perceived as more effective under MPAs. However, there is no discernible difference from other operations in how learning is financed (mainly through trust funds), and reporting on the implementation of learning plans is mainly reflected in the preparation of follow-up phases. 4. Adaptation. Given that the approach is relatively new, there is little evi- dence that the frequency of, motivations for, and content of restructuring under MPAs are any different from non-MPAs. However, for the small number of MPAs that have progressed beyond phase 1, there is evidence that learning is informing the design of subsequent phases. Implications This evaluation has highlighted several areas in which MPAs’ effectiveness might be strengthened, while noting that the portfolio is for the most part at an early stage of implementation and all programs are still active, with only 7 out of the 40 having moved beyond phase 1 during the evaluation period. Although MPAs can still deliver with scale and speed, achieving these Early-Stage Evaluation of the Multiphase Programmatic Approach  Chapter 4 objectives will require that the project design and implementation features are intentionally geared toward these objectives. The benefits and risks of using the approach, particularly the absence of a firm commitment on either side to follow-up phases, should be better explained to clients at the outset. Other issues for consideration are as follows: » Outcomes. It is important to ensure that MPA objectives and their targets, including outcomes and contributions to high-level outcomes, are set at an appropriate level of ambition; that they are adjusted to reflect changing circumstances over the lifetime of the MPA; and that targets are ratcheted up where possible. » Coherence. The justification for using either type of MPA should be clearly articulated in CPFs. This would include clarity on how the MPA fits into the larger World Bank Group program and, where relevant, how the program complements interventions by partners. » Continuity. The prioritization of MPAs in CPFs should be considered, particularly for IDA-eligible countries subject to greater uncertainty over funding allocations, where funding trade-offs may be most acute and shocks 40 more prevalent. The leveraging of MPAs through alternative sources of external finance (for example, climate finance or debt swaps) should also be considered. Attention should be given to providing dedicated training and networking opportunities to MPA TTLs, including local staff. It is critical to ensure team leaders are well equipped with operational, technical, and interpersonal skills, as well as sector and country knowledge, to manage the complexities of implementing MPAs, especially those requiring coordination across sectors and countries. Transitions across TTLs need to be especially carefully managed. » Learning. MPA learning activities need to be properly tailored to project activities, adequately resourced and monitored, and able to adapt to incor- porate “learning moments” to strengthen feedback. Learning should also encompass institutional development over the program cycle. This learn- ing may entail developing indicators that, for vertical MPAs, measure the effectiveness of long-term institutional reforms and, for horizontal MPAs, in- centivize and measure the effectiveness of collaboration among participants. The growing use of MPAs highlights the need for consideration of the trade- offs among scale, speed, and complexity. The expectation for MPAs to deliver at scale and with speed is linked to the use of emergency response MPAs during the COVID-19 Strategic Preparedness and Response Plan. These emergency MPAs benefited from key fiduciary and operational flexibilities, which significantly contributed to their success, allowing for rapid disburse- Independent Evaluation Group World Bank Group    41 ment and the swift achievement of PDOs. Although MPAs can still deliver with scale and speed, achieving these objectives will require that the project design and implementation features are intentionally geared toward these objectives. Increased complexity could slow down implementation when speed and replicability are explicit objectives. References World Bank. 2014. Learning and Results in World Bank Operations: Toward a New Learning Strategy. Independent Evaluation Group. Washington, DC: World Bank. World Bank. 2017. “Multiphase Programmatic Approach.” Operations Policy and Country Services, World Bank, Washington, DC. World Bank. 2019a. “Ethiopia—Renewable Energy Guarantees Program, Phase I.” Program Appraisal Document 136433-ET, World Bank, Washington, DC. World Bank. 2019b. Two to Tango: An Evaluation of the World Bank Group Support to Fostering Regional Integration. Independent Evaluation Group. Washington, DC: World Bank. World Bank. 2020. Results and Performance of the World Bank Group 2020. Independent Evaluation Group. Washington, DC: World Bank. World Bank. 2022a. “Evolving the World Bank Group’s Mission, Operations, and Resources: A Roadmap.” Report 179285, World Bank, Washington, DC. World Bank. 2022b. The World Bank’s Early Support to Addressing COVID-19: Health and Social Response. An Early-Stage Evaluation. Independent Evaluation Group. Washington, DC: World Bank. World Bank. 2024. “Global Challenge Programs (GCP): Energy.” World Bank, Washington, DC. 42 APPENDIXES Independent Evaluation Group Early-Stage Evaluation of the Multiphase Programmatic Approach Appendix A. Evaluation Methods Evaluation Purpose and Questions The overarching goal of this evaluation is to assess whether the use and effectiveness of the multiphase programmatic approach (MPA) has met the expectations of the 2017 paper from the Board of Executive Directors (World Bank 2017). Given that the MPA is a new approach and therefore has a young portfolio—only seven MPAs are post–phase 1, and more than half the port- folio has been active since fiscal year 2023—the evaluation questions focus on elements of MPA design and early results and observable implications within earlier phases of the MPA. As such, the evaluation’s scope encom- passes the following three areas of inquiry: (i) design and compliance with expectations, (ii) design features associated with achieving objectives, and (iii) enabling conditions on the World Bank side and the client side. Table A.1, which presents the evaluation’s design matrix, gives an overview of the evaluation questions and subquestions, the data sources and methods used to answer them, and the limitations and mitigations associated with each analysis. The following subsections give an overview of the analytical strategy we employed to assess each evaluation question, as well as how we triangulated the evidence. 46  valuation Design Matrix Table A.1. E Source or Type Evaluation Question Methods Limitations of Information EQ1. Design and compliance with expectations To what extent have » PDO indicators align with » Portfolio analysis: document » We assess compliance with expectations set the objectives of or clearly build toward review out in the 2017 Board paper (World Bank 2017) MPA operations been PrDOs » Use of methodology developed at the design stage only, which limits some of oriented toward » PrDOs/PDOs/ for gauging outcome levels in the conclusions we can draw with respect to high-level impacts, intermediate indicators RAP 2020 (World Bank 2020) elements of compliance; they may be relevant including to code outcome level; later in the MPA process. » Structured and semistructured climate-related inclusion of climate- interviews to validate any objectives and private related objectives; unusual findings capital mobilization? and private capital mobilization To what extent have » PADs, ISRs, and data from » Document review MPAs been designed structured interviews to » IEG evaluation of learning from to support institutional interrogate alignment lending development and of implementation, » Portfolio analysis; budget data learning? M&E, and institutional arrangements with MPA To what extent do MPAs objectives conform to either the » Results frameworks, horizontal or vertical implementation models outlined in the arrangements for MPAs 2017 Board paper? and comparators (continued) Independent Evaluation Group World Bank Group    47 48 Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix A Source or Type Evaluation Question Methods Limitations of Information EQ2. Design features associated with achieving objectives To what extent have » PADs, CPFs, and PLRs » Desk review of strategic » General: Most MPAs are at an early stage the design features should indicate a documents and selected of implementation, with only seven having improved the coherence logic framework that advisory services and analytics graduated to the second or third phase; of interventions? articulates the links » Structured and semistructured thus, we cannot assess longer-term results among country analytics, interviews or effectiveness, which limits some of the strategy, and MPAs conclusions we can draw. » Greater evidence of » To mitigate this issue and be able to draw collaboration meaningful conclusions with respect to results, » Management of risks we focus on observable implications within to long-term program earlier phases of the MPAs. objectives » Interviews: We mitigated potential interview biases (for example, selection bias, social desirability bias, and confirmation bias) via proper selection of interviewees, projects, and interview questions. » With respect to project selection, there are many variables along which MPA projects differ, some of which are confounders for effectiveness outcomes. We controlled for these as much as possible, but this was limited by the small number of MPA projects. Given that small number, we cannot stratify the data based on too many variables. (continued) Source or Type Evaluation Question Methods Limitations of Information » To mitigate this, we created a typology based on critical dimensions of variation, then selected within each cell. » The short project timeline also limited the overall number of interviews we could perform. We used a points of saturation strategy to make sure all needed information could be obtained with a smaller number of interviews. To what extent » PDO and PrDO » Desk review of project have the design indicators, institutional documents features supported arrangements for » Portfolio analysis programmatic M&E, and interview » Structured and semistructured continuity? data should indicate interviews consistent, measurable progress toward long-term development objective(s) and long-term assurance of support (continued) Independent Evaluation Group World Bank Group    49 50 Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix A Source or Type Evaluation Question Methods Limitations of Information To what extent have » Learning plans articulated » Review of learning plans, the design features in PADs project documents, ISRs, and facilitated and » ISR reporting of aide-mémoire supported monitoring of implementation of » Structured and semistructured learning within or across learning plans interviews phases? » Interview data » IEG evaluation of learning from » Information on lending institutional arrangements for generating, managing, and communicating knowledge To what extent have » Restructuring papers » Document review and the design features » Interview data descriptive analysis of various supported adaptation to metrics around restructuring » Other project documents changing circumstances extracted/coded from these and priorities? documents » Structured and semistructured interviews (continued) Source or Type Evaluation Question Methods Limitations of Information EQ3. Enabling conditions To what extent have » Data on country and » Document review » There may be too little variation in the client-side conditions sector context » Structured and semistructured country context variables to draw meaningful enabled or prevented » Implementation-related interviews with task team conclusions. the MPA from working data in ISRs leaders, senior management, » The evaluation relies on more granular as intended? and clients information obtained during interviews to » Interview data mitigate some of these concerns and is careful not to extrapolate in its conclusions. To what extent have » Interview data » Structured and semistructured » Social desirability bias is a potential concern conditions within the interviews with task because we are trying to obtain information on World Bank Group team leaders and senior internal conditions. enabled or prevented management » To mitigate this concern, we formulated neutral the MPA from working interview questions that highlight “process” as intended? as opposed to “opinions.” Source: Independent Evaluation Group staff analysis. Note: CPF = Country Partnership Framework; EQ = evaluation question; IEG = Independent Evaluation Group; ISR = Implementation Status and Results Report; M&E = monitoring and evaluation; MPA = multiphase programmatic approach; PAD = Project Appraisal Document; PDO = project development objective; PLR = Performance and Learning Review; PrDO = program development objective; RAP 2020 = Results and Performance of the World Bank Group 2020. Independent Evaluation Group World Bank Group    51 Evaluation Question 1: To What Extent Has the Design of MPAs Followed Board Expectations and Management Guidance? To answer evaluation question 1, we focused on assessing three areas ex- pected to be distinctive in the MPA: (i) the level at which program and project objectives were set, (ii) design elements intended to support institu- tional development, and (iii) design elements intended to support learning. First, we used in-depth content analysis of program documents to assess the presence or absence of expected MPA characteristics across the 40 MPA projects in the sample. The team extracted and manually coded a series of outcomes, such as the outcome level of project development objective and program development objective indicators, support for institutional devel- opment (intermediate indicators; components), and the adequacy of learning agendas (see appendix B for a complete list). We then assessed whether the intent at design followed a business-as-usual model by comparing the MPA Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix A projects with a non-MPA comparator group, matched on intervention simi- larity and country context (see appendix B for details). Second, we used interview evidence to validate the extent to which percep- tions around MPA design aligned with findings from the data analysis. To understand the intent at design, the team systematically identified the rea- sons task team leaders (TTLs) and senior management highlighted for using the MPA or choosing to conduct a particular project under this approach (as opposed to alternative approaches such as a regional engagement in the case of a horizontal MPA or a series of projects in the case of a vertical MPA). We also assessed the salience of outcome levels in interview narratives. Moreover, we leveraged client interviews to understand clients’ perspectives on the decision to undertake a project under the MPA, including the expect- ed benefits and salient reasons for use of the approach. Last, within the scope of evaluation question 1, we asked whether new MPA models that depart from the original horizontal or vertical models have emerged. To do so, the team assessed the level of complexity of each MPA project and the alignment of the projects with the original horizontal and vertical models. Information on development objectives; results frameworks; 52 and the identity of TTLs for nonemergency MPAs, the COVID-19 Strategic Preparedness and Response Plan, and regional investment project financing comparators was gathered from Project Appraisal Documents. Data on su- pervision budget allocations came from the Projects and Operations portal. Evaluation Question 2: To What Extent Have the Design Features Embedded in the MPA Worked as Expected to Achieve Design Objectives? To answer evaluation question 2, we focused on assessing hypothesized MPA mechanisms associated with development effectiveness by interrogating four key objectives of the MPA: (i) coherence, (ii) continuity, (iii) learning, and (iv) adaptation. To do so, we followed a similar analytical strategy to that used for evaluation question 1. First, we theoretically derived a series of observable implications associated with each of the key objectives and mechanisms of the MPA (see table 2.1), then assessed their presence or absence within the group of MPAs, using in-depth content analysis of program documents. For this analysis, the team coded several outcomes: level of cofinancing, level of integration with Country Partnership Frameworks, frequency of interruption in support, incorporation of learning into follow-up phases, restructuring or cancellation in response to external shocks, reasons for restructuring, and so on (see appendix B for a complete list). Then, the team tested for any Independent Evaluation Group World Bank Group    53 observed differences between MPAs and the matched non-MPAs, comparing a range of selected outcomes that could be meaningfully quantified across both the MPA and comparator groups. For example, we compared adaptation between the MPA and non-MPA groups by proxying adaptation with various measures of restructuring (for example, timing, incidence, and rationale). Second, we used interview evidence to triangulate findings around MPA key objectives, either to validate any results that come up from desk review or to dig deeper into mechanisms or uncover pathways of change where other sources of data were scarce. For this part of the analysis, we focused primarily on interviews with TTLs, under the assumption that they would be best situ- ated to answer questions regarding implementation, challenges in the field, and early results. To enable comparability across responses, we employed a structured interview template, asking a series of identical questions relat- ing to coherence, continuity, learning, and adaptation to all respondents in a particular category (TTLs of vertical MPAs and TTLs of horizontal MPAs), then systematically coded the responses and descriptively analyzed the resulting data (see appendix C for a detailed description). We also leveraged client inter- views to understand clients’ perceptions about these key objectives. Evaluation Question 3: Under What Circumstances or Enabling Conditions Has the MPA Worked as Intended? To answer evaluation question 3, we focused on assessing enabling or hindering factors related to client-side conditions and conditions within the World Bank. The evidence base for this question relies on interview data and therefore reflects the perceptions of the respondents. First, we systematically extracted factors relating to enabling conditions and narratives relating to risk mitigation from TTL interviews. The team then triangulated this bottom-up perspective with the view of senior management (country directors, practice Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix A managers, regional directors, and regional vice presidents) to understand how broader decisions (including at portfolio level) were informed. Last, the team supplemented this analysis with client interviews to understand the value added by MPAs from the client perspective and to update our understanding on existing findings pertaining to client characteristics. To mitigate some of the limitations and potential biases associated with the nature of the data, we extracted this information across all categories of respondents (TTLs of horizontal and vertical MPAs, senior management, and clients) from structured interview questions about processes, context, and incentives and asked respondents for examples of the items they were highlighting. We were therefore able to get a sense of the contextual factors that were perceived to be most salient in each respondent group and to compare perceptions across groups of respondents. Triangulating the Evidence The evaluation uses data analysis and key informant interviews to inform responses to all evaluation questions. Table A.2 shows the strength of the evi- dence across all three evaluation questions and the evaluation’s primary topics. 54  valuation Questions Addressed through Triangulation Table A.2. E Evaluation PRA Type Interview Semistructured Interview Items Question Topic Evidence PRA Data Point of Analysis Evidence Question Extracted EQ1 Reasons to Mixed Extracted from Cross-MPA Yes Why did you think that the Typology use MPA PADs: Rationale for comparison horizontal MPA would be using MPA more appropriate in these cases than a regular regional IPF? (Or for vertical MPAs, more appropriate than an SOP/stand-alone project?) EQ1a Outcome Yes Outcome Comparison Yes What are the specific Typology levels levels coded for all between characteristics of the MPA that PrDOs/PDOs/ MPA and the client was interested in? indicators comparators EQ1a Private Yes Mention of private Comparison No n.a. n.a. capital investors in project between mobilization design; tracking MPA and private investment comparators in results framework EQ1b Learning Yes Learning agenda Comparison Yes Is the learning agenda Binary and lessons from between different from non-MPA classification, previous phases, MPA and projects? How? types of coded based on comparators learning OPCS guidance generated, note and narrative (continued) Independent Evaluation Group World Bank Group    55 56 Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix A Evaluation PRA Type Interview Semistructured Interview Items Question Topic Evidence PRA Data Point of Analysis Evidence Question Extracted EQ1b Institutional Yes Project manage- Comparison Yes How did MPAs build capacity Factors capacity ment component/ between and create incentives to invest building other compo- MPA and in capacity building? nents/PrDO and comparator PDO indicators/ intermediate indicators EQ1/EQ2 Climate Yes PrDOs/PDOs/ Comparison No n.a. n.a. mitigation indicators coded between for whether they MPA and address climate comparators mitigation EQ2a Coherence Mixed Cofinancing and Comparison Mixed To what extent did you Typology and relationship with between use internal and external factors CPF for MPAs MPA and collaboration to support comparators long-term objectives? EQ2a Consensus No n.a. n.a. Mixed To what extent did the MPA Binary coding building support consensus building and factors compared with non-MPAs? EQ2b Continuity Yes MPA phase Cross-MPA Yes To what extent did the MPAs Binary coding overlap comparison support continuity compared and factors with non-MPAs? (continued) Evaluation PRA Type Interview Semistructured Interview Items Question Topic Evidence PRA Data Point of Analysis Evidence Question Extracted EQ2d Adaptation Yes Restructurings: Comparison Yes What are some examples of Binary coding content and reason between smaller course corrections? and factors MPA and What specific mechanisms comparators or processes are in place for adaptive management? EQ3 Enabling No n.a. n.a. Yes What role do country context, Typology, environment managerial incentives, valence, and awareness and capacity, and narrative budget play? Source: Independent Evaluation Group staff analysis. Note: Green (“yes”) indicates that evidence is strong and can be found both in findings from the data analysis and from the semistructured interviews. Orange (“mixed”) indicates that there is only some evidence to back the finding and that it is not entirely conclusive. Red (“no”) indicates that there is no evidence for that topic. CPF = Country Partnership Framework; EQ = evaluation question; IPF = investment project financing; MPA = multiphase programmatic approach; n.a. = not applicable; OPCS = Operations Policy and Country Services; PAD = Project Appraisal Document; PDO = project development objective; PrDO = program development objective; SOP = series of projects; PRA = portfolio review and analysis. Independent Evaluation Group World Bank Group    57 Limitations The main limitation of this study, as identified in the Approach Paper and reflected in the design matrix, is the young age of the MPA portfolio (World Bank 2024). We mitigated this issue by deriving and focusing on the assessment of shorter-term observable implications. However, the nature of the portfolio also limits the conclusions that can be drawn. First, the evaluation assesses a relatively small portfolio of 40 MPAs, which restricts the type of analysis we can conduct: the team used nonparametric approaches and descriptive analysis to analyze the data. This restricted the analysis to noncausal conclusions. Second, while we work with the full population of MPAs, which is therefore represen- tative of the full portfolio, we cannot draw conclusions about the approach in sectors, regions, or countries that are not currently covered by the MPA portfolio. Interview data represent a large portion of this evaluation’s evidence base. We have ensured the robustness of our approach via adequate planning, careful selection of informants, neutral and structured interview templates, systematic Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix A collection and analysis of interview data, and within-method triangulation (see appendix C). The design matrix details the various types of bias actively mitigated for within the evaluation. Two limitations to the qualitative analysis remain. First, the timeline of this evaluation restricted the number and types of client interviews that could be conducted. The team was neither able to perform any field validation as part of this assessment nor engage with a range of different stakeholders that may be relevant within the same country. This has implications for our ability to draw conclusions about clients’ perspectives. Second, in trying to understand perceptions around the value added by the MPA from TTL, senior management, or client perspectives, with respect to EQ1 and EQ2, we asked respondents to compare their MPA experience with their experience with relevant alternative approaches (for example, versus regional investment project financing in the case of a horizontal MPA). However, beyond those who offered examples to specifically contrast their MPA and relevant non-MPA experience, this did not allow us to assess the extent to which TTLs considered the alternatives. With respect to EQ3—relating to factors and condi- tions within the World Bank and country contexts that enabled or hindered the MPA from working as intended—we did not specifically aim to provide compara- tive evidence on factors that differentially affect MPAs versus non-MPAs. 58 References World Bank. 2017. “Multiphase Programmatic Approach.” Operations Policy and Country Services, World Bank, Washington, DC. World Bank. 2020. Results and Performance of the World Bank Group 2020. Independent Evaluation Group. Washington, DC: World Bank. World Bank. 2024. Early-Stage Evaluation of the Multiphase Programmatic Approach. Approach Paper. Independent Evaluation Group. Washington, DC: World Bank. Independent Evaluation Group World Bank Group    59 Appendix B. Portfolio Review and Data Analysis Portfolio Portfolio data were provided by the Operations Policy and Country Services unit. The main portfolio consists of 40 nonemergency multiphase programmatic approach (MPA) projects approved between 2017 and December 31, 2023 (table B.1). The pipeline portfolio was restricted to operations of probability A or B, resulting in a portfolio of 31 MPAs.1  ultiphase Programmatic Approach Portfolio of Projects Table B.1. M Project ID MPA Project Name P160848 1 Improving Nutrition Outcomes using the Multiphase Programmatic Approach P161329 2 West Africa Unique Identification for Regional Integration and Inclusion (WURI) Program P162043 3 Western Balkans Trade and Transport Facilitation P162607 4 Renewable Energy Guarantees Program P160005 5 Climate Resilience Multi-Phase Programmatic Approach P166071 6 Indonesia Geothermal Resource Risk Mitigation Project (GREM) P167156 7 Nigeria Improved Child Survival Program for Human Capital MPA P170928 8 Advancing Sustainability in Performance, Infrastructure, and Reliability of the Energy Sector in the West Bank and Gaza P169880 9 Western Economic Corridor and Regional Enhancement Program P168862 10 Sava and Drina Rivers Corridors Integrated Development Program P164184 11 Guinea Commercial Agriculture Development Project P173416 12 Liberia Electricity Sector Strengthening and Access Project (LESSAP) (continued) 60 Project ID MPA Project Name P170868 13 Serbia Railway Sector Modernization P174002 14 Sustainable Rural Economy Program P171767 15 Niger, Improving Women’s and Girls’ Access to Improved Health and Nutrition Services in the Priority Areas Project—LAFIA-IYALI P172769 16 West Africa Food System Resilience Program (FSRP) P174034 17 Niger Accelerating Electricity Access Project (Haské) P177299 18 Supporting an Education Reform Agenda for Improving Teaching, Assessment and Career Pathways P174867 19 Horn of Africa—Groundwater for Resilience Project P176683 20 CAR—Electricity Sector Strengthening and Access Project P178566 21 Food Systems Resilience Program for Eastern and Southern Africa (Phase 3) FSRP P176549 22 Accelerating Transport and Trade Connectivity in Eastern South Asia—Bangladesh Phase 1 Project P174639 23 Mozambique Safer Roads for Socio-Economic Integration Program P174595 24 Building Resilient Bridges P174593 25 Assam Integrated River Basin Management Program P177823 26 Dominican Republic Water Sector Modernization Program P170941 27 Kenya Digital Economy Acceleration Project Independent Evaluation Group World Bank Group    61 P178389 28 Water Supply and Sanitation Access Program (PASEA) Project P176780 29 Khyber Pakhtunkhwa Rural Investment and Institutional Support Project P176698 30 Kenya Green and Resilient Expansion of Energy Program P178694 31 Fiji Tourism Development Program in Vanua Levu P178534 32 Climate Resilient Infrastructure for Urban Flood Risk Management Project P178286 33 Kyrgyz Renewable Energy Development Project P179550 34 Côte d’Ivoire Health, Nutrition, and Early Childhood Development Program P179293 35 East Africa Girls’ Empowerment and Resilience (continued) Project ID MPA Project Name P180127 36 Health Emergency Preparedness, Response and Resilience Program Using the Multiphase Programmatic Approach P180547 37 Accelerating Sustainable and Clean Energy Access Transformation Program Using the Multi-phase Programmatic Approach P179154 38 Tertiary Education, Science, and Technology Project (TEST) P180512 39 Distribution Efficiency Improvement and Utility Strengthening Project P179078 40 Health Security Program in Western and Central Africa Source: Operations Policy and Country Services data. Note: MPA = multiphase programmatic approach. Comparators To identify the comparators, two groups of projects were selected to Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix B maximize intervention similarity while minimizing the influence of key confounders. They comprised (i) the most similar operations in the same country, for vertical MPAs, or Region, for horizontal MPAs, and (ii) the most similar operations in a similar context, as measured by the public adminis- tration Country Policy and Institutional Assessments (CPIAs) for fiscal years 2018–22 (as a proxy for institutional capacity). Although we considered using other sources of data as proxies for institutional capacity, we selected the CPIA because of its better coverage of the countries in the sample as- sessed in this evaluation (that is, the MPA portfolio and the set of projects from which we drew the comparators). Data for CPIA public sector manage- ment and institutions ratings and subratings for International Development Association and International Bank for Reconstruction and Development countries were obtained from Operations Policy and Country Services. Comparators were drawn from a set of 1,308 projects approved between fiscal years 2018 and 2024 (excluding additional financing), of which 1,187 were investment project financing and 127 were Program-for-Results projects. Not all MPAs had an adequate match (an inclusion cutoff of 0.3 or less was used for the distance score, where 0 indicates the most similar 62 interventions). There were 14 operations in the first group for which no match existed and 6 in the second group for which either no match existed or the best match was in the same Region or country (and therefore the same as the first group match). Because the CPIA is an imperfect proxy for institutional similarity, in 11 cases we substituted an alternative for the second group comparators (usually to ensure a match on International Development Association or International Bank for Reconstruction and Development eligibility). Nine of the comparators were series of projects (six in the first group and three in the third). Variables and Analysis Outcome levels. The team extracted and coded the program development objectives and project development objectives for all 40 MPA projects in the portfolio and for the 60 comparator operations as follows, based on the methodology from World Bank (2020): » Level 1. Output (product or service provided is within the control of the cli- ent). For example, signing agreements for cross-border information sharing on health security; developing a foundational identification system enabling a legal and institutional framework; creating an enabling environment for increasing access to sustainable and clean energy. » Level 2. Immediate outcome (development of the capability of a group or Independent Evaluation Group World Bank Group    63 organization or initial benefit to people). For example, improving access to quality education and health services in targeted rural areas; increasing the number of people and assets protected against flood risk in priority river basins; increasing the supply of and access to clean energy services. » Level 3. Intermediate outcome (stakeholders apply a new capability to solve an issue, which causes a change in the lives of the ultimate beneficiaries). For example, improving key nutrition behaviors known to reduce stunted growth in children; improving education outcomes for primary and secondary stu- dents; reducing outages or voltage fluctuations; reducing transport costs along a project corridor. » Level 4. Long-term outcome (a sustained change in delivery or governance or a sustained benefit to a beneficiary). For example, reducing the prevalence of stunted growth in children under two years of age in targeted regions; re- ducing the under-five mortality rate in program areas; reducing greenhouse gas emissions relative to a baseline; improving the incomes and resilience of beneficiaries and selected rural areas. Learning agenda. Operations Policy and Country Services guidance is that each phase of an MPA should contain a learning agenda describing what knowledge is expected to be acquired, how it will be acquired, the cost of acquiring it, and how it will be used to improve the effectiveness of the pro- gram (World Bank 2021). All first-phase MPA project documents contained some form of learning agenda. The team coded them as either weak or ade- quate according to their degree of detail and specificity. For example: » Weak. Three principles will underlie the proposed program throughout the phases. First, instead of a “one-size-fits-all” approach, the program design will build in flexibility to ensure a responsive system based on different epidemiological profiles and priorities. Second, from the outset, the program will build sustainable vertical (across different service levels) and horizontal Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix B (across different regions) mechanisms for transferable knowledge and ca- pacity building. Third, adaptive learning (learning from operational rollout and experimentation) will continue to be at the core of the program. Various aspects of the program will benefit from this type of approach, including the implementation arrangements, which will undergo continual assessment to mitigate any risks and ensure effective arrangements. » Adequate. The MPA will produce a great diversity of lessons and learning approaches to a more robust design of the legal and technical components of the project in the following dimensions: » Learning at the institutional and implementation level. The project will set up mechanisms for exchanging views on implementation among the participating beneficiaries through ad hoc workshops, where participating agencies and ministries can present and discuss specific project-related ac- tivities, and a project-specific learning review, where lessons learned from one beneficiary in implementing a specific activity can be transferred to an- other beneficiary. 64 » Learning at the customs level. As a first step toward the development and implementation of a national single-window system and to facilitate an informed decision-making process, the World Bank organizes visioning workshops for trade facilitation stakeholders, including representatives of all regulatory agencies with jurisdiction in import, export, and transit oper- ations. Major private sector stakeholders are also expected to participate in this workshop, ideally at the deputy head or department head level. » Learning at the transport data level. Most of the data required can be ob- tained from computerized sources (for port, rail, and customs, for instance). However, all data relating to cargo moved by road require field surveys. A customized methodology will be used for survey data collection, under a trust fund for the regional resilience of the infrastructure and the trade strategies in the Western Balkans, managed by the same World Bank team, as well as additional intermodal data from another activity, funded under another trust fund for intermodal connectivity in the Western Balkans. Climate-related indicators. The team coded the program and project devel- opment objective indicators for all MPA portfolio and comparator operations according to whether they supported climate change mitigation or adapta- tion. Examples include the following: » Mitigation. Greenhouse gas emissions reduced; capacity of renewables gen- eration increased; annual energy generated from solar; policy and regulatory Independent Evaluation Group World Bank Group    65 framework for renewable energy strengthened. » Adaptation. Climate-resilient road access improved; climate-smart agricultural technologies adopted by producers; land area protected by flood risk mitigation measures increased; regional information systems in use for decision-making related to droughts, flooding, or cyclones improved. Long-term institutional development. The team coded the program and project development objectives and activities of MPAs and comparators according to whether they supported or measured improvements in the functioning of organizational structures, management systems, and moni- toring and evaluation systems. Policy or legal changes without evidence of implementation were excluded. Examples of such indicators and the activi- ties that contribute to their achievement include the following: » Organizational. Agriculture Sector Coordination Committee operational; Regional Platform for Groundwater Collaboration functioning among partici- pating countries. » Systems. Remote sensing data used for medium-term budget planning for bridge management and maintenance; percentage of efficiency activities fully implemented as planned; corporate turnaround strategy implemented. » Monitoring and evaluation. Number of lessons learned reports from design and pilot of liquidity support account and payment system; percentage of programs evaluated. Cofinancing. This indicator captures whether the operation uses a trust fund (yes, no), the number of grants, the total amount of all grants (in US dollars), and the type of trust fund execution (World Bank, recipient, or both). Data to support this indicator were extracted for all MPAs from the World Bank’s Projects and Operations portal. Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix B Restructuring. The team used restructuring papers from MPAs and compar- ators to analyze (i) the frequency and timing of restructuring (months since World Bank approval); (ii) the reasons for restructuring (for example, change in government priorities, external shock such as COVID-19 or the Russian invasion of Ukraine, and other necessary corrections to project design); and (iii) the content of the restructuring (for example, extension of closing date, change in results framework, change in implementation arrangements, or reallocation of funds across components). Supervision budget. The annual work program supervision budget alloca- tion per country for horizontal MPAs was extracted from the World Bank’s Projects and Operations portal. Phase overlap. This overlap was calculated as the number of months be- tween the planned end of the first phase and the start of the second phase for MPAs and series of projects. 66 References World Bank. 2020. Results and Performance of the World Bank Group 2020. Independent Evaluation Group. Washington, DC: World Bank. World Bank. 2021. “Bank Guidance: Multiphase Programmatic Approach.” World Bank, Washington, DC. Independent Evaluation Group World Bank Group    67 1 The project delivery ratings within the Activity Initiation Summary serve as indicators of a project’s readiness and likelihood of being delivered within the fiscal year. An “A” rating is given when there is confidence that the project will be ready by the expected approval date, and for certain types of projects, specific reviews should be completed before this rating is assigned. A “B” rating indicates that the project is likely to be ready within the fiscal year, but that there is uncertainty about meeting the expected approval date; a concept review should be completed before assigning this rating.  Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix B 68 Appendix C. Key Informant Interviews Sample Frame The evaluation team conducted structured and semistructured interviews with key informants across three categories of respondents: task team lead- ers (TTLs) for vertical and horizontal multiphase programmatic approaches (MPAs), senior management (country directors, practice managers, regional directors, and regional vice presidents), and stakeholders in client countries (for example, in project implementation units, ministries of finance, or line ministries in the sectors covered by the MPA). To select respondents within each category of key informants, we leveraged a combination of purposive sampling and stratification, with slight variation in the selection strategy by key informant group, as described in the relevant sections in this appendix. Except for the senior management selection, we conducted the sampling at the project level, first selecting the MPAs and then reaching out to the affili- ated respondents. Sampling and Selection Strategy Interviews with Task Team Leaders Of the 40 nonemergency MPAs, the evaluation team selected respondents for all horizontal MPAs (n = 11, of which 4 have progressed past phase 1) and for a sample of the vertical MPAs. To select vertical MPAs for interviews, we used a purposive selection strategy, as follows. First, we created a typology of projects based on two dimensions of vari- ation: (i) level of institutional maturity and (ii) nature of intervention. Although other dimensions of variation may be confounding on the relation- ship between MPAs and their performance, given the small number of active vertical MPAs, we decided to use a model with fewer dimensions to allow    69 more cases in each of the resulting strata. Second, we classified all MPAs based on a qualitative assessment of their Project Appraisal Documents, as the scope and timeline of the evaluation (and the limited number of MPA projects) rendered a more complex, ma- chine learning–based classification exercise nonoptimal. We used a country’s International Development Association versus International Bank for Reconstruction and Development status as a proxy for the level of institutional maturity, grouping fragile and conflict-affected countries with those that are International Development Association eligi- ble. The team coded the nature of the intervention into three categories: (i) infrastructure centric, (ii) systemic leaning, and (iii) human capital or service delivery focused. These categories are not fully independent, and each MPA contains elements of all three categories; for example, all projects require components related to institutional capacity building or the building of some form of infrastructure. To classify MPAs, coders qualitatively assessed which category was dominant in each MPA. Third, we purposively selected three projects from each of the resulting six Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix C strata (that is, institutional maturity × nature of intervention cell) to maxi- mize the chances of gathering evidence relating to the evaluation questions. For example, we included most projects that have advanced past phase 1 to ensure that we could gather information around key MPA design objectives and mechanisms such as continuity or learning. We also balanced the sample with respect to other variables, such as Region, sector, or years since active. The evaluation team selected 19 vertical MPAs for interviews. Of these, 5 had a human capital focus, 7 were infrastructure centric, and 7 were sys- temic leaning categories. Of these, only 7 were in International Bank for Reconstruction and Development countries, reflecting the fact that most human capital–focused MPAs are in International Development Association countries. We reached out to the TTLs of all 19 selected MPAs and received an 89 percent response rate, leading to 17 interviews from vertical MPAs. In the case of horizontal MPAs, the team received a 100 percent response rate. However, the analysis includes only 10 horizontal MPAs, as we relied on 1 horizontal MPA to scope and polish interview questions. 70 Interviews with Senior Management In interviewing senior management, the overarching goal was to under- stand the MPA-related strategy, incentives, and environment (as opposed to implementation experience, which was more relevant for TTL interviews). We therefore selected respondents at multiple levels of senior management: practice managers, to understand the internal environment and uses of the MPA; country directors, to understand how the MPA fits with clients and coordination across Country Management Units; regional directors, to un- derstand how the MPA fits with the Region, coordination across the Region, and with regional partners; and regional vice presidents, to understand the high-level strategic view. We selected senior management based on their experience with the MPA, operationalized as the number of MPAs in their portfolios, and not as the volume of financing. The decision to not focus solely on the largest projects was motivated by wanting to assess if the approach itself works, and we ex- pected the largest projects to be different with respect not only to available resources but also to other potential confounding factors (for example, type of monitoring and evaluation). The team selected both senior management with substantial experience with the MPA and senior management with lim- ited experience with the MPA. We balanced the sample with respect to other variables as well, such as coverage of vertical and horizontal MPAs, regions, Independent Evaluation Group World Bank Group    71 and sectors. In addition to this, we ensured the selection covered both older and newer MPAs, to make sure we captured any systematic differences due to institutional learning curves. The evaluation team reached out to 15 respondents for senior manage- ment interviews and received an 87 percent response rate. In addition, we conducted three scoping interviews with senior management to refine the interview template. Interviews with Clients In interviewing clients, the main goal was to understand the value added by the MPA from the client perspective, with a view toward corroborating find- ings and bridging gaps in evidence. For horizontal MPAs, the evaluation team sought to understand what it takes to align an MPA across multiple countries with respect to continuity and coordination. We therefore selected three horizontal MPAs and tried to get the perspective of multiple countries within the MPA, as well as that of some regional organizations. The selection criteria for the three operations were level of institutional maturity or institutional capacity, intervention type, and regional variation. In addition, since the team planned to ask some questions about challenges in the field and about learning and adaptation over time, we picked MPAs with less recent approval years. Across the three selected horizontal MPAs, we reached out to more than 12 countries for interviews. For vertical MPAs, we sought to understand the factors that explain better or worse design and early implementation. We therefore focused on a subset of MPAs: those that seemed particularly good on paper or from a World Bank perspective and those that were very much “run-of-the-mill” operations or those that already seemed to be running into some issues. We selected seven vertical MPAs. For each MPA, we reached out to at least two stakeholders for Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix C interviews, based on recommendations from the Country Management Unit and affiliated TTLs. The stakeholders included portfolio coordinators, project implementation unit staff, ministers in the ministry of finance, and minis- ters in the corresponding line ministry. Data Collection: Interview Methodology and Format The evaluation leveraged a semistructured interview protocol. We used five interview templates, one for each type of respondent: (i) TTLs of verti- cal MPAs, (ii) TTLs of horizontal MPAs, (iii) senior management, (iv) client country stakeholders for vertical MPAs, and (v) client country stakeholders for horizontal MPAs. We used a set of structured core questions asked across all respondent categories, supplemented by additional structured questions by type of respondent and type of MPA. For example, the interview template for horizontal MPAs included questions on processes and mechanisms for coordination across countries, a question that was excluded in the template for vertical MPAs where not relevant. 72 TTL interviews focused on three categories of questions and information: (i) the decision-making around using the MPA for the project, (ii) how the MPA has worked in practice and whether it has worked as intended, and (iii) country-level and World Bank–side factors that enabled or hindered the MPA (with respect to design or implementation). All TTL respondents were asked an identical set of questions. The interview templates for TTLs, both for horizontal and vertical MPAs, asked different questions with respect to continuity, coordination, and coherence, as MPA objectives are expected to materialize differently depending on the type of MPA. Senior management interviews focused on (i) the strategic view and enabling environment; (ii) perceived differences between projects under and not un- der the MPA in their portfolios (for example, horizontal MPA versus regional projects); and (iii) how senior management socialized the MPA with relevant clients and stakeholders. Client interviews focused on (i) incentives to sign onto the MPA and (ii) experience during implementation. For clients in horizontal MPAs, we also focused on questions relating to coordination and coherence across countries. For clients in vertical MPAs, we also focused on questions relating to learning and adaptation. Interview Processing and Analysis Independent Evaluation Group World Bank Group    73 The evaluation team systematically extracted and coded several items mapped to the evaluation questions using manual processing and NVivo. We started by thematically tagging interview responses (at the sentence level) to predefined topics, developed ex ante based on the type of informa- tion each interview question targeted. We used this information to uncover narratives relating to each theme. We then extracted several other types of information, coding items based on either predefined categories or induc- tively, based on patterns observed in the data. Table C.1 shows an excerpt of the coding template for the TTL interviews. Coded items included vari- ables indicating the absence or presence of a specific topic ({1;0}); variables indicating valence (positive or negative perception about a specific topic); variables that systematically classify responses into predefined categories (typology); and variables that extract factors associated with a specific topic (extract factors). nterview Coding Template for Interviews with Task Team Table C.1. I Leaders (Excerpt) Topic Variable Coding MPA design Differences in preparing the MPA {1;0} + narrative (degree) (compared with SOP/stand-alone project/regional program) Reasons to » Regionality Typology use the MPA » Flexibility » Time efficiency » Funding » Long-term goals » Engagement with government » Multiple GPs » Outcome levels » Learning agenda Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix C » Lending scale » Long time horizon » Other Coordination Engagement with development {1;0} + narrative with partners partners Coordination with partners around {1;0} design Reasons for partners being Extract factors interested in approach Monitoring processes (concrete {1;0} mechanisms for monitoring put in place) Learning Learning plan under MPA different from {1;0} agenda SOP/stand-alone project/regional program Types of learning generated (sharing of {subject matter or technical; knowledge/processes) operational} + narrative (specific examples) Specific activities targeted at learning {1;0} + extract factors Source: Independent Evaluation Group. Note: GP = Global Practice; MPA = multiphase programmatic approach; SOP = series of projects. 74 Appendix D. Validation of Operations Policy and Country Services Findings The evaluation assessed the findings of the Operations Policy and Country Services unit using recent data (up to the end of fiscal year 2024) on the preparation time and the time between approval and first disbursement un- der the multiphase programmatic approach (MPA) compared with the entire nonemergency investment project financing portfolio. Phase 1 MPAs were compared with a more rigorously selected set of comparator operations. The evaluation found that (i) MPA phase 1 processing times are similar to those for the overall investment project financing portfolio but longer than for the selected comparators and (ii) MPA phase 2 and 3 processing times are shorter than for the overall investment project financing portfolio, but not by much for vertical MPAs (figure D.1).     75  ultiphase Programmatic Approach Phase Processing Times, Figure D.1. M Average Months from Activity Initiation Summary Sign-Off a. IPF versus MPA (phase 1) b. Coding comparator versus MPA (phase 1) c. IPF versus horizontal and vertical MPA (phases 2 and 3) Early-Stage Evaluation of the Multiphase Programmatic Approach  Appendix D Source: Independent Evaluation Group. Note: IPF = investment project financing; MPA = multiphase programmatic approach. 76 The World Bank 1818 H Street NW Washington, DC 20433