38913 Capacity Development BRIEFS S H A R I N G K N O W L E D G E A N D L E S S O N S L E A R N E D THE CAPACITY TO EVALUATE: WHY COUNTRIES NEED IT Linda Morra-Imas and Ray C. Rist1 Evaluation skills are central to effective development work. Evaluation captures real results, leads to feed- back and learning, and identifies areas where more capacity is needed. It is also an essential tool for mak- ing mid-course corrections in ongoing programs, developing appropriate indicators, tracking an individ- ual's or organization's capacity to deliver on its mandate, and guiding the design of future programming. Donors now expect countries to be full partners in the development process, which means that they need to have the capacity to evaluate their own progress and to use the findings to continuously improve their performance. The evidence suggests that these changes can potentially have a transformative effect on governance and make poverty reduction efforts dramatically more effective. The World Bank, in partner- ship with Carleton University in Ottawa, is currently providing evaluation capacity development through its International Program for Development Evaluation Training (IPDET), which has already trained more than 850 practitioners from 100 countries. It is no secret that in the past 25 years, both donors results, because developing countries generally lack and developing countries have become increasingly the planning, budgeting, and performance assessment disenchanted with development assistance: donors tools necessary to manage development. The emphasis because of the ineffective use of resources and lack of of donor lending has, therefore, recently shifted from results and governments because the two basic tenets an ownership to an empowerment/partnership para- of the way aid is delivered--the project modality digm: donors help countries to develop the right tools and policy conditionalities--have been ineffective in to manage development, and partner (not "client") achieving sustainable progress, while prolonging aid countries are responsible for evaluating and reporting dependency. The capacity of countries to manage and on their progress and using the lessons learned from evaluate their own development progress is the key to those evaluations and from stakeholder feedback to breaking out of this stalemate. continuously improve their performance. From Clients to Development Partners Mutual Accountability through Planning In assessing the reasons for their lack of effective- and Evaluation ness, aid agencies have carried out a number of stud- The Paris Declaration on Aid Effectiveness (March ies in the past decade. These studies have consistently 2, 2005), in which donors and countries agreed to be shown that a limit exists to what donors can do with- accountable to each other for development progress, out country ownership of development initiatives. As a result, donors began in the mid-1990s to encourage "client" governments to develop their own assistance 1 Linda Morra-Imas (lmorra@ifc.org) is a chief evaluation offi- and poverty reduction strategies and began to tailor cer for the Independent Evaluation Group at the International their investments to support country strategies, rather Finance Corporation and Ray C. Rist (rrist@worldbank.org) than pursuing their own development agendas. It is a senior evaluation officer for the Independent Evaluation Group at the World Bank. As senior members of these two enti- soon became clear, however, that country ownership ties, Linda and Ray comanage the International Program for of policies and programs is not sufficient for achieving Development Evaluation Training (IPDET). JUNE 2006 NUMBER 17 articulated this new development paradigm through appropriate indicators, tracking an individual's or the use of mutually agreed standards, processes, and organization's capacity to deliver on its mandate, and monitoring and evaluation frameworks. More specifi- guiding the design of future programming. cally, donors affirmed their commitment to: On a more concrete level, a government's capacity to evaluate its own development progress is a pre- · Strengthen partner countries' national develop- condition for donors' willingness to provide funding ment strategies and associated planning, budget, through budget support, rather than tying it to spe- and performance assessment frameworks cific projects or programs. This is still a controver- · Define measures and standards of performance sial proposition, but the preponderance of evidence and accountability of partner country systems in suggests that budget support has a transformative public financial management, procurement, fidu- effect on institutions and governance. It immediately ciary safeguards, and environmental assessments empowers a government in its relations with donors, in line with broadly accepted good practices while increasing its accountability to political and civil · Address weaknesses in partner countries' institu- society. Studies have found a strong interdependence tional capacities to develop and implement results- between "big picture" democratic accountability and driven national development strategies line-management reporting under budget support. · Implement, where feasible, common arrangements Furthermore, by freeing governments of the need to at the country level for planning, funding, and dis- satisfy different donors, budget support reduces trans- bursement action costs and increases the allocative efficiency · Harmonize their monitoring and reporting require- of public spending. By aiming to use and strengthen ments, until such time as they can rely more government systems, rather than setting up paral- extensively on partner countries' statistical, moni- lel systems (such as project implementation units), toring, and evaluation systems to track develop- this support increases the effectiveness of public ment progress. administration. By focusing on government's own accountability channels, budget support also improves At the same time, partner countries pledged to: transparency and accountability to the country's par- liamentary institutions and population. Improvements · Integrate specific capacity-strengthening objectives in all these areas, in turn, are highly likely to enhance in national development strategies government's capacity to reduce poverty by produc- · Establish results-oriented reporting and assess- ing intermediate outcomes, such as high-quality basic ment frameworks that monitor progress against services and effective regulation. key dimensions of the national and sector devel- opment strategies Emerging Good Practices: Results-Based · Involve a broad range of development partners in Management and Managing for Development assessing that progress. Results All partner countries are expected to have mutual Evaluation is integrated into development work assessment systems in place. By 2010 the proportion through two related practices: of countries without transparent and monitorable per- formance assessment frameworks in place is expected · Results-based management aims to improve the to decrease by one-third. performance of organizations by providing the management framework and tools for strategic Why the Emphasis on Evaluation? planning, risk management, performance monitor- ing, and outcome evaluation. Its main purposes Evaluation has implications far beyond mutual are to improve organizational learning and fulfill accountability. Evaluation skills are central to effective accountability obligations through performance development work, not simply because they facilitate reporting. partnership, but because they lead to critical thinking · Managing for development results is a manage- and greater institutional and organizational capacity, ment strategy focused on development perfor- both of which are necessary to address increasingly mance and on sustainable improvements in coun- broad sets of issues. In addition, evaluation captures try outcomes. It provides a coherent framework for real results, rather than inputs and activities; leads development effectiveness in which performance to feedback and learning; increases understanding of information is used to improve decisionmaking, the issues; and identifies areas where more capacity and it includes practical tools for strategic plan- is needed. It is also an essential tool for making mid- ning, risk management, progress monitoring, and course corrections in ongoing programs, developing outcome evaluation. JUNE 2006 NUMBER 17 Training Developing Country Professionals in Putting Evaluation Capacity into Practice Effective Evaluation Alberto Narváez, an IPDET graduate, returned to One Bank response to the increasing demand for Quito, Ecuador, where he works for the Fundación evaluation capacity training has been the International para Salud Ambiente y Desarrollo (a foundation for Program for Development Evaluation Training; since health and environmental development) and applied it began in 2001, this program has trained more than his newly acquired skills to finding out how decades 850 practitioners from 100 countries in techniques of gold mining have affected the environment and for results-based management and for managing for health of area residents. Earlier efforts had produced development results. Created by the World Bank's limited success, but with the skills acquired through Independent Evaluation Group (formerly Operations IPDET, Narváez is now fine-tuning his organization's Evaluation Department), in partnership with Carleton approach to evaluating health and environmental University in Ottawa, Canada, IPDET has also fostered impacts. "For us," he says, "the most important thing the growth of professional and personal networks is to link strategies with outcomes." among participants, which has helped to create an Another IPDET graduate, Arturo Campaña, works active community of shared practice and interest in for the NGO Centro de Estudios y Asesoria en Salud development evaluation across many countries and (Center for Study and Assessment of Health) in Quito, sectors. Ecuador, and is using his new skills to assess the impact of chemicals on farm workers. In that country's The Skills Acquisition Process Cayambe region, farms produce flowers for export, which requires that workers use chemicals to pre- Many program participants have intuitively been pare and preserve them. In earlier surveys, the use of doing evaluations using informal tools. As participants qualitative data from interviews with farm workers in IPDET, they gain a richer understanding of the full posed analytical and interpretive issues that the center range of evaluation issues and their importance for was not able to resolve. Using his IPDET training to improved performance, better governance, and effec- leverage his understanding of agricultural chemicals, tive use of general budget support. Campaña designed a learning agenda to resolve meth- IPDET addresses four major aspects of evaluation odological problems and conduct evaluations accord- capacity: (a) identification of issues, needed skills, and ing to international good practice. intuitive tools, (b) use of quantitative and qualitative data and participatory evaluation tools, (c) sharing of Beyond Individual Skills Development: Building knowledge and experiences with peers, and (d) inte- Stronger Organizations gration of that learning into an operational framework. Ninety percent of former participants interviewed Courses offered as part of IPDET include, for example: for an IPDET impact assessment said they were able to handle evaluation challenges successfully in · Qualitative methods and analysis (techniques such their organizations; however, assessing the impact as rapid appraisals and cross-program synthesis of enhanced evaluation capacity on organizational evaluation) and institutional effectiveness is a greater challenge. · Results-based monitoring and evaluation (design Although the IPDET evaluation training reportedly of performance-based monitoring and evaluation brought about changes in evaluation units, monitor- systems, including steps such as readiness assess- ing systems, and policies, more than a third of those ment, goal setting, indicator selection, and estab- interviewed said that organizational barriers such as lishing baseline data) lack of financial resources, absence of a learning cul- · Quantitative data analysis (the use of descriptive ture, and political resistance prevented more profound statistics, such as means and correlation to sum- changes. This result is not surprising, because in a marize information and determine when results resource-constrained environment, the organizational are statistically significant) culture and incentive structure is focused on gathering · Small-scale surveys (the challenges of conduct- inputs and producing the bare minimum of outputs ing surveys in the development context, including that the entity must deliver, and funding is gener- issues related to translation, gender, and selection ally tied to designing and delivering programs, rather and training of data collectors) than to evidence that targets were achieved. So long · Use of citizen report cards, community evalua- as funding depends on evidence of appropriate expen- tions, and civic engagement (design of a commu- ditures and outlays, rather than on results, managers nity evaluation of public service and discussion of are unlikely to invest in evaluation. When developing concepts, tools, and methodological issues behind country professionals are themselves champions of using this approach in rural and urban settings). evaluation, however, they can move their organiza- tions incrementally toward results-based manage- Kaufmann, Daniel. 2005. Where Next? Building Local ment. The development of evaluation capacity can Capacity in Governance. Development Outreach, thus help facilitate a shift toward a results-based vol. 7, no. 4 (September). learning culture in which monitoring and evaluation Lawson, Andrew, David Booth, Alan Harding, David are an organic part of management and decisionmak- Hoole, and Felix Naschold. 2005. General Budget ing. The development of a learning agenda based on Support Evaluability Study, Phase I, Synthesis monitoring and evaluation results is critical to the Report. Overseas Development Institute (ODI), capacity development of managers and the success of London development interventions. Development partners and Organization for Economic Cooperation and governments are becoming increasingly aware that Development (OECD). 2005. Paris Declaration agencies and countries both benefit when learning is on Aid Effectiveness: Ownership, Harmonisation, fostered and shared in a systematic and transparent Alignment, Results, and Mutual Accountability. manner. Paris (March). Organization for Economic Cooperation and Conclusion Development­Development Assistance Committee Evaluation capacity is a critical factor in the success (OECD-DAC). 2006. Managing for Development of development initiatives. It captures real results, Results: Principles in Action: Sourcebook on promotes learning, creates opportunities for mid- Emerging Good Practices. Paris. course corrections, guides the design of future pro- ------. 2006. MfDR Concepts, Tools, and Principles. gramming, and makes it possible for donors to shift Managing for Development Results: Principles in from the project modality and policy conditionalities Action: Sourcebook on Emerging Good Practices. to budget support, which in turn makes governments Paris. more accountable to political and civil society. As Scott, Alison. 2006. Emerging Practices of Results- evaluation capacity development becomes a richer, Based Country Programming among Aid Agencies. broader, and deeper area of practice, it has the poten- In Managing for Development Results: Principles tial to transform development work into a dynamic in Action: Sourcebook on Emerging Good Practices. and effective partnership between countries and aid OECD-DAC, Paris. agencies--a partnership in which donors play a sup- White, Elizabeth, and Rosalia Rodriguez-Garcia. portive role in assisting countries to realize their own 2006. Results-Oriented Country Programming: vision for development. Applying the Principles of Managing for Results and Emerging Practices and Lesson. In Managing References for Development Results: Principles in Action: Sourcebook on Emerging Good Practices. OECD- Bolger, Joe. 2000. The Emerging Program Focus: DAC, Paris. Striving for Greater Development Impact. Capacity World Bank. 2002. Evaluation Capacity Development: Development Occasional Series, vol. 1, no. 2. A Growing Priority. Précis Series, no. 229. Canadian International Development Agency. Operations Evaluation Department, Washington, Gatineau, Quebec. D.C. Department for International Development (DFID). ------. 2006. Capacity for Development and 2002. Capacity Development: Where Do We Stand Organizational Resources. Capacity Development Now? London (May). Unit, World Bank Institute, Washington, D.C. About World Bank Institute (WBI): Unleashing the Power of Knowledge to Enable a World Free of Poverty WBI helps people, institutions, and countries to diagnose problems that keep communities poor, to make informed choices to solve those problems, and to share what they learn with others. Through traditional and distance learn- ing methods, WBI and its partners in many countries deliver knowledge-based options to policymakers, technical experts, business and community leaders, and civil society stakeholders; fostering analytical and networking skills to help them make sound decisions, design effective socioeconomic policies and programs, and unleash the produc- tive potential of their societies. WBI Contacts: Mark Nelson; Program Manager, Capacity Development Resource Center Tel: 202-458-8041, Email: mnelson1@worldbank.org Visit our website for more information and download the electronic copies of all Capacity Development Briefs: http://www.worldbank.org/capacity JUNE 2006 NUMBER 17