Document of The World Bank FOR OFFICIAL USE ONLY Report No.: 21874 PERFORMANCE AUDIT REPORT PHLLLPPINES Engineering and Science Education Project (Loan 3435-PH) Second Vocational Training Project (Credit 2392-PH) February 26, 2001 Sector and Thematic Evaluations Group Operations Evaluation Department This document has a restricted distribution and may be used by recipients only in the performance of their official duties. Its contents may not otherwise be disclosed without World Bank authorization. Currency Equivalents (At appraisal and closing) Aug/Sept. 1991 US$1 = P27 (peso) P1 = US$0.03 July 6, 1992 SDR 6.76 = US$1.0 June 30, 1998 US$1 = P42.05 P1 = US$0.02 December 31,1999 US$1 = P40.3 P1 = US$0.02 Abbreviations and Acronyms ADB Asian Development Bank CHED Commission on Higher Education DOST Department of Science and Technology ECD Monitoring and evaluation capacity development ESEP Engineering and Science Education Project ICR Implementation Completion Report JBIC Japanese Bank for International Cooperation M&E Monitoring and evaluation MIS Management information system NEDA National Economic Development Authority NMIS National Manpower Information System NMYC National Manpower and Youth Council OED Operations Evaluation Department PCASTRD Philippine Council for Advanced Science and Technology Research and Development PRC Professional Regulations Commission S&T Science and technology TA Technical assistance TESDA Technical Education and Skills Development Authority TVET Technical and vocational education and training VTP II Second Vocational Training Project Fiscal Year Government of the Philippines. January 1-December 31 Director-General, Operations Evaluation : Mr. Robert Picciotto Director, Operations Evaluation Department Mr. Gregory K. Ingram Manager, Sector and Thematic Evaluation : Mr. Alain Barbu Task Manager : Ms. Linda A. Dove The World Bank Washington, D.C. 20433 U.S.A. Office of the Director-General Operations Evaluation February 26, 2001 MEMORANDUM TO THE EXECUTIVE DIRECTORS AND THE PRESIDENT SUBJECT: Philippines Engineering and Science Project (Loan 3435-PH) and Second Vocational Training Project (Credit 2392-PH) Attached is the Performance Audit Report on the above-named projects. The Engineering and Science Project (ESEP) was approved for a loan of US$85 million in January 1992. The loan was closed in June 1998 after a one-year extension, and US$5.69 million was cancelled. The Second Vocational Training Project (VTP II) was approved in June 1992 for a credit of US$36 million. The credit was closed in December 1999 after two one-year extensions of the closing date. After affirming the project ratings assigned in the Implementation Completion Reports (ICR), this audit was undertaken for the specific purpose of examining project-level monitoring and evaluation (M&E) in the two projects. By focusing on M&E, the report intends to offer insights in support of the Bank's initiative to strengthen M&E and thereby enhance results-based project management by borrowing countries. In this context, the audit also complements the ongoing operational review of best practices in monitoring in education projects by the HDNED Quality Cluster. It also supports the initiative of the Government of the Philippines to employ systematic guidelines for monitoring and evaluating results in projects that it appraises, a process spearheaded by the National Economic and Development Authority. Objectives. Both projects aimed to strengthen the education and training system's capability to provide scientific and technical manpower for industry and economic development. ESEP sought to upgrade advanced-level technological and research skills. VTP II sought to expand middle-level technical skills. Outcomes. Both ICRs assessed project outcome as satisfactory. OED has reviewed the findings of the ICRs and concurs with all their ratings. Both projects were efficacious in improving the capacities of the education and training system to supply skilled manpower. ESEP remains relevant to the government's economic and human resource development policies and VTP II is highly relevant to poverty reduction. In the operational phase, outcomes remain satisfactory. Continuing M&E activities keep the focus on results strong and steady. Institutional development impact is substantial in both projects at central management levels, but is as yet only moderate at local levels. The major weakness is that industry participation in training skilled workers remains modest. Stronger policies and incentives are needed to encourage employers to invest in human capital. Resilience to risk is substantial in both projects and sustainability is therefore likely. In ESEP, however, there is a caveat: alternative funding sources must be found in order to compensate for the discontinuation of external funding. This document has a restricted distribution and may be used by recipients only in the performance of their official duties. Its contents may not otherwise be disclosed without World Bank authorization. 2 Main Insights on M&E Capacity development in M&E goes hand-in-hand with institutionalizing M&E tasks as regularly expected activities. In hierarchical government bureaucracies, regular demand from managers for the products of M&E systems is critical to encouraging capacity development. To stimulate capacity development in policy evaluation, managers need to pose issue-oriented questions and ask staff to present a range of policy options based on empirical findings and informed judgment. They need to follow up by informing staff of the policy decisions eventually made. At the design stage, all project responsibilities requiring M&E tasks need to be identified This would help ensure that the M&E tasks are relevant to project objectives, feasible, and supportive of decision-makers striving for results. Care should be taken to avoid loading projects with M&E tasks related to broader goals that are properly the responsibility of sector and country authorities and may be beyond the scope of any single-sector project to accomplish. M&E elements in projects are naturally diffused across various objectives and components, but they should receive detailed attention at the appraisal stage as full-fledged components. The priority need is for clear, precise M&E objectives related to overall project objectives, and for the explicit identification of users. Implementing mechanisms for achieving measurable outputs need to be related to M&E objectives. Flows of inputs and activities need to be properly sequenced. Investment and operational costs should cover all M&E clients and stakeholders. Plans should be made for how information and analysis is to be disseminated and communicated in a timely manner. M&E costs paid for with external funds need to be tailored to ensure that M&E functions can be sustained from the regular budget after project completion. Finally, the appraisal should describe outcomes anticipated in terms of the development of a performance and results- oriented culture and the risks associated with failure to accomplish M&E objectives. In designing M&E components, institutional and stakeholder analysis is important. The appraisal should determine the project's formal authority to gather relevant information and to share the findings from M&E activities. The two projects illustrate the importance of establishing the legitimacy of the project to collect information from stakeholders, ensuring that incentives and the rules of the game encourage them to provide information, and securing two-way channels of communication. The validity and effectiveness of M&E is highly dependent on the level and range of participation by the providers and users. The easy part is to generate ownership among project management and staff and host agency decision-makers. The more difficult part is to generate commitment among dispersed and diverse stakeholders. When projects involve stakeholders from multiple sectors, evaluation capacity development poses exceptional challenges. At the start, it is helpful to use project funds to defray stakeholders' costs for engaging in M&E. But the ultimate test of the usefulness and sustainability of M&E systems is that stakeholders themselves become willing to bear the costs of engagement. To achieve this, it is important for project authorities to inform all stakeholders about the purposes of M&E and to convince them that participation in developing sound practice and sharing information is beneficial to them. In this respect, M&E champions prove to be exceptionally effective because they model sound M&E practices and show how information-shared can be used to advantage. Attachment Contents Principal Project Ratings .........................................................................................................i K ey Staff R esponsible ................................................................................................................... m P reface ............................................................................................................................................. v Sum m ary of Findings .................................................................................................................... 1 The Projects ............................................ ..... I Monitoring and Evaluation and Evaluation Capacity Development................ ............. I Main Findings............................................2 Project Outcomes ................................... ............ 2 M&E Outcomes ................................................2 Highlighted Finding on Evaluation Capacity Development... ............... 3 M&E Objectives ...................................... .......... 4 M&E Responsibilities............ ................... ................ 4 M&E-related Components and Investments.................... ........... 6 Participation in Project and Sector Monitoring ............................ 6 Note on Changes in Evaluation Criteria. ....................... ...........8 Insights ofBroad Applicability on Project-level M&E............. .............. 8 Annex A. Engineering and Science Education Project........................................................ 11 Key Project Data ................................. ................. 12 Annex B. Second Vocational Training Project......................................................................15 Key Project Data.................................. ................. 16 Annex C. Review of Project Experiences in M&E...............................................................19 Annex D. Comments from the Borrower ............................................................................... 33 This report was prepared by Linda A. Dove who audited the projects in September 2000. William Hurlbut edited, and Pilar Barquero provided administrative support.  111 Principal Project Ratings Engineering & Science Education Project ICR Audit Outcome Satisfactory Satisfactory Sustainability Likely Likely Institutional Development Substantial Substantial Bank Performance Satisfactory Satisfactory Borrower Performance Satisfactory Satisfactory Second Vocational Training Project ICR Audit Outcome Satisfactory Satisfactory Sustainability Likely Likely Institutional Development Substantial Substantial Bank Performance Satisfactory Satisfactory Borrower Performance Satisfactory Satisfactory Key Staff Responsible Engineering & Science Education Project Task Manager Division Chief Country Director Appraisal Shigeko. Asher Bradley Babson Callisto Madavo Midterm Review Omporn Regel Sven Burmester Vinay Bhargava Completion Omporn Regel Alan Ruby Vinay Bhargava Second Vocational Training Project Task Manager Division Chief Country Director Appraisal Joseph Bredie Bradley Babson Callisto Madavo Midterm Review Robert McGough Sven Burmester Vinay Bhargava Completion Omporn Regel Alan Ruby Vinay Bhargava  V Preface This Project Performance Audit Report, while meeting the accountability requirements expected of such reports, departs from the usual format to focus on a narrow subject area: project-level monitoring and evaluation. This "learning audit" examines the M&E components of two recently completed projects in the Philippines. The projects aimed to strengthen the capability of the education and training system to provide the advanced and mid-level technical workforce required for industry and economic development. The Engineering and Science Education Project was supported by a World Bank loan (3435-PH) for US$85 million, approved in January 1992. The loan amount was reduced to US$61 million in October 1993 when Japan's Overseas Economic Cooperation Fund (now the Japanese Bank for International Cooperation) offered cofinancing of US$24 million equivalent. Loan-3435 was closed in June 1998 after a one-year extension and US$5.69 million was cancelled. The JBIC loan was disbursed by November 1998 and closed in August 2000. The Second Vocational Training Project was supported by an International Development Association credit (2392-PH) for US$36 million, approved in June 1992. The credit was closed in December 1999 after two one-year extensions of the closing date. On the recommendations of a recent Bank M&E task force, the Bank has recently launched an initiative to strengthen M&E systems in Bank projects and in borrowing countries as tools for results-based management. By focusing on M&E, OED's report offers several important lessons with broad applicability for strengthening M&E components Chapter 1. These are supported in the detailed findings on the projects' M&E experience and outcomes (Annex C). It also supports the recent innovation by the Government of the Philippines in introducing systematic guidelines for project appraisal with a results-based orientation. Though the projects were designed in an earlier period when a results-based orientation was less prominent than measuring outputs, the findings suggest that project managers gave increasing attention to the need to achieve results and capture outcomes through M&E. The audit findings are based on a review of the Implementation Completion Reports (ICRs), appraisal reports, and other project documents. An OED mission visited Manila for 10 days in September 2000 to interview senior government officials and staff in line and oversight agencies, the Asian Development Bank, and JBIC. Mission information was validated through analysis of project studies and files and interviews with Bank country office staff and technical staff familiar with the projects. The author wishes to acknowledge the help and consideration of all participants in the review, especially project staff, government officials, and others who generously provided the information on which this report is based. Following customary procedures, copies of the draft report have been sent to the relevant government officials and agencies for review and comment. Comments received have been incorporated in the text of the report and reproduced as Annex D.  Summary of Findings The Projects The Philippines Engineering and Science Education Project (ESEP) and the second Vocational Training Project (VTP II) ran concurrently through the 1990s. The projects focused primarily on strengthening the capability of the education and training system to supply a skilled workforce relevant to economic development. Under the management of the Department of Science and Technology (DOST), ESEP aimed to improve training in science in secondary schools and in engineering, science, and industry research in universities and colleges (see Annex A for detailed objectives and components). Under the management of the Technical Education and Skills Development Authority since 1994, VTP II aimed to develop middle-level technical skills in formal settings, such as technical institutes, and in nonformal settings, such as trade centers and apprenticeship programs (see Annex B for detailed objectives and components). The projects sought to encourage employer participation in worker training. They both addressed problems relevant to the low quality of education and training in a context of high youth unemployment, shortages of technically skilled workers, and low levels of technological innovation, industrial productivity, and global competitiveness. Monitoring and Evaluation and Evaluation Capacity Development A high-performing public (or private) sector requires strong capabilities in policymaking, planning, analysis, and management. OED has argued that sound monitoring and evaluation (M&E) systems are critical for governments to achieve high performance.' M&E should not be viewed as a mere technocratic tool. Rather, it is a vehicle for supporting a performance culture in government agencies that seek to manage by results. Capacity development in monitoring and evaluation consists of processes that help strengthen M&E functions. Some of these processes involve specialized training but much capacity is developed on the job through learning-by-doing in carrying out M&E functions. OED's recent analysis of evaluation capacity development (ECD) processes has focused on the whole-of-government, while operational efforts have focused on strengthening the capacities of senior government agencies concerned with economic and fiscal management and public administration. In the Philippines, with the support of the Bank, the government has focused on strengthening oversight agencies, including the National Economic Development Authority, the Department of Budget and Management, and the Commission on Audit. The investments the two education projects made in M&E were intended for institutional strengthening. In the short term, they were expected to enhance the efficiency and effectiveness of the project management units and the host agencies in carrying out project activities and achieving their outputs and objectives. In the long term, the benefits of the project investments in M&E were implicitly expected to spill over to the education and training system. The intention 1. Keith Mackay. Evaluation Capacity Development: A Diagnostic Guide and Action Framework. Operations Evaluation Department. 1999. 2 was that the M&E capacities developed under the projects would support sector managers in gathering information from schools and employers to ensure that the education and training system responded to the needs of modern industries for a skilled workforce. In current parlance, this was viewed as a contribution to results-based management at the sector level as well as the project level. Main Findings The summary below provides a road map of the most significant findings of this Performance Audit Report. They should be read with reference to Annex C, which discusses the details of the projects' experience in carrying out M&E functions and in developing evaluation capacities. Project Outcomes The Implementation Completion Reports assessed the outcomes of both projects as satisfactory. The projects were efficacious in improving the capacities of the education and training system to supply a skilled workforce. ESEP remains relevant to the government's economic and human resource development policies and VTP II is highly relevant to poverty reduction. OED has reviewed and validated the findings and ratings of the ICRs and refers readers to the many insightful lessons provided by the borrower and the Bank (ESEP report no. 18706, November 1998; and VTP II report no. 20376, May 2000). In the operational phase, outcomes remain satisfactory. Continuing M&E activities keep the focus on ensuring project results strong and steady in terms of equipping the project training institutions to supply skilled workers capable of contributing to economic development. The projects had only modest impact on industry's willingness to pay for the training of skilled workers. This outcome should have been predictable since the projects had few incentives to offer and lacked support from a strong economy and labor policy framework. Institutional development impact continues to be substantial in terms of the capacities of the central management agencies and leading local government and school entities but remains moderate at local levels where capacity- building will take more time. Resilience to risk is substantial in both projects and sustainability is therefore likely. In ESEP, alternative funding sources must be tapped to compensate for the absence of external funds. DOST is able to maintain the current scholarship program from its budget and has collaborated with the sector authorities to conserve project benefits but will probably not be able to develop the project further without external funds. Resilience to risk will be affected by the ability of the projects to remain relevant to changing government and sector policies; the establishment of a strong policy framework to encourage employers to invest their own resources in human capital formation; and the retention of experienced managers and technical staff to operate the projects; and, as noted, the degree to which follow-up is likely to be supported by external funds. M&E Outcomes ESEP's objectives refer explicitly to M&E in two areas. Criteria were to be established for monitoring the quality and internal efficiency of science and engineering education in project schools. These included student enrollment, dropout, repetition, graduation and research output by field of study, as well as class-sizes and the research time available for faculty relative to teaching hours. Capacities for monitoring the science and technology workforce were to be strengthened through tracking the initial job-placement of project scholarship-holders; the months taken to find jobs; initial pay and employers' feedback on skills acquired. The former objective 3 was substantially achieved by project completion. In the operational phase, the Department of Science and Technology (the host agency) has continued to work with the sector authorities and individual schools to institutionalize the use of monitoring indicators. The department continues to monitor the initial placement of graduate scholarship-holders in industry employment. It has had less success in monitoring the needs of diverse employers for skilled graduates in the science and technology field.' VTP II's objectives refer explicitly to strengthening the capabilities of the host agency in policy formulation, planning and management, and research and evaluation, specifically in the area of nonformal vocational training and employment services. These objectives were parially achieved in the latter years of implementation once the host agency was reestablished as the Technical Education and Skills Development Authority (TESDA) and subsequently restructured. The project was highly successful in establishing a project monitoring system and the building blocks within the central TESDA for planning and management functions and partially successful in sector policy formulation. It was less successful in producing evaluation findings and analytical research of high relevance for policy formulation in nonformal vocational training and employment services. The large training and studies programs were less efficient and effective than anticipated in producing in-house capacities for policy analysis. This was in part because of the frequent turnover of project personnel, the lack of direction of the studies program, the heavy demands of internal reorganization and project management and the time needed to establish a clear role for TESDA.2 Table I provides a summary assessment of the projects' performance in M&E based on OED's rating methodology. Table 1. OED Assessments of M&E Outcome ESEP VTP I M&E M&E Outcome Satisfactory Satisfactory Relevance High Substantial Effectiveness Moderate Moderate Efficiency Moderate Moderate Institutional Development Impact Moderate Moderate Sustainability Uncertain Likely Borrower Performance Satisfactory Satisfactory Bank Performance Moderately satisfactory Moderately satisfactory Highlighted Finding on Evaluation Capacity Development If the two projects reviewed are typical for the Philippines, the government has a rich stock of local experience and wisdom on which to draw in developing the country's evaluation capacities for the future. Under the projects, the effort, purposefulness and perseverance shown in meeting larger-than-life responsibilities for M&E turn out to have been extraordinary. The willingness of people involved in the projects to reflect on experience and to articulate important issues shows that a tremendous amount of reflection, and learning have occurred. The main credit for enhancing project monitoring capacities is due to managers and staff of the projects and the host 2. For elaboration of these points about studies and feedback from TESDA, see Annex C, pp. 31-32. 4 agencies in establishing regular information flows and cross-checks. Recently, as the need for policy guidance has become urgent, senior managers have expressed need for just-in-time evaluative studies to help them set out sector policy options. However, as noted, in-house analytic capacity has been slower to emerge than capacity for routine project monitoring. M&E Objectives At the design stage, M&E objectives were not entirely clear, outcomes were not precisely described, and the mechanisms for implementation received little attention. While the lack of prescription gave flexibility, it also led to some lack of direction, and to delays and costs that might have been avoided with more precision and detail. In hindsight, the critical questions to have asked at appraisal were: who needs the M&E output; for what purposes and when; and who is responsible for providing it? In both projects, major problems arose, for example, in establishing the management information system (MIS) and in deciding on its main use. The technical difficulties of designing the system and procuring and installing hardware and software were vastly underestimated, even though similar experience elsewhere readily showed that these processes are time-consuming and should be carefully sequenced. Since the immediate need was to proceed efficiently towards project targets, the MIS was pressed into service for internal project management purposes, which, although a legitimate use, was not the main focus at the appraisal stage. The MIS infrastructure for both projects was operational around the project completion stage at central levels. Some problems remain with connectivity, software compatibility, and data collection and harmonization. More work is needed for VTP II to have the system operational at provincial levels. There is also some concern that the MIS infrastructure may not be ideally suited to the grand plan for M&E first developed in 1995 and recently revised and adopted. Similar issues of precision in objectives and sequencing affected investments in personnel training in various aspects of information management and cost and policy analysis. It is noteworthy that both projects chose to contract out the major analytic studies. This was a reasonable option since they could draw on substantial local and international evaluation expertise and had dedi.ated project funds to do so. The original focus for M&E in the projects was labor market analysis, but this expectation was not clearly defined and has been only modestly fulfilled. At the design stage, the type of analysis required was not clearly specified in relation to the specific needs of potential users. During implementation, the task of collecting accurate information from "the labor market" proved time-consuming and difficult and gave way to more pressing demands. Indeed, TESDA comments that for VTPII the primary aim was to strengthen training centers, and that, while the M&E system did provide inputs to labor market analysis, this was not the main purpose. In retrospect, the projects should not have been burdened with such vague responsibilities and it is arguable that the responsibility for labor market analysis is best left with specialized economic and social policy bodies whether in the public or private sectors. Discussion are currently underway, for example, about whether the recently established Professional Regulations Commission is the proper source of information on occupational licensing, registration, and accreditation rather than the projects that have helped to develop such services in various trades and professions. M&E Responsibilities The range of M&E activities and tasks expected of the two projects turns out to have been extensive. This complexity was definitely not anticipated in the original project design but, rather, emerged during implementation. For purposes of analysis, the M&E elements identified in the projects may be related to four areas of responsibility: project management, sector management, regulation of sector training institutions, and development of sector policies. In reality, of course, 5 these responsibilities (and related M&E tasks) are not always distinct. Box 1 provides an overview of the M&E tasks identified in the projects and a summary rating of implementation performance. Annex C discusses outcomes for the four areas of responsibility. Box 1. Project Responsibilities and Related M&E Tasks Implementation Performance ESEP VTP II Project Management Reviewing implementation progress to plan project activities, budget needs, and Highly Highly spending. satisfactory satisfactory Monitoring and reporting for compliance with Bank and borrower requirements Moderately Satisfactory satisfactory Evaluating project experience and assessing project achievements Satisfactory Satisfactory Sector Management Providing relevant and accurate information on the performance of providers of Moderately Moderately training and related occupational services satisfactory unsatisfactory 3 Providing analysis of sector issues and recommendations on strategy Moderately Moderately satisfactory satisfactory Regulation of Sector Training Institutions Developing performance standards Satisfactory Moderately satisfactory Providing quality assurance services Satisfactory Moderately satisfactory Development of Sector Policies Evaluating project strengths and weaknesses to inform sector policy formulation Moderately Satisfactory satisfactory Analyzing sector issues and needs Moderately Moderately satisfactory satisfactory Exploring policy options and priorities Moderately Satisfactory satisfactory Recommending future projects to support sector policies Moderately Satisfactory satisfactory The number of management-related M&E tasks expected of the projects went beyond those envisaged at appraisal for several reasons: failure to set out detailed implementation mechanisms and plans; increasing demands over time on managers in government agencies to demonstrate efficiency, accountability, and development outcomes; the changing scope of the host agencies' mandates as they negotiated roles with new partners in the training sectors and with diverse industry players; and the ready availability of Bank project funds for investment and operational needs in a government context where budgetary resources for M&E were scarce. Nevertheless, as the summary ratings indicate, both projects performed satisfactorily overall in coping with the various M&E tasks that they assumed. 3. TESDA comments that the project design did not include providing information on the providers of training and occupational services and that this area should not have been assessed. In fact, the original project documents clearly state that the project should develop M&E systems to monitor the technical, management and financial performance of training institutions. In practice, this proved extremely difficult given the multiplicity and diversity of providers. Under the reorganization, efforts were made, nevertheless, to produce an inventory of and baseline information on TVET providers. However, this is far from complete and requires determined effort in order to establish a sound information base for targeting and designing project interventions aimed at strengthening training institutions according to needs. 6 M&E-related Components and Investments The projects established four essential building blocks for a viable results-based management system, even though an explicit plan to do so was not set out at appraisal. The M&E-related elements in both projects were not designed as one integrated component, but rather as individual inputs and outputs. Inputs included personnel training, technical assistance, computer hardware, software and programs, and routine monitoring. These inputs contributed to four outputs: * An M&E plan covering the training sector * An MIS to support data gathering and analysis * Quantitative targets and indicators for monitoring project performance and achievements * Studies to identify issues and guidelines for planning and policy. Several important points arose during the audit mission. The building blocks took the entire implementation period to establish and the subsequent operational phase was critical for refining and developing them. Because M&E was not a component as such, the process of sequencing and integrating the building blocks was less than fully efficient. Since the building blocks were not in place during implementation, the projects were unable to assemble comprehensive data on the relevant training and industry sectors; yet these are critical for fully effective results-based management. Even when the four building blocks were set in place, they would not alone have formed a functioning M&E system. Also needed were the attentiveness of managers, the trained M&E personnel, and the institutionalization of M&E activities in regular agency operations. The projects provided all three inputs to capacity-building fairly successfully. But a fourth, and equally critical, element is the set of incentives that would encourage providers and users of information to buy into the emerging performance culture. This is where much of the unfinished business lies. OED's best guess is that the direct costs of M&E investments constituted 2-4 percent of total project costs. Since the projects probably had more M&E-related elements than many education projects of the era, costs do not appear excessive. Actual costs probably exceeded original expectations because of delays in installing the MIS in both projects. Personnel costs also included staff development, temporary help for data processing, survey work, and commissioned studies. In VTP II, staff development expenditures overall were substantial and many times higher than planned. Neither estimated nor actual costs can be precisely determined, however, because they were spread across several components and expenditure categories under the loan agreements. It would take enormous research to separate them out retrospectively from budgets and accounts that were not originally designed to identify them. Participation in Project and Sector Monitoring An important finding is the enormous range of participation in monitoring tasks. The projects are concerned with the internal efficiency and quality of the education and training system. Therefore, M&E tasks and duties are shared in one form or another at many levels: the sector departments and education authorities at central, regional, and provincial levels and in local governments; and the governors, owners, managers, staff, and students of beneficiary schools. Without this participation, essential M&E-related tasks - providing information; assembling data; preparing reports; responding to surveys; and giving advice and feedback to questionnaires - would not get done. 7 At central project levels, a performance culture has developed. The same is true of some, but not all, local entities and the flagship schools that have become project champions. Because monitoring activities have become part of central office processes, M&E functions and systems are fairly well institutionalized. This is a tremendous achievement in view of shortages of specialists and the severe discontinuity from staff redeployment, turnover, and reliance on short- term contractual help for inputting MIS data. At local levels of the education system much more effort and time is needed to strengthen performance cultures, especially with decentralization underway. The challenges come from several factors: the lack of perceived usefulness of M&E, especially to those required to perform basic tasks in data processing; the burden of the M&E tasks on top of other administrative requirements for those whose positions are not fully dedicated to host agency duties; the problem of making sense of data compiled from different information systems and for different purposes; and the lack of responsive analysis for managers' use on important trends and policies. Local capacity building is a priority under the Asian Development Bank-assisted project that will follow VTP II, and sustainability is likely. Under ESEP, the plan is to continue monitor training institutions, scholarship holders, and graduate placement without the support of a follow-up project. Long-term, much depends on the level of demand from industry and trainers for the evaluation products. Since the projects are equally concerned with internal and external efficiency, their ability to assemble relevant information, and hence their effectiveness in M&E, depends critically on reaching out across several sectors, even where they have little control. Employers hold different views on the value of high-level skills in technical and scientific fields and possess uneven capacities to invest in skills. Hence, industry willingness to provide and finance training remains a big issue. The challenge imposed by the need to involve many stakeholders probably increases costs because of the broad scope of monitoring responsibilities and the need to help defray stakeholders' costs in providing information and feedback. In the event, both projects had only partial success in building industry commitment and willingness to pay for information and evaluation. A universal constraint on establishing a widespread performance culture is the perception of M&E as threatening, destructive, and risky. In the Philippines, for reasons to do with recent political history, M&E is linked in some regions with heavy-handed central regulation. In close- knit professional, academic, and government circles, collegiality and partnership among peers and families are important. Furthermore, as elsewhere, factual data and empirical judgments may be no more valid, reliable, or influential in decision-making than social and political influence. Contextual factors proved to be highly important in establishing M&E systems that can respond to the needs of decision-makers and clients. In a complex environment, both projects had to work hard to develop a network of providers and users of information needed to establish an effective M&E system. A volatile economic environment created problems for the projects in monitoring the needs of employers for skilled people. Changing structures of governance in education created ambiguities about which decision-makers needed to be kept informed and how best to channel information to stakeholders. As previously noted, the projects were challenged to elicit information from diverse training providers and employers dispersed across the country. In addition, they had to contend with the fact that, in both public and private sectors, these two groups have considerable discretion as to whether and how they respond to project requests and 4.Technical Education and Skills Development Project and Fund for Technical Education and Skills Development, Asian Development Bank. July 2000. 8 directives. As the authority directly responsible for managing the entire technical and vocational education and training system, TESDA has full formal authority to gather information, though in practice it has to work hard to win commitment from the many training providers, local governments, and business enterprises. As the government agency responsible for science and technology industry development, DOST has strong links with industry leaders but has to work hard to win commitment from the smaller employers. Moreover, although the department is represented in the governing structures of the education system, it lacks the final authority and must work through the senior sector agencies, as well as local governments to win commitment from the more peripheral schools and colleges. Note on Changes in Evaluation Criteria In 1998, the National Economic Development Authority (NEDA) assessed the projects as successful against original objectives and ESEP's achievements as highly satisfactory at completion. Nevertheless, NEDA has applied different criteria in appraising proposed follow-up projects financed with foreign assistance. This is because of changed government priorities. Because of enhanced attention to poverty-reducing policies, VTP II is firmly positioned as a social sector project. Its focus on middle-level skills means jobs and income for the less affluent. Before completion, it had redirected resources to skills training for rural poor and women. Consequently, it retains a legitimate position in the policy agenda. Against the same policy criterion, ESEP has found itself much less firmly positioned. It focuses on advanced technological skills that in the short term directly benefit the relatively affluent. Consequently, the project is vulnerable when judged as a social sector project. In addition, it is not regarded as a core industry sector preject, despite being managed by DOST, because by design it focused on the supply of skills and rather less on industry demand. In practice, both projects continue to face difficulties on the demand side but approval of the ADB-financed follow-up project to VTP II enhances it potential impact and sustainability. Insights of Broad Applicability on Project-level M&E Beyond their specific relevance to the education and training sector in the Philippines, the projects yield several insights of broad applicability that are worthy of consideration in the design of M&E at the project level. Capacity development in M&E goes hand-in-hand with institutionalizing M&E tasks as regularly expected activities. In hierarchical government bureaucracies, regular demand from managers for the products of M&E systems is critical to stimulating capacity development. To promote the development of monitoring capacities, as VTP II staff commented, managers need to consistently demand reliable information on progress and results and demonstrate that they use it in performance management. When managers regularly expect information, staff learn how to resolve technical problems and make use of feedback for issue identification and remedial action. To encourage capacity development in policy evaluation, managers need to pose issue-oriented questions, demand analysis of the facts, and ask staff to present a range of policy options based on empirical findings and informed judgment. They need to follow up by informing staff of the policy decisions eventually made. At the design stage, all project responsibilities requiring support from M&E functions need to be identified and expansion ofM&E tasks should be avoided unless they directly relate to the achievement ofproject objectives. Under ESEP, the scope of the monitoring 9 exercise was successfully limited to support project objectives and the results were successfully fed into the M&E efforts of the relevant sector authorities. Under VTP1 II, the scope expanded along with the overall responsibilities of the host agency. The project was burdened by the expectation that it could establish M&E across the entire vocational and technical education and training sector. Though VTP 1I established the many of the building blocks, the objective was beyond the capacity of any one project to accomplish. M&E elements in projects are naturally diffused across various objectives and components, but they should receive detailed attention at the appraisal stage as full- fledged components. The priority need is for clear, precise M&E objectives related to overall project objectives, and explicit identification of information use and users. Measures to encourage evaluation activity and optimal follow-up of findings must be built in. Implementing mechanisms for achieving measurable outputs need to be related to M&E objectives. Flows of inputs and activities need to be properly sequenced, though in incremental packages to allow for adjustments. Investment and operational costs should cover all M&E clients and stakeholders. Plans should be made for how information and analysis is to be disseminated and communicated in a timely manner. M&E costs paid for with external funds need to be tailored to ensure that M&E functions can be sustained from the regular budget after project completion. Finally, the appraisal should describe expectations of outcomes in terms of development of a performance culture and the risks associated with failure to accomplish M&E objectives. In the case of the two projects reviewed, such plans were lacking at appraisal in common with most projects of the time. Effective mechanisms were developed during implementation through the perseverance of project and Bank staff, but they suffered delays that might have been avoided with more detailed planning. In designing M&E components, institutional and stakeholder analysis is important The appraisal should determine the project's formal authority to gather relevant information and to share the findings from M&E activities. The two projects illustrate the importance of clarifying channels of communication and legitimacy with important stakeholders. ESEP needed to establish legitimacy through partnership with sector authorities, while TESDA had the advantage of being the sector authority. Both projects needed to understand the incentives and rules of the game motivating training providers and employers to provide needed information for M&E purposes. The validity and effectiveness of M&E is highly dependent on the level and range of participation by the providers and users. The easy part is to generate ownership among project management and staff and host agency decision-makers. The more difficult part is to generate commitment among dispersed and diverse stakeholders. When projects involve stakeholders from multiple sectors, evaluation capacity development poses exceptional challenges. At the start, it is helpful to use project funds to defray stakeholders' costs for engaging in M&E. But the ultimate test of the usefulness and sustainability of M&E systems is that stakeholders themselves become willing to bear the costs of engagement. To achieve this, it is important for project authorities to inform all stakeholders about the purposes of M&E and to convince them that participation in developing sound practice and sharing information is beneficial to them. In this res]pect, M&E champions prove to be exceptionally effective because they model sound M&E practices and show how information-shared can be used to advantage. Unfortunately, in the case of the projects, contextual disincentives operated against full stakeholder ownership. Training providers were aware of government intentions to downsize poor- performing programs and employers were reluctant to take over the costs of worker training from a heavily subsidized education and training system.  11 Annex A Annex A. Engineering and Science Education Project (Staff Appraisal Report no. 9907. December 1991) The overall objective was to increase the supply of well-trained science and technology (S&T) workers by strengthening S&T education in support of the government's plan to upgrade the industrial technological capability. The project aimed to strengthen selected engineering and science colleges; adjust enrollment patterns to respond to S&T workforce needs; introduce new courses and programs for management of the environment and technology; and improve the quality of instruction and laboratory practices in priority engineering and science fields. Objectives * To improve institutional mechanisms and criteria for funding and monitoring engineering and science education * To strengthen colleges of engineering and science that had met eligibility criteria by increasing their financial and resource management capacities. * To improve science and math instruction in secondary schools to better prepare students for engineering and science colleges. * To increase institutional capacities to plan and coordinate S&T workforce development programs. Policy Action Plan The plan was expected to produce criteria and directives to improve quality. It aimed to phase out over-expanded and substandard undergraduate engineering programs; establish targets to improve the balance of enrollments in various fields and levels; and improve financial management in schools selected to participate in the project. These included the 10 colleges and universities offering science education and 19 out of the 183 colleges and universities offering engineering in priority fields. The selected "flagship" institutions already offered high-quality science and engineering programs and the others were to be upgraded. Components * The development of programs, faculty, and laboratories for engineering education at 5 state and 14 private institutions; science education at 4 state and 6 private institutions, with environmental programs strengthened at 3 of them; and management of technology at I state and 2 private institutions. * Expanded reference book and journal collections, staff development, and networking arrangements for libraries at 3 engineering and 7 science colleges. * In-service teacher training, library books, laboratory equipment, and facilities for improved science and math education in 110 high schools and 22 teacher training institutions. * The development of S&T workforce planning and monitoring capacity within the DOST. 12 Annex A With changes imminent in the governance of higher education, the project was allotted to the DOST. Although, this was the department's first Bank project, DOST was well positioned because of linkages with industry and regular contact with other departments, such as Trade and Industry and Agriculture. Key Project Data Philippines-Engineering and Science Education Project (Loan 3435-PH) Executing Agency: Department of Science and Technology Financing (US$ equivalents) Actual or Actual as % of Total project costs current estimate Appraisal estimate Total project costs 125.3 103.8 IBRD 85.0 56.33 OECF/JBIC 15.9 23.7 Private universities 15.83 Cumulative Estimated and Actual Disbursements (US$ million) FY92 FY93 FY94 FY95 FY96 FY97 FY98 Appraisal estimate 2.00 13.00 37.00 65.00 80.00 85.00 Adjusted estimate 0 1.72 5.17 17.46 51.46 61.00 Actual 0 1.72 5.17 19.09 33.74 46.33 55.53 Actual as % of estimate 0 100 100 114 65 76 91 Date of final disbursement: November 10, 1998 Project Dates Steps in project cycle Original Actual Appraisal March 3 1991 March 9,1991 Negotiations December 5, 1991 Board presentation November 5, 1991 January 28, 1992 Signing February 5, 1992 Effectiveness June 3,1992 Project completion June 30, 1997 June 30, 1998 Loan closing October 31, 1997 December 31, 1998 Staff Inputs Actual/Latest Estimate Stage of project cycle No. Staff weeks tS$ (000) Preappraisal 114.4 345.9 Appraisal 38.2 109.2 Negotiations 9.3 31.0 Supervision 130.4 412.8 Completion 8.0 24.8 Total 300.3 923.7 13 Annex A Mission Data Duration of Performance ratings Stage of project Date No. of staff in mission Specializations Implement Develop. Types of cycle (month/year) field (# of days) represented' Statusb Objectives problems, Identification Oct. 1989 3 14 OP/ENG/S&T Preparation March 1990 5 14 OPIS&T/ENG/FA Appraisal March 1991 4 15 OP/ENG/S&T/AR Postappraisal June 1991 2 11 ED/AR Supervision 1 Feb. 1992 4 8 TM/S&T/ENG/AR 1 1 Budgetary Supervision 2 July 1992 3 12 TM/S&T/ENG 1 1 Budgetary Supervision 3 Nov. 1992 3 10 TM/ENG/AR 1 1 Budgetary Supervision 4 March 1993 3 13 TM/ED/AR 2 1 Budgetary Supervision 5 May-June 2 9 S&T/PD 1994 Mid-term Review March-Apr. 3 12 ED/S&T/AR HS HIS 1995 Supervision 6 Sept. 1995 2 13 S&T/AR HS HIS Supervision 7 Feb.-March 2 10 EDIS&T HS S 1996 Supervision 8 Feb. 1997 2 13 TED/ED S S Financial Supervision 9 Oct.-Nov. 3 12 TED/ED/AR S S Financial 1997 Completion June 1998 2 11 TM/ED S S a. A = Architect; ENG = Engineering Education Specialist; ED= General Education Specialist; FA = Financial Analyst; PD= Procurement/Disbursement Specialist; S&T = S&T Education Specialist; TED = Technical Education Specialist; TM= Task Manager; b. I = Highly satisfactory; 2 = Satisfactory. c. Typical problems included: delays in providing budget allocations and loan withdrawals for taxes.  15 Annex B Annex B. Second Vocational Training Project (Staff Appraisal Report no.10221, May 1992) The project's aims in the medium term were to support the government in improving employment and training policies; strengthen employment services; expand nonformal, rural, basic training; and increase private sector involvement in training and cost-recovery. In the long term, it aimed to improve the internal and external efficiency of the technical and vocational education and training (TVET) system. Objectives * Strengthen the capabilities of the National Manpower and Youth Council (NMYC) in policy formulation, planning and management, research, and evaluation of nonformal vocational training and employment services. * Improve training quality and cost recovery. * Upgrade training facilities. * Guide future improvements in formal technical education through two studies, including an identification of investment needs. Components A. Institutional development * Management training to help in restructuring the NMYC secretariat. * Training in policy, planning, and coordination for central and regional staff. * Development of a National Manpower Information System (NMIS) to facilitate the decentralization of management and strengthen employment services. B. Training Quality and Cost-recovery * Improvement and promotion of the National Skills Certification Program. * Development of a program to train the trainer. * Improvement of curricula and training materials. * Strengthening of guidance and employment services. * Development of monitoring and evaluation systems to allow NMYC to monitor the technical, management, and financial performance of training institutions. * Development of mechanisms to increase cost sharing with industry, communities, and trainees. C. Training Capacity Development * Upgrading of 14 Regional Manpower and Training Centers, 13 Provincial Manpower and Training Centers, and the national Training Skills Center. * Expansion of enterprise-based training by private sector firms, support for existing and new training programs, improvement of apprenticeship training, and support for a rural skills and livelihood training program. 16 Annex B D. Studies * A study of the formal technical and vocational education sector to improve the database and develop an investment strategy to rationalize formal technical and vocational education. * A feasibility study of the needs for the establishment of sector specific training centers and advanced technology training centers. Initially, NMYC became the implementing agency. When TESDA took over, NMYC staff were retained. The TESDA Act gave the agency broad responsibility for the direction, policy, and guidance of training provision and employment services; for program accreditation, skills standardization, and certification; and the upgrading of training institutions' capabilities. It was to promote private sector participation and devolve direct training provision to local governments. Key Project Data Philippines: Second Vocational Training Project (Credit 2392-PH) Borrower/Executing Agency: National Manpower and Youth Council (defunct). Technical Education and Skills Development Authority Financing (US$ equivalent) Actual or Actual as % of Current estimate appraisal estimate Total project costs 46.0 100.7 Loan amount 31.0 86.1 Project Dates Steps in project cycle Original Actual Appraisal April 1991 Board presentation June 1992 Effectiveness December 1992 Completion/Credit closing December 1997 December 1999 Staff Inputs Stage of project cycle Actual/Latest Estimate No. Staff weeks US$ (000) Identification/Preparation 64.9 156.2 Appraisal/Negotiation 63.3 148.7 Supervision 127.22 271.06 ICR 10.00 20.00 Total 265.42 595.96 17 Annex B Mission Data Performance ratings Stage of project cycle Date (month/year) No. of staff in Specializations Implement. Develop. field, represented Status Objectives Identification/ Preparation Nov.1989/Oct. 3 TES;ES;E 1991 Appraisal/ May 1991 6 TES;ESEP;E;A Post appraisal/ Negotiations April 1992 Supervision July 1992 2 TES;A HS HS Nov. 1992 2 TES/A HS HS May 1993 2 TES/00 S HS Nov. 1993 3 OO/PS/PA S HS June 1994 3 TES/ESEP/PS S S Sept. 1994 3 TES/ESEP/PE U S April 1995 3 TES/ESEP/PS S S Sept. 1995 2 ES/AE S S March 1996 3 S/PS/OO S S Mid Term Review Feb. 1997 2 TES/PS S S May1997 1 TES S S Nov. 1997 3 TES/PS/A S S June 1998 5 TESIA/00/PS/FMS S S Sept. 1998 5 TES/00/PS/FMS/ISS S S March 1999 5 TES/O0/PSFMS/ISS S S ICR Completion Oct. 1999 5 TESIOO/PS/FMS/ISS S S a. A =Architect; ESE = Education Specialist Engineer; FMS = Financial Management Specialist; ISS = Informal Sector Specialist; TES = Technical Education Specialist; 00 = Operations Officer; PA = Procurement Advisor; PS = Procurement Specialist; Related Bank Credits Project Credit US$ Year of approval Closing date First Vocational Training Project L2200-PH * 1983 April 1991  19 Annex C Annex C. Review of Project Experiences in M&E Context Country Economic and Sector Environment The projects were identified in the context of the Bank's support to the Philippines following five years of promising political, economic, and fiscal adjustments under the Aquino government. In 1990, the Country Assistance Strategy supported the government's program for economic and social development. It sought to sustain the economic recovery, strengthen public sector management, develop the private sector, and alleviate poverty and increase spending on human resource development. Despite their relevance to the policy environment, the Engineering and Science Project and Second Vocational Training Project were prepared and implemented under difficult conditions-the oil crisis, natural disasters, inflation, budgetary deficits, economic volatility, and unemployment. New technologies and global competition challenged the knowledge and skills of the workforce. A recent Bank sector study had examined serious issues of internal and external efficiency in education' and, with strong support from the Ramos administration, the Congressional Education Commission recommended sweeping reforms amid highly charged public debate. Reform issues included governance and decentralization; management efficiency; the financing and cost sharing of public and private schools; and improvement in educational performance, especially basic education. Education Sector Governance In 1994, two years into implementation, new legislation created a "trifocalized" system of governance. The Department of Education, Culture, and Sports (DECS) dropped responsibility for higher education and vocational schools and institutes to focus on basic and general secondary education. The new Commission on Higher Education (CHED) took over higher education' The Technical Education and Skills Development Authority (TESDA)replaced the National Youth and Manpower Development Council to take over nonformal vocational and technical training as well as vocational schools and institutes. All three agencies-DECS, CHED, and TESDA-have authority for policy development, system management, and quality assurance, and report to the president and the cabinet. They work through school owners and managers, local governments, and regional and provincial offices. On the ground, policy directives and operational instructions are more or less effective depending on how local authorities and stakeholders respond. In the five to six years of project implementation remaining after 1994, the unpredictability caused by rationalization and reorganization of governance and the decentralization of education management began to lessen. The three agencies worked out their respective responsibilities and partnerships and stakeholders learned new rules of the game. But the shifting institutional context demanded flexibility in project monitoring and management. After delayed midterm reviews, the projects prepared for completion just as the Estrada administration promoted a new set of initiatives. Poverty reduction, along with basic and nonformal education, became overarching priorities. The Medium Term Philippine Development Plan set out investment priorities for foreign assistance. An Asian Development Bank-World Bank education sector study, Philippine Education for the 21" Century, fed into a new Presidential Commission on Educational Reform. 5. The Philippines Education Sector Study. 1988. World Bank. Washington D.C. Report no. 7473-PH. 6. Mona Dumlao Valisno (Ed.). The Reform and Development of Higher Education in the Philippines. Philippine National Commission for UNESCO Education Committee. Manila. 2000. 20 Annex C The commission aimed to consolidate recent reforms and to continue with decentralization, and quality improvement and rationalization in the school system.' The Professional Regulations Commission began work on upgrading occupational standards. Recently, the National Coordinating Council for Education was formed to ensure strong linkages between the three education subsectors under the trifocalized management. Context for M&E The education projects were identified a decade before the Bank began to view the strengthening of M&E functions as a critical investment in results-based management. The Bank had always required progress monitoring and typically included support for project training in Bank monitoring and reporting procedures. During the 1990s, capacity development in M&E began to be viewed as a means of enhancing the quality of the Bank's lending and advisory work. However, in the early 1990s, education projects invested little in developing evaluation capacities, and had little internal capacity to do so. Consequently, evaluation components rarely enjoyed much attention at the design stage or later. Though the terminology of results-based management was not used in the two projects reviewed, they turn out to have paid significant attention to monitoring and evaluation as a useful management tool. Bank Expectations for M&E Thought the projects were prepared just before M&E gained serious attention, they were challenged by changing Bank expectations throughout their implementation period. In 1993, Getting Results' placed evaluation of development outcomes at center stage. By the mid-1990s, new M&E champions in the Bank joined with OED, notably the Quality Assurance Group and the Operational Core Services network. By the midterm review in 1997, accountability and measurement of results had become the focus of Bank attention and the intensity of Bank supervision increased. The country office staff helped the projects adjust to new standards for accounting and auditing. The logical framework became the basis for new project design requiring clear and measurable objectives, sound baseline data, and indicators of outcome as well as outputs. At the completion stage, the Comprehensive Development Framework raised expectations that Bank staff would ensure collaborative effort with government in M&E, especially in monitoring progress toward the recently reinforced goal of poverty reduction. The Bank began to emphasize the importance of capacity development in M&E as one vehicle for establishing a performance culture as borrowers pursued development goals. Similar shifts occurred in the Bank's approach to M&E in education and training. They were too late to affect the projects' design but they did raise expectations at midterm and completion. In 7. Philippines Education for the 21st Century and the 1998 Philippines Education Sector Study, Asian Development Bank and World Bank. 1999; Presidential Commission on Educational Reform.2000. Philippine Agenda for Educational Reform. The PCER Report. Manila. 8. Edward B. Rice. Monitoring and Evaluation Plans in StaffAppraisal Reports Issued in Fiscal Year 1995. A Follow- up to OED's Report "An Overview ofMonitoring and Evaluation in the World Bank " Report no. 15222. December 29, 1995. World Bank: Operations Evaluation Department: Washington, D.C. Operations Evaluation Department. Built in Project Monitoring and Evaluation. A Second Review. (Education) Report no. 2724. November 2, 1979. World Bank: Washington, D.C. Operations Evaluation Department. An Overview of Monitoring and Evaluation in the World Bank. Report no. 13247. June 30, 1994. World Bank: Washington, D.C. 9. Getting Results: The World Bank's Agenda for Improving Development Effectiveness. World Bank, Washington D.C., July 1993. 21 Annex C 1995, Priorities and Strategies " emphasized the importance of evaluating qualitative outcomes in education and training projects, especially school effectiveness and student achievement. In 1999, the Education Sector Strategy " also emphasized the need to strengthen educational institutions, including M&E for management, planning, policy, and quality enhancement. In preparing for the completion stage, project managers, supported by Bank staff, struggled to retrofit databases and indicators to better reflect development outcomes. Country Demands on Evaluation Capacities From the mid-1990s, the Ramos administration strongly promoted reforms in public sector management and governance and emphasized accountability. The oversight agencies, including the Department of Budget and Management, the Inspectorate General, the Central Office for Auditing and the National Economic and Development Authority (NEDA), tightened monitoring procedures. In adopting the logical framework, NEDA tightened its appraisal criteria for approving foreign-assisted projects. In 1998, the agency commissioned an evaluation of the entire Bank portfolio." The draft report rated ESEP's achievements as highly satisfactory, and VTP II's progress as satisfactory.13 NEDA's Medium-Term Investment Framework set out socioeconomic indicators for poverty-reduction policies and identified social sector programs, including basic and nonformal education, as candidates for external assistance. In 1999, the Bank's Country Portfolio Performance Review and the first joint ODA Portfolio Review by the government, Asian Development Bank, Japanese Bank for International Cooperation, and the World Bank- were strong advocates of results-based management on the grounds that it promised to reduce implementation delays and inefficiencies and enhance outcomes. M&E in Project Design Both projects included M&E-related elements typical of most investment projects in education. Analysis reveals that investments in M&E were fairly substantial, even though they were not viewed as major project components at the design stage. ESEP included provision for routine monitoring, project staff training in Bank procedures, and evaluation studies (included under technical assistance). Table C1 shows categories and estimated costs for ESEP components for which M&E-related elements are identifiable. Together, they amount to only about 1.75 percent of total project costs. This is an underestimate, however, since the promotion of performance standards and quality assurance in science and engineering schools was an important M&E element that cannot be separated out from other expenditure categories. Project management was by far the most prominent expenditure category among the three components. The science and technology workforce planning and monitoring component included evaluation studies and much more. Only the MIS was exclusively dedicated to M&E. Project accounts do not lend themselves to identification of actual M&E costs because M&E was in part viewed as an overhead and not a prominent component. 10. World Bank. Priorities and Strategies for Education A World Bank Review, 1995. Washington, D.C. 11. World Bank. Education Sector Strategy. 1999. Washington, D.C. 12. NEDA benefited from a PHRD grant for ECD whose main output was a commissioned impact study of the Bank portfolio in 1998. This was produced in draft in 1999. First Annual Report. Development Impact of World Bank Assisted Projects. Draft. National Economic and Development Authority. Manila. 1999). Since implementation of the grant was slow, it was terminated early. 13. In 1999, OED's Philippines Country Assistance Review rated both projects unsatisfactory in design because they did not analyze labor market failures. OED proposed that analysis should focus more on sector developments than on projects. Gianni Zanini. Philippines: From Crisis to Opportunity. Operations Evaluation Department. 1999. The 1999 ADB-Bank sector study addressed labor market issues in some depth. 22 Annex C Table C1. ESEP: Estimated Costs for M&E-related Categories, 1991 S&T Workforce Planning and US$ million % Monitoring 1. Information System 0.3 15.8 2. Project Management 1.3 68.4 3. Planning and Monitoring 0.3 15.8 Total Base Cost 1.9 100 Total Costs incl. Contingencies 2.3 In VTP II, M&E-related costs appear to be somewhat higher. Table C2 shows that the components with M&E elements constituted over 30 percent of estimated project cost. Even if M&E elements constituted only one-third of the 30 percent, they would have amounted to 4 percent of total project cost. Table C2. VTP II: Estimated Costs for Key M&E-related Categories, 1992 Institutional Development US$ million % 1. Strengthen Institutional Capability 2.1 20.6 2. Workforce Policy, Planning and 4.9 48.0 Coordination 3.National Manpower Information 2.0 19.6 System 4. Studies 0.7 6.9 5. Project Management 0.5 4.9 Total Base Cost 10.2 100 Total Cost incl. Contingencies 12.9 TA and operational overhead costs were especially high. The ICR shows that the total actual cost of these components amounted to more than $18 million, or 40 percent more than estimated. Factors behind the increase include a much larger than anticipated staff development program, delays in establishing the NMIS and technical assistance difficulties. The fact that M&E investments were much heavier than under ESEP relates in part to the project's broader scope. Box C3 identifies M&E-related inputs, broadly defined, identified under almost all components. 23 Annex C Box C3. VTP II: M&E and ECD inputs Organizational Strengthening. Training for about 300 NMYC/TESDA staff at central and local levels covering management and financial analysis; employment surveys; management of trade standards and certification; training needs surveys; and "testing and evaluation. " Development of an NMIS. TESDA training in policy analysis; labor market analysis; computer operations; and data management. Strengthening of Workforce Policy, Planning, and Coordination. Training in policy analysis and coordination; economics; statistics; workforce planning and training needs analysis; cost-effectiveness analysis; and impact evaluation. The training would benefit 72 overseas fellows and 350 local participants. Improvement of Training Quality. Training in management and testing for the National Skills Certification programs. Strengthening of the Trainer Development Program. Local technical assistance to help TESDA develop a plan for analyzing demand for trainers for center-based and firm-based training. Strengthening of Guidance and Employment Services. Technical assistance to improve local labor market surveys and establish employment information systems; and the training oflocal staff in employment analysis and data management. Monitoring and Evaluation of Training Performance. Technical assistance to develop systems to monitor and evaluate the performance of trainingfacilities, staff and students, including instruments to measure internal and external efficiencies. Improvement of Cost Recovery. Technical assistance to improve financial analysis and accountability of training programs at all levels; conduct sample survey and studies to determine the ability of trainees, firms, and communities to pay for training. Evaluation ofApprenticeship Training. Support for field testing of new apprenticeship schemes and an impact evaluation of the Women in New Trades project. Review of TVET System. A review offormal TEVT programs at the secondary, post-secondary, technician and teacher education levels in public and private sectors to determine relevance and effectiveness and develop strategy; and a feasibility study for the establishment of technology training centers. M&E Outcomes, Experience, and Issues The projects' experience is discussed below under the four areas of responsibility to which M&E contributed: project management, sector management, regulation of sector training institutions and sector policy development. While sector management and regulation are closely related, they are discussed separately because of the project's heavy emphasis on the quality enhancement and assurance objectives that pertain to the regulatory functions of sector management. Project Management Engineering and Science Education Project ESEP developed tools for project monitoring that are now integrated into DOST's management capability and are used competently by unit staff responsible for project operations. As reported in the ICR, DOST managers have available a tailor-made "ESEP-MIS" as an addition to the department's existing systems. Computer-based project monitoring modules and databases are in place and are tailored to the project's management and reporting needs. The seven flagship universities are networked into the system. The six modules include data on scholarships, graduates, staff appraisal reports, and library materials procurement, a bidders' module aligned 24 Annex C with recent procurement regulations, and a financial management system aligned with recent DBM and COA requirements. Project investments in developing the conceptual infrastructure for managing project information, and tailor-made training for staff, enabled the Project Management and Coordinating Office to establish itself as a competent unit. A stated in the ICR, staff learned to coordinate project components effectively, overcome errors in dealing with funds, keep adequate records and produce regular progress reports. The institutionalization of project monitoring capabilities has been achieved despite DOST's lack of experience with Bank projects, delays in establishing the computerized system and the dismantling of the implementation unit at completion. Working groups have been established to ensure continuity. During the audit mission, project monitoring competencies were demonstrated. The department had prepared relevant reports on progress in the two years since project completion. 14 Discussions and interviews were issue-oriented, evaluative, and strategic. The computerized MIS was not fully installed until project closing due to changes in design specifications and problems with procurement and contractors. Consequently, project staff relied on available mechanisms to monitor and manage the project without much benefit from the new software programs. Since project completion, DOST has continued to develop the system. Even though, the agency ran a small scholarship program before the project, staff say that the development of the scholarship monitoring tools was one of the most useful project inputs. The project continues to require participating schools to report biannually on the progress of scholarship-holders and it releases scholarship money following submission of progress reports. Since the government currently favors student loans over scholarships as being more sustainable, it is not clear whether DOST will continue to develop the program in future. Second Vocational Training Project VTP II's success in establishing capacities in project monitoring and reporting, despite delays and constraints, was a substantial achievement. Staff reported to the audit mission that the monitoring process during implementation was a continual lesson in problem-solving. Successive managers faced several heavy constraints: the changeover from NMYC to TESDA; the continual redeployment of personnel at all levels; changes in procurement, financial management and auditing requirements; the mushrooming size of a diverse TVET subsector and changing demands for middle-level personnel in the face of technological progress. Despite these constraints, program managers at TESDA headquarters demonstrated to the audit mission that they use information effectively to plan, allocate, and monitor resources and produced accounts readily that met fiduciary obligations. They report that local level capacities, however, remain a major concern, especially the development of reliable databases that would facilitate decentralized management and planning. During implementation, procedural manuals were disseminated to local offices to clarify tasks and staff were trained in their use. At completion, the borrower recommended that future training programs should be tailored to more clearly defined priorities. However, interviews clearly suggest that training without a strong incentive system will not be sufficient to promote serious attention by local managers to establishing a performance culture. 14. ICR Table 5, Key Indicators for Project Implementation, is incomplete and includes inaccuracies and ambiguities. These were resolved in discussion with DOST staff. An updated and improved version is included in Annex *C. The ICR appears to have confused baseline indicators for 1990 and targets selected at appraisal with those revised during implementation to better reflect project's achievements. 25 Annex C The project has equipped TESDA with the computer hardware and software for an MIS that improves on the obsolete NMYC system. Once fully in operation, the system has potential to enhance the reliability and timeliness of routine monitoring and reporting. The strongest elements are the more modem computer hardware, software, and networking tools that are targeted at support to project and corporate management at TESDA headquarters. Databases include property management; human resources; procurement; financial management. During the audit mission, problems of compatibility and integration were still being resolved while additional licenses were awaited from a multinational vendor. Sector Management Engineering and Science Education Project The project was intended to contribute to education sector management but only in the limited areas of science and engineering education in the selected schools. The project established a library network among flagship schools and arranged for the schools' research and development projects to share laboratories and specialized facilities at several university centers. M&E mechanisms developed include the systems for monitoring laboratory equipment and maintenance needs, as well as scholarships. Criterion-based assessments of schools' performance also proved successful and has been further developed and adopted by CHED. In its feedback on the draft audit report, DOST states that it considers this to be a major project achievement. Success in developing up-to-date and reliable monitoring information on schools' performance has been substantial in the flagship schools but uneven in the weaker and more vulnerable schools. The project was only partially successful, however, in developing mechanisms to improve internal efficiencies since this requires schools to reorganize and downsize over- expanded programs. It is not clear that project funds have been withheld from schools that fail to meet performance criteria. The issue is difficult because these programs remain in high popular demand. They hold out prospects for the swelling numbers of students completing elementary and secondary education of a highly prized graduate degree status irrespective of the relevance of the field of study to employability or workforce needs. The successful use of performance monitoring criteria and mechanisms has encouraged DECS to develop them for wider application in science programs. This is good news since DOST would be unable to forge ahead alone. Project funds are no longer available and two proposals for follow- up projects have not been successful since they do not fully fit within the government's current priorities for poverty reduction and industry-financed human capital development. Even more importantly, DOST has no direct sector-wide jurisdiction except through its representation in the governance structures for education. Its effectiveness, therefore, depends on partnership with DECS and CHED and the schools. In higher education, the individual schools have considerable autonomy, especially the private schools. Moreover, in order to carry out the project's monitoring responsibilities locally, DOST depends heavily on local governments and education offices. A similar situation pertains to DOST's ability to monitor and intervene on the industry side. The original project did not include detailed mechanisms for demand-side interventions, focusing on a "supply-push" approach. Nevertheless, it explicitly sought to stimulate employers' interest in hiring graduates for science and technology development and to promote industry-led training provision and cost-sharing. The project has no direct jurisdiction, of course, over employers' hiring and training practices, which differ across the range of large and small enterprises in many industry segments. Rather, its ability to intervene successfully depends on partnership with representatives of industry, the professions, and labor organizations. In addition, failures in the market for science and technology graduates were beyond the project's control and were not fully acknowledged at the design stage. Nevertheless, despite limited means, the project performed 26 Annex C exceptionally well in forging partnerships with industry champions who led the way in developing workplace training and research programs with flagship schools. As DOST's surveys show, however, many employers remain to be won over to the project's objectives and industry's willingness to participate in monitoring the performance of training programs remains weak. Nevertheless, an important development to the project's credit is the increased scope and accuracy of databases on employers that tailor-made monitoring tools have facilitated. While much remains to be done to ensure comprehensiveness, prospects of sustainability are positive because the data also support DOST's broader mandates in science and technology development. Overall, through enhancing DOST's monitoring tools and improving the quality of its information on the demand and the supply side, the project has enhanced DOST's capacities to facilitate productive ties between employers and the schools. Second Vocational Training Project In 1994, the TESDA Act charged the agency it created with responsibility to ensure efficient management of the entire TVET system. This requires monitoring capacities and a performance culture at every level of TVET management. The project took on much of the challenge, although, clearly, it was beyond the scope of a single project. VTP II's main benefits have so far accrued to TESDA's central offices and its impact has been most evident in the two most responsive regions. Inevitably, much remains to be done to improve information and feedback mechanisms across a huge and sprawling TVET system. To be successful, continued efforts must include all stakeholders, including TESDA offices, local government units, and TVET training providers of all kinds in public and private sectors. In the ADB-financed follow-up project, local level capacity development is a major objective. Headquarters managers say that monitoring responsibilities are increasingly included in the regular job descriptions for local managers. This is a sound step forward. They affirm, however, that continued efforts in communication and education are required to ensure that the purposes of TVET system monitoring tasks are understood and accepted. When local personnel do not see the use of benefit, monitoring and reporting are viewed as administrative tasks to support central offices. Moreover, the use of empirical information as a tool of good decision-making does necessarily make much sense to local managers who operate in highly politicized environment. The M&E framework for training sector management proved to be too complex. However, a modified version has recently been approved as a basis for moving forward with TESDA's National Technical Skills Development Plan 2000-2004 (Box C2). This focuses on government goals for global competitiveness, rural development and social integration and places priority on nonformal training and services to the poor and women. The elements of the M&E framework to support the plan will include three subsystems: TVET databases (trainees, providers and programs); TVET performance monitoring and evaluation (internal and external efficiency of training operations); and the NMIS (TESDA's direct activity in technical education and skills development). The monitoring systems set in motion under the project form a sound basis for TESDA to make progress in areas of weakness. 27 Annex C Box C2. National Technical Education and Skills Development Policies, 2000-2004 Upgrade the quality and raise the productivity ofPhilippine middle-level personnel to be globally competitive. Rationalize the roles and functions of TESDA in the overall management of middle-level skills development sub-sector. Maximize the roles and contributions of the industry and other private partners in the planning, management, and delivery qf education and training Utilize the comparative advantage of the middle-level skills development in the promotion ofsocial integration and rural development Elevate the prestige of middle-level skills as a viable occupational career Adopt a comprehensive plan for devolving major responsibilities of training to local government units and other stakeholders Emphasize the development of entrepreneurial culture in the education, training, and employment of middle-level personnel. Regulation of Sector Training Institutions Engineering and Science Education Project ESEP succeeded in developing and testing criteria and mechanisms to strengthen performance standards in project school programs and had some success in disseminating them more widely (Box C3). The project also had a positive impact on flagship schools which began to welcome new approaches to quality assurance as professional practice. Box C3. Improving Performance in Engineering Education: Strategies and Achievements * The project funded exploration ofaccreditation systems under the National Engineering Center and the University of the Philippines and succeeded in devolving ownership of the selected scheme to engineering professionals. The "Peer Evaluation Process" was selected and later strengthened under the Foundation for Engineering Education. It was piloted in seven colleges and introduced in final form in one. The scheme is voluntary and the pace of adoption is slow, largely because of the absence ofstrong incentives for participation. * ESEP had modest impact in facilitating the emergence of common standards for engineering education. In two regions, it encouraged the development of two consortia of engineering universities and colleges. The consortia strengthened ties between academe and industry, pooled resources, and achieved common standards for masters degree programs. * The project expanded DOST's practice of working through quality control and technical panels-those in science, engineering and agriculture--as mechanisms to support its responsibilities in evaluating the performance ofproject schools. As the ICR reported, the device of ensuring peer, expert, and industry representation was successful. The project demonstrated the effectiveness of a management tool that could help promote self-regulation by educational institutions. * ESEP developed the 'flagship" mechanism for setting performance standards based on the PCASTRD prototype. This has motivated project schools to strive for and maintain performance levels and has raised the bar for others through the demonstration effect. DECS and CHED have incorporated lessons from the project in developing similar mechanisms for the entire education sector. If successful, the interventions will be powerful tools for providing quality assurance services, results- based management, and the allocation of budgetary resources to improve school effectiveness and system efficiency. 28 Annex C From the standpoint of promoting objectivity and fairness in quality assurance, the design of the flagship mechanism was flawed. DOST affirms that the flagship schools, especially in the science sector, were selected according to the accreditation standards of the Philippine Council for Advanced Science and Technology Research and Development. The appraisal report, however, neglected to make this clear and did not give the precise criteria to be used to determine the flagship and near-flagship status of project schools. The criteria were developed during implementation, but the project missed the opportunity to fully expand them, codify good practice in applying them, and disseminate them across the school system. At completion, the project chose not to publish the performance levels of the individual schools that benefited from project inputs. Currently, the question of how far such transparency would be effective or counterproductive is under discussion. Project staff express strong support for a rather less risky approach to quality promotion. Publicity has been positive for schools whose students have won international and local awards for high achievement in science education and science and technology-related research and is considered a means of stimulating healthy competition. Second Vocational Training Project A core mandate for TESDA is to encourage TVET providers to adopt voluntary self-regulation. Training providers are encouraged to adhere to recognized national and international standards for occupational skills certification. The project made progress in developing some of the basic tools for self-regulation. It did not, however, cover anywhere near the entire range of skills and occupations and could not have expected to do so. To support training providers, the project developed prototype occupational maps, skill equivalencies; competency based testing tools, skill-certification programs, and accreditation services. It provided the system with large numbers of trained personnel in these areas. For employers it developed licensing and registration schemes. TESDA found that the participation of employers in developing standards was an important factor in motivating them to adopt self-regulation. The construction industry was an early adopter and has become a champion. Overall, however, progress with providers and employers has been slow in the absence of strong incentives for self-regulation and considerable business risk. Furthermore, continuing efforts to mainstream self-regulation among training providers must address the thorny problem of how to maintain the relevance of the certification system for technical skills in rapidly changing employment markets. Sector Policy Development Engineering and Science Education Project Before the project, DOST already had a prominent voice on science and technology policy at the highest levels of the administration and government. It chairs the Science and Technology Coordinating Committee and its representation on the new National Coordinating Council for Education places it in a pivotal position as an advisor on industry training policies. The project's contribution was to add authority to the department's advice on the issues affecting the school system. The M&E activities under ESEP produced empirical findings that allow senior managers to offer independent and impartial advice on policy options affecting the education sector. This is very important in a highly politicized school system. The project provided opportunities for staff to become knowledgeable about science and engineering training programs in the school system through working closely with the Council of Engineering Deans, school managers, scientists, and engineers. As the ICR reported, various entities representing education and industry contributed substantially, among them the project Steering Committee and the Advisory Group. The project strengthened communication and advisory channels across bureaucratic boundaries, forging partnership with CHED and DECS, PRC, and major public and private schools. The project promoted lively policy discussion in NEDA between staff in social sector and industry units. The 29 Annex C difficult issue of industry participation in science and technology workforce development was prominent in NEDA debates on DOST's proposals for follow-up projects. The original project provided little in the way of a policy evaluation component and included only one policy study," though DOST mounted several surveys of graduates and employers. Despite the lean provision for formal studies, the project's Policy Action Plan succeeded in focusing the attention of all the stakeholders on three important and difficult areas that called for intervention: the quality and relevance of advanced training in science and engineering; the mismatches between the supply and demand for skills; and the cost-efficiency and financial management capacities of the schools. As discussed above, the project was most successful in promoting higher standards of training provision and less successful in achieving internal efficiencies and restructuring. As a result of the project's policy action plan, DOST has much of the information necessary for it to policy dialogue on these contentious issues. Currently, the department is collaborating with the education sector in an evaluation of the effectiveness of interventions in science teaching. The aim is to intervene in the 25 percent of secondary schools whose students performed least well in an international assessment of science learning achievement. Also, the department is currently developing an analysis of employers' demand for engineering graduates and researchers in the regional labor markets served by the various schools. Some of the themes for policy analysis and evaluation that arose during mission discussions are outlined in Box C4. The items focus around several important questions. Would the policy issues have been resolved without the project interventions? Would they have been more difficult to resolve without these interventions? In what ways have the interventions made a positive difference? How would they be designed and implemented in the light of experience? While the completion report did not cover these larger questions, some of them have been explored in preparation for the follow-up project. Disseminated in imaginative ways, such analyses, based on the partnership that has evolved between DOST and the education system, will help sustain the project's effectiveness in harmonizing education and industry policies. Box C4. Policy Evaluation Themes arising from the ESEP Experience Strengths and weaknesses of the supply-push approach to workforce development. School-by-school analysis of achievements and constraints in restructuring the supply of engineering graduates and researchers. Impact of the training of teachers and researchers on student performance levels. Relevance of teaching and research programs to industry employers. The effect of interventions on schools' resource allocation and financial viability. Sector-wide production of graduates in various fields. Graduate unemployment and over-qualification. Graduate placement, pay, productivity, and career prospects in science and technology employment. Industry needs for science and technology workers in less-developed regions and the mobility of trained graduates. 15. The study was on the S&T industries and focused on preparation of the proposed follow-up project. Three other studies financed under the project were relevant to operational matters including the information management infrastructure. 30 Annex C Second Vocational Training Project In 1994, TESDA was charged with developing a policy agenda to reshape the TVET system for the future. Six years later, the agency is equipped with a reform program focused on achieving a "quality assured technical education and skills development system." The policy framework (Box C2) is in place primarily due to new vision and leadership in TESDA and is only partly attributable to VTP II. Like DOST, TESDA is prominently represented at the highest levels of the administration and government. During implementation, the original focus on developing policies for nonformal TVET and middle-level skills development was diverted by the many constraints faced by the project. Nevertheless, in only one or two years, the project has developed the framework for new policies that are refocused once more on original objectives. The focus on non-school based training provision is likely to have a more immediate impact on the employment and income of the rural poor and women. However, the outstanding policy issue under discussion is how the transition is to be achieved from a 19th and 20th century model of vocational and technical training to one appropriate to the 21St century. While craft and manual skills training still have a place in the less-developed regional economies, the new education policy will focus on equipping future workers with sound basic education that will enable them to succeed in modern workplaces demanding scientific and technological skills. Yet industrial modernization in the country is not yet far advanced and employers are not universally convinced that new entrants to the workforce who possess "middle-level" technical skills will make better workers than current employees. This policy issue is hotly debated and extremely complex. TESDA's senior managers have been highly visible in policy debate and will require a great deal of support from policy analysts in moving in directions on which both industry and the school system can agree. The project was successful in familiarizing over 6,000 TESDA and TVET managers, staff, and partners with policy and cost analysis. The institutional development component alone financed 57 bachelor degrees, 141 masters degrees, and 17 doctorates and by 2000, 40 masters degrees and 3 doctorates had been awarded. This far exceeded targets. According to the feedback routinely requested from participants, the training equipped personnel with relevant knowledge and skills-- from data processing and technology management to research and evaluation concepts and methods, including designing and managing field surveys. The expansion of the original training program occurred because of the need to upgrade ex-NMYC staff and train new staff to replace skills lost due to turnover and extended absence. In addition, repetitiveness and redundancy crept in because the policy direction was not yet established. Lacking a policy framework and a tight link between training and tasks on the job, the investment in human resource development has not developed a critical mass of analytical capacity that TESDA's decision-makers' need. The project was successful in producing 12 major studies, almost all commissioned to local or foreign consulting firms and individuals. TESDA staff say that they learned much about interpretation of data and policy analysis from teamwork with specialists. According to the ICR, the studies were of high quality and went beyond the original policy issues identified. However, as the borrower's contribution to the ICR noted, the absence of a policy framework reduced their relevance, while research findings were difficult to translate into policies, programs, and actions. While the audit confirms that the majority of the studies were of high technical quality, they mainly added to TESDA's inventory of knowledge on the characteristics and functioning of the TVET system. They were probably most helpful to Bank staff and new TESDA managers who sought to comprehend the complexities of the TVET sector but they were not particularly focused on guiding sector policy. Project experience suggests that strong competencies in selecting and supervising consultants, as well as the policy framework, are important elements of evaluation 31 Annex C capacity development. This is an important point because TESDA has chosen to rely not on in- house capacity but on independent evaluation studies done by consultants. Box C5. TESDA Comments on OED's Evaluation of the Studies Program TESDA comments that the reorganization from NYMC to TESDA meant that the focus of the studies was broadened from nonformal training to study of the entire technical and vocational sector. Though study recommendations did not immediately lead to policy decisions, they stimulated thinking, discussion and understanding of the sector and prompted the move to involve the private sector in services and delivery. TESDA points to several interventions that resulted for analysis of sector needs including the Training Contract Scheme, the Training Assistance Contract and the Trainers' Development Programs. In-house capacities in policy analysis were developed among members of the Technical Education and Skills Development Committee and other partners at regional and provincial levels and this helped to ensure that they completed their skill priorities and plans for the national plan for middle-level manpower development. Policy analysis was done mainly through the TVET forum and other consultation. TESDA affirms that the various studies helped TESDA get an increasing share in the appropriation from national government funds. In the final years of implementation, evaluative work became more focused on seeking answers to policy questions from the project's experience. The prospect of an early launch of the ADB project appears to have been a powerful incentive, alongside the government's new policy thrust. In 1999, TESDA commissioned an impact evaluation that goes beyond an account of inputs and outputs. While not using sophisticated statistics, the study is an example of clear links between research findings and evaluative judgements and is concise and readable. A forward--looking study of the decentralization process has provided detailed surveys of the progress in each region and will also be directly useful. In addition, based on the lessons from the various retrospective reviews and new policy directions, the agency commissioned a detailed plan and budget for operations in the transitional months between VTP II and the launch of the ADB project. The project was based on the reasonable assumption that effective policy development and evaluation requires a repository of accessible and up-to-date information on the TVET system and employers. While the purpose for the NMIS was not fully elaborated during project preparation, it was clearly intended, according to the appraisal report, to aid in labor market analysis as a support for sector policy development. During implementation, the project put in place the conceptual architecture and the physical infrastructure for the information systems. Data from central sources are now available. It did not succeed in assembling and inputting the content of databases related to the TVET system and employers from local levels. This was mainly due to the challenges of gathering reliable data from local sources and the fact that tools for TESDA's corporate and project management were a more pressing need. TESDA staff are keen to establish a multidimensional and fully networked capability at all levels of the TVET system and ambitious goal. Over the next five years, the ADB project will support development of an "Educational Management Information System" (EMIS), particularly at local levels. The EMIS is viewed as to support to TESDA in fulfilling all its mandates. Though details have yet to be worked out, ADB views this $1 million investment as an extension to existing MIS capacity and not as an entirely new system.  33 Annex D Annex D. Comments from the Borrower STAND Republic of the Philippines DEPARTMENT OF SCIENCE AND TECHNOLOGY January 24, 2001 MR. ALAIN BARBU Manager Sector and Thematic Evaluation Group Operations Evaluation Department The World Bank Washington, D.C. USA Dear Mr. Barbu: Respectfully forwarded are the comments of this office on the Draft Performance Audit Report (Re: Philippines---Engineering and Science Education Project and Second Vocational Training Project). 1. Annex B, p. 17: request to clarify/ qualify "U" (is this uncertain or unsatisfactory?) rating for the Implementation Status under Supervision for September 1994; 2. Annex C, p. 28, 1" paragraph: selection of the flagship schools particularly for the science sector was based on the accreditation standards of the implementing council/agency, which is Philippine Council for Advanced Science and Technology Research and Development (PCASTRD). It should be noted that PCASTRD started its own scholarship program called Manpower Development Program (MDP) in 1989 and has already established its "network schools". 3. Suggested inclusions for achievements in Science: a. Establishment of a library network among ESEP flagship schools; b. Establishment of laboratory/shared-facility institutions, e.g. the National Chemistry Instrumentation Center (NCIC) at Ateneo de Manila University (ADMU), Computational Science Research Center (CSRC) at University of the Philippines in Diliman (UPD) and the Condensed Matter Physics Laboratory (CMPL) also at UPD, which cater to the needs of the R&D projects conducted by the ESEP flagship schools and other DOST network institutions. 4. Significant contribution of the ESEP: Establishment of M&E for criterion based-assessment of ESEP's schools performance that has been further developed and now adopted by CHED. The findings on DOST's M&E capacity development are factual and fair. Overall, we find that the said Draft Performance Audit Report appropriately reviewed all components of ESEP. May you find these inputs sufficient for your consideration. Very truly yours, DR. R ELIO A PANLASIGUI Acting ecretary ad Undersecretary for R&D Postal Address: P.O. Box 3596 Manila Head Office: Bicutan, Taguig, Metro Manil, Cable Address: SCIENCE MANILA Tel. Nos. 823-80-71 to 82 Telex No. (75) 66819 Fax No. (632) 823-8937  35 Annex D TECHNICAL EDUCATION AND SKILLS DEVELOPMENT AUTHORITY (TESDA) South Superhighway, Taguig, Metro Manila Philippines Page lof 4 Pages FACSIMILE TRANSMITTAL SHEET Date: January 31, 2001 FAX No. (202) 522-3123 FOR : ALAIN BARBU Manager Sector and Thanatic Evaluation Group Opereations Evaluation Dcparanent WB-IBRD International Development Association World Bank Washington DC FROM EDICIO G.DELA TORRE FAX No. (632) 893-21-28 DirecEor-General RE SECOND VOCATIONAL TRAINING PROJECT Cr 2392 PH Comments Draft Performance Audit Report (PAR) Dear MR. BARBU: This refers to your letter dated December 19, 2000 which was received only last December 28, 2000. Attached are our comments on VTP-II draft PAR for inclusion in the final report Thank you and beat regards. EDICO G. DELA RRE cc:. WB Manila Office RECEIVED TIMEJAN. 31. 10:02PM PRINT TIMEFEB. 1. 3:18PM 36 Annex D Comments on VTPn Draft Performance Audit Report (PAR) Reference Statement Comment Page 3 Par. I It was less successful in The reorganization from NMYC to producing evaluedon flndings TESDA has created the imperative that and analytical research of high the focus of the VTPI studies need not relevance in policy fonulation be confined to that which is only useful in non-formal taining and for non-formal training. It had to employment services. The large broaden the scope of its outlook as such training and studies programs there was a need to study the entire were less efficient and effective field of technical vocational education that anticipated in producing and training. While some of the in-house capacities for policy recommendations of the various studies analysis. did not immediately result to policy decisions, it did not start off with the thinking and discussions, which eventually lead to a greater understanding of the TVET sector and the more definitive move to involve the private sector in the various processes in TVET services and delivery and in the development of the TVBT Reforms. Likewise, it is not that in-house capacities for policy analysis was not developed rather, It is on purpose that we had to get champions and stakeholders for ownership and more effective implementation. The project was able to develop the capacities of the Technical Education and Skills Development Committee (TESDC) members at the regional and provincial level and other partners. The capability build up greatly helped to ensure the completion of provincial and regional skills priorities and plans and the National Technical Education and Skills Development Plan (NTESDP). The NTESDP provides the over-all framework and directions for the development of middle level manpower. The TESDCs, on the other hand, are advisory bodies on policies and programs of TESDA at the sub- national level. Our policy formulation processes involve our stakeholders. Policy analysis was done largely through the TVET Forum and other RECEIVED TIMEJAN 31. 10:02?M PRINT TIMEFEB. 1. 3:18PM 37 Annex D consultation. The various studies have also supported TESDA in its effort to get increasing share in the appropriation . _from national government funds. Page 4 Par. I The original focus for M&E The Monitoring and Evaluation of in the project was labor market Training Program (METP) analysis, but this expectation subcomponent in the VTPII was was not clearly defmed and has primarily aimed at developing an MAE been only modestly fulfilled. system for TVET to strengthen training center management and operations and improve the quality of training activitie. As such, while M&E system results are an input to labor market analysis, it was not designed to be for labor market analysis. We agree with the need for monitoring and evaluation for labor market analysis. While we still have to see the full functioning of the NMIS as a tool for labor market analysis, it does have features that can be used for labor market analysis. Page 5, Box I Implementation In terms of providing accurate Performance under Sector information on the performance of the Management and Development providers of training and other of Sector Policies, both racings, occupational services, this was not part Moderately Unsatisfactory, on of the design of the VTPU project, and the areas Providing relevant as such the project should not have been and accurate information on the assessed In this area. There were efforts performance of providers of at producing an inventory of providers training and related and their services, as part of the regular occupational services and programs of TESDA that were running Analyzing sector issues and at the same time as VTPI. VTPII was needs, respectively. implemented as a complement to the other efforts of the organization in monitoring and evaluation. The baseline studies conducted as part of the sector studies, did produce a baseline information on TVET institutions. In terms of the analysis of the sector issues and needs. we think we should at the minimum be rated satisfactory in this area. For as a result of the analysis of the sector needs and issues, project had several intervention measures, such as Training Contract Scheme. Training Assistance Contract, Trainor's Development ProVrams were RECEIVED TIMEJAN. 31. 10:02PM PRINT TIEE7. 1. 3 18PM 38 Annex D vigorously pursued. The studies conducted under the project have likewiye analyzed the various id D and nees of the sector. These issues wer cuhas inpuUed and discussed in the TVET foVy from wher policies wEr efunciatfd and eventually compreheJdthe x implemented. Page 30, Annex C While the audit onfWs The stdies were basically done to that the majority of the studies provide empirical dam and serve to were of high teclmical quality, provide management with a clear they mainly added to TESDA's understanding of the sector so that inventory of knowledge on the subsequent action could be taken, characuerstics and functioning Taken in this light, the studies suved as of the TVET system, iey were inputs to guiding TESDA's decision on probably most helpful to Bank action that affects the sector. Fwther, staff and new TESDA the studies were used in various for, in managers who sought to paricular, the TVET Forumi from comprehend the complexities where the policies and progams were of the TVET sector bu u were initiated/drawn. not particularly focused on i gtidinn sector dlisoo RECEIVED T[MRIJA, 21. 10:a2?c PRINT TMeFE. 1. 3:18P 39 Annex D 9 February 2001 Mr. Alain Barbu Manager Sector and Thematic Evaluation Group, Operations Evaluation Department World Bank Dear Mr. Barbu: This has reference to your letter requesting for comments on the Audit Mission Report for the Engineering and Science Education Project (PH-3435) and the Second Vocational Training Project (PH-2392) We find the report comprehensive, however, we would like to suggest some points to be included in the report. 1. We suggest that a summary of major findings presented in matrix form should be attached to the report. The matrix should include, lessons learned, gaps, and recommendations for improvement, and areas for possible assistance. This will make the report user friendly for any possible intervention e.g. decision making, basis for subsequent projects to be designed. 2. Though the Bank emphasized its initiative on results based management, there was no substantial findings from the two projects to support this initiative. It is worth mentioning that the current Government thrust is also on results based performance and monitoring. With the ICC guidelines incorporating results monitoring and evaluation in proposed projects, this initiative was further emphasized. 40 Annex D 3. There was a mention in the report that during the project design stage M&E component had no clear objective and direction thus delaying its implementation. Likewise, institutionalization/ sustainability of M&E was rated only as likely and there are ample findings in the report to back this up. However, no recommendations were presented to address this issue. Moreover, endengering evaluation activities and optimization of the use of M&E results are issues to be addressed in designing M&E. It is further suggested that the recommendation should include how to address these issues. 4. The monitoring/follow-up is an important component of M&E for both projects and has been an issue still being encountered by TESDA. The report likewise stressed the need for this to assess the effectiveness of the training programs. This also needs to be addressed in the recommendation. 5. We agree with the Missions' findings that there "is a rich stock of local experience and wisdom" in monitoring and evaluation. However, these have to be disseminated and translated into a more responsive M&E in the design of future projects. 6. We hope you find our comments in order. Best regards. Very truly yours, JOSE S. MONTERO Officer in Charge