67664 The World Bank Human Development Network - Education System Assessment and Benchmarking for Education Results (SABER) SAINT LUCIA Education Management Information System (EMIS) COUNTRY REPORT Emilio Porta, Jennifer Klein, Gustavo Arcia and Harriet Nannyonjo February 2012 Acknowledgements This report was prepared by a team led by Emilio Porta, Senior Education Specialist at the Human Development Network/Education at the World Bank; and consisting of Gustavo Arcia, Consultant to the Human Development Network/Education of the World Bank and Senior Economist at Analítica LLC in Miami, Florida; Jennifer Klein, Consultant to the Human Development Network/Education at the World Bank, and Harriet Nannyonjo, Senior Education Specialist, LCSHE, World Bank. The report was prepared under the guidance of Elizabeth King, Robin Horn and Chingboon Lee. The views expressed here are those of the authors and should not be attributed to the World Bank Group. All data contained in this report is the result of collaboration between the authors, the Organization of Eastern Caribbean States, and participants in the benchmarking exercise. All errors are our own. This benchmarking study arose from an active partnership between the Education Reform Unit of the Organization of Eastern Caribbean States (OECS) and the World Bank. The benchmarking exercise was done during an OECS workshop conducted in Castries, St. Lucia, from January 23 to January 28, 2011, with the participation of government officials from Antigua & Barbuda, the Commonwealth of Dominica, Grenada, St. Kitts & Nevis, St. Lucia, and St. Vincent & the Grenadines. A delegate from Montserrat also attended as an observer. The workshop and benchmarking exercise were done under the invaluable leadership of Marcellus Albertin, Head of the Education Reform Unit (OERU) at the OECS. His unflagging support, enthusiasm, and institutional supervision were fundamental for the cooperation of all participants and for the success of the workshop. To him we owe a great deal of gratitude. We would like to thank the OERU staff that helped us with workshop logistics, especially Emma Mc Farlane- Jouavel and Beverly Pierre. We would also like to thank the workshop participants: Doristeen Etinoff, Priscilla Nicholas, and Patricia George from Antigua & Barbuda; Ted Serrant, Robert Guiste, and Weeferly Jules from Dominica; Pauleen Finlay, Michelle Peters, and Imi Chitterman from Grenada; Gregory Julius from Monserrat; Quinton Morton, Ian Gregory, and Laurence Richards from St. Kitts & Nevis; Kendall Khodra, Nathalie Elliott, Sisera Simon, Evariste John, and Valerie Leon from St. Lucia; Dixton Findlay, Keith Thomas, and Junior Jack from St. Vicent & Grenadines; Darrel Montrope, Jacqueline Massiah, Sean Mathurin, and Loverly Anthony- Charles from the OECS. Abbreviations EMIS Education Management Information System MOE Ministry of Education OECD Organization for Economic Cooperation and Development OECS Organization of Eastern Caribbean States SABER System Assessment and Benchmarking for Education Results SEAT SABER EMIS Assessment Tool UIS UNESCO Institute for Statistics UNESCO United Nations Educational, Scientific, Cultural Organization SAINT LUCIA ESTABLISHED  Aspect of Data Quality Benchmark Prerequisites of Quality Established ¤¤¤¢ Assurances of Integrity Established ¤¤¤¢ Methodological Soundness Established ¤¤¤¢ Accuracy and Reliability Emerging ¤¤¢¢ Serviceability Established ¤¤¤¢ Accessibility Emerging ¤¤¢¢ 1 BACKGROUND Education Data in St. Lucia With the growing demand for timely and accurate data, the Ministry of Education (MOE) in St. Lucia embarked on a project to implement an Education Management Information System (EMIS) for all schools across the island. Due to financial constraints, it was initially implemented in public secondary schools, but has expanded over time to include primary schools and Sir Arthur Lewis Community College. EMIS STAFF. At the national level, no additional staff members have been hired to manage the operation of the EMIS, but each secondary school has one additional staff member who was employed and trained to coordinate EMIS activities at the school. Principals and teachers of primary schools were also trained to support the collection of education data through a series of workshops on the Intime Program. EMIS DATA. St. Lucia collects data on primary and secondary schools annually, including: Ü School background information Ü Student data: Enrolment, repeaters, dropouts, transfers, graduates and distribution by age. Ü Staff data: Teachers employed, non-teaching staff, teacher training and teacher movement Ü Conditions of buildings, furniture and equipment Ü Revenue and expenditure Also, Attendance Forms are collected each month that include: Ü Daily student attendance by grade and gender Ü Teacher attendance and punctuality, which is measured on an actual and official basis. FACILITIES AND EQUIPMENT. Computers are available in both primary and secondary schools and additional computers were made available at secondary schools for EMIS use. Network infrastructure at most secondary schools needs repair and upgrading. Electronic EMIS databases and data files reside on the senior statistician’s machine with access privileges granted to other statisticians. Files containing confidential information are password protected. Hard copies of data are stored in filing cabinets under lock and key. DATA COLLECTION. Data is collected from all learning institutions in St. Lucia. St. Lucia’s EMIS connects all public secondary schools with the Maplewood Software, 40 of the 75 public primary schools with the Intime Program, and the Sir Arthur Lewis Community College with the Sonis System. Schools with EMIS facilities may submit data electronically. Annual Education Census Questionnaires are sent out in October of each year with the specification that the data submitted must reflect what exists on October 31st of that year. Attendance data is also collected on a monthly basis using standard attendance forms. DATA PROCESSING. Data is submitted to district offices, where it is verified and validated before being forwarded along to the MOE. Verification and validation are also done at the MOE, and the MOE statisticians address any errors or inconsistencies they find. After verifying and validating the data, they are aggregated and analyzed in Excel. Because of limited personnel, attendance tables are generated each term or sometimes each year. PUBLICATIONS. Two main EMIS documents are published on an annual basis: 1) The Education Digest and 2) The Attendance Report, which is not available to the public. Copies of the Education Digest are distributed to the data providers including schools, MOE departments, all government ministries, and local, regional and international organizations. The report can also be downloaded online through the Central Statistics Office and MOE websites. 2 The EMIS in St. Lucia ESTABLISHED:  In January 2011, St. Lucia’s EMIS was assessed using the SABER-EMIS Figure 1. SABER EMIS Scores in the OECS Assessment Tool (SEAT) and overall, the EMIS was categorized as ESTABLISHED (0.63). Among the six Organisation of Eastern Caribbean States (OECS) countries, St. Lucia’s score was ranked third behind Dominica (0.65) and St. Kitts and Nevis (0.65). St. Lucia had the highest score of the OECS countries on Assurance of Integrity (0.64) and outperformed the OECS average on all of the SEAT’s Aspects of Quality except 1) Methodological Soundness (0.67) and 2) Accuracy and Reliability (0.58). St. Lucia’s lowest score was on Accessibility (0.56), but the score was still above the OECS average. The next sections of this country report will analyze St. Lucia’s performance on the sub- components of each Aspect of Quality in order to present a detailed portrait of the strengths and weaknesses of St. Lucia’s EMIS and many concrete actions that the country can take to improve education data quality. Table 1. SABER EMIS Scores in the OECS Countries (2011) OECS Dominica Antigua Grenada St. Kitts St. Vincent St. Lucia Average Pre-Requisites 0.70 0.52 0.68 0.66 0.45 0.64 0.61 of Quality Assurances of 0.58 0.53 0.61 0.44 0.50 0.64 0.55 Integrity Methodological 0.83 0.50 0.67 0.67 0.83 0.67 0.69 Soundness Accuracy and 0.70 0.48 0.58 0.75 0.53 0.58 0.60 Reliability Serviceability 0.61 0.29 0.50 0.79 0.43 0.68 0.55 Accessibility 0.47 0.47 0.69 0.61 0.36 0.56 0.53 Overall 0.65 0.46 0.62 0.65 0.52 0.63 0.59 Latent Emerging Established Mature 0 – 0.3 0.31 - 0.59 0.6 - 0.79 0.8 - 1 3 PREREQUISITES OF QUALITY ESTABLISHED:  St. Lucia has ESTABLISHED (0.64) the Prerequisites of Figure 2. Prerequisites of Quality Quality (Figure 2) necessary to support an EMIS and was ESTABLISHED the only OECS country to have staff, facilities, computer    resources, and financing commensurate with EMIS activities (Table 2, 0.5). Laws exist to protect confidentiality of individual/personal data and individuals are informed of their rights (0.3). Laws also exist to establish the collection and dissemination of education data, but responsibilities for these actions are not clearly defined by the laws (0.1). Institutions are legally obligated to share data with the MOE, but no penalties are established if institutions fail to report (0.4). No formal agreements exist to ensure data sharing and coordination among agencies, but agencies informally share data and collaborate (0.2). St. Lucia’s only LATENT score resulted from a lack of processes to monitor the quality of data processes (0.9): No formal reviews or external reviews are carried out and user feedback on quality is not collected. Quality is a main objective, but it is not enforced by management (0.8). St. Lucia could further establish the Prerequisites of Quality by formalizing EMIS responsibilities and informal agreements and focusing more on quality. OECS Table 2. Prerequisites of Quality: Subcomponents St. Lucia Benchmark Average Responsibility for collecting and disseminating education data is Established 0.1 0.75 0.75 clearly specified  Emerging 0.2 Data sharing and coordination among different agencies are adequate 0.50 0.50  Individual/personal data are kept confidential and used for statistical Mature 0.3 1.00 0.79 purposes only  Statistical reporting is ensured through legal mandate and/or Established 0.4 0.75 0.58 measures to encourage response  Staff, facilities, computing resources, and financing are Mature 0.5 1.00 0.63 commensurate with the activities  Processes and procedures are in place to ensure that resources are Established 0.6 0.75 0.63 used efficiently  Education statistics meet user needs and those needs are monitored Established 0.7 0.75 0.75 continuously  Emerging 0.8 Processes are in place to focus on quality 0.50 0.63  Latent 0.9 Processes are in place to monitor the quality of data processes 0.00 0.33  Processes are in place to deal with quality considerations in planning Emerging 0.10 0.50 0.58 the stat program  Mechanisms exist for addressing new and emerging data Emerging 0.11 0.50 0.54 requirements  4 ASSURANCES OF INTEGRITY ESTABLISHED:  St. Lucia scored the highest of all the OECS countries Figure 3. Assurances of Integrity in the OECS on the Assurances of Integrity and was classified as ESTABLISHED (0.64). ESTABLISHED    Choices of sources, statistical techniques and decisions on dissemination are sound (Table 3, 1.3) and advance notice of major changes in methodology, source data, and statistical techniques is usually given in publications (1.8). Only informal mechanisms protect the professional independence of the data producing institution, but EMIS staff is aware of established ethical practices and generally adhere to them (1.1). The terms and conditions under which statistics are collected, processed, and disseminated are difficult to find but they are available to the public (1.5). The statistical agency also comments publically on technical errors, provides technical explanations and comments on major misinterpretations (1.4). The professionalism of EMIS staff is currently MATURE because of established guidelines for staff behavior that are actively enforced (1.9). Also, while staff are recruited and promoted based on professional credentials, professionalism could be further promoted by encouraging staff to publish and by establishing a peer review process (1.2). OECS Table 3. Assurances of Integrity: Subcomponents St. Lucia Benchmark Average Emerging 1.1 Statistics are produced on an impartial basis 0.25 0.38  Emerging 1.2 Professionalism of staff is actively promoted 0.50 0.42  Choices of data sources and statistical techniques are made solely by Established 1.3 0.75 0.83 statistical considerations  Agency is entitled to comment on erroneous interpretation and misuse Established 1.4 0.75 0.58 of statistics  Emerging 1.5 Terms and conditions are available to the public 0.50 0.33  Public is aware of internal governmental access to statistics prior to Emerging 1.6 0.50 0.38 their release  Established 1.7 Products of education statics agency are clearly identified 0.75 0.50  Advanced notice is given of major changes in methodology, source Established 1.8 0.75 0.71 data, and statistical techniques  Guidelines for staff behavior are in place and are well known to the Mature 1.9 1.00 0.83 staff  5 METHODOLOGICAL SOUNDNESS ESTABLISHED:  In terms of Methodological Soundness, St. Figure 4. Methodological Soundness in the Lucia’s EMIS is ESTABLISHED (0.67). St. Lucia OECS countries scored just below the OECS average (0.69) and had the same score as both Grenada and St. Kitts (Figure 4). St. Lucia’s overall structure, concepts and ESTABLISHED definitions have proper documentation but    definitions do not conform with regional and international standards (Table 4, 2.1) established by the UNESCO Institute for Statistics (UIS) and the OECS Education Reform Unit (OERU). St. Lucia also follows the International Standard Classification of Education (ISCED) in all education sector data except expenditure data (2.3), which cannot currently be disaggregated by ISCED classification. Expanding the use of ISCED to expenditure data would ensure complete consistency with ISCED and improve St. Lucia’s score on this subcomponent. Currently, St. Lucia’s EMIS produces between 71 to 90 percent of UIS indicators annually, which results in a EMERGING benchmark on the scope of statistics sub-component (2.2). Expanding the scope of statistics produced to 100 percent of UIS and OECD indicators is ideal and can enable additional domestic, regional, and international education policy analysis. OECS Table 4. Methodological Soundness: Subcomponents St. Lucia Benchmark Average Overall structure, concepts and definitions follow regionally and Established 2.1 internationally accepted standards, guidelines, and good 0.75 0.83  practices Scope is in accordance with international standards, guidelines, Emerging 2.2 0.50 0.42 or good practices  Classification systems are consistent with international Established 2.3 0.75 0.83 standards, guidelines, or good practices  6 ACCURACY AND RELIABILITY EMERGING:  The Accuracy and Reliability of St. Lucia’s EMIS Figure 5. Accuracy and Reliability data is EMERGING (0.58). St. Lucia scored slightly below the OECS average (0.60) on this Aspect of Quality, but scored higher than both St. Vincent and Antigua (Figure 5). EMERGING    St. Lucia’s sub-component scores all fall into the EMERGING and ESTABLISHED range with no LATENT or MATURE scores. This indicates that St. Lucia has a foundation for all the sub-components of Accuracy and Reliability and needs to build upon the foundation to improve. For example, St. Lucia could: • improve procedures to update, standardize, and properly reference source data (3.2) • ensure that education data are provided within six months after the end of the school year to other source providers (3.3) • routinely assess other data sources and train staff to handle these data sources (3.4) • always validate intermediate results against other information (3.7) • improve systematic processes to investigate statistical discrepancies (3.8; 3.9) • conduct studies of revisions (3.10) OECS Table 5. Accuracy and Reliability: Subcomponents St. Lucia Benchmark Average Source data are obtained from comprehensive data collection that Established 3.1 0.75 0.58 takes into account country-specific conditions  Data are reasonably confined to the definitions, scope, classifications, Emerging 3.2 0.50 0.50 and time of recording required  Emerging 3.3 Source data are timely (6 months after event) 0.50 0.46  Other data sources, such as censuses, surveys, and administrative Emerging 3.4 0.25 0.42 records, are routinely assessed  Data compilation employs sound statistical techniques to deal with Established 3.5 0.75 0.79 data sources  Other statistical procedures (data editing, transformations, and Established 3.6 0.75 0.63 analysis) employ sound statistical techniques  Intermediate results are validated against other information where Emerging 3.7 0.50 0.67 applicable  Statistical discrepancies in intermediate data are assessed and Established 3.8 0.75 0.92 investigated  Statistical discrepancies and other potential indicators or problems in Established 3.9 0.75 0.71 statistical outputs are investigated  Studies and analyses of revisions are carried out routinely and used Emerging 3.10 0.25 0.33 internally to inform the processes  7 SERVICEABILITY ESTABLISHED:  The Serviceability of St. Lucia’s EMIS data is Figure 6. Serviceability in the OECS ESTABLISHED (0.68) and is far above the OECS average (0.55). St. Lucia’s lowest score on any sub- component was 0.50, which indicates that a strong ESTABLISHED    foundation for Serviceability is currently in place. St. Lucia met the MATURE benchmark for Periodicity by producing an annual census of enrolments, teachers, schools, and financial data (Table 6, 4.1), but the timeliness of releasing the data could be improved. Currently administrative census data are available six to 12 months after the initiation of the school year (4.2) when ideally this data should be released within two months. Time series are available for five to 10 years (4.4), but procedures for data revisions could be improved (4.6). Cross-checks are only done in an ad-hoc fashion (4.3) and comparison checks show that there is roughly a 11 to 20 percent difference between school-reported figures and data from other sources (4.4). St. Lucia could improve its Serviceability by publishing administrative data within two months of the initiation of the school year, increasing the availability of time series data to 10 years, and strengthening systems for revisions and cross-checks. OECS Table 6. Serviceability: Subcomponents St. Lucia Benchmark Average Mature 4.1 Periodicity follows dissemination standards 1.00 0.96  Emerging 4.2 Timeliness follows international dissemination standards 0.50 0.63  Emerging 4.3 Statistics are consistent within the dataset 0.50 0.71  Statistics are consistent or reconcilable over a reasonable Established 4.4 0.75 0.54 period of time  Statistics are consistent or reconcilable with those obtained Emerging 4.5 0.50 0.33 through other data sources and/or statistical frameworks  Emerging 4.6 Revisions follow a regular and transparent schedule 0.50 0.21  Mature 4.7 Preliminary and/or revised data are clearly identified 1.00 0.46  8 ACCESSIBILITY EMERGING:  Accessibility was St. Lucia’s lowest score (0.56/EMERGING), but sub-component scores ranged from Figure 7. Accessibility in the OECS MATURE to LATENT. St. Lucia scored highly on data presentation: EMIS statistics are clearly presented with EMERGING disaggregation and underlying data for charts (5.1). Also, St.    Lucia releases data on a pre-announced schedule (5.3) typically releases data to all users at the same time (5.4). There are procedures in place for releasing non-published data and non-confidential data upon users’ request (5.5), and metadata are available, but not publicized (5.6). St. Lucia could improve Accessibility of the EMIS by: • Disseminating data electronically (5.2). • Making metadata available to the public (5.6). • Creating a data catalog so users can request data in a level of detail that meets their needs (5.7). • Creating a catalog of publications and services (5.9). • Identifying contact points to offer assistance to users (5.8). Accessibility is one of the key missions of an EMIS because it creates and maintains the public image of the EMIS and enables greater accountability. It is imperative for all levels of administration in St. Lucia to focus on developing a more accessible EMIS. OECS Table 7. Accessibility: Subcomponents St. Lucia Benchmark Average Statistics are presented to facilitate proper interpretation and Mature 5.1 1.00 0.96 comparisons (layout, clarity of texts, tables, and charts)  Emerging 5.2 Dissemination media and format are adequate 0.25 0.54  Mature 5.3 Statistics are released on a pre-announced schedule 1.00 0.38  Established 5.4 Statistics are made available to all users at the same time 0.75 0.79  Mature 5.5 Statistics not routinely disseminated are made available upon request 1.00 0.75  Documentation on concepts, scope, classifications, basis of recording, data sources, and statistical techniques is available, and differences Established 5.6 0.75 0.58 from internationally accepted standards, guidelines, or good practices  are annotated Latent 5.7 Levels of detail are adapted to the needs of the intended users 0.00 0.38  Emerging 5.8 Contact points for each subject field are publicized 0.25 0.38  Catalogs of publications and other services, including information on Latent 5.9 0.00 0.00 any charges, are widely available  9