GOVERNMENT ANALYTICS IN EUROPE MAKING PUBLIC DATA COUNT Zahid Hasnain, Ayesha Khurshid, Timothy Lundy, and Daniel Rogger GOVERNMENT ANALYTICS IN EUROPE Making Public Data Count Z A HI D HAS NAI N, AY E S HA K HUR SHI D, T I MOT HY LUNDY, A ND DA NI EL R OGGER © 2024 International Bank for Reconstruction and Development / The World Bank 1818 H Street NW Washington, DC 20433 Telephone: 202-473-1000 Internet: www.worldbank.org This work is a product of the staff of The World Bank with external contributions. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy, completeness, or currency of the data included in this work and does not assume responsibility for any errors, omissions, or discrepancies in the information, or liability with respect to the use of or failure to use the information, methods, processes, or conclusions set forth. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. Nothing herein shall constitute or be construed or considered to be a limitation upon or waiver of the privileges and immunities of The World Bank, all of which are specifically reserved. Rights and Permissions The material in this work is subject to copyright. Because The World Bank encourages dissemination of its knowledge, this work may be reproduced, in whole or in part, for noncommercial purposes as long as full attribution to this work is given. Any queries on rights and licenses, including subsidiary rights, should be addressed to World Bank Publications, The World Bank, 1818 H Street NW, Washington, DC 20433, USA; fax: 202-522-2625; e-mail: pubrights@worldbank.org. Cover image: © Pixel Matrix / Adobe Stock. Used with the permission of Pixel Matrix / Adobe Stock. Further permission required for reuse. Cover design: Bill Pragluski, Critical Stages, LLC. Project Collaborators Columbia University | The George Washington University | Universitat Pompeu Fabra | University College London | The World Bank Columbia University Michael Best Assistant Professor of Economics The George Washington University Alessandra Fenizia Assistant Professor of Economics Universitat Pompeu Fabra Gianmarco León-Ciliotta Associate Professor at the Department of Economics and Business University College London Christian Schuster Professor in Public Management The World Bank Zahid Hasnain Lead Governance Specialist in the Governance Global Practice Daniel Rogger Research Manager in the Development Impact (DIME) Department Contents Acknowledgments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii About This Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Chapter 1 The Power of Measurement and Analytics for Better Public Administration. . . . . . . . . . . . . 1 Chapter 2 Understanding Public Administration from Administrative Data . . . . . . . . . . . . . . . . . . . . . . 11 Chapter 3 Understanding Public Administration from Surveys of Public Servants . . . . . . . . . . . . . . . 19 Chapter 4 Understanding Public Administration through Impact Evaluations . . . . . . . . . . . . . . . . . . . 29 Chapter 5 Understanding Public Administration from the Outside . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Chapter 6 Croatia: Influencing Productivity through Improved Management . . . . . . . . . . . . . . . . . . . 47 Chapter 7 Estonia: Influencing the Local Policy Environment with Data . . . . . . . . . . . . . . . . . . . . . . . . 51 Chapter 8 Lithuania: Confronting Bias with Survey Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Chapter 9 Romania: Restructuring Recruitment to Improve Public Administration . . . . . . . . . . . . . . . 59 Chapter 10 The Slovak Republic: Benchmarking Local Government Performance. . . . . . . . . . . . . . . 63 Appendix A Project Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 BOXES 1.1 Regional Variation in Government Quality in Europe. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2 We Didn’t Say This Was Going to Be Easy: Key Challenges for Government Analytics in Europe. . . . . . . . 8 2.1 DIME Analytics Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.2 Measuring Information Use in Government. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.3 What Might Be: Making Administrative Data Work Better . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.1 Existing Survey Efforts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 4.1 How Do Impact Evaluations Work? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.2 Development Impact at the World Bank. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 7.1 Using Administrative Data to Improve Health Care in Estonia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 10.1 Analytics Teams in Public Administration Worldwide . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 v FIGURES 1.1 Variation in Productivity of Business License Case Processing by Individual and District Office, the Slovak Republic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.2 Map of Public Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.1 Major Sources of Administrative Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.1 Management Quality across Public Administration Organizations, Lithuania. . . . . . . . . . . . . . . . . . . . . . . . 22 3.2 Job Satisfaction by Country in Europe. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 B3.1.1 Leadership (Communication of Mission) by Country. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.1 Impact Evaluation Cycle. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 B4.2.1 DIME’s Operating Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 4.2 Ministry of Health Email Sent to School Administrators in Lithuania . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 5.1 Citizens’ Satisfaction with Local Government by Region, Estonia. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 5.2 Public Sector as a Share of All Workers with Tertiary Education, by Country. . . . . . . . . . . . . . . . . . . . . . . . . 43 5.3 Public Sector Wage Premium by Country. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 5.4 Public Sector Wage Premium by Region over Time, 2004–18 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 6.1 Average Motivation Level by Central Government Organization, Croatia. . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 A.1 Project Timeline. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 MAPS 2.1 Variation in Productivity of Social Security Case Processing, Italy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2 Variation in Productivity of Environmental Departments, the Slovak Republic. . . . . . . . . . . . . . . . . . . . . . . . 13 3.1 Management Quality across Local Governments, Estonia. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.2 Management Quality across District Offices, the Slovak Republic. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.3 Share of Public Servants Who Received a Performance Evaluation across Local Governments, Estonia. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 4.1 Variation in Schools’ Uptake of Mental Health Training by Region, Lithuania. . . . . . . . . . . . . . . . . . . . . . . . . 34 5.1 Citizens’ Perceptions of Priority Alignment with Local Governments, Estonia. . . . . . . . . . . . . . . . . . . . . . . 40 5.2 Public Sector Employment as a Share of Paid Employment, by Country. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 7.1 Level of Governance Quality by Local Government in Estonia, 2021. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 8.1 Share of Students Who Experienced Mental Health Challenges by Municipality, Lithuania, Fall 2020. . . . 55 10.1 Government Effectiveness Scores by Region in the Slovak Republic, 2021. . . . . . . . . . . . . . . . . . . . . . . . . . 63 vi Contents Acknowledgments This report is the product of the Bureaucracy Lab, a partnership between the World Bank’s Governance Global Practice and Development Impact (DIME) Department. It is part of a multi-country collaboration with the European Commission (EC), funded via the Part II Europe 2020 Programmatic Single-Donor Trust Fund with the EC (TF073353), the objective of which is to empirically understand the personnel determi- nants of, and mechanisms influencing, productivity in public administration and service delivery units in EU Member States. This report was produced by a core team consisting of Zahid Hasnain, Ayesha Khurshid, Timothy Lundy, and Daniel Rogger. The report was produced under the overall guidance of Arianna Legovini, Roby Sende- rowitsch, and Fabian Seiderer. We would like to thank current and former consultants at the Bureaucracy Lab who worked on the anal- ysis described in this report, including Jolanta Blazaite, Mantasha Husain, Patrik Jankovic, Robert Lipinski, Dimitrie Vasile Mihes, Patricia Rose Paskov, Gailius Praninskas, Ravi Somani, and Katre Väärsi. We would also like to thank Roby Senderowitsch (Practice Manager, Governance Global Practice, Europe and Central Asia), Pedro Arizti (Senior Public Sector Specialist), Reena Badiani-Magnusson (Senior Economist, Program Leader, EECDR), and the Governance ECA Ops Team for their overall advice, guidance, and support. The research described in the report also benefited from the guidance of Michael Best (­Columbia ­ University), Alessandra Fenizia (George Washington University), Gianmarco León-Ciliotta (­Universitat Pompeu Fabra), and Christian Schuster (University College London). We would like to express gratitude to the government officials who were partners in the research described in this report. In Croatia, the team would like to thank the staff in the Ministry of Justice and Pub- lic Administration. In Estonia, the team would like to thank officials in the Ministry of Finance (MoF)—Karl Annus, Ats Aasmaa, Andreas Aljas, Andrus Jõgi, Kaie Küngas, Mari Kalma, and Piret Zahkna. In Lithua- nia, the team would like to thank officials in the Government Strategic Analysis Center (STRATA)—Dalia Bagdžiūnaitė, Dovilė Gaižauskienė, and Viktoras Urbis—the Ministry of Health (MoH)—Laura Masiulienė, Ignas Rubikas, and Simona Bieliūnė—the Ministry of Education, Science and Sports (MoESS)—Gražina Šei- bokienė, Laima Rutkauskienė, Jolanta Navickaitė, and Bagdonaitė Jurgita—and the Ministry of Social Secu- rity and Labor (MoSSL)—Jolanta Sakalauskienė and Tania Gurova. In the Slovak Republic, the team would like to thank the Ministry of Interior (MoI) and, in particular, the Institute for Public and Security Analyses (IPSA) and its director Tomáš Černěnko, department manager Marek Mathias, and former counterparts Markéta Tomaga, Alfonz Aczel, and Veronika Ferčíková, now a part of the Ministry of Regional Affairs and Agriculture. Finally, special thanks go to officials from all the public organizations who supported the imple- mentation of the surveys discussed in this report and who took the time to respond to them. Finally, we are grateful for the close collaboration and continuous support of officials from the ­ Directorate-General for Regional and Urban Policy (DG REGIO) and Directorate-General for S ­ tructural Reform Support (DG REFORM) of the EC, particularly Lewis Dijkstra, Philippe Monfort, and Mina Shoylekova. vii About This Collection This report is part of a collection examining how analytics using government microdata is revolutionizing public administration throughout the world. The collection is based on The Government Analytics Handbook, a comprehensive guide to using data to understand and improve government. The reports in this collection aim to help public servants apply lessons from the Handbook to their own administrations by describing the unique opportunities and challenges for government analytics that arise in different regions. No two regions, countries, administrations, or organizations are alike—that’s why using microdata to measure, understand, and improve government is so important! The general principle of the Handbook is as follows: governments across the world make thousands of personnel management decisions, procure millions of goods and services, and execute billions of processes each day. They are data rich. And yet there is little systematic practice to date that capitalizes on these data to make public administrations work better. This means that governments are missing out on data insights to save billions in procurement expenditures, recruit better talent into government, and identify sources of corruption—to name just a few! The Handbook seeks to change that. It presents frontier evidence and practitioner insights on how to leverage data to make governments work better. Covering a range of microdata sources—such as administra- tive data and public servant surveys—as well as tools and resources for undertaking analytics, it transforms the ability of governments to take a data-informed approach to diagnose and improve how public organiza- tions work. Throughout this report, we’ll use gold callout balloons to point out chapters in The Government Analytics Handbook where you can learn more about each topic we discuss. ix CHAPTER 1 The Power of Measurement and Analytics for Better Public Administration QWhat is government analytics— A and why does it matter? Government analytics means looking at how well public administration is working—often using data you already have—and applying the evidence to make it work better! Léon de Pas, Europa Riding the Bull, Brussels, Belgium. Photo by JLogan, CC BY-SA 3.0. 1 “ It’s not only the laws that are important: the building of institutions is also important, and so is the training of people. You may have very ” good laws, but if you have a bad civil service then you have obstacles to running your country well. – Heiki Loot, former Secretary of State of Estonia A DATA REVOLUTION IN GOVERNMENT The last three years have seen more data generated and stored than in all prior human history. The private sector has generated trillions of dollars in value by creating and managing those data. It has also driven economic and social advancement by discovering new ways to analyze them, like machine-learning ­ algorithms. Data analytics has led many companies to entirely new business models—and these models ­ have revolutionized productivity! So why aren’t these data being used to measure and improve government? If you are a public sector official, manager, or leader, the time has come to ask yourself, “What can I do to make public data count?” Governments are largely missing out on the benefits of data analytics, despite being rich in data. It is rare to find a government unit whose purpose is to turn data into greater effectiveness. And we don’t necessarily mean minute-by-minute monitoring of government activity. Many government organizations don’t even assess their work environment annually through staff feedback or compare the prices they are paying for similar goods and services. Not everyone has a daily diagnostic, but an annual health checkup—like most of us have with our doctor—is a smart idea. If you are a public sector official, manager, or leader, the time has come to ask yourself, “What can I do to make public data count?” If you feel intimidated by the idea of using data in your work, we wrote this report to show you where to find data to measure government and how to use them to drive transformative change. The report is designed especially for public servants in Europe, where high levels of digitalization and critical government involvement in economic productivity make the need for an efficient, data-driven public administration vital (World Bank 2021). After reading this report, you’ll be ready to start solving real problems in public admin- istration by bringing the data revolution to government. HOW TO USE THIS REPORT This report is based on The Government Analytics Handbook, a comprehensive guide to using data to transform public administration, which brings together evidence and insights from all over the world into a one-of-a-kind resource on everything from data ethics to survey design. To make these insights more acces- sible, we’ve distilled the Handbook’s essential lessons for public servants in Europe and illustrated them with relevant examples. In this report, you’ll find answers to your biggest questions about government analytics in Europe and signposts pointing you directly to chapters from the Handbook (in the gold callout balloons) and other tools you need to transform your administration. 2 GOVERNMENT ANALYTICS IN EUROPE All government organizations can use the data already at their fingertips to understand themselves and make improvements. If you’re worried you haven’t been collecting any data to begin with, we’ll show you how to repurpose the data you already encounter in your day-to-day work. As we show in chapter 2 of this report, governments are already producing more data than ever before! All government organizations, in all regions, at all stages of administrative development can use the data already at their fingertips to understand themselves and make improvements. You might also be wondering whether it’s even possible to measure the foundational elements of effective public administration, like management quality or employee motivation. In chapter 3, we’ll show you how to use surveys of public servants to look inside the public service. Of course, it’s not enough to collect and analyze data without using it to improve public administration. In chapter 4, we’ll describe how impact evaluations (experiments to better understand how programs and policies work) can help decision-makers use data analytics, in real time, to make better policy choices. We also know that government data are not always widely accessible—even to public servants. Restrictions on data access and use are real. (We discuss some restrictions within the European Union in ­ box 1.2.) In chapter 5, we’ll explain how to use data from outside government to deepen your understanding of government outcomes and the relationship of the public and private sectors. This report on Europe is the result of a project by the World Bank’s Bureaucracy Lab called EU ­ Measuring and Evaluating Determinants of Public Administration Productivity. Conducted between 2019 and 2024, the project aimed to pilot the approaches to understanding government at the regional level ­outlined in the Handbook and this report. Throughout the report, we’ll illustrate the successes and challenges of government analytics with examples from four European Union (EU) countries the project covered—­ Croatia, Estonia, Lithuania, and the Slovak Republic—as well as from related work in Romania. After the main chapters of the report, short case studies give an overview of the Bureaucracy Lab’s work in each c­ ountry, so you can see how different elements of government analytics fit together to help solve problems in different contexts within Europe. Appendix A at the end of the report gives background real ­ information about the project as a whole. BETTER MEASUREMENT FOR BETTER PUBLIC ADMINISTRATION Careful measurement that informs conversations about what is and isn’t working is essential for any administration to be effective. Why is good measurement important in the first place? Simply put, better measurement means better public administration, leading to better outcomes. Careful measurement that informs conversations about what is and isn’t working is essential for any administration to be effective. And an effective administration ensures that policies are well implemented and citizens receive high-quality services, like education and health care. Of course, there is such a thing as too much measurement—but most public organizations in Europe are far from that situation. For example, regional variation in government effectiveness is often poorly understood. Public adminis- tration in Europe could be greatly improved through data-driven measurement of regional variation within countries. In Italy, for instance, the productivity of social security claim processing could be boosted by at least 7 percent simply by reassigning the best managers to the largest offices (Fenizia 2022). Even small, low- cost adjustments can lead to extensive improvements when they echo across the public service. CHAPTER 1: THE POWER OF MEASUREMENT AND ANALYTICS FOR BETTER PUBLIC ADMINISTRATION 3 The size of public administration means that the impact of good measurement goes beyond the public s ­ ervice itself. Worldwide, wages for public sector employees amount to about 10 percent of countries’ gross domestic product (GDP), and public procurement accounts for roughly 12 percent (Bosio and Djankov 2020; World Bank 2019). Reassigning the best managers to the right organizations may literally change a country’s GDP. Governments are also large employers: nearly 24 percent of the EU workforce is employed in the public sector. That’s a lot of people to manage. And the influence of public employment policy isn’t limited to the public sector. Decisions about public sector wages, spending, management, and policy affect the whole economy. When a public sector employee gets paid more, their friends and neighbors expect to see an increase in their own private sector wages. When we measure government, we empower officials to make better decisions, making the organiza- tions, units, and individuals in public administration more effective at implementing programs and policies that serve citizens. If we look at it this way, better measurement might just be the best approach to trans- forming public policy. CHALLENGES FOR GOVERNMENT ANALYTICS We all want to do the best with what we invest in the public sector. So why haven’t governments followed the private sector in using data analytics to drive transformative change? Government analytics poses unique demands because public administration is very different from a pri- vate sector firm. The mission of public administration is multidimensional, meaning that you can’t just look at the bottom line to evaluate the quality of outcomes. Not everything that matters in public administration can be measured, so it’s important to respect the limits of measurement and recognize that data analytics can’t replace other kinds of knowledge. Public organizations are also large and complex, they take a long time to change, and changing them sometimes requires navigating thorny political challenges, as well as the ethical concerns that come up when collecting data on public employees. Governments also face unique analytics challenges—see box 1.2 for a discussion specific to Europe. These challenges might leave you feeling like you don’t know where to begin using analytics to improve public administration. That’s where we come in. Each chapter in this report will give you a new perspective to understand how well government is working, as well as real-world examples of how effective analytics can strengthen the quality of public administration. We can offer you the tools to become more aware of the complex environment in which public policy is made—but it’s up to you to discover how it can be transformed. For a guide to measuring what matters in government, and respectfully navigating ethical concerns, see chapters 4 and 6 of The Government Analytics Handbook. GOVERNMENT COMPLEXITY AT THE MICRO LEVEL To improve government functioning at scale, we need to build on existing efforts to measure government— but we need to go much deeper too! Many governments already use their data for processing or auditing government business. This is important, but process and audit data can be put to other uses as well. In fact, these data paint a picture of the life of an individual transaction: When did it begin? Who was it for? Who was involved? When did it end? Add up all these glimpses and you get perhaps the richest, most granular picture of how public administration works that has ever been produced! More and more individual officials have already discovered that they can use these data to undertake some form of government analytics. Human resource officers have run surveys to monitor how happy and 4 GOVERNMENT ANALYTICS IN EUROPE motivated employees in their organizations are and track changes in attitudes over time. Ministry of Finance members have begun using payroll records to predict future wage costs. We need to gather these individual efforts into a systematic approach to monitoring, diagnosing, and nudging government in the right direction. We need microdata if we’re going to understand how public administration works in all its complexity. To make analytics work across an entire government, we need data that are specific to particular questions, particular parts of public administration, and particular regions. We need microdata if we’re going to under- stand how public administration works in all its complexity. Microdata are exactly what they sound like: very precise data points that give you a picture of individuals— individual people, households, or organizational units. Microdata allow you to measure and analyze relationships among granular phenomena—for instance, how the management approach of an individual manager affects the productivity of a particular team. All the data we describe in this report are really microdata: information about the particular people, processes, decisions, and outcomes that make up public administration. Why isn’t it enough to know what is going on at the national level? Countries—and people—are naturally diverse, so there is no reason to expect that the government should be the same across a nation. But par- ticularly in public administration, top-down attempts to control behavior, such as laws, often don’t work to impose sameness across a country. Instead, it’s the culture and management quality of particular offices and organizations within a country that determine how well policy gets translated into practice. The public service is a very diverse place, in contrast to its staid stereotype. Until we look at public administration at the suborganizational or subnational level, using microdata, we don’t know what we will find. And—as box 1.1 shows—there is a huge amount of variation out there to find! Box 1.1  Regional Variation in Government Quality in Europe We know from experience that people are naturally diverse. But we don’t always apply that understanding to government and public administration. Measurement at the national level or in the central government can be a good starting point, but it won’t tell us much about how policies and programs are being put into practice throughout a country. We know that government quality varies greatly from country to country within Europe—and so do the structure and capacity of public administration, the size of the public service, and public trust. Depending on where you live in Europe, 68 percent of the population might trust the national government (Luxembourg)— or only 15 percent (Croatia) (Mackie, Moretti, and Stimpson 2021). This variation is significant, but we’ll never understand why it exists without looking more carefully at the variation within countries themselves. When we look closer, we find that government quality can vary enormously even within a single country. In Bulgaria, the region with the highest quality of government ranks well below the lowest-scor- ing region in Estonia—and the highest-scoring region in Estonia ranks about the same as the lowest in Sweden (Charron, Lapuente, and Bauhr 2021). If we look only at the national level, we miss the fact that moving between regions in Estonia might change the quality of the government services you receive far more than moving to a different country. To see how policy becomes practice, resulting in very different levels of government quality and public trust, we need to look at the way particular offices and organizations work in different regions of a country. The need for this deeper understanding is one of the essential lessons about government analytics in Europe that we lay out in this report. CHAPTER 1: THE POWER OF MEASUREMENT AND ANALYTICS FOR BETTER PUBLIC ADMINISTRATION 5 Microdata build on existing systems for measuring government, giving public sector officials, managers, and leaders a clearer sense of what is happening within their own administrations. Again and again, this report will include examples of how government analytics looks different when we focus our attention on regions, organizations, and local governments within European countries, rather than on the central govern- ment or the country as a whole. To illustrate, let’s take an example from the Slovak Republic. Figure 1.1 shows how productivity in busi- ness license case processing varies across district offices and individual officers. The number of cases an offi- cer is able to complete in a month varies immensely, from nearly 80 to just a small handful. If you’re a citizen, this means that crossing an internal border, or even just working with a different officer, could mean waiting twice as long for a license! Even in a very tight space, government can look radically different depending on which office and which individual public servant you work with. FIGURE 1.1  Variation in Productivity of Business License Case Processing by Individual and District Office, the Slovak Republic 80 70 60 50 Number of Cezir cases 40 30 20 10 0 Le olie Ča ice eb in Pieľčanc Ve šť y St ý Kr y ro tíš ov o va lac u na Koišov Pr sto a iev vo Ba volea rd n Tr ejov šic Ž ava áh ce Se om a Pe NR ok Mi mb láš ns B lovc k ká re e iš H Bys no Žia ská um trica va ad vá é k ro s sk žň a o va Do Sv bota K ík Tr ubín To uče čín ov Br má ov na atis rno Ke Sen ky ov ma ec á k Du ad Ga mky va jsk ra ta St ká BStre u ará y da NáĽuborica k a Tr art a žo iku a Rim á By nom žs H Ve ľk an po ne é Z ro ch ero av Ro stric me vň Z idz nic e-o ilin M dc Ru ý M Šaľ Ma pľo žs á vo Po r n No enn lný idn Po na eb lan áS a d T la d V ši pk Ko reš zin z L en v n st a P ž B sk Ko N tov sto en Ba an Lip Me Sp vc Vr no vé Bá No District Individual productivity District o ce productivity Source: Cezir (2015–19) data set, Ministry of Interior of the Slovak Republic. 6 GOVERNMENT ANALYTICS IN EUROPE FINDING GOVERNMENT MICRODATA At this point, you’re probably wondering, “Where do all these microdata for measuring government come from?” The good news is there have never been more microdata available for government analytics! Figure 1.2 illustrates just some of the different inputs, processes, and outcomes of public administration that ­ can be studied through government microdata. As we’ve discussed, governments are data rich. In Europe, many government systems are digital, mean- ing that every interaction, transaction, and case leaves an electronic record. Government payroll, procure- ment, and budget disbursement are all tracked in electronic databases. Every time a judge makes a court rul- ing, a firm submits a procurement bid, or a human resources department sends an employee their monthly paycheck, a microdata point is created in a digital system. Most of these microdata aren’t produced with analytics in mind—they are produced to help public employees do their jobs. But by changing the way we look at these microdata, we can use them to decode public administration. In chapter 2, we’ll show you how to do this. Digital systems have also made it easier for governments to collect survey data. It is now possible for gov- ernments to survey all their employees online, and the number of countries that conduct this kind of survey has increased continuously over the past decade! These surveys are unparalleled for understanding public servants’ motivation, their work environments, and how well they are managed. We’ll talk more about this data source, and how to use it, in chapter 3. Governments not only have more data at their fingertips than ever before; they have already developed the capacity to collect and analyze data about citizens, firms, and the environment to improve public policy and services. Some of these data can even be used to analyze government itself, as we’ll explain in chapter 5. The time is right for governments to take the same analytical approach to their own administrations that they do in these other areas. FIGURE 1.2  Map of Public Administration Payroll Public Impact Human resources administration Management practices Workplace culture Budget Employee motivation Services Procurement Source: Original figure for this publication. FROM ANALYTICS TO TRANSFORMATIVE ACTION The goal of government analytics isn’t just to gather evidence of what is working and what isn’t in govern- ment—it’s to put that measurement into practice to improve public administration. But it can’t do every- thing. Most importantly, analytics can’t replace good management, and it will never be a substitute for the extensive experience and deep knowledge of good public sector managers. If measurement can’t replace management, what can it do? CHAPTER 1: THE POWER OF MEASUREMENT AND ANALYTICS FOR BETTER PUBLIC ADMINISTRATION 7 Evidence enables managers to apply their knowledge and experience where they matter most. First, government analytics can provide better evidence for public officials to make better decisions, serving as a complement to their own knowledge. For example, when managers know that district productivity differences are driven by individual productivity differences, they can focus on improvements that will make the biggest impact: enhancing recruitment, improving management, and reducing turnover. Evidence enables managers to apply their knowledge and experience where they matter most. Analytics can also go beyond measurement to test new ways of working, helping public officials improve the effectiveness of public administration. Well-designed impact evaluations—“trial-and-adopt” experi- ments—help decision-makers get to the heart of what works and what doesn’t, in real time. We’ll talk more about impact evaluations—and the cutting-edge government analytics they make possible—in chapter 4. Finally, government analytics can increase the accountability of the public service to those on the out- side, helping to gauge and improve good governance. The Estonian government, for example, maintains a policy dashboard that compares service delivery across 79 local governments each year. The Bureaucracy Lab developed recommendations to increase awareness of the dashboard among citizens, the media, and civil society organizations. When an analytics tool becomes an accountability mechanism, the public is empow- ered to keep tabs on their local government, see how it compares to others, and participate in improving it. There is no one approach to government analytics. That’s why this report is aimed at reaching out to public servants across Europe—we want to encourage you to do analytics your own way. Each chapter of this report takes a different perspective because there are as many applications of analytics as there are unique policies, programs, and individuals in government! (box 1.2). For a deeper introduction to the power of government analytics, see chapters 1–3 of The Government Analytics Handbook. Box 1.2  We Didn’t Say This Was Going to Be Easy: Key Challenges for Government Analytics in Europe In this report, we highlight some successful examples of government analytics in Europe, as well as the enormous potential for unlocking greater insights from administrative and survey data. But we need to be realistic about the challenges as well. The World Bank faced a number of different challenges in the course of its project in Europe, includ- ing data access and a lack of senior management demand for analytics. We’ll describe our work in the Slovak Republic to illustrate these challenges (you can learn more about that work in the chapter 10 Chapter case study). These challenges are reflective of those faced throughout the entire project, to differing extents. Making progress in government analytics will mean taking a careful look at these chal- lenges to learn from past experience. (continues on next page) 8 GOVERNMENT ANALYTICS IN EUROPE Box 1.2  We Didn’t Say This Was Going to Be Easy: Key Challenges for Government Analytics in Europe (continued) Challenge 1: Measurement Context Challenge 1: Measurement Context Every administrative data set has insights to offer. But we can’t unlock these insights without being able to put these data in context. In the Slovak Republic, the World Bank’s Bureaucracy Lab wanted to esti- mate productivity across district offices. One of the data sets we used came from a case-­ management system called Fabasoft, which let us measure the number of business licensing cases undertaken by individual public officials. These numbers tell us something about productivity, but it’s challenging to interpret them accurately without a better understanding of the context in which these cases were undertaken. Different offices might use the Fabasoft system differently, and some offices might have more complex cases than others. Individuals with different job titles might also use the system differ- ently. The centralized understanding of the diverse district office environments in which individuals work, and thus our holistic understanding of the data, was relatively limited. Challenge 2: Data Integration Integrating multiple data sets can yield much more powerful insights than a single data set alone. But integrating data sets can be challenging. In the Slovak Republic, we worked with two administrative data sets. As we mentioned above, the Fabasoft data let us understand the productivity of individual public officials, but not the context of the district offices in which they worked. Another data set, from a database called Cezir, let us see the progress of business licensing cases across district offices. Both of these data sets had information we needed, but they didn’t fit together neatly: Fabasoft told us more about individ- uals than offices, while Cezir told us more about offices than individuals. To get the fullest picture and deepest insights about productivity across district offices, it would have been useful if these data sets had been built on a common set of identifiers. Similarly, human resources data were costly and compli- cated to integrate with the two productivity data sets. The challenges to integrating the productivity and personnel data—which were never overcome—limited the insights we were able to glean from the data. Challenge 3: General Data Protection Regulation (GDPR) Interpretation In Europe, the GDPR governs the collection and use of data about living people. The GDPR is a crucial set of legal protections that defend individual privacy and prevent personal data from being misused. The importance of regulations like the GDPR makes negotiating the balance between privacy and data use a critical challenge in the European Union context. In the Slovak Republic, the human resources department felt constrained in sharing its data with other members of the public service, as well as exter- nal actors such as the Bureaucracy Lab, due to its interpretation of the GDPR and the Ministry of Interior’s rules. Assessing the balance between privacy and the benefits of integrated data use and ensuring consistent interpretation of the GDPR are central challenges to government analytics in Europe. Challenge 4: Demand by Senior Managers The success of government analytics is affected by political interest in the corresponding results. In the Slovak Republic, buy-in for the public servant survey that aimed to capture information about manage- ment practices and attitudes was mediated by the engagement of managers in different district offices. This affected both the reach of the survey—through gaining contact information for public officials— and the attention it received—with only a few managers encouraging public officials to respond to the survey. When analytics is not perceived as useful for the senior decision-maker’s current priorities, the quality of any primary data it influences will be lower, and staff will be less engaged in analytics. CHAPTER 1: THE POWER OF MEASUREMENT AND ANALYTICS FOR BETTER PUBLIC ADMINISTRATION 9 CHAPTER 2 Understanding Public Administration from Administrative Data QI haven’t been collecting data on how well A my government is working—what do I do? You have! Repurpose your administrative data into analytical data. Phlegm, Calibrating the Seeing Device, Zagreb, Croatia. Photo by Doris Baric, CC BY-SA 4.0. 11 “ The only true voyage of discovery . . . would be not to visit strange lands but to possess other eyes. – Marcel Proust ” LOOKING AT EVERYDAY DATA WITH NEW EYES Where does government analytics begin? The best place to start is with the microdata that are right under your nose—data you might not even notice. By the end of this chapter, we want you to think differently about the huge amount of information points you encounter in your everyday work. When you complete a form, type a number into a spreadsheet, update an information system, or even just stamp a document with today’s date, you’re not just doing your job—you’re also creating data! These data can tell a story about how you go about accomplishing tasks, about the resources you need and the obstacles you encounter, about the people who influence, build on, and depend on your work—and so much else. But they don’t tell stories on their own: we first need to repurpose them, sort them, compare them, and learn how to look at them. Looking at ordinary government data with new eyes can lead to extraordinary insights. Looking at ordinary government data with new eyes can lead to extraordinary insights. Let’s consider an example. Governments regularly produce case files as part of their day-to-day operations—think of the record that is created when a citizen applies for a business license or files for social security benefits. Each day, public servants use these files to move citizens’ cases forward and ensure they receive the services they need. But what happens if we take a second look at these case files—not as administrative records but as data? By looking at the time stamps on a case file, we can see how long it took to complete the case from beginning to end. If we look at a whole set of case files this way, we start to get a picture of the overall productivity of an office— how long citizens must wait for their cases to be completed. And if we compare productivity across a whole country, we can tell a story about how service quality varies in different regions. Not bad for a simple case file! ADMINISTRATIVE DATA IN ACTION The two figures below show what this kind of government analytics looks like in action. Map 2.1 shows how many social security cases were processed by offices across Italy. Map 2.2 shows the mean monthly produc- tivity in environmental departments across different districts in the Slovak Republic. In both cases, we see that where you live in the country makes a big difference to the quality of service you receive. Just moving from one region to another in Italy might mean that a citizen must wait 2.5 times as long for a social security claim to be processed! Crossing an administrative boundary shouldn’t mean receiving less efficient govern- ment services (Fenizia 2022). Imagine the questions you could ask after looking at maps like these. In the Slovak Republic, the Bureau- cracy Lab looked closer still. We discovered that the variation among individuals working in the same department was often even bigger than the variation between districts. We decided to ask deeper questions— and gather more data—about individual productivity, management quality, employee turnover, and recruit- ment practices to better understand what factors might make a difference to productivity. Changing how you look at case data could be the start of a deep dive into how government works—and you might just trans- form it in the process. 12 GOVERNMENT ANALYTICS IN EUROPE MAP 2.1  Variation in Productivity of Social Security Case Processing, Italy Index of claims processed 66–89 90–96 97–99 100–116 Source: Fenizia 2022, using Italian Social Security Agency data. Rpt. in Rogger and Schuster 2023. Note that this graph was not produced using data sourced by the authors but reproduces analysis from a distinct report. MAP 2.2  Variation in Productivity of Environmental Departments, the Slovak Republic (9, 15) (15, 20) (20, 25) (25, 30) (30, 35) (35, 43) Source: Fabasoft (2015–21) data set, Ministry of Interior of the Slovak Republic. CHAPTER 2: UNDERSTANDING PUBLIC ADMINISTRATION FROM ADMINISTRATIVE DATA 13 FINDING AND USING ADMINISTRATIVE DATA Once you begin to see everyday information with new eyes, you’ll start finding useful administrative data everywhere you look. Once you begin to see everyday information with new eyes, you’ll start finding useful administrative data everywhere! Digital databases are full of administrative data that can be repurposed for analytics, from payroll and human resources data to expenditure data. Governments are data rich, especially in highly digitalized countries, as in much of Europe. In figure 2.1, we summarize the sources of administrative data that exist in government, where you might find them, and a few of the questions you can use them to answer. The possibilities for using administrative data to explore questions about government are almost endless. Of course, analytics requires skills and tools that can’t be learned overnight. To get started, we recommend looking at the World Bank’s DIME Wiki, which contains detailed articles on all aspects of data analytics. Box 2.1 lists some other freely available resources. Naturally, when you’re working with microdata about individual people, data security, anonymity, and privacy become especially important (see box 1.2 for some thoughts on privacy in a European Union con- text). Balancing public servants’ dignity and privacy as individuals and their rights as employees against the needs of public service is crucial to ethical data analytics. For an in-depth discussion of each of these kinds of administrative data, see chapters 9–13 of The Government Analytics Handbook. For an ethics framework covering issues that arise when working with personal data, see chapter 6 of The Government Analytics Handbook. BOX 2.1  DIME Analytics Resources The World Bank’s Development Impact (DIME) Department has created a set of resources based on the latest research to help guide you in collecting and analyzing data. DIME Analytics is a unit dedicated to developing tools and resources to support data analysts at all levels. These resources offer guid- ance about good data analysis, data transparency, and the reproduction of results in accordance with international best practices. Even if you’re completely new to data analytics, you’ll find help in the free courses and tools below. Guides and Written Resources Development Research in Practice: The DIME Analytics Data Handbook (Bjärkefur et al. 2021) is a comprehensive guide to data analytics and empirical research, covering every step in the process from designing a research project to analyzing and publishing your findings. The book not only moves step by step through the data workflow; it contains examples from a real-life case. Feedback can be (continues on next page) 14 GOVERNMENT ANALYTICS IN EUROPE BOX 2.1  DIME Analytics Resources (continued) provided on its GitHub repository. The handbook is also linked to the DIME Wiki, a regularly updated guide to every phase of an impact evaluation, with over 200 pages. Courses and Technical Trainings DIME Analytics offers regular courses that cover both the workflows of data collection and the tools required for advanced statistical analysis. The course materials are freely available online, so you can study them at your own pace. The courses with materials available online include the following: ● Development Research in Practice (a companion course to The DIME Analytics Data Handbook that gives an overview of empirical research) ● Manage Successful Impact Evaluation Surveys (a course covering the data collection workflow) ● R for Advanced Stata Users (an introduction to the R programming language) ● The Research Assistant Onboarding Course and other technical trainings ● DIME Continuing Education (a biweekly series of hands-on trainings with changing topics) Technical Tools Finally, DIME Analytics also makes technical tools useful for statistical analysis freely available on GitHub, including two Stata software packages (ietoolkit and iefieldkit) and data visualization libraries, for producing graphics from your data. FIGURE 2.1  Major Sources of Administrative Data Source Where are data found? What can they answer? Human resource and financial What will future wages look like? Payroll and HR management information systems Which organizations have high turnover? Financial management Are organizations spending Expenditure information system money e ectively? What could make procurement more Electronic procurement Procurement e cient? Is there evidence of system corruption? Where are trade goods getting stuck in Customs Customs database the customs process? How productive is an organization? Electronic case Case data Which organizations process cases with management system the fewest errors or complaints? Source: Original figure for this publication. CHAPTER 2: UNDERSTANDING PUBLIC ADMINISTRATION FROM ADMINISTRATIVE DATA 15 CUTTING-EDGE ANALYTICS WITH ADMINISTRATIVE DATA Let’s look at an innovative example of analytics from the Bureaucracy Lab’s work in Estonia. (You can get a bigger picture from the case study in chapter 7.) The government of Estonia developed a policy dashboard showing government service levels across different local governments. This dashboard was meant to help both citizens and public officials better understand how well local governments are aligned with citizens’ service priorities. (You can find out more about understanding how public officials use information to make decisions in box 2.2.) Of course, an informational tool is only useful if people use it. We can use administrative data to study this question too: first, the web tracking data on the dashboard, and second, the queries officials and citizens make to the relevant government staff. However, an important question in Estonia has been whether the dashboard really affects the decisions and activities of local government officials. For this reason, the World Bank team assessed the extent to which policies and budgets changed, as well as whether public officials stated that they were more aware of their local government’s strengths and weaknesses. The Bureaucracy Lab team that led the analysis processed one form of administrative data by using a cutting-edge analytics method: machine learning. The data source we drew on is the minutes from the meet- ings of local councils between local government officials and citizens. Our question was “Has the dashboard changed what public officials and citizens talk about?” It would be nearly impossible for a person to read through the minutes of all the meetings of all the local councils across Estonia to study what the conversa- tions were about. But we can use administrative data—the meeting minutes—combined with text analysis and machine-learning methods to detect the changes in the conversations for us. By training a machine-learning algorithm to analyze the meeting minutes, it becomes possible to see how the topics of these conversations evolve—and whether they do so in relation to the policy dashboard. Our hope is that, in the future, an automated system may even be able to update the dashboard based on the policy priorities discussed in meetings. “ The e-government cabinet, e-health services, online voting, online pre-filled tax returns, e-mobile parking, are all examples of Estonian innovation, but far more importantly, they are examples of the transformative ” power of intensive and extensive use of information technology in the public sector. – Toomas Hendrik Ilves, former President of Estonia E PLURIBUS UNUM Each of the administrative data sets described above has something to teach you on its own—but you can learn even more by combining them! In the Slovak Republic, the Bureaucracy Lab learned the importance of not just having access to administrative data sets but being able to integrate them with one another. In this context, we had access to two administrative data sets from two different case management databases. But we didn’t have a third data set to link the two and enable more comprehensive, in-depth analytics (see box 1.2 and the chapter 10 case study for more details about this challenge). In Estonia, we were able to combine a range of data sets—from budget data to service delivery data—painting a vivid picture of the government and its delivery of public services. 16 GOVERNMENT ANALYTICS IN EUROPE In the long run, making data analytics a part of your work is much easier if different sources of admin- istrative data are compiled in one place where they are easy to examine, compare, and work with. Building a good data infrastructure that ensures the quality and accessibility of administrative data should be a goal for any organization that wants to unlock the power of their everyday data to transform public administra- tion. Box 2.3 highlights a number of avenues government could take to lay the foundations for greater use of administrative data. For a guide to building effective data infrastructure, see chapter 9 of The Government Analytics Handbook. BOX 2.2  Measuring Information Use in Government One major goal of government analytics is giving public officials and managers solid, empirical evi- dence that they can use to make better decisions. But just because information is available to public officials doesn’t mean it’s actually being used by public officials. If you plan to invest time and money in repurposing administrative data and building a better data infrastructure, you need to make sure that the resulting insights actually affect how decisions get made in your administration. You need to con- duct analytics about government analytics itself! “You’re telling me I need to do government analytics?” Yes! But don’t panic: we aren’t opening the door to an endless hall of mirrors. We’re just reminding you how much data you likely already have available to you for analytics. Few governments undertake broad assessments of what information pub- lic officials use, but the data you need for measuring information use in government are easily available in most management information systems. For instance, a website or dashboard can tell you how many times a public official has accessed it, how much time they spent engaging with the data, how many times data have been downloaded, and how many times information has been shared. These data can start to give you a picture of how officials are using a management information system. Of course, these administrative data can’t tell you everything. Even if you can see that a dashboard is widely accessed, you don’t necessarily know how officials are using the data, or what impact it is having on their decisions. Public servant surveys and focus groups can help you focus your attention on how officials use data, how information affects their decision-making, and what obstacles they encoun- ter. They can also help you pick up on whether information is being misused—for instance, to boost the goals that are being measured while ignoring those that aren’t. Understanding how public officials use—or don’t use—data can help foster a culture in which data are used to improve policies and programs. Conducting an impact evaluation can reveal how to improve information use (see chapter 4). For instance, you might find that officials lack key measure- ments, would benefit from comparing different measurements more intuitively, or use information more purposefully when they have more autonomy in their decision-making. Like all government processes, information use needs to be measured before it can be improved. CHAPTER 2: UNDERSTANDING PUBLIC ADMINISTRATION FROM ADMINISTRATIVE DATA 17 For a fuller discussion of how to conduct analytics about government analytics, see chapter 7 of The Government Analytics Handbook. BOX 2.3  What Might Be: Making Administrative Data Work Better OK. We’re going to come clean. Administrative data are never quite what the analyst expected. They’re typically not created for analytical purposes. Definitions of included data are not always clear, not everything that you’d think would be included will be there, and there are omissions in what is included. Box 1.2 laid out some of the challenges of undertaking coherent government analytics, and the rest of this report tries to showcase the benefits of overcoming those challenges. However, there are some simple steps that could be taken to make administrative data work better for analysts, wherever they come from. First, frequently governments do not have the authority to impose a standardized measurement approach on all organisations or subnational government entities. Yet instead of simply giving in to incoherent measurement, government actors can release voluntary data standards and guidelines and encourage their uptake. Advertising the benefits of having at least some data that are comparable with others increases the likelihood of take-up. This is particularly true when government agencies publish data online already. Many government agencies publish full listings of their employees on their websites and update them when there is a change in staffing. Together, these listings present a picture of government staffing over organizations and time. Being able to understand what agencies have the largest turnover and why could be very useful in strengthening personnel practices. Yet despite the data’s being published in a disparate form, few governments make all these staffing data publicly available in a consistent and joined-up way. If only all government agencies published their staffing in a standardized way that was archived in a common platform. Second, for anything that might be used as data, have an organizational default of making it public. While freedom of information laws across Europe tried to nudge government staff to do this, the culture of individual organizations will always make or break the implementation of these laws. And once you have made something public, ensure some explanation or definition is available in a place that could be easily found from the data source itself. So if someone is looking at budget data, they know what they are looking at. And then make clear within that why there are the omissions there are. 18 GOVERNMENT ANALYTICS IN EUROPE CHAPTER 3 Understanding Public Administration from Surveys of Public Servants QBureaucracy is such a strange place, and A it’s filled with phenomena that are hard to measure. Should we just give up? No! Use surveys to find out what’s going on inside the public service. Nicolaes Eliaszoon Pickenoy, The Governors of the Spinhuis. 19 “ The hardest thing about public service is not to lose the ability to give feedback over time. That is why it is necessary to work with people who are unafraid to ” disagree with you, thus setting a mirror to reality and everyday life. – Zuzana Čaputová, President of the Slovak Republic LOOKING INSIDE THE PUBLIC SERVICE In the last chapter, we showed you how to look at data you already have with new eyes in order to measure how well government is working. But administrative data don’t always capture the full picture. They can help you measure the machinery of government and see where it needs to improve, but they can’t always tell you how to improve it. How can we understand what is going on inside the public service in order to make it better? Public servant surveys are used by governments around the world to better understand public administration—especially the aspects that are trickiest to measure. Ask public servants themselves what is going on in their organizations! Public servant surveys are used by governments around the world to better understand public administration—especially the aspects that are trickiest to measure. Think, for instance, of how important it is to understand what motivates public ser- vants: why they do the work they do and what makes them want to improve the way government works. Surveys can also tell you how well public servants are being managed and what their work environments are like, qualities that matter greatly for government effectiveness. For all their advantages, public servant surveys aren’t as easy to use as other data sources. They are costly, so they need to be well designed, and their data can be challenging to interpret. In this chapter, we’ll show you how to implement public servant surveys well and, most importantly, how to use the results to improve government.in If you’re curious how governments around the world use public servant surveys, see chapter 18 of The Government Analytics Handbook. PUBLIC SERVANT SURVEYS IN ACTION Let’s start by looking at some examples of the powerful insights public servant surveys offer. Public servant surveys show that the quality of management varies greatly within countries in Europe. The following figures show how much management quality can vary even within a single country: map 3.1 shows variation across local governments in Estonia, map 3.2 shows variation across district offices in the Slovak Republic, and figure 3.1 shows variation across public administration organizations in Lithuania. 20 GOVERNMENT ANALYTICS IN EUROPE In each country, this means that where public servants work makes an enormous difference to their work environment—and thus to how well they deliver services to citizens. For instance, moving from one local government to another in Estonia might mean the difference between every public servant receiving a formal performance evaluation and no public servant receiving one (map 3.3). MAP 3.1  Management Quality across Local Governments, Estonia (4,4] (3,4] (3,3] [2,3] No data Source: World Bank public servant survey in Estonia, 2022/23. MAP 3.2  Management Quality across District Offices, the Slovak Republic [-3,2, -1.0] (-1,0, -0.5] (-0,0, -0.0] (0.0, 0.5] (0.0, 1.8] (1.0, 1.8] No data Source: World Bank public servant survey in the Slovak Republic. CHAPTER 3: UNDERSTANDING PUBLIC ADMINISTRATION FROM SURVEYS OF PUBLIC SERVANTS 21 FIGURE 3.1  Management Quality across Public Administration Organizations, Lithuania 1.0 Management quality index 0.5 0 –0.5 –1.0 –1.5 1 5 10 15 20 25 30 35 40 45 N = 45 Ranking of organization Organization average Health and education unit averages Source: World Bank public servant survey in Lithuania, 2021. MAP 3.3  Share of Public Servants Who Received a Performance Evaluation across Local Governments, Estonia (82,100] (56,82] (27,56] [0,27] Source: World Bank public servant survey in Estonia, 2022/23. So what do we do with this information? To put public servant survey data into action, we make sure managers and decision-makers can access information from the surveys, we tailor actionable recommendations based on the survey results, and we ensure accountability by embedding the survey into an ongoing process of reform. 22 GOVERNMENT ANALYTICS IN EUROPE Don’t worry if those steps aren’t totally clear yet—we’re going to break down this process below, with examples from Lithuania, where the Bureaucracy Lab undertook a survey of about 2,000 public sector work- ers as part of a wider effort to improve youth mental health services. You’ll find more information about this effort in the Lithuania case study in chapter 8. But first, let’s talk about the foundation of high-quality data and evidence-based reform: survey design. DESIGNING EFFECTIVE PUBLIC SERVANT SURVEYS Designing an effective public servant survey means thinking carefully about what you need to measure, what methods will be practical and effective, and how you are going to interpret and ­ report the data you uncover. What kinds of management practices or employee attitudes do you want to measure? Should your survey be administered in person or online? Will it t ­ arget ­ mployees or just a representative sample? And how will you turn the data you collect into all e actionable recommendations? These questions are too complex to address in full here, but there are many resources to guide you in designing surveys. Box 3.1 describes three major public servant survey efforts and how they work. The DIME Wiki and other resources in box 2.1 also address different aspects of survey design. And if you decide to design a public servant survey yourself, part 4 of The Government Analytics Handbook is an invaluable resource. Without benchmarks across organizations and countries, it’s hard to tell what individual statistics mean, or how to act on them. Above all, be sure that your survey produces data that can be compared across different contexts. Public servant survey data can be challenging to interpret because individual statistics don’t mean much on their own. For example, how do you know whether 65 percent job satisfaction in a particular organization is high or low, and how to respond accordingly? Without benchmarks across organizations and countries, it’s hard to tell what individual statistics mean, or how to act on them. The Global Survey of Public Servants (GSPS) is a particularly useful initiative to harmonize sur- vey questions and methodologies across different countries, so you can see how any given measure- ment stacks up globally. If you look at figure 3.2, which displays GSPS data from across Europe, you’ll see that 65 percent job satisfaction is relatively low in this context. (You can find out more about the GSPS in box 3.1) In Lithuania, effective survey design led to a high response rate (82 percent) to the Bureaucracy Lab’s public servant survey. The team collaborated with government partners to design the survey. We decided to administer it online due to the COVID-19 pandemic and because it targeted a highly technologically literate group of public servants. These design decisions not only helped ensure useful data; they also paved the way for an effective partnership, in which government collaborators encouraged participation in the survey and were prepared to act on its findings. For a comprehensive look at how to design public servant surveys, see chapters 19–24 of The Government Analytics Handbook. CHAPTER 3: UNDERSTANDING PUBLIC ADMINISTRATION FROM SURVEYS OF PUBLIC SERVANTS 23 FIGURE 3.2  Job Satisfaction by Country in Europe 100 95 90 83 82 81 80 77 Share of respondents, % 73 70 66 60 50 40 30 20 10 0 Romania Ukraine Lithuania Estonia Albania Slovakia Croatia Country Source: GSPS. Note that this graph was not produced using data sourced by the authors but reproduces data from the GSPS. Note: GSPS = Global Survey of Public Servants. MAKING SURVEY DATA COUNT Ensure that surveys are embedded within wider, ongoing initiatives toward public service reform. So you have high-quality data from public servant surveys—now what? The best way to translate survey data into action is to ensure that surveys are embedded within wider, ongoing initiatives toward public service reform. This means ensuring survey data can be used to answer questions and drive real change. This may sound challenging, but, fortunately, there are many low-cost strategies for using survey data to support evi- dence-based reforms. First, report your survey results in a way that is tailored to different groups, identify specific strengths and areas for development, and enable managers to explore the information in ways that are useful to them. Make it as easy as possible for managers to get to the heart of the important questions in their organizations and see how they compare to other organizations. In Lithuania, for example, the Bureaucracy Lab was able to identify stigma around mental health issues as a key area for development for school employees and public servants, and management quality as a key area for school principals, both contributing to overall mental health service delivery. By breaking down results according to different organizations and people, it becomes clearer how to address what is and isn’t working in public administration. “ Reform is an ongoing process, and, in all spheres of the public sector, there is always room for further upgrades ” and enhancements, and for the improvement of service provision to citizens. – Ivan Malencia, Minister of Public Administration of Croatia 24 GOVERNMENT ANALYTICS IN EUROPE In addition to reporting survey results in a clear, relevant way, make sure your survey findings are ­ ccompanied by concrete recommendations for action. Tailored recommendations build managers’ capacity a to enact change and ensure that the data gathered in the public servant survey can be i ­ mmediately translated into real improvements. Don’t make managers struggle to identify concrete actions in response to challenges. In Lithuania, the Bureaucracy Lab was able to tailor different recommendations to schools—such as improv- ing the culture around trainings to combat mental health stigma—and government ministries—such as strength- ening school management by establishing a competency framework. When recommendations are addressed to ­mmediate change. each group’s particular situation, they are easier to put into action right away, driving i Finally, find ways to hold individuals and organizations accountable for survey findings, to increase their motivation to take action. Accountability might take the form of central oversight, or it might mean making survey data available to the public, strengthening the transparency of government organizations. Don’t just consider how to motivate change but also how to measure it to create a culture of reform. In Lithuania, the Bureaucracy Lab made several recommendations to the Prime Minister’s Office to improve data collection and build data management tools to help municipal governments integrate evidence into decision-making. Whatever accountability strategies you develop, don’t just consider how to motivate change but also how to measure it to create a culture of reform. The most effective public servant surveys are the product of ongoing collaboration between a tech- nical expert who can interpret survey data and a senior manager who can drive change, as well as an organizational culture of using survey data for constant improvement. The Office of Personnel Manage- ment’s Federal Employee Viewpoint Survey (FEVS) in the United States is a wonderful example of such a survey. Public servant surveys can be challenging to implement, but when they’re done well, they offer unique insights into the experiences of the people who work in the public service. They help you understand what motivates public servants, how well they’re being managed, and what their work environment is like. When survey data are accessible, when decision-makers know what to do with them, and when organizations cre- ate a culture of accountability around evidence-based improvements, public administration is transformed, one organization at a time. For lessons about putting survey data into action, including the example of the FEVS in the United States, see chapters 25 and 26 of The Government Analytics Handbook. BOX 3.1  Existing Survey Efforts If reading this chapter has piqued your interest in public servant surveys, there are many places to look to learn more! This box will cover three examples that illustrate a range of approaches and available resources. (continues on next page) CHAPTER 3: UNDERSTANDING PUBLIC ADMINISTRATION FROM SURVEYS OF PUBLIC SERVANTS 25 BOX 3.1  Existing Survey Efforts (continued) The Global Survey of Public Servants (GSPS) is an initiative to generate survey data from public servants in government institutions around the world. Its goal is to help governments collect more and better data through public servant surveys and to enable them to make better sense of those data by comparing them to global benchmarks. You can find more information about the GSPS on its website (www.globalsurveyofpublicservants. org). In addition, you can use the data dashboard to explore further comparisons of various indicators on the performance of public administrations. As the number of governments who regularly survey their employees has grown steadily over the past decade, the GSPS can help make these surveys as useful as possible. Most governments want to measure the same concepts with their public servant surveys—particularly concepts related to manage- ment quality, employee motivation, and workplace culture. But they measure these concepts in different ways, using different questions and methodologies. This means, for example, that a measurement of leadership in one country can’t be compared to a measurement of leadership in another country. The GSPS has pooled the expertise of researchers and practitioners to help harmonize survey data and design a core module of survey questions that governments can use in their own contexts. As figure B3.1.1 shows, this means that concepts like leadership can be compared in different country con- texts. When survey data can be compared easily across countries, it’s easier to see what measurements actually mean—and what can be improved. FIGURE B3.1.1  Leadership (Communication of Mission) by Country 80 76 74 70 Share of respondents, % 60 53 50 40 30 10 0 Albania Romania Estonia Country Source: GSPS. Note that this graph was not produced using data sourced by the authors but reproduces data from the GSPS. The Organisation for Economic Co-operation and Development (OECD) is also leading an initia- tive to implement a common set of modules in public servant surveys in a range of European Union (EU) Member States. Such efforts will provide common, and more comparable, data from across Europe. Finally, highly targeted surveys of public servants can also be useful, at times. Calling on experts can help you focus your questions, decide what data will help you answer them, and identify obstacles to collecting and analyzing them. (continues on next page) 26 GOVERNMENT ANALYTICS IN EUROPE BOX 3.1  Existing Survey Efforts (continued) For example, the Sustainability Transition Assessment Framework (STAF) is a tool that the World Bank and external consultants developed to help the EU assess its Member States’ capacity to make transformative changes in response to climate change. STAF works by asking governments to priori- tize sectors they want to understand better and to designate public sector experts to fill in and discuss surveys. The final results are then shared with the governments. The tool was piloted in Greece and Romania. In both countries, STAF helped the governments think through obstacles to climate change adaptation and disaster resilience, and it identified ways to remove these obstacles through better data collection and analytics, especially by helping different ministries and organizations coordinate their data collection efforts and harmonize their indicators. By learning what data to collect and how to collect them, Greece and Romania can be better prepared to respond to future climate disasters. STAF is a great example of how public service surveys and data analytics can complement one another as approaches to charting a course for public administration reform. CHAPTER 3: UNDERSTANDING PUBLIC ADMINISTRATION FROM SURVEYS OF PUBLIC SERVANTS 27 CHAPTER 4 Understanding Public Administration through Impact Evaluations QWhat should I do next? Can government A analytics really help me make my administration stronger? Yes! Especially when it’s combined with well-designed impact evaluations. Joaquín Sorolla y Bastida, Research. 29 “ In nature we never see anything isolated, but everything ” in connection with something else which is before it, beside it, under it, and over it. —Goethe LOOKING FOR CAUSES Data, like those described in the previous chapters, make it possible to know what is happening in the public service. Data are like the pixels on a screen or the brushstrokes in a painting. They provide the colors that highlight the differences in the world. But—as we’ve already emphasized—great colors are not enough for a great picture: they must be combined in ways that allow us to distill what is being illustrated. Good analysis is good illustration, but it is also the ability to tell a story about what is really happening. Purely descriptive analysis does a lot for our ability to illustrate the world around us, and when analysis is combined with evaluation techniques—which aim to show us the causes of what we see in our data—we can begin to understand why we see what we see. In contexts with extensive digital data and well-established surveys, impact evaluations are the next step toward cutting-edge analytics. This chapter describes the power of implementing impact evaluations in public administration. If you’ve never heard of impact evaluations before, that’s okay! We’ll introduce you to what impact evaluations are and explain why they are a useful complement to the analysis of administrative and survey data. In contexts like Europe with extensive digital data and well-established public servant surveys, impact evaluations are the next step toward cutting-edge data analytics. Because impact evaluations are at the frontier of government analytics, it’s easiest to give examples to help map this new territory. In this chapter, we’ll share some examples from the Bureaucracy Lab’s project in Europe that show how an impact evaluation works and how it can serve as the capstone to existing survey and data analytics efforts. And because impact evaluations generate new knowledge, we’ll describe how the findings from this project provide policy guidance for the governments we worked with and lessons for the wider region. WHAT IS AN IMPACT EVALUATION? What is an impact evaluation? It’s the use of empirical evidence to create a conception of what would have happened if a specific policy or other intervention hadn’t been put in place. The best impact evaluations also create evidence for why a policy had the impacts it did. Decision-makers need to know not only whether a given intervention is succeeding (or not); they need to know how and why policies work, in real time, based on solid data. Put simply, an impact evaluation tells you whether a policy made things any different, as well as which approaches to implementing the same policy worked best for whom and why. The most commonly used technique in impact evaluations is a randomized controlled trial (RCT), an experiment in which the analyst randomizes exposure to a policy, or some aspect of it, to some people (the treatment group) and not to others (the control group). “But I’m not a scientist!” you may be thinking. 30 GOVERNMENT ANALYTICS IN EUROPE “How can I run an experiment in my administration—let alone one with actionable results?” Fortunately, running a successful impact evaluation doesn’t require working with dangerous chemicals or putting on a lab coat, goggles, and gloves—although being willing to dig into the data and get your hands dirty won’t hurt! Impact evaluations build on the other kinds of data analytics discussed in this report. They use admin- istrative and survey data but generate new insights by taking a “trial-and-adopt” approach. This means that decision-makers experiment with different ways of implementing a policy, look at its impacts in their exist- ing data sources, use the data to decide which approach to implementing a policy (or which policy itself) is working best, and then adopt the best version of the policy. So, for example, if you had a policy question, you would set up an experiment by implementing the pol- icy for some members of the population you have in mind, chosen randomly, and not for others (for now). Then, you would return to your data to understand the impact of the policy by comparing the outcomes in the treatment and control groups. Box 4.1 explains in more detail what these experiments look like and how they help you understand the causal impact of changes. Impact evaluations have many advantages. Using an impact evaluation means you can make refinements to a policy or program early on, before investing more time, money, and energy in it or implementing it on a wider scale. Impact evaluations also help you improve your skills over time. Once you know not only what works but why, you can apply the lessons from one program to the next (figure 4.1). Impact evaluations aren’t one-time solutions—they’re part of a continual cycle driving transformative change! Wherever you are in the journey of government analytics, conducting impact evaluations should be a goal you work toward in your existing efforts to analyze administrative and survey data. Not only can impact evaluations help decision-makers rapidly improve their interventions; the new knowledge you generate in the process can be used by others too! No matter the scale of the reforms you have in mind, you can’t afford not to have impact evaluations in your toolkit. The Development Impact (DIME) Department at the World Bank has developed a highly effective model for conducting impact evaluations that enables decision-makers to draw on cutting-edge research and data a ­ nalytics tools to maximize the impact of programs and policies. You can read more about that model in box 4.2. Box 4.1  How Do Impact Evaluations Work? What’s the difference between experiments, impact evaluations, impact studies, and all those other terms you might have heard about? An impact evaluation helps us measure and truly understand the causal impact of a policy or program. When we say “causal impact,” we just mean figuring out whether the changes you’ve imple- mented are the actual cause of the improvements observed for key outcomes. An impact evaluation is not just about determining whether a change is successful or not; it is about comprehending the rea- sons behind its success or failure. For instance, did streamlining administrative processes actually lead to more productive employees? Understanding causal impact is crucial because you want to be certain that the changes you have made are truly responsible for the transformation you see. Decision-makers in government rely heavily on this type of information. They need it to make well-in- formed choices, grounded in real-time facts and data. Imagine you are a manager trying to improve the productivity of your team. You would want to know whether a specific change in workflow really makes your team more productive. Impact evaluations help you understand what truly works and, most impor- tantly, why it works. (continues on next page) CHAPTER 4: UNDERSTANDING PUBLIC ADMINISTRATION THROUGH IMPACT EVALUATIONS 31 Box 4.1  How Do Impact Evaluations Work? (continued) There are three main methods for conducting an impact evaluation: ● Experimental evaluation: An experimental evaluation is essentially a well-structured science exper- iment. In this method, you randomly choose which group of employees will be exposed to the new changes and which will not. This way, you can be very sure that the changes are truly enhancing efficiency, rather than some other cause, since the two groups of employees will otherwise be the same. In more scientific terminology, this is known as a randomized controlled trial (RCT) because the groups are chosen randomly and because one group (the control group) isn’t exposed to the changes. ● Quasi-experimental evaluation: A quasi-experimental evaluation is similar to the first method but a bit simpler. You might not always be able to select your groups randomly, but there are other fair ways to test whether the new changes are having the intended effect. ● Non-experimental evaluation: A non-experimental evaluation is a more practical approach in which you look at what happened before and after the changes to see if things improved. You might also be able to compare different groups within the government to better understand the impact of the changes. Selecting the appropriate evaluation method depends on the specific objectives and circumstances of your administration. Each method has its strengths and considerations, making it vital to choose care- fully on the basis of the desired improvements. FIGURE 4.1  Impact Evaluation Cycle Policy Policy adoption problems Evidence or Data solutions IE testing Hypotheses Source: Adapted from Legovini et al. 2018. 32 GOVERNMENT ANALYTICS IN EUROPE Box 4.2  Development Impact at the World Bank The Development Impact (DIME) Department at the World Bank has developed a co-production model for impact evaluation that joins research to practice by giving people on the ground the power to iden- tify and ask for the knowledge they need to take action now. Development research often takes place in academic settings, far away from the practitioners who need it, and so it doesn’t always serve their needs. In DIME’s model, by contrast, practitioners draw on cutting-edge research and real-time data to try out policies and programs that are relevant to their own contexts, and they are empowered to make adjustments—right away—to maximize their impact. In partnership with practitioners, DIME researchers use their analytical skills to experiment with different options, guide mid-course corrections, and inform the adoption and scale-up of policies. In the process, impact evaluations generate innovative tools and scientific knowledge that are impactful far beyond their original context. Figure B4.2.1 illustrates how DIME’s unique operating model works. If you’d like to learn more, DIME helped put together a handbook (Gertler et al. 2016) that takes readers through the steps to implement an impact evaluation. In addition, the DIME Wiki offers further practical resources about impact evaluations. FIGURE B4.2.1  DIME’s Operating Model 1 2 3 Identifying Training Capacity Inform policy knowledge gaps government building through Calls for proposal design and set sector clients and others workshops agenda in IE methods Systematic evidence Capacity building 4 5 6 Guide policy Match teams Review and Financial and Monitor and implementation, with researchers selection of technichal support make mid-course and subject proposals support for IE implementation corrections experts 7 8 9 Provide policy Disseminate to Improve Generate feedback to Analysis and counties and e ectiveness actionable results inform adoption results global communities of development and empower and scale-up of practice policy governments Source: Adapted from Legovini et al. 2018. IMPACT EVALUATIONS IN ACTION Since you probably still have some questions about what an impact evaluation actually looks like in practice, let’s take a close look at an example from the Bureaucracy Lab’s work in Lithuania. The impact evaluation we’re going to describe was just one element of the Bureaucracy Lab’s overall project in Lithuania, which aimed to improve the youth mental health services offered by schools. You can find an overview of the full project in the case study in chapter 8. One of the findings from public servant surveys that the Bureaucracy Lab conducted in Lithuania (which you can read more about in chapter 3) was that public sector personnel—including both school staff and public administrators—displayed high levels of stigma toward mental health and knew little about the extent of the mental health problems youth in Lithuania face. Attitudes and knowledge significantly impact how school staff respond to students with mental health problems. CHAPTER 4: UNDERSTANDING PUBLIC ADMINISTRATION THROUGH IMPACT EVALUATIONS 33 Training about youth mental health can help change attitudes and strengthen knowledge. In 2021, the Ministry of Health started offering a free, self-paced mental health training module online to address this problem. But as map 4.1 shows, trainings were being underprioritized and underutilized in Lithuanian schools. MAP 4.1  Variation in Schools’ Uptake of Mental Health Training by Region, Lithuania Share (percent) yes 100 75 50 25 0 Source: World Bank public servant survey in Lithuania, 2021. The Ministry of Health partnered with the Bureaucracy Lab to design an impact evaluation that would experiment with different ways to encourage school staff to participate in the online mental health training. The impact evaluation that we decided to conduct was an experimental evaluation, or an RCT (see box 4.1). Schools throughout Lithuania were randomly sorted into three groups. In the first group, school administrators were sent two emails from the Ministry of Health encouraging them to participate in the online training and share the emails with the staff at their schools. The second group of school administrators received exactly the same emails, but the messages came from the Lithuanian Student Union (a nongovernmental organization) instead. The final group (known as the control group; see box 4.1) didn’t receive any emails at all. You can see an example of what one of these emails looked like in figure 4.2. The impact evaluation was designed like this so we could learn not only whether advertising the train- ing through email would make a difference but also how this advertising could be most effective. An email message from the Ministry of Health reaches school staff from the top down, while one from the Lithuanian Student Union comes from the bottom up. Even when the emails contain the same message, this different branding might change how administrators and staff read and act on them. 34 GOVERNMENT ANALYTICS IN EUROPE FIGURE 4.2  Ministry of Health Email Sent to School Administrators in Lithuania Source: Ministry of Health of the Republic of Lithuania. At the end of the experiment, we compared the three groups. We found that about three times as many staff members started the training in the group that received emails from the Ministry of Health, com- pared with the control group (the group that didn’t receive any emails), and about twice as many finished it. Similarly, about twice as many staff members in the group that received emails from the Lithuanian Student Union started the training, and 1.3 times as many finished it. CHAPTER 4: UNDERSTANDING PUBLIC ADMINISTRATION THROUGH IMPACT EVALUATIONS 35 What these numbers show us is that advertising these mental health trainings by email works to increase participation. But school staff don’t respond to emails in the same way. The email from the Ministry of Health was more impactful in prompting staff members to start the training and—more importantly—to complete it. By finding out what works best, governments can constantly improve. What bigger lessons does this impact evaluation have for us? The Bureaucracy Lab is conducting an additional study to help clarify the causal impact of the email from the Ministry of Health—why it worked ­ best to increase participation in the training. But one initial guess is that top-level institutions have a large influence on school staff; staff respond more strongly to messages that come from the top down. These findings can inform the strategy for future communication with school staff—not just in Lithuania but in other comparable contexts as well. Part of the power of impact evaluation is that it creates new knowledge, and public officials and managers get to be part of the process. This means that officials not only develop a deeper body of empirical evidence to make better decisions; they also get to make discoveries that help their colleagues in other countries, and maybe even around the world! By using impact evaluations to find out what works best, governments can constantly improve. 36 GOVERNMENT ANALYTICS IN EUROPE CHAPTER 5 Understanding Public Administration from the Outside QBut how am I supposed to understand A government if I don’t have permission ­ to use government data? Don’t worry! Look outside of public administration and ask citizens how government impacts their lives. Pieter Bruegel the Elder, The Census at Bethlehem. 37 “ People trust those leaders who show real results ” of their work, rather than those who just talk about the results. – Dalia Grybauskaitė, former President of Lithuania LOOKING AT GOVERNMENT FROM THE OUTSIDE So far, we’ve covered ways to measure, understand, and improve public administration by using data that governments collect during their ordinary operations or by surveying public employees directly. But administrative data and surveys can sometimes be inaccessible. For a variety of reasons, directly collecting government data might be infeasible or impractical. Are there ways to understand public administration through data that are available to outsiders? Of course, even public administrators themselves might be constrained in their ability to obtain data from inside the public service, especially for privacy reasons (see box 1.2). So by “outsiders,” we mean anyone outside of the permissible space for accessing the government’s own data. What can we say about the public service using only data available to outsiders? There are many, many ways of examining government from outside. Here, we’ll focus on two data sources that are readily produced and analyzed by private actors as well as governments. First, we’ll discuss citizen surveys, which capture how citizens perceive government outcomes. Then, we’ll look at household surveys, which capture what life is like across the private and public sectors in a consistent way. Even though we focus on just two kinds of data, some of the lessons from this discussion are a ­ pplicable to all “outsider” data: look for evidence of the outcomes of public administration, pay attention to the relationship between the public and private sectors, and use multiple different kinds of data from different sources to build up a more accurate picture of government. USING CITIZEN SURVEYS TO MEASURE GOVERNMENT OUTCOMES Surveys that ask citizens about what they think their government is doing well—and where they think it could improve—are one of the most widely used measures of government success. Citizen surveys are used to measure citizens’ satisfaction with public services—like the tax administration, schools, and hospitals— and some countries also have longstanding surveys that measure citizens’ trust in government. These surveys are usually nationally representative, making them a fantastic source of data about how well the government is doing in the eyes of those it is trying to serve. Citizen surveys aren’t just a great, widely accessible data source—they might be even better for some projects than the other kinds of data we’ve discussed so far. Administrative data and public servant survey data shed light on what’s going on inside of the government, but citizen surveys reveal the outcomes of gov- ernment work—how the government affects the lives of citizens. In addition to capturing vital evidence of government effectiveness, citizen surveys also give citizens a voice, making government more transparent and accountable. In addition to capturing evidence of government effectiveness, citizen surveys also give citizens a voice, making government more transparent and accountable. When citizens can express their priorities and see how public administration works to meet them, they are more satisfied and better able to call for improvements. 38 GOVERNMENT ANALYTICS IN EUROPE CITIZEN SURVEYS IN ACTION How do citizen surveys strengthen measurement and increase accountability? Let’s look at Estonia for an example of citizen survey data in action. (You can learn more in the Estonia case study in chapter 7.) In Estonia, the Bureaucracy Lab used a citizen survey to understand citizens’ satisfaction with their local governments and the public services they offer. The Estonian government had put together masses of data on how well the government was delivering services. But in the end, they wanted to increase citizen satisfaction with the state—improving public services was just a way to get there. The Estonia survey found that local governments with better services have more satisfied citizens overall. But it also found that which region of Estonia you live in makes a big difference to how satisfied you are with your local government (figure 5.1) and how much you think that government decisions match your own priorities (map 5.1). These data show there is significant room for local governments across Estonia to make their work line up with what is most important to citizens—and to engage citizens with the work they’re already doing! The Estonia survey didn’t just show how satisfied citizens are; it also gave them a chance to rank how important different public services are to their overall satisfaction. This gives citizens a voice and increases the transparency of local governments, helping them better understand the outcomes of their actions and how to increase the satisfaction of the citizens they serve. For lessons about citizen surveys from the OECD Trust Survey, see chapter 28 of The Government Analytics Handbook. FIGURE 5.1  Citizens’ Satisfaction with Local Government by Region, Estonia 100 Average quality service local governments as a place to live, 2021 (%) Share of respondents satisfied with their 75 Average satisfaction share 50 25 0 0 3 4 5 6 Service quality score (2020 LG-level mean) Central Estonia Northeastern Estonia Northern Estonia Southern Estonia Western Estonia Source: World Bank citizen survey in Estonia, 2021. Note: Size of dots represents size of LG population in 2020. Service quality score measured on 0–9 discrete scale across 16 service areas. Municipalities classified by region according to divisions employed by Eesti Statistika. CHAPTER 5: UNDERSTANDING PUBLIC ADMINISTRATION FROM THE OUTSIDE 39 “ Without free, self-respecting, and autonomous ” citizens, there can be no free and independent nations. – Václav Havel MAP 5.1  Citizens’ Perceptions of Priority Alignment with Local Governments, Estonia To what extent do you feel that the decisions of those in power at the local government reflect your own priorities? % agreeing 0–20% 20–40% 40–60% 60–80% 80–100% Missing The values show % of respondents agreeing with a statement in each municipality. ′Don′t Know′ responses are not included in the calculations. Source: World Bank citizen survey in Estonia, 2021. USING HOUSEHOLD SURVEYS TO MEASURE THE PUBLIC SECTOR Household surveys are often overlooked for government analytics—but they’re very powerful. We’ve seen how citizen surveys help governments know how citizens perceive their outcomes, p ­ roviding those on the outside with valuable data for understanding how government works and where it could improve. But what if you need to compare what it’s like working for the government with the private sector? Household surveys are often overlooked for government analytics—but they’re very powerful for under- standing how the public and private sectors compare. 40 GOVERNMENT ANALYTICS IN EUROPE To design good policy, governments need to understand what life is like for the people they serve. For just this reason, national statistical authorities all over the world regularly conduct household surveys that capture data about demographics, consumption, and labor market participation. These are some of the most professionally conducted surveys in the world, so they’re an excellent source of high-quality data. If you want to know what life is like in the public sector—and how that compares to life in the private sector—you need survey data that will let you compare these two groups of workers accurately. Household surveys are ideal for this purpose: you can use them to see how public and private sector workers answer the same questions at the same moment in time. These data help us answer tough questions about public sector work. What level of wages will attract talented employees to the public sector without crowding out private sector jobs? Are the public and private sectors competing for workers with particular skills? Is the public sector doing a good job promoting employment for women compared with the private sector? Household survey data can help answer these questions, and more! Finding household survey data couldn’t be easier—especially within Europe. Best of all, finding household survey data couldn’t be easier—especially within Europe. Eurostat’s European Union Statistics on Income and Living Conditions (EU-SILC) database compiles data on income, poverty, social exclusion, and living conditions, all drawn from household surveys. The Bureaucracy Lab has also used data from over 1,000 household surveys across the world, including EU-SILC, to construct the Worldwide Bureaucracy Indicators (WWBI), a cross-national data set that lets you dig even deeper by ­ comparing the public sector in your country to the rest of the world. Household survey data have some other distinct advantages too. Unlike administrative data and public servant surveys, which only cover public sector employees, household surveys collect data for both public and private sector workers simultaneously. These surveys are already designed with research objectives in mind, meaning you don’t need to do anything to repurpose the data, like you do with administrative data. And household surveys might even capture workers that other data sets miss, like contract workers, giving you a more complete picture of who is working in the public sector. Next, let’s look at some examples of what we can learn about government from household survey data— and how that understanding can help us improve public administration. HOUSEHOLD SURVEYS IN ACTION For a deeper discussion of household surveys, see chapter 27 of The Government Analytics Handbook. What unique insights do household surveys provide? First, household survey data can tell us how big the public sector is and who works in it. Using these data, we can compare the size and importance of the public sector in different regions of a country and track changes over time. For example, the WWBI reveal wide variation by country in the share of the paid workforce employed in the public sector (map 5.2). Similarly, the public sector’s share of total employment ranges from a little over 13 per- cent in Romania to over 35 percent in Belgium. The WWBI also show that, in every EU Member State, women outnumber men in the public sector but are outnumbered by men in the private sector. This information helps us see how well the public sector is promoting equal employment for women in the labor market as a whole. We can also learn more about the skills and education that workers bring to the public sector, relative to the private sector. For example, the WWBI show that, on average, 58.9 percent of all EU public sector CHAPTER 5: UNDERSTANDING PUBLIC ADMINISTRATION FROM THE OUTSIDE 41 MAP 5.2  Public Sector Employment as a Share of Paid Employment, by Country Scale NA 0–10 11–20 21–30 31–40 41–50 51–60 Source: World Bank Worldwide Bureaucracy Indicators (WWBI). Note that this graph was not produced using data sourced by the authors but reproduces data from the WWBI. ­ orkers have completed tertiary education (figure 5.2). This is a much higher proportion than the global w average of 48 percent, and it starkly contrasts with the private sector in the EU, where only 33.8 percent of workers hold a tertiary degree. These numbers vary from country to country: in Lithuania, 82 percent of the public sector workforce hold a tertiary degree, while in Luxembourg, just over 30 percent of the public sector workers have the same level of education. This information can help us identify areas in which the government is competing most intensively for skills with private sector employers and respond accordingly. Finally, we can use household surveys to better understand how public sector workers are paid compared with their private sector counterparts or with similar workers in other countries. For example, the WWBI reveal that public sector workers worldwide enjoy a wage premium compared to similar private sector workers (figure 5.3). In EU Member States, however, this premium is less than half (7 percent) the global ­ average (21 percent) and has been declining for the past 14 years (figure 5.4). This information can help governments make smart choices about compensation to keep employees motivated and productive. ­ Are you ready to get started using data from outside government to conduct analyses like the ones we’ve demonstrated in this chapter? Check out the WWBI Dashboard, which allows you not only to compare different countries but also easily download country profiles for any country or region you are interested in. ­ Data from outside government—including citizen survey data and household survey data—are a great option if you want to do government analytics but don’t have access to government data sources. But even if you do have access to these data sources, you can’t afford to neglect these alternative perspectives on how public administration is working. Without drawing on “outside” data, you’ll never know how citizens perceive the outcomes of public administration and whether the impact of policies is really being felt. You’ll also lack a crucial perspective on the public sector workforce if you can’t see how public employees’ experience compares to their private sector counterparts. Without looking outside government, it’s impossible to see the big picture: how public administration impacts the economy and the people it serves. 42 GOVERNMENT ANALYTICS IN EUROPE FIGURE 5.2  Public Sector as a Share of All Workers with Tertiary Education, by Country 90 80 70 60 50 Percent 40 30 20 10 0 e Po ce Ire ia G ia Po ia Fi ia ia ov A ta Re tria un ia Ro in Bu ion an um us Cr lic m ic g ro Be rk ry Re ly Es d Cy d De land Fr ld l ga ec ur n n tv an ar t xe bl a al n n Ita ga a b an oa pr or ua to Sp Un ak us pe lgi la la nm La bo M pu Lu pu rtu re lg m W n th H Li h ec Cz Eu Sl Country Source: World Bank Worldwide Bureaucracy Indicators (WWBI). Note that this graph was not produced using data sourced by the authors but reproduces data from the WWBI. FIGURE 5.3  Public Sector Wage Premium by Country 40 30 20 Percent 10 0 –10 –20 Es ce ce Li nia Ire ia ia Ro ia ro A ia an tria ec Cr a Re tia n Po ion Fi m us Po ic Be lic Cy g Re ark Bu ly Fr ry d nd M l ov en d H nd ga t ai ur n ar v an bl al iu l Sl D n Ita ga b e an h oa pr or t to ua Sp Un pe us la a a ak m bo La pu pu rtu re lg lg m l nl W un th G m xe Lu Cz Eu Country Source: WWBI. Note that this graph was not produced using data sourced by the authors but reproduces data from the WWBI. Note: WWBI = Worldwide Bureaucracy Indicators. For ideas about other ways to look at government from the outside, see chapters 29–30 of The Government Analytics Handbook. CHAPTER 5: UNDERSTANDING PUBLIC ADMINISTRATION FROM THE OUTSIDE 43 FIGURE 5.4  Public Sector Wage Premium by Region over Time, 2004–18 35 30 25 20 Percent 15 10 5 0 –5 2004 2006 2008 2010 2012 2014 2016 2018 Western Europe Southern Europe Central Europe and Baltics Northern Europe European Union Source: WWBI. Note that this graph was not produced using data sourced by the authors but reproduces data from the WWBI. 44 GOVERNMENT ANALYTICS IN EUROPE Country Case Studies Ambrogio Lorenzetti, Effects of Good Government in the City. The following chapters contain brief case studies that give an overview of the successes and challenges of government analytics in five countries in Europe: Croatia, Estonia, Lithuania, Romania, and the Slovak Republic. In each of these countries, the World Bank’s Bureaucracy Lab helped governments use analytics to answer pressing, practical questions about public administration, particularly at the local and regional level: • How can administrations motivate employees who feel underpaid? • What makes open government tools useful to citizens and public servants? • What can schools do to confront the youth mental health crisis? • How can administrations select employees who care about public service and keep them motivated in the long run? • Why are some local governments much more productive than others, and how can we improve the least productive ones? We’ve used examples from these countries in the first five chapters of this report to illustrate the differ- ent kinds of data available to public sector managers and officials, as well as the different ways they help us understand government. We’ll point you back to those discussions in the chapters that follow. These case studies highlight how the dimensions of government analytics we’ve already outlined fit together in specific contexts. By the end of this section, you’ll be able to imagine the endless ways data can be used to measure and improve government—and hopefully think about your own country context as well. Four of these countries—Croatia, Estonia, Lithuania, and the Slovak Republic—were covered by the EU Measuring and Evaluating Determinants of Public Administration Productivity project that the Bureaucracy Lab undertook between 2019 and 2024. You can find our complete research findings in the relevant country reports, available on the World Bank website. CHAPTER 5: UNDERSTANDING PUBLIC ADMINISTRATION FROM THE OUTSIDE 45 “ Science knows no country, because knowledge belongs ” to humanity, and is the torch which illuminates the world. – Louis Pasteur 46 GOVERNMENT ANALYTICS IN EUROPE CHAPTER 6 Croatia: Influencing Productivity through Improved Management GOVERNANCE CHALLENGES IN CROATIA Croatia scores low on measures of government effectiveness relative to the rest of the European Union. This means that the Croatian government’s capacity to deliver high-quality public services and develop and implement effective policies is limited. To improve government e ­ ffectiveness, public employees must be well managed so they can perform at a high level. Motivation levels are low among civil servants in Croatia (figure 6.1). The biggest reason that civil servants give for dissatisfaction with their jobs is low compensation. But increasing employee motivation ­ isn’t as simple as raising wages: it requires understanding how compensation fits into the big picture of job engagement and performance management. FIGURE 6.1  Average Motivation Level by Central Government Organization, Croatia 140 127 120 117 99 103 100 91 88 88 90 78 79 81 81 85 80 75 71 72 73 Number 66 67 69 70 70 70 61 64 65 60 55 41 40 20 0 e d o y gi my gn ats ig uc ng al us d ut l s ion lo bl pe the y en dev A R To an lop airs t d ro A d s ds bl gic ric ort m st e ist te in H ion to h of a e at e G ub n In tics et pr d ior In tr Cu dm ure ily te po e a tra nt te nf m n ua u a ro re CS ng ete De erty Fi de d c ns nse ig ra ru rs e hy ion ci th n ism U n Ec F O m an h og ve ina ro e iet ct str di ec alt in rt i nd tio io ad l in tur au F rat st nc pm e an H br og nst ai ur d E me itu s r is e so u in itu l p ctu m an fun Re no rei cro nt d ed usi od lic an er pu lo g p lle ra e on , s an o ita at t ad p ct at as yo st an ltu in m is de ta Eu sid oc O n ra fe sp e CS lop ce tion trol ra p ic a ul ic oc fa t re m o a i n ve ien uc e t e de Sc nstr te m , l p sio co titu a e ni V re at ra n a re Ins Bu St nt pe an eo e et ce r, e O abo st rom an d O pl an CS L CS Ju yd al o o ic a H O ic Se CS ys Ph Central government organization Source: World Bank public servant survey in Croatia, 2023. Note: Survey respondents were asked to imagine that their work motivation when they started in the civil service was 100 and rank their current work motivation relative to 100. Numbers above 100 indicate that respondents are more motivated than when they started; numbers below 100 indicate that respondents are less motivated. EU = European Union. 47 HOW CAN GOVERNMENT ANALYTICS HELP? In 2023, the Bureaucracy Lab conducted a public servant survey targeting staff in the Croatian civil service to better understand performance management from the perspective of public employees (see chapter 3 of this report). This work aimed to support the World Bank’s Reimbursable Advisory Services (RAS) program Reforming the Wage-Setting Mechanism. By measuring public employees’ satisfaction with how they are managed, we can better understand low levels of motivation in public administration and the role that ­ ompensation plays. c The survey targeted 28 organizations within the central state public administration. It was conducted online and received over 9,500 responses. The majority of responses came from civil servants who are not police, and we decided to focus on this group because performance management looks very different for police civil servants. WHAT DID WE LEARN? Across organizations, civil servants report that their motivation has declined since they joined the ­service. Civil servants in only three organizations are more motivated than when they started. Wages are the most popular reason civil servants give for their level of motivation, whether low or high. Concerningly, many younger civil servants report that they intend to leave the civil service within 2 years. And compared with similar countries, Croatia has a much higher overall percentage of civil ser- vants who intend to leave: 28 percent, compared with 11 percent in Lithuania, for example. Staff turnover is a significant negative side effect of low motivation levels. It can cause disruptions to service delivery, and it comes with additional costs to replace staff members. It can also make staff members who remain feel overloaded, contributing to burnout. How much does compensation matter for motivation? Croatia has the lowest salary satisfaction rates when compared with similar countries: just 15 percent of respondents are satisfied with their salaries, compared with 86 percent in Romania. But compensation is only part of the picture. The level of job sat- isfaction in Croatia is 54 percent, which is much higher than the level of salary satisfaction. What’s more, civil servants perceive that salaries in the private sector are higher than they actually are. When we looked at actual private sector salaries, we found that the gap between them and public sector salaries was smaller than survey respondents thought—and, for workers with only primary or secondary education, there was no gap at all. In fact, based on our survey, satisfaction with other benefits beyond salary is more closely associ- ated with job satisfaction than salary satisfaction is. The main reasons civil servants gave for staying in the service include hard-to-quantify benefits like job security, work-life balance, and predictable working hours. All this suggests that basic compensation is not the only factor that matters to job satisfaction and employee motivation. Many civil servants feel that there is limited opportunity to advance within the service, leading to the desire to leave. Performance evaluations aren’t used to assess and reward performance, and evaluations are often given without feedback for improving performance. Only 1 percent of respondents reported that their evaluations were used for awarding bonuses. And 81 percent of respondents reported that no rewards or recognition had been promised to them for their performance. This leads civil servants to feel unconfident about whether they will be promoted and to attribute promotion more to personal and political connections than performance. These issues are closely tied to performance management. Of the managers who responded to the survey, 48 percent feel pressured to give staff members higher ratings than are justified, and only 26 percent feel that evaluations are taken seriously enough. Managers also reported feeling underprepared to address performance issues within their teams. ­ 48 GOVERNMENT ANALYTICS IN EUROPE WHAT CAN WE DO? To effectively motivate and retain civil servants in Croatia, a broader compensation package needs to be promoted as part of a broader employment package that restructures and improves the understanding of benefits like job security and work-life balance that are important to employees. Helping managers to communicate more clearly with their staff is an important starting point, and annual staff surveys—similar to the Bureaucracy Lab’s public servant survey—can help monitor employee perceptions of these additional benefits and relevant changes in them over time. It seems that improving how performance management is undertaken in Croatia is crucial to increasing employee motivation. In particular, managers need support in using performance evaluations more effec- tively and in using both extrinsic and intrinsic incentives to reward performance. Managers also need sup- port addressing staff underperformance in a fair and timely manner. If performance management becomes a key element of managers’ work, civil servants will be more satisfied with their jobs and less likely to leave them, resulting in a more effective public administration overall. “ It is evident that the form of government is best in ” which every man, whoever he is, can act best and live happily. – Aristotle CHAPTER 6: CROATIA: INFLUENCING PRODUCTIVITY THROUGH IMPROVED MANAGEMENT 49 CHAPTER 7 Estonia: Influencing the Local Policy Environment with Data GOVERNANCE CHALLENGES IN ESTONIA The quality of the services citizens receive from their local governments varies widely across Estonia, leading to dissatisfaction. One reason this problem persists is that it’s challenging for citizens and policy makers to see how different local governments compare with one another. In 2020, through the Open Government Partnership, the Estonian central government created the Minuomavalitsus (My Municipality) dashboard, an online tool that compares levels of service delivery across ­ 79 local governments each year (map 7.1). The dashboard uses expert assessments of everything from ­ overall governance to the condition of public libraries to score each local government along 20 d ­ imensions service delivery. For the dashboard to be effective, however, citizens need to use the information it of ­ ­ provides to engage with public officials about their priorities, and public officials need to use it to improve the services local governments deliver. MAP 7.1  Level of Governance Quality by Local Government in Estonia, 2021 Valitsemine 2021 0 1 2 3 4 5 6 7 8 9 Teenuse tase Ei hinnata või andmed selgumisel Source: Screenshot of Minuomavalitsus dashboard, Ministry of Regional Affairs and Agriculture, Estonia, https://minuomavalitsus.ee. 51 HOW CAN GOVERNMENT ANALYTICS HELP? In May 2020, the Bureaucracy Lab began to work with the Ministry of Finance in Estonia to better understand how citizens and public officials were using the Minuomavalitsus dashboard—and how to ­ make it more effective. We wanted to understand the dashboard in three ways: ● First, we wanted to understand how well the dashboard works as a measurement tool. Do the dashboard’s scores for the quality of service delivery line up with how satisfied citizens and local officials are with those services? ● Second, we wanted to understand how well the dashboard works as a diagnostic tool. Is the dashboard a useful way for officials to understand citizens’ priorities? ● Finally, we wanted to understand how well the dashboard works as a policy tool. Are citizens and local officials making good use of the dashboard to inform policy conversations? If not, why not? To answer these questions, we needed to gather data about how citizens and public officials thought about their local governments, and how the Minuomavalitsus dashboard figured into their assessments. We used two surveys to accomplish this: a citizen survey targeting a random sample of citizens throughout the country (see chapter 5 of this report) and a public servant survey targeting a random sample of officials in 79 different local governments (see chapter 3). We also analyzed data from the dashboard website itself. WHAT DID WE LEARN? Estonian citizens are moderately satisfied with their local governments: 54 percent of respondents to the citizen survey reported that they were satisfied with the services their local government provides. But levels of satisfaction vary significantly across the country, and local governments that that have higher scores on the dashboard also have more satisfied citizens. In other words, there is a link between the quality of local government services and citizens’ satisfaction—which means the dashboard is a useful measurement tool. On the other hand, most Estonian citizens feel that their local government only reflects their priorities in some areas. This isn’t because public officials aren’t interested in citizens’ priorities: the public servant survey showed that officials want their priorities to match those of the citizens they serve. Nevertheless, there wasn’t any alignment between the priorities public officials reported in the public servant survey and those that citi- zens reported in the citizen survey. All this suggests that there is a lot of room for local governments to better engage with the priorities of citizens. What’s more, in the public servant survey, officials in the same local government often gave very different assessments of the performance of different sectors. Public officials seem to lack accurate information, both about how well local governments are performing and about what citizens’ priorities are. This suggests that the dashboard is still being underutilized. When it comes to use of the Minuomavalitsus dashboard, only 4 percent of respondents to the citizen survey were aware of its existence. But the website data show an increase in visits when the dashboard was being advertised, suggesting that citizens are responsive to conscious efforts to promote the dashboard. Among public officials, dashboard use is more established. Half of the respondents to the public servant survey had used the dashboard at least once, and the majority of those who used it did so multiple times per year. This is a real success for a relatively new initiative. Still, the adoption rate varies widely across local governments, from 100 percent of officials using the dashboard in some areas to only 9 percent in others. Finding the local governments that haven’t adopted the dashboard and convincing them that it is useful is therefore crucial, and the survey data pinpoint exactly where the central government needs to work. The survey also suggested that the work environment in different local governments makes a big differ- ence as to whether the dashboard is being used or not. This led us to ask a new question: What leads some local governments to use the dashboard extensively and others to almost never use it? Building on the Croatia example (see chapter 6), our public servant survey found that motivation levels also matter for investing in novel sources of information. Like in Croatia, public officials’ motivation in 52 GOVERNMENT ANALYTICS IN EUROPE Estonia has decreased over time in most local governments. The reasons officials gave for lower levels of motivation include inadequate compensation, low-quality management, and unsatisfactory service condi- tions. Building an environment in which officials feel motivated to search out new sources of information, such as the dashboard, will require addressing these broader issues. Of course, public servant surveys rely on self-reporting by public officials. One remaining question is whether the dashboard has led to objectively measurable improvements in the quality of public services. The Bureaucracy Lab worked on an impact evaluation to study whether the dashboard has actually affected the decisions that public officials make (see chapter 4). We used machine learning to analyze the minutes from the meetings of local councils to see how the topics of these conversations evolve and whether they do so in response to the dashboard. The hope is that, in the future, an automated system may even be able to update the dashboard based on the policy priorities discussed in meetings. WHAT CAN WE DO? Since targeted outreach efforts have been effective so far, a comprehensive awareness campaign could help increase citizens’ use of the Minuomavalitsus dashboard. It would also be easy to implement, given Estonia’s high level of digital literacy. Partnering with civil society organizations and gathering regular feedback will be essential to getting the dashboard into the public eye. Most importantly, making the dashboard a part of public policy by actively showcasing how it influences decisions and improves outcomes can help demon- strate its value to the public. As we discuss throughout this report, it isn’t enough to give people information without creating a culture in which evidence is used to drive reform (see box 2.1 for ideas about how to measure information use in government). Management quality varies across local governments, with an impact on how effectively the dashboard is used by public officials in their day-to-day work. For instance, only 58 percent of officials reported that they had a formal performance evaluation during the previous 2 years. But the public servant survey showed a strong culture of collaboration within local governments and a strong responsiveness to trainings. Perfor- mance management initiatives offer an opportunity to help build public officials’ capacity to use the dash- board more effectively. Trainings focused on actively using the dashboard could help public officials think about how it fig- ures into the decisions they make on a day-to-day basis, as well as how to better engage citizens in those decisions. These trainings could be further supported by additions to the dashboard to support officials in interpreting the data, understanding their relevance, and making action plans in response (see chapter 3). And workshops could also provide the basis for a long-term plan to improve and expand the dashboard by keeping it grounded in the needs of the public officials who use it. The Bureaucracy Lab’s project in Estonia made extensive use of survey data, but Estonia’s pioneering e-government program has also endowed it with an enormous amount of high-quality administrative data that can be used for government analytics (see chapter 2). In box 7.1, we explore how Estonian health care data are being used to drive reforms. Box 7.1  Using Administrative Data to Improve Health Care in Estonia Though it is not an official part of the project that underlies this report, the Bureaucracy Lab’s work on healthcare in Estonia complements its work on the Minuomavalitsus dashboard. This work showcases the power of repurposing administrative data for improving service delivery. Many Estonian citizens, especially the elderly, face chronic conditions like hypertension and diabetes that require consistent, high-quality care from a primary care provider. In response, the Estonia Health Insurance Fund (the single-payer national insurance fund) has implemented a program called Enhanced (continues on next page) CHAPTER 7: ESTONIA: INFLUENCING THE LOCAL POLICY ENVIRONMENT WITH DATA 53 Box 7.1 Using Administrative Data to Improve Health Care in Estonia (continued) Care Management, which aims to improve primary care for these patients by offering sustained coaching to doctors. The World Bank’s Development Impact (DIME) Department designed an impact ­ evaluation to assess the program’s effectiveness and improve its design. Repurposing administrative data was key to the research team’s strategy for evaluating the pro- gram. The team looked at a decade’s worth of insurance claims data to measure patients’ health care utilization, doctors’ compliance with standards for quality care, and several health outcomes, including hospitalization and mortality. By looking at these data, the team could identify changes in behaviors and outcomes that correlated with doctors’ participation in the program. Using these data, the team was able to identify that the program has had a big impact on patients’ survival probability: mild-risk patients’ mortality risk declined by 1.6 percentage points! The precision of this estimate was only possible because of the repurposing of administrative data. In the process, the research team also developed recommendations for improving the Estonia Health Insurance Fund’s performance incentive system. The former system had paid doctors a bonus if they met primary care targets, but it didn’t account for the fact that some doctors had much younger, healthier patients overall, and it didn’t reward doctors for making significant performance improve- ments if they scored far above or far below the targets. The research team developed a new weighting scheme for incentives to address these problems, which the Estonia Health Insurance Fund was able to put into practice right away. By using already-existing administrative data, DIME was able to assess government effectiveness in improving health care services for citizens and support the Estonia Health Insurance Fund in making changes on the ground to immediately improve outcomes. 54 GOVERNMENT ANALYTICS IN EUROPE CHAPTER 8 Lithuania: Confronting Bias with Survey Data GOVERNANCE CHALLENGES IN LITHUANIA Lithuania is facing a youth mental health crisis. In 2018, 24 percent of students in grades one through nine were bullied, 40 percent experienced low levels of psychological well-being, and 24 percent thought about suicide. The COVID-19 pandemic increased the challenges young people face. But the mental health crisis looks different in different schools and in different regions of the country (map 8.1). The government of ­ Lithuania made it a priority to address this crisis—but first they needed to be able to measure it. MAP 8.1  Share of Students Who Experienced Mental Health Challenges by Municipality, Lithuania, Fall 2020 Share (percent) 25 20 15 10 Source: World Bank public servant survey in Lithuania, 2021. 55 HOW CAN GOVERNMENT ANALYTICS HELP? Beginning in May 2020, the Bureaucracy Lab set out to study the mental health service delivery chain—the public sector organizations and people at every level who contribute to providing mental health services to young people—to find out how it could be strengthened. Understanding how all parts of the service delivery chain function—with different results in different regions of the country—requires microdata specific to individual managers and organizations. We decided to focus our efforts on schools because they are where reforms can have the biggest impact on youth mental health. At school, young people interact extensively with public sector employees (such as teachers, psychologists, and administrators), and school is where men- tal health problems can be most easily detected and prevented. Even in schools, the mental health service delivery chain is complex. It involves not only teachers, school staff, and principals but also managers and officers in organizations at the municipal and national levels. We started by breaking the delivery chain down into pieces that could be measured. We wanted to study whether schools were providing adequate mental health services to students—the outcome of the delivery chain—and the resources available to meet this goal. But we also wanted to under- stand the organizational and individual factors that affect how well schools and the organizations they work with accomplish this. We decided that these factors—including the attitudes, behaviors, and knowledge of public sector workers; their work environment; and the quality of their management—would be easiest to measure in a public servant survey (see chapter 3 of this report). With government partners, the Bureaucracy Lab designed and implemented a survey of about 2,000 public sector workers in Lithuania, who worked in 230 schools and 83 public administration organiza- tions involved in mental health service delivery. Overall, the survey covered 43 municipalities distributed throughout Lithuania. The team collaborated with government partners to design the survey and decide what it should ask. We decided the survey should be administered online, both in response to the COVID-19 pandemic and because the people we surveyed had a lot of experience using technology. Because of these design decisions, the survey achieved a very high response rate of 82 percent. Working on the survey with government part- ners also helped build relationships and investment that encouraged participation in the survey and paved the way for action based on its findings. WHAT DID WE LEARN? The public servant survey measured the youth mental health crisis in detail. We saw significant variation across the country in the share of students experiencing mental health challenges (map 8.1). We also learned that most students aren’t receiving help with these challenges—only 12 percent of students received help from their school. The surveys revealed that school staff and public administrators display high levels of stigma toward mental health problems and have little knowledge about the mental health challenges young people in Lithuania face. This may make it difficult for staff to recognize mental health problems in students and ­ may limit their actions in response. Programs to reduce stigma and increase knowledge, like trainings, are underutilized or not strategically implemented. The surveys also revealed low levels of collaboration among school staff, between staff and parents, and between staff and public administrators. All these kinds of collaboration are necessary for effectively imple- menting mental health programs and ensuring that students receive consistent help and appropriate referrals from teachers, psychologists, and parents. Finally, the surveys revealed a correlation between the overall quality of management in schools and the mental health services offered to students. School principals have significant power to set the mental health agenda in schools, and better overall managers might also be better at recruiting and retaining mental health staff. Management quality varies widely across schools. 56 GOVERNMENT ANALYTICS IN EUROPE WHAT CAN WE DO? Training regarding youth mental health is an effective way to reduce stigma and increase staff members’ knowledge. In 2021, the Ministry of Health in Lithuania started offering a free, self-paced mental health training module online to address this problem, but 71 percent of school staff were not familiar with it, and only 46 percent planned to participate in it. Improving the training’s content and design and incentivizing participation could help improve its uptake. As part of its work in Lithuania, the Bureaucracy Lab designed an impact evaluation to test a low-cost way of encouraging school staff to participate in the online training. We experimented with sending emails advertising the training to school staff and found that an email from the Ministry of Health was impactful in prompting staff to complete the training. You can read more about how that impact evaluation was imple- mented in chapter 4 of this report. Improving teamwork and collaboration in schools is also essential to strengthening mental health ser- vice delivery. Many respondents to the survey, for instance, did not even know whether their school had a psychologist. Collaborative budgeting or other coordination programs at the municipality level might bring together schools and community members in a region to work on mental health issues. At the national level, standardized protocols addressing mental health should be developed to guide schools more consistently. Our evidence also suggests that strengthening the management quality in schools where it is lacking can strengthen mental health services. The Ministry of Education, Science and Sports in Lithuania has already been developing a competency framework for evaluating school principals, offering an opportunity for reform. Training and capacity-building programs, as well as better incentives, may help improve manage- ment at the schools that need it most. Finally, improved government analytics are essential to monitoring the youth mental health crisis to see whether progress is being made. The Bureaucracy Lab used data from a public servant survey to conduct analytics in Lithuania because the available administrative data weren’t granular enough—they covered schools but not individual students and staff. Collecting more granular data would enable more detailed analysis of the impact of policies. Replicating the public servant survey in future years would also enable the government to measure change over time, and developing a data-management dashboard would help make all these data easily available to decision-makers. Finally, embedding impact evaluations in policy mak- ing—like the evaluation the Bureaucracy Lab implemented as part of this project—can help decision-makers understand why reforms are effective, maximizing their impact. CHAPTER 8: LITHUANIA: CONFRONTING BIAS WITH SURVEY DATA 57 CHAPTER 9 Romania: Restructuring Recruitment to Improve Public Administration GOVERNANCE CHALLENGES IN ROMANIA Romania faces challenges to economic development, especially related to poor service delivery and inadequate infrastructure. For example, 40 percent of Romanian 15-year-olds are functionally illiterate, and ­ many rural communities lack access to electricity, piped water, and sanitation. These inequities are partly the result of weak public administration. When public organizations underperform, citizens don’t receive the services they need, while financial support from the European Union goes unused. The Government of Romania has been engaged in public administration reform to professionalize and depoliticize public organizations. One challenge it has encountered is that the majority of public employees are contract employees, who are managed separately from civil servants. This makes it difficult to know even how many employees are working in the public sector, let alone how they are being managed. On top of this, vacancies in management positions are high, making management reforms difficult. Most public servants in Romania stay in their jobs for a long time, however, which means that increasing their motivation is a key challenge for improving public administration. HOW CAN GOVERNMENT ANALYTICS HELP? The Government of Romania needed to gather data about the reality for public employees on the ground to begin mapping human resource management practices in different public administration organizations and see how they affected employee motivation. Their need to understand what was going on inside organiza- tions made a public servant survey an ideal tool (see chapter 3 of this report). A World Bank team supported by the Bureaucracy Lab designed and implemented the Romania Public Administration Employee Survey, which was conducted between June 2019 and January 2020 in 81 public administration organizations in Romania. The survey was an activity under the World Bank program Reimbursable Advisory Services on Developing a Unitary Human Resources Management System within the Public Administration in Romania (P165191). Over 6,000 interviews were com- pleted, both in person and online. The Bureaucracy Lab designed the survey on the basis of the Romanian government’s priorities for reforming human resource management practices. The survey questions measured public servants’ attitudes and perceptions toward two areas of management in particular: recruitment and motivation, which included issues related to promotion, performance management, and compensation. 59 WHAT DID WE LEARN? There is wide variation in motivation levels across public administration organizations in Romania. Both good recruitment and effective management play a role in increasing motivation. Recruiting the right people to work in public administration is the cornerstone of a productive and effective government. While public servants need a broad array of skills to do their jobs effectively, they also need to be motivated by the mission of public service. Without this cornerstone in place, other reforms to human resource management will be less effective. Candidates for public sector jobs in Romania are screened with a written exam that tests their legal and technical knowledge. This knowledge is important, but it doesn’t promise that a candidate will be a good fit for a job. Interviews should be an opportunity for learning about candidates’ s ­ ocioemotional skills and motivation as well, but in the public servant survey, only 19 percent of respondents said that they were asked questions about all these topics during their job interviews. Even though the law requires a more balanced recruitment process, the de facto reality is that only a few skills are prioritized. Asking public servants about recruitment also reveals that rigorous procedures to ensure meritocratic recruitment might still be falling short. Some public servants, especially those who work in local or terri- torial organizations, feel that jobs are not advertised widely enough to attract a competitive pool of candi- dates. And as many as 24 percent of respondents to the survey reported that they were not formally assessed against other candidates when they applied to their current positions, with 23 percent even admitting that exam or interview questions are leaked to preferred candidates. Low competition for jobs not only prevents public administration organizations from recruiting the best people; it also correlates with low motivation and job satisfaction. Only 7 percent of respondents ranked public service as their top reason for choosing a career in public administration, compared to 44 percent who were attracted by job security. What’s more, 88 percent of respondents said they see their own mission as either somewhat or strongly misaligned with the mission of their organization. These numbers suggest organizations aren’t recruiting candidates motivated by public service, and they aren’t aligning individual employees with their missions. Other survey data suggest missed opportunities for increasing employee motivation through manage- ment. A quarter of respondents reported that they were not involved in setting the objectives for their own jobs, representing a missed opportunity for aligning public servants with their organizations’ missions. Performance reviews are also not motivating: almost 95 percent of respondents reported that they received the highest score in their review, meaning that job performance isn’t being measured and used to award promotions and pay raises. When respondents were asked how to improve the performance review system, the two most popular responses were to make the performance ratings more realistic and to give more feedback on performance. Better motivation was also correlated with good interpersonal relationships in the work environment, high levels of trust among colleagues, and the quality of leadership in the organization. WHAT CAN WE DO? Increasing the quality of recruitment of public servants doesn’t have to be expensive. In fact, in the public servant survey, salary was not a major reason respondents gave for choosing a public sector job in the first place, and higher salaries weren’t associated with higher levels of motivation. Instead, public servants who are oriented toward the mission of public service are motivated by realistic feedback and performance-based career advancement, an amicable work environment, and good leadership that creates these conditions. Testing for these features in interviews and then ensuring they are part of a manager’s role are clear steps for the Romanian government to take. 60 GOVERNMENT ANALYTICS IN EUROPE Change can begin right away. A more meritocratic recruitment process should screen all candidates for public service motivation. Managers can provide more and better feedback to employees, especially related to professional development, and better training in people management can support them in doing so. Finally, managers can focus on creating a more amicable work environment and on ensuring that incentives are clearly tied to performance reviews. Moving toward an organizational culture in which public servants feel valued for their work will also move Romania’s public administration toward greater productivity—as well as better services for citizens and stronger long-term development. CHAPTER 9: ROMANIA: RESTRUCTURING RECRUITMENT TO IMPROVE PUBLIC ADMINISTRATION 61 CHAPTER 10 The Slovak Republic: Benchmarking Local Government Performance GOVERNANCE CHALLENGES IN THE SLOVAK REPUBLIC The Slovak Republic scores at the lower end of the spectrum on measures of governance quality in Europe. Only 4 out of 10 citizens in the Slovak Republic consider the quality of public services to be good or very good, compared with 9 out of 10 in the Netherlands (Mackie, Moretti, and Stimpson 2021). Improving management in public administration is a key way to improve the capacity of the government to deliver ­ services to citizens. The quality of governance varies within the Slovak Republic too, meaning that service delivery and the quality of public sector jobs are different depending on where in the country you live (map 10.1). To improve the public administration, the Government of the Slovak Republic needed to understand why some district offices across the country are more productive than others. MAP 10.1  Government Effectiveness Scores by Region in the Slovak Republic, 2021 1.3 2.3 1.9 1.3 0.5 1.9 0.0 0.5 –0.4 0.0 –0.8 –0.4 –1.2 –0.8 –2.3 –1.2 Source: European Quality of Government Index, 2021. Note that this graph was not produced using data sourced by the authors but repro- duces data from a distinct report. 63 HOW CAN GOVERNMENT ANALYTICS HELP? The Slovak Republic already has high-quality, granular administrative data that can be repurposed to analyze public sector productivity and improve the management of public administration (see chapter 2 of this report). The Bureaucracy Lab worked with the Ministry of Interior of the Slovak Republic to repurpose these administrative data in order to understand variation across the 72 district offices that coordinate and oversee local governments. These district offices play a crucial role in service delivery, so improvements to their pro- ductivity could have an immediate impact. We focused on repurposing three key sources of administrative data for government analytics. First, we obtained a data set from a case-management system called Fabasoft that contained over 6.5 million indi- vidual cases from January 2015 to April 2021. We obtained a second data set from a database called Cezir that contained over a million business licensing cases processed between January 2015 and December 2019. These data sets had different features, but both of them let us see the progress of cases over time. Finally, we scraped job postings from June 2017 to November 2021 from the centralized public job portal in order to create a data set for understanding hiring practices. To complement these administrative data sets, we also developed and implemented a public servant sur- vey in collaboration with the Ministry of Interior (see chapter 3). The survey targeted 5,800 public officials working in district offices. Unfortunately, we weren’t able to gather much data with this tool, due to limited response (see box 1.2). WHAT DID WE LEARN? Using the Fabasoft and Cezir data sets, we were able to estimate the monthly productivity of district offices by looking at the average number of cases these offices, and the officials working in them, handled. All of our estimates showed wide variation in productivity across district offices. For instance, each public official in the most productive offices completed more than twice as many cases per month as those in the least productive offices! This means that you could be waiting twice as long to receive a business license just depending on where in the country you live. At the same time, our analysis revealed that different districts are better at some tasks than others. It isn’t enough to label some districts as “productive” and others as “unproductive”—we need to understand which tasks each district excels at and where there is room for improvement. We expected to see productivity differences between district offices. But it surprised us to learn that there is actually more variation in productivity within district offices than between them! In any given office, there are individuals who are as productive as the most productive office and individuals as productive as the least productive office. This means that, as a citizen, the quality of service you receive from the govern- ment doesn’t just depend on where in the country you live—it depends on which public official ends up handling your case. At the same time, this means that every public official has something to learn from their colleagues. Discovering that individual public officials are key drivers of the productivity differences across district offices transforms how we think about improving productivity. Building a team of highly capable individuals in a supportive work environment is the key task for public sector managers. And this means that investing in recruitment, improving personnel management, and reducing turnover are key levers for change. For instance, we observed from the productivity data that public officials are substantially less productive in their first 6–8 months of employment than officials who have been working in the district office for longer. This implies that officials learn a great deal on the job from managers who are good mentors. It also implies that turnover comes with significant costs—if offices can’t retain employees, they won’t see future gains in productivity. Identifying and learning from the managers who already do this well is a natural next step. Since the data gathered in our public servant survey were limited, we used the data set we created from the public job portal to begin exploring what changes to the recruitment process might have the biggest impact on the makeup of district office workforces. We found that competition for jobs varies greatly across the country: the most competitive district has, on average, about seven times as many applicants for each 64 GOVERNMENT ANALYTICS IN EUROPE open position as the least competitive district. Competition for managerial positions is especially limited. But crucially, we found that the productivity of district offices increases as competition for managerial posi- tions increases. More competition means better managers, leading to increased productivity. WHAT CAN WE DO? Small increases in competitiveness for managerial jobs could make a big difference in increasing the quality of management in the Slovak Republic’s district offices. This would lead, in turn, to better management of individual public officials, who are the key drivers of productivity in these offices. Our findings suggest that investing in higher wages for management may induce greater competition for these positions, resulting in savings from higher productivity in the long run. Another area of investment for the government of the Slovak Republic is in data analytics. Because the government already has high-quality data, it can move toward frontier data analytics at very low cost. Some of the obstacles that we encountered in our analysis could be resolved by making more data avail- able to a central analytics team and by integrating different data sets (anonymously) at the individual level (see box 1.2). For instance, the two data sets we worked with in our analysis had different strengths, but we weren’t able to integrate them to unlock all their insights. The Fabasoft data set let us understand the productivity of individual public officials but not the context of the district offices in which they worked. Because differ- ent offices might use the Fabasoft system differently, and some offices might have more complex cases than others, a centralized understanding of the different district office environments in which individuals work is essential. On the other hand, the Cezir data set let us see the progress of business licensing cases across district offices but didn’t offer as granular of data about individuals. In the end, to get the fullest picture and deepest insights about productivity across district offices, we needed data sets built on a common set of identifiers, so they could be integrated. Human resources data, which match individuals to the district offices where they work and explain their job titles and background, were costly and complicated to integrate with the two productivity data sets. Integrating human resources data and productivity data would greatly enhance the Slovak Republic’s ability to provide high-quality government analytics to its managers and decision-makers. It would also enhance the work the Slovak Republic’s ministry-level analytics teams are already doing (see box 10.1). There is much more to be learned! ­ Box 10.1  Analytics Teams in Public Administration Worldwide Governments throughout the world have assembled teams dedicated to government analytics, whose work offers valuable lessons for building an analytics architecture. Analytics units located in the central government—such as in a ministry of finance—are especially well positioned to implement impactful, cost-effective analytics across public administration organizations. Their work can be enhanced by creating analytics units within each organization that adapt analytics tools and interpret results for that organization’s particular needs. As the following examples show, developing a strong analytics architec- ture is possible in every different country context. In the Slovak Republic, analytics teams have existed in some ministries for decades, and the process of building units systematically began in 2016. Today, there are analytics units in every ministry and the Government Office of the Slovak Republic. These units collaborate to ensure that recruitment, output, and communication are held to consistently high standards—and to make their work relevant to the ministers they work with. So far, these analytics units have reviewed €22.6 billion in spending and identified millions of euros in savings. They have also driven evidence-based decision-making—for (continues on next page) CHAPTER 10: THE SLOVAK REPUBLIC: BENCHMARKING LOCAL GOVERNMENT PERFORMANCE 65 Box 10.1  Analytics Teams in Public Administration Worldwide (continued) instance, the Institute for Financial Policy experimented with a letter to taxpayers who owe property tax that has increased tax revenues by €2.5 million. The Bureaucracy Lab’s experience was that the Ministry of Interior’s analytics unit was the most open to novel analytics of all our partners there and was very capable in collaborating with us. The Slovak Republic is leading the way in creating such analytics units in Europe. Other countries have started to build their own analytics teams focused on what we call government analytics. The United States federal government now requires that all federal organizations identify key questions related to their implementation capacity as well as approaches to answering those questions. In Mongolia, the Prime Minister’s Accelerator Unit has used administrative data and surveys to better understand predictors of personnel turnover and its impacts. The unit has a team undertaking a busi- ness process reengineering activity with the Ministry of Labor and Social Protection. Initiatives such as these are directly influencing the Prime Minister’s approach to managing the public service. In Ghana, the Office of the Head of the Civil Service created a “bureaucracy lab” of its own 4 years ago with the World Bank’s support. The lab surveys officials to better understand their experience of specific processes within government, such as onboarding into new jobs and performance appraisal. Governments like these around the world are building analytics teams to turn an analytical lens on governments own data and using it to improve public sector efficiency and effectiveness. For most such teams, the impact is immediate and sizeable. 66 GOVERNMENT ANALYTICS IN EUROPE APPENDIX A Project Background THE BUREAUCRACY LAB The World Bank’s Bureaucracy Lab is a joint effort of the World Bank’s Governance Global Practice (GGP) and the Development Impact (DIME) Department that aims to transform the evidence base for public administration reform. The Lab provides valuable insights into the operational design of the public s­ ector by generating improved administrative and survey data on the characteristics of public officials and their organizations and using these as the basis for research and practice. With a focus on innovation, data ­ analysis, and collaboration, the Bureaucracy Lab is at the forefront of public sector reform and improved ­ governance worldwide. EU MEASURING AND EVALUATING DETERMINANTS OF PUBLIC ADMINISTRATION PRODUCTIVITY The EU Measuring and Evaluating Determinants of Public Administration Productivity project was funded through the Part II Europe 2020 Programmatic Single-Donor Trust Fund, generously supported by the European Commission (EC) (referenced under TF073353). The implementation of this project was entrusted to the Bureaucracy Lab of the World Bank, working in close collaboration with the EC. This collaborative effort was based on an agreement signed in September 2019, which outlined the project’s scope and objectives, with an implementation time frame through September 2024 (figure A.1). In July 2020, an important amendment was made to the agreement, substantially enhancing the project’s reach and impact. This amendment expanded the project’s activities, broadening its scope from three to five EU Member States, thus enabling a more comprehensive examination of the issues at hand. Furthermore, the amendment extended the implementation time frame and adjusted the indicative timetable for achieving outputs and results indicators, providing a more flexible and holistic approach to achieving project goals. FIGURE A.1  Project Timeline Project Survey instruments Slovak Republic Croatia country commencement developed country report report July 2020 August 2022 January 2024 September 2019 January 2021 June 2023 April 2024 Lithuania country report Estonia country Project extension Report on the consolidated report measurement approach to the core survey Source: Original figure for this publication. 67 The overarching objective of this project was twofold. First, it aimed to empirically investigate the personnel determinants and mechanisms that influence productivity within public administration and service delivery units across the five EU Member States involved. Second, it sought to produce comparative cross-country indicators on public administration, spanning all tiers of government. To achieve these objectives, the project relied on a robust methodology rooted in the analysis of micro-level ­ administrative data. Additionally, surveys were conducted among representative samples of administrators and service delivery personnel to gain a nuanced understanding of the factors influencing productivity and identify actionable reforms. This project represented a significant collaborative effort between the EC, the World Bank, and the ­ participating EU Member States—Lithuania, Estonia, the Slovak Republic, and Croatia—striving to enhance the efficiency and effectiveness of public administration and service delivery within the region through robust measurement and analytics. KEEP CALM AND FIX THE BUREAUCRACY 68 GOVERNMENT ANALYTICS IN EUROPE BIBLIOGRAPHY Bjärkefur, Kristoffer, Luíza Cardoso de Andrade, Benjamin Daniels, and Maria Ruth Jones. 2021. Development Research in Practice: The DIME Analytics Data Handbook. Washington, DC: World Bank. https://openknowledge.worldbank.org​ /­handle/10986/35594. Bosio, Erica, and Simeon Djankov. 2020. “How Large Is Public Procurement?” Let’s Talk Development (blog). World Bank Blogs, February 5, 2020. https://blogs.worldbank.org/developmenttalk/how-large-public-procurement. Charron, Nicholas, Victor Lapuente, and Monika Bauhr. 2021. “Sub-national Quality of Government in EU Member Presenting the 2021 European Quality of Government Index and Its Relationship with COVID-19 Indicators.” States: ­ Working Paper 2021:4, The Quality of Government Institute, Department of Political Science, University of Gothenburg, ­ ­Gothenburg, Sweden. Fenizia, Alessandra. 2022. “Managers and Productivity in the Public Sector.” Econometrica 90 (3): 1063–84. https://doi​ .org/10.3982/ECTA19244. Gertler, Paul J., Sebastian Martinez, Patrick Premand, Laura B. Rawlings, and Christel M. J. Vermeersch. 2016. Impact Evaluation in Practice. 2nd ed. Washington, DC: Inter-American Development Bank and World Bank. http://hdl.handle​ ­ .net/10986/25030. Hasnain, Zahid, Emoke Anita Sobjak, Iman Kalyan Sen, and Ravi Somani. 2021. Selecting the Right Staff and Keeping Them Motivated for a High-Performing Public Administration in Romania: Key Findings from a Public Administration Employee Survey. Washington, DC: World Bank. http://documents.worldbank.org/curated/en/099549505202223190​/I ­ DU06d6bc25a0 36690442e09b3209218de6ca804. Legovini, Arianna, Guigonan Serge Adjognon, Guadalupe Bedoya Arguelles, Theophile Bougna Lonla, Kayleigh Bierman Campbell, Paul J. Christian, Aidan Coville, et al. 2018. Science for Impact: Better Evidence for Better Decisions — The DIME Experience. Washington, DC: World Bank. https://documents.worldbank.org/pt/publication/documents-reports​/­document detail/942491550779087507/science-for-impact-better- evidence-for-better-decisions-the-dime-experience. Mackie, Iain, Claire Moretti, and Alex Stimpson. 2021. Public Administration in the EU Member States: 2020 Overview. Luxembourg: Publications Office of the European Union. ­ Rogger, Daniel, and Christian Schuster, eds. 2023. The Government Analytics Handbook: Leveraging Data to Strengthen Public Administration. Washington, DC: World Bank. https://www.worldbank.org/en/publication/government-analytics. World Bank. 2019. Innovating Bureaucracy for a More Capable Government. Washington, DC: World Bank. http://hdl.handle​ .net/10986/31284. World Bank. 2021. Europe and Central Asia Economic Update, Spring 2021: Data, Digitalization, and Governance. Washington, DC: World Bank. https://openknowledge.worldbank.org/handle/10986/35273. ­ World Bank. 2022. Governance of the Service Delivery Chain for Youth Mental Health in Lithuania: Key Findings from a Public Sector Employee Survey. Washington, DC: World Bank. doi:10.1596/38144. ­ QUOTATIONS Quotation from Heiki Loot in chapter 1 is from Matt Ross, “Heiki Loot, Secretary of State, Estonian Government: Exclusive Interview.” Global Government Forum, November 21, 2018, updated April 2, 2022. https://www.globalgovernmentforum​ .com/heiki-loot-secretary-of-state-estonian-government-exclusive-interview/. Quotation from Toomas Hendrick Ilves in chapter 2 is from Mike Canning, William D. Eggers, John O’Leary, and Bruce Chew. 2020. Creating the Government of the Future. Deloitte Center for Government Insights. Boston: Deloitte. https://www​ .deloitte.com/cbc/en/our-thinking/insights/industry/government-public-services/government-of-the-future-­evolution​ -change.html. Quotation from Zuzana Čaputová in chapter 3 is from Helena Zdráhalová. 2022. “Čaputová: We Must Not Slacken in H ­ elping Ukraine.” Forum: Charles University Magazine, November 16, 2022. https://www.ukforum.cz/en/main-categories​/n ­ ews/8656 -caputova-we-must-not-slacken-in-helping-ukraine. 69 ­ alenica, ­ Quotation from Ivan Malenica in chapter 3 is from Ivan Malencia. 2019. “Speech of Mr Ivan M ­ ublic Minister of P Administration of Croatia, at the 37th Sitting of the Congress of Local and Regional Authorities of the Council of Europe.” Strasbourg, France, October 29–31. https://rm.coe.int.mcas.ms​ speech-of-mr-ivan-malenica-minister-of-public​ /­ -administration​-of-croati/16809885c2?McasCtx=4&McasTsid=15600. Quotation from Dalia Grybauskaitė in chapter 5 is from Lynn Harris. 2010. “Female Heads of State: The Chosen Ones.” Glamour, November 1. https://www.glamour.com/story/female-heads-of-state. ­ Viktor Hulík, Čumil, Man at Work, Bratislava, the Slovak Republic. Photo by Marco Ebreo, CC BY-SA 4.0. 70 GOVERNMENT ANALYTICS IN EUROPE This report is part of a collection examining how analytics using government microdata is revolutionizing public administration throughout the world. Its focus is on government analytics in the European Union. The collection is based on The Government Analytics Handbook, a comprehensive guide to using data to understand and improve government. The reports in this collection aim to help public servants apply lessons from the Handbook to their own administrations by describing the unique opportunities and challenges for government analytics that arise in different regions. No two regions, countries, administrations, or organizations are alike—that is why using microdata to measure, understand, and improve government is so important! SKU 33666