Policy Research Working Paper 10361 Toward Environmentally Sustainable Public Institutions The Green Government IT Index Michael Lokshin Eduardo Widmar Europe and Central Asia Region & Information and Technology Solutions March 2023 Policy Research Working Paper 10361 Abstract This paper proposes a new Green Government IT index to the conceptual and theoretical foundations behind the assess the environmentally responsible use of computers and new index and defines a set of verifiable, comparable, and other resources by the information technology departments transparent indicators for index construction. This frame- of government institutions and nonprofit organizations. work allows for future index revisions as the green agenda The methodology used in the paper relies on the established evolves. The new index could be the first step before more literature on index construction and the existing models resource-intensive assessments to inform an organization’s for evaluating the environmental sustainability of informa- long-term environmentally sustainable strategy. tion and communications technologies. The paper discusses This paper is a product of the Office of the Chief Economist, Europe and Central Asia Region and the Information and Technology Solutions Department. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://www.worldbank.org/prwp. The authors may be contacted at mlokshin@worldbank.org and ewidmar@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Toward Environmentally Sustainable Public Institutions: The Green Government IT Index Michael Lokshin and Eduardo Widmar* Keywords: ICT, Government, Sustainable Development, Green agenda JEL: C43, Q55, Q58, L86 *The authors are at the World Bank. This paper’s findings, interpretations, and conclusions are entirely those of the authors and do not necessarily represent the views of the World Bank, its Executive Directors, or the countries they represent. We are grateful to James Foster for his guidance on the methodological approach used in the paper. We thank Denis Robitaille for his support of this initiative and Frank Batler, Rajan Bhardvaj, Ronald Fortin, Paulo Motta, Frederico Shinohara, and Varun Sondhi for their support and contributions on the design of GGIT instrument and general comments. 1. Introduction Over the last several decades, the world experienced an explosion in information and communication technologies (ICT). ICT plays a critical role in solving major environmental challenges and supporting the transition to a sustainable, greener economy by increasing the efficiency of existing production processes and services, creating new, green value chains, and intensifying the use of scarce resources. At the same time, ICT is one of the fastest-growing greenhouse gasses (GHG)-emitting and energy management sectors. Global ICT spending exceeded $4 trillion in 2021 and grew at a 5.5 percent rate in 2022 (Gartner 2022). ICT accounted for at least 5 percent of the world’s energy demand in 2015; that demand is projected to reach 20 percent by 2030 (Andrae and Edler 2015, Gupta et al. 2021). Currently, ICT is responsible for up to 3.6 percent of the world’s total GHG emissions, and the digital carbon footprint is increasing by 8 percent per year (Belkhir and Elmeligi 2018, Freitag and Berners-Lee 2020). Electronic waste (E-waste) is the world’s largest waste stream, often containing materials extracted in conflict- ridden areas and toxic to human health and the environment. The world generated 53.6 million metric tons of e-waste in 2019 (Forti et al. 2020). Given the growing share of ICT in the global economy, identifying risks and opportunities of ICT for environmental consequences is critical for achieving the goals of sustainable green growth. ICT affects the environment over its whole lifecycle in a complex and multidimensional way. Manufacturing ICT components that use rare earth metals leads to air, water, and soil pollution and contributes to half of ICT’s global climate change impact (Cabernard et al. 2019). The operation of computers and servers consumes power and contributes to GHG emissions. The disposal of ICT hardware at the end of the lifecycle pollutes the environment. A wide variety of ICT products such as mobile phones, computers, servers, printers, and electronic components integrated into other products and a range of materials used in manufacturing these devices make measurement of the environmental impact of ICT a challenging task.1 The pressure from society and the advent of corporate social responsibility induced businesses to integrate environmental impact measures into their reporting systems. Such sustainability reports often comprise indicators measuring environmental and societal impacts like GHG emissions or 1 See Chen et al. (2020) for an extensive review of the recent literature on the environmental impact of ICT. 2 employee satisfaction with their companies’ actions in achieving the climate change agenda (Wallström 2006). Governments and organizations could rely on various indexes measuring environmental performance, such as Environmental Performance Indicators, Key Ecological Indicators, or Green Performance Indicators (Hammond and Institute 1995, Ferreira and Pernici 2014), and many others. Although the number of environmental indicators is growing rapidly, the existing sustainability frameworks like ISO 14001, the Greenhouse Gas Protocol, or the Global Reporting Initiative make little connection to Green ICT (ISO 2021, World Resource Institute 2021, GRI 2021). The indicators measuring green ICT are rare (Krumay and Brandtweiner 2016). The impact of ICT is most often measured in terms of GHG emissions or other quantifiable indicators. The main issue with such an approach is that different organizations have different goals and operational practices resulting in different environmental impacts. Comparing the environmental performance of organizations pursuing different agendas based on their environmental impact could be misleading from the policy perspective. The fact that an organization emits more GHG does not automatically make it less green. A datacenter housing tens of thousands of servers supported by just a few staff might have a large CO2 footprint but, at the same time, be very environmentally efficient in what it does by utilizing the latest energy and waste management technologies.2 The GHG emission of a ministry of finance’s IT infrastructure could be small compared to that of a datacenter, and still, that department might rely on outdated equipment and practices that are not compliant with modern environmental standards. In most countries, governments are the largest employer, and government budgets represent, on average, about a third of a country’s GDP. Universally, ICTs are the primary drivers of improving public sector productivity by optimizing management and organizational processes for a more significant societal impact (World Bank 2021). The e-government development levels have been improving rapidly across all regions and countries from all income groups (United Nations 2021). Government organizations continue to embrace remote work and hyperconnected public services and innovate at a quicker pace by adopting new technology solutions for operational and mission- critical needs. GovTech – a holistic government approach to public service modernization – drives 2 For example, the server farms of leading cloud server providers, such as AWS, Google, or Microsoft, are at least three times more energy efficient than the median enterprise data center in the US (Oberhaus 2019). 3 the increasing demand for ICTs infrastructure in governments. The government’s IT spending is growing by more than 5 percent annually to reach $500 billion globally in 2021 (Gartner 2022). Three main factors drive the government’s adoption of green ICT: economic, regulatory, and ethical (Molla et al. 2009). Improving energy efficiency, for example, could reduce operational costs. The requirements for government IT departments to comply with environmental laws are examples of regulatory drivers. The Electrical and Electronic Equipment (EEE) Directive that controls the disposal of EEE, and its consignment to landfills is an example of such regulation in the EU (e.g., Tansel 2017). Corporate social responsibilities and good corporate citizenship practices are examples of ethical drivers of environmentally sustainable IT in the government (e.g., Bohas and Poussing 2016). Governments responsible for setting up and enforcing environmental standards across economies are also the leading promoters of the green agenda. In that role, governments must demonstrate commitment to sustainable development and raise public awareness by promoting their own green government operations. For example, the Government Greening Initiative was launched by the governments of the US and Canada in 2021 to enable countries to share the best practices in order to improve green government operations and help meet Climate Change goals (US Government 2021). The development and implementation of practices that minimize the environmental impact of government ICT infrastructure is a critical element of this process. However, to our knowledge, the instruments for monitoring the progress of government IT toward green goals are currently missing. In this paper, we develop a methodology for evaluating the greenness of government IT departments. That approach should increase the effectiveness in selecting strategic policy priorities for transforming the government IT to environmentally sustainable operations and promote more transparent decision-making processes. We propose a new green government IT (GGIT) index that would allow assessing the environmentally responsible use of computers and their resources by the IT departments of government institutions, multilateral development corporations, think tanks, NGOs, and other organizations. Our theoretical and empirical methodology relies on the established literature on index construction and draws on the existing models for evaluating the environmental sustainability of ICT. The paper discusses the conceptual and theoretical 4 foundations behind the new index and defines a set of verifiable, comparable, and transparent indicators for index construction. The following section summarizes the recent literature on Green IT measurement. Section 3 describes the specific features of IT systems in government institutions that justify the need for the government-specific Green IT index. Section 4 provides a theoretical framework for the new index. Section 5 discusses the index dimensions and aggregation approaches. Section 6 presents the approach for the practical implementation of the GGIT Index and develops the survey instrument, and Section 7 concludes. 2. Measuring Green IT: Literature review Over the last two decades, studies related to Green IT have been consolidating into a well- established research stream (Ribeiro et al. 2021). But, despite the importance of ICT for the green transition agenda, the limited quantitative and longitudinal research represents a significant gap in this area (Singh and Sahu 2020). The literature on the integrated indexes of Green IT is sparse, and most of the Green IT metrics have focused on the energy efficiency of the hardware side of IT. Only a few papers developed frameworks for measuring the environmental impact of software products (e.g., Bozzelli et al. 2014). Several indicators and frameworks were recently proposed to measure the “greenness” of IT organizations and departments. Molla et al. (2009) introduced the Green IT Readiness (G- readiness) model that conceptualizes Green IT from the infrastructure and capability perspective. The analysis in the paper is based on a survey of CIOs of more than 2,100 private companies from the US, Australia, and New Zealand. The paper identifies four components of the Green Readiness assessment: attitudes, policies, practice, and governance. These main components comprise eight subcomponents and 32 indicators. The averaging of the components produces the summary G- readiness index to identify areas needing improvement. While the G-readiness index was one of the first to measure the environmental sustainability of IT infrastructure, it is mainly based on subjective assessments of CIOs that might be biased toward presenting their organizations in a favorable way. The G-readiness index is also designed without a clear methodological framework, both in terms of selecting individual indicators and in the choice of the aggregation methodology. 5 Butler (2011) proposed the practice-oriented Green Information Systems (IS) framework based on five core categories: the Business and IS Strategy; Energy Efficiency; Dematerialization; Waste and Recycling; and Green Operations. The paper argues that if practitioners address these five areas comprehensively, lower organizational GHG emissions will result. The framework does not propose an index to compare the green performance of different organizations but rather an approach to reduce GHG emissions. The paper also emphasizes that while a growing number of organizations realize the direct effects of Green IT, the enabling effects of Green IS are proving more elusive. A paper by Uddin and Rahman (2012) develops an energy efficiency and low carbon enabler green IT framework for datacenters to save electricity consumption and reduce the emission of GHG to lower the effects of global warming. The framework relies on energy-saving techniques like virtualization and cloud computing. It comprises five phases to properly implement green IT techniques to achieve green data centers. The framework divides data center components into different resource groups and applies green metrics, for example, Power Usage Effectiveness, Data Center Effectiveness, and Carbon Emission Calculator, to assess the performance of individual components to set the benchmarking values as standards to be followed. The proposed approach is specific to the datacenters and might not apply to government organizations that use a broader spectrum of ICT equipment and practices. The framework also focuses on measuring the specific environmental outcomes and establishing benchmarks that might be challenging to apply to organizations with different business goals. The Sustainable ICT Capability Maturity Framework (SICT-CMF) was developed by the Innovation Value Institute to systematically assess Sustainable ICT capabilities (Donnellan et al. 2011). The SICT-CMF complements existing approaches for measuring SICT maturity, such as the G-readiness framework (Molla et al. 2009). The SICT-CMF focuses on four key actions for increasing SICT’s business value: defining the scope and goal of sustainable ICT, understanding the organization’s SICT capability maturity level, systematically developing and managing the SICT capability building blocks, and assessing the SICT progress over time. The SICT-CMF recognizes the need for business alignment for the SICT metrics that vary for different organizations and industries. The metrics could include, for example, the total company CO2 emission and emission per employee, CO2 emission in travel, percentage of materials replaced by 6 ICT, and others. The framework for a particular organization is developed through interviews with main stakeholders to define the key priorities and SICT drivers and assess the performance of initiatives taken or planned. The results of these interviews are then aggregated and mapped to five standard maturity level categories (Paulk et al. 1993). The SURF Green ICT Maturity Model (SGIMM) is a maturity model on green ICT developed by Dutch Higher Education and Research Partnership for ICT in collaboration with Vrije Universiteit Amsterdam (Hankel et al. 2015). The SGIMM was designed to help organizations assess where they stand on the Green ICT agenda and guide actions for improvement. The model covers four domains of negative and positive impacts and aspects of ICT. The “green ICT in the Organization,” “greening of ICT,” and “greening of operations with ICT” domains are not sector-specific, while the fourth domain that covers “greening of primary processes with ICT” should be customized for the purposes of a particular organization. Several organizations in the Netherlands rely on the SGIMM framework (e.g., Hankel et al. 2019). Given the importance of environmental issues in the current policy agenda and the large and growing share of the ICT sector in both energy consumption and GHG emissions, many models, frameworks, and tools were developed to assess the environmental impact of ICT. Most of these models focus on specific aspects of energy efficiency. More general models that capture the entirety of ICT, for example, the Green Readiness framework, are often too abstract and impractical. A few models that offer practical approaches to applying Green ICT principles to business processes in organizations are specific to the datacenters or large ICT-centered enterprises. To our knowledge, no frameworks were developed to assess the green state of ICT departments in government and other not-for-profit organizations. 3. Why is the public sector IT different? There are inherent differences in the structure and management of IT departments of public and private entities. While private businesses serve people as customers with the goal of profit maximization, government organizations serve people as constituents, maximizing the value for the public. The public value of ICT in governments is the ability of ICT systems to improve government efficiency, improve services to citizens, and social values such as inclusion, transparency, and participation (e.g., Chircu and Lee 2005). ICT can also influence the democratic 7 process and voting returns, and as such, the expansion of ICT in government can become a point of contention and subject to impasse and political roadblocks that slow down or outright constrain ICT adoption and implementation (Palmer and Perkins 2012). Government organizations are typically lagging behind private businesses in adopting modern technologies, including ICT. For-profit firms can readily justify risky investments in innovations for competitive advantage. The public sector relies on government allocations for its funding, making budgetary constraints more challenging compared to private businesses. Once established, the fundings also become cyclical, which, in turn, induces waste. As a result, ICT projects perceived as risky are less likely to be considered in the public sector (Rocheleau and Wu 2002). Government organizations have less market exposure and, as a result, fewer incentives for productivity and efficiency improvements, but, at the same time, more legal and procurement constraints. For example, acquisitions of hardware and software by government organizations are subjected to multiple bureaucratic and legal regulations, making the entire process time- and resource-consuming. Once set up, it is easier to maintain or renew the budget to support an established operation than to justify a new one. The private sector could usually offer higher salaries to IT professionals, contributing to high staff turnover in IT departments of government organizations. The relatively lower wages in the public sector also result in difficulties motivating individual performance. Such a shortage of skills drives the reliance of public institutions on contract staff and outsourcing IT functions. Obsolete technologies and processes are ubiquitous in the public sector. For example, three-fourths of the US IT budget supports aging technology, and the increasing costs of such support are shortchanging modernization (The US Congress 2017). Multiple layers of authority, typical for the public sector, lead to low implementation rates and may hamper innovation as decisions take longer to be finalized and resourced. Compliance requirements, legacy processes, hardware and software, and policy limitations can make investments in IT difficult for the governance committees. Access to IT funding often requires a strong cost-cutting focus, as public funds are usually less flexible. Public sector IT budgets typically lack the flexibility to carry over or borrow against the future, which is critical to account for benefits that accrue during an asset’s life cycle. Governmental jobs are more stable and 8 whoever stays is not necessarily driving change. And contract staff are not usually equipped with the organization’s vision, just the job for the moment. The lifecycle costing helps demonstrate that even though environmentally and socially preferable goods and services are more expensive, they may be cost-efficient, bringing significant savings over product life and end-of-life disposal. Most government procurements require working with fixed budgets that cannot be carried forward and where the net present value cannot be accounted for. In this case, justifying purchasing sustainable products and services might be difficult as it is likely to increase costs compared to the standard, traditional equipment. Another feature that distinguishes government IT procurement processes from those in the private sector is that in the public sector, capital and revenue budgets usually fall under the authority of different departments. Because the long-term benefits of sustainable public procurement accrue during a product’s life and product utilization and by various units within the organization, the departments bearing the capital costs might be hesitant to fund such green investments. These differences in IT practices and processes between the public and private sectors justify the need for developing a specialized tool to measure the adherence of governments, nonprofit organizations, and multilateral development agencies to the green agenda. 4. Toward a new Green Government ITS Index: Theory and implementation Our methodology for developing the GGIT index is similar to many measurement exercises applied in different fields (e.g., Cameron et al. 2021). We rely on the standard approach in constructing a policy-relevant measure. 4.1 What are we measuring? The concept of GGIT We define the Green Government ITC as a systematic application of environmental sustainability criteria to the production, sourcing, use, and disposal of the government IT technical infrastructure as well as within the human and managerial components of the IT infrastructure to reduce emissions and waste in IT, business, and supply chain-related processes, and improve energy efficiency (e.g., Molla et al. 2009). 9 4.2 What is the purpose of the GGIT Index? The index aims to assess the environmentally responsible use of computers and other ICT resources (Green Computing) by the IT departments of government institutions and other organizations. The index can facilitate comparisons between different organizations in their pursuit of environmental ITS sustainability, identify areas for potential improvement, set up the baselines and help interpret the trends within an organization over time and across organizations or sectors. 4.3 What are the desired characteristics of the index? Constructing a measure of Green Government ITC entails many choices that can appear to be subjective and unrelated to one another.3 A set of desired characteristics for such an index provides the guiding principles that help make these choices explicit and organize them to obtain a relevant and practical measurement tool. Literature on measurement offers the standard set of such characteristics that we modify for the purposes of GGIT (e.g., Alkire et al. 2015). Simplicity: this characteristic speaks to the simplicity of a measurement approach and the ease with which it might be communicated. Unclear or overly complicated measures discourage IT management and staff participation in constructive policy discussions. Barriers to understanding an index can lead stakeholders to question its usefulness as a policy-relevant tool. On the other hand, a well-understood index allows stakeholders to evaluate for themselves its qualities. This index property also requires a clear link between the indicators used in an index and the underlying data. Transparency of the Green Government IT index and its components will facilitate its takeup by organizations, promote independent confirmation of its findings, and support the index’s credibility. Coherence: speaks to the authenticity of the index in capturing the phenomenon it aims to measure. One should be able to describe in plain language the underlying concept of what is being measured by the index and articulate why this concept is a credible version of what needs to be measured. In our case, we measure the capability of government IT departments to use technologies and practices to reduce their resource consumption and GHG emissions. 3 This section draws on a discussion in Cameron et al. (2021). 10 Motivation: this property considers the extent to which the index fulfills the purpose for which it has been constructed. The motivating objectives should guide the index design. Rigor: this characteristic addresses the aggregation method employed and the axioms that must be satisfied by the index. For example, the aggregate index should not decrease because of the improvements in its components. Below, we outline several axioms that the index should fulfill. These axioms ensure that the results derived from the index are robust to allowable changes in its components. Implimentability: this property requires that the index we design can be constructed using available data. This property imposes restrictions on the qualities of the data to be used. Both the purpose and the coherence characteristics require that the data coverage is sufficient across organizations and dimensions. Data should be homogenized to ensure that the individual variables are comparable across organizations. Replicability: The index should allow comparisons across organizations at a given point in time, and the data requirements should be maintained through time to allow consistent estimates of progress (the intertemporal comparisons). An overly volatile index might indicate replicability problems related to the data and/or the aggregation method. Incentive compatibility criterion concerns the interaction between actions by organizations and the behavior of the index through time. The index should not fall when its components improve between time periods. Some indices are constructed so that the index value of a unit of analysis (an organization or a country) depends on the index values of other units. Then, an organization can improve some or all the index components and still have its aggregate index value to drop. Using such relative indexes might obscure the relationship between the actions of the organization and the index level, thus diminishing the accountability that the index is meant to support. Similarly, the index design can adopt a goal-oriented approach when progress is measured as the extent of movement towards pre-defined goals. If the goals vary with external conditions, it can be unclear whether the progress is due to efforts undertaken by an organization or changes in the goals. The incentive compatibility criterion postulates that an index should use fixed or absolute targets that link changes in the index to actual progress. 11 4.4 Axiomatics Compared to the desired characteristics discussed in the previous section, axioms are more formal and general properties that the index should satisfy. The measurement literature often groups axioms into three categories (e.g., Foster et al. 2013). Invariance axioms indicate what not to measure. Dominance axioms indicate what index should measure. Subgroup axioms postulate how the index should be broken down or built up by its components and units of analysis. The GGIT Index should satisfy symmetry, monotonicity, and subgroup decomposability axioms. The symmetry or anonymity axiom falls into the invariance class of axioms and guarantees that the index value is not affected when variable levels are switched. The monotonicity axiom requires the index value to reflect improvements in individual components. This axiom ensures that the index value rises when one component or variable increases, and the other components or variables do not fall. This axiom is related to the incentive compatibility criterion since it ensures that an organization is not penalized when it succeeds in improving the index components. The subgroup decomposability axiom ensures that the index can be divided into subindexes and linked to the original index for policy analysis. When the decomposability axiom is satisfied, the aggregate index could be expressed as a weighted average of the sub-indexes representing the main index components. These sub-indexes could, in turn, be expressed as a weighted average of the groups of variables and individual variables these sub-components consist of. This axiom might help inform why one organization is doing better than another or understand an organization’s progress over time. 5. Constructing the Green Government IT Index The desired properties and axioms described in the previous section provide a basic theoretical framework within which the GGIT Index can be calculated and compared across organizations and time. This section describes the methodology of practical implementation of the GGIT index. 5.1 Dimensions of the index We adopt the lifecycle matrix approach to selecting the dimensions of the GGIT Index. Our index structure is similar to the two-tier structure - categories in which several components are grouped 12 - used in other ICT models described in Section 2 (Hankel et al. 2017). We identify three main pillars of the index: ICT equipment acquisition, ICT operation, and ICT disposal. In addition to pillars, we introduce six cross-cutting themes of the index (horizontal): strategy, standards, performance, key performance indicators, governance, and knowledge and outreach. The pillars and themes together form the index matrix that guides the choice of individual indicators. The acquisition pillar assesses whether an organization has processes and practices to comply with environmentally sustainable hardware and software acquisition standards. For example, if an organization has the policy to procure only Energy Star 4-certified equipment. The operation pillar reflects the environmentally sustainable practices an organization relies upon in its day-to-day operations. For example, if an organization monitors its energy consumption and sets targets to become more energy efficient. Or, if it uses some power-saving technologies like thin clients or power management software in its operations. The disposal pillar evaluates the end-of-life practices of retrofitting, reusing, recycling, and utilizing Waste from Electrical and Electronic Equipment (WEEE). Improving the collection, treatment, and recycling of electrical and electronic equipment can improve hardware production and consumption sustainability, increase resource efficiency and contribute to the circular economy (EU 2012). Cross-cutting themes are present in each pillar and assess whether there is a coherent strategy in each pillar, for example, if the organization has a well-articulated strategy for green hardware acquisition. The standards theme inquires whether the actions in a particular pillar follow some pre-defined standards. For example, if the organization integrates the life cycle costing standards (ISO 15686-5) into its procurement process. Performance theme reflects a performance dimension of a particular pilar. For example, an organization might choose to lease out hardware so that it can be utilized at a central location with the highest levels of environmental protection. Governance theme inquires, for example, whether an organization has a department or a division responsible for implementing the Green IT acquisition strategy. Knowledge and outreach assesses if an organization conducts training and skill development programs on the best practices in sustainable IT processes. 13 5.2 Aggregation method and variable selection The aggregation method for the index should satisfy the axioms and desired properties outlined above. The essential features of the aggregation method are simplicity and transparency to ensure understanding and impact of the index. Most indicators in our index are derived from the questions concerning the adherence by the IT departments to strategies and guidelines of green practices. For example, in the acquisition pilar, a relevant question could be whether an organization has a strategy for acquiring environmentally sustainable hardware and software. In the operation pilar, a question could ask whether switching computers to the energy-saving mode is practiced in periods of inactivity. Such questions generate a set of binary variables , = 1, … , , where N is the number of indicators constituting the index. Variable = 1 if a particular green protocol or action is implemented and = 0 if no such practice or activity is in place.4 The GGIT Index can then be represented by vector = (1 , 2 , … ) that summarizes the actions undertaken by an organization to achieve the environmental agenda. When the variables constituting an index are binary, a standard practice of many measuring exercises is using the so-called “counting method” (Atkinson 2003). This approach assigns positive weights to each variable forming the vector of weights = (1 , 2 , … ). The weights indicate the importance of a particular indicator in the total index. Then the counting index G is defined as: ∙ 1 1 +⋯+ = (; ) = = , = (1, … ,1). (1) ∙ 1 +⋯+ The numerator in (1) represents the weighted sum of the values of the organization’s achievements. The denominator is the maximum value that could be achieved (every action and protocol are implemented). The index G is then a share of the current total achievement in the maximum potential achievement. Index G could also be expressed as the weighted mean of indicators . 1 (; ) = 1 + ⋯ + . (2) 1 +⋯+ 1 +⋯+ 4 As shown in Cameron et al. (2019), multi-value questions with more than two options can and should be dichotomized to avoid ambiguity in cardinal value interpretations. 14 Equation (2) demonstrates that G is bounded between 0 and 1. The next step in constructing the GGIT Index is to evaluate the relative importance (weights) of different indicators. In that, we follow the measurement literature as relying on an “equal importance” approach to subdimensions of the index and individual variables (Atkinson 2003, Alkire and Foster 2011). The “equal importance” principle stipulates that the index’s pillars should be divided into equally important subdimensions. The assumption we rely upon for the GGIT index is that the pillars and cross-cutting themes represent equally important groups of indicators. The individual indicators grouped within a dimension or a subdimension should be chosen to have similar contributions within the group. Such a nested index structure helps to account for the relative importance of dimensions, subdimensions, and individual indicators in a coherent way.5 The GGIT Index constructed using the counting method and satisfying the “equal importance” principle is consistent with the desired properties and axioms defined above. In particular, the index is additively decomposable by subsets of variables or subdimensions. The index can also be constructed using both binary and multi-value variables (which are dichotomized).6 5.3 Selection of indicators Our choice of individual indicators is consistent with the simplicity principle, so these indicators should be easy to measure, verify, and understand. At the same time, the selected indicators should provide the necessary and sufficient information about the actions undertaken by an organization in its pursuit of environmentally sustainable ICT. We present the whole set of indicators for each pillar and the related cross-cutting themes in the Appendix. The questionnaire for the GGIT was designed through a series of consultations with the Procurement and IT departments of the World Bank. 5 There exist different ways of determining the importance of individual indicators in an index. Some methods, for example, principal component or cluster analysis, are used to “derive” relative weights from the data. However, these methods suffer from implicit subjectivity, and derived weights depend on a particular data sample and on the model’s assumptions (see, for example, Decancq and Lugo (2013)). Alternatively, other indexes rely on expert surveys to determine the relative weights and composition of indicators (for example, Krumay and Brandtweiner (2016)). This approach might suffer from subjectivity and biases of expert assessments and might not be stable over time and across different organizations. 6 See Cameron et al. (2019) for the formal proofs of these properties. 15 6. GGIT Index implementation and update 6.1 Index robustness and sensitivity analysis Despite our attempt to construct the GGIT based on a solid theoretical foundation, developing any composite index cannot avoid subjective decisions on selecting indicators, data normalization, weights, and aggregation methods. The robustness of the GGIT and the underlying policy messages could be contested because of such subjectivity and to minimize the degree of respondents’ discretionality in influencing the index values and rankings. The robustness of GGIT might be assessed by the sensitivity and uncertainty analysis that can demonstrate the weaknesses of the proposed methodology and improve the instrument transparency. The uncertainty analysis helps reveal uncertainty associated with the errors in individual indicators that propagate on the values of GGIT and uncertainty associated with the selection of sub-indicator weights. The sensitivity analysis demonstrates how the variations in values of individual indicators affect the GGIT variance. To assess the uncertainty of the GGIT indicator, the following steps could be made: 1. Inclusion and exclusion of individual indicators 2. Modeling data errors based on the availability of information on variance estimation. 3. Using alternative aggregation methods. 4. Using different weighting schemes (linear, geometric mean, multi-criteria ordering). 5. Using different values of weights. 6.2 Governance The index can be implemented by the advisory board consisting of designated representatives of IT departments from the major international development institutions, country governments, and the scientific community. The advisory board establishes the schedule and agrees on the methodology of the index upgrade and revisions. The index is reviewed and audited by a panel of independent experts reporting to the board. The advisory board might publish reports about the state of environmental sustainability of the Government IT infrastructure. 16 6.3 Implementation The index implementation could be organized as a series of interviews with IT management in the organization. Different questionnaires could be administered to personnel responsible for the acquisition, operation, and disposal of the organization’s IT infrastructure. An aggregate GGIT index and sub-indexes are then derived from these survey data and supporting documentation. These indexes could be compared with those of other government organizations, and the areas of improvement might be identified based on such comparisons. While the primary purpose of the GGIT index is to indicate areas of improvement, the GGIT methodology could also be used as a guide for improvements in IT processes to achieve environmentally sustainable ICT in each organization. In addition to cross-sectional comparisons, the GGIT index could be used for monitoring the progress of an organization in improving the environmental sustainability of its IT departments. Year-to-year relative improvements in the aggregate index and its sub-components could be indicative of whether the organization is committed to the green IT agenda. The GGIT index must be periodically reviewed and updated. ICT technology is progressing, rapidly becoming more efficient and environmentally friendly. It is essential to renew the methodology and the underlying indicators for the index to remain relevant. At the same time, the index should try preserving inter-temporal comparability. One approach to achieve such comparability is to maintain two indexes, the updated and the new index, for some time, allowing monitoring of the progress of government organizations towards the environmental sustainability of their IT departments. 7. Conclusions In this paper, we propose a new Green Government IT index to assess the environmentally responsible use of computers and other ICT resources by the IT departments of government institutions and other organizations. Our theoretical and empirical methodology relies on the established literature on index construction and draws on the models for evaluating the environmental sustainability of ICT. The paper discusses the conceptual and mathematical foundation behind the new index and defines a set of verifiable, comparable, and transparent indicators for index construction. The flexibility of our methodological framework allows for 17 future index revisions as the green agenda evolves. Using the framework, government organizations can assess the environmental capabilities of their ICT departments and systematically monitor and improve these capabilities to meet their green objectives. This new index could be seen as the initial step before more resource-intensive assessments to inform the organization’s long-term environmentally sustainable strategy. 18 References Alkire, S., Roche, J., Ballon, P., Foster, J., Santos, M. and S. Seth (2015). Multidimensional Poverty Measurement and Analysis. United Kingdom: Oxford University Press. Alkire, S. and J. Foster (2011). “Counting and multidimensional poverty measurement.” Journal of Public Economics, 95(7): 476-487. Andrae A. and T. Edler (2015). “On global electricity usage of communication technology: trends to 2030,” Challenges, 6(1): 117–157. Atkinson, A. (2003). “Multidimensional deprivation: contrasting social welfare and counting approaches.” Journal of Economic Inequality, 1(1): 51-65. Belkhir, L. and A. Elmeligi (2018). “Assessing ICT global emissions footprint: Trends to 2040 & recommendations.” Journal of Cleaner Production, 177: 448–463. Bozzelli, P., Gu, Q., and P. Lago (2014). “A Systematic Literature Review on Green Software Metrics.” mimeo, Amsterdam. Bohas A, and N. Poussing (2016). “An empirical exploration of the role of strategic and responsive corporate social responsibility in the adoption of different Green IT strategies.” Journal of Cleaner Production, 22: 240-251. Butler, T. (2011). “Towards a practice-oriented green IS framework.” ECIS 2011 Proceedings 102. https://aisel.aisnet.org/ecis2011/102. Cabernard, L., Pfister. S. and S. Hellweg (2019). “A new method for analyzing sustainability performance of global supply chains and its application to material resources.” Sci Total Environ 684:164–177. Cameron, G., Dang, H., Dinc, M., Foster, J., and M. Lokshin (2019). “Measuring the Statistical Capacity of Nations.” World Bank Policy Research Working Paper, 8693. Cameron, G., Dang, H., Dinc, M., Foster, J., and M. Lokshin (2021). “Measuring the Statistical Capacity of Nations.” Oxford Bulletin of Economics and Statistics, 83: 870-896. Chen, X., Despeisse, M. and B. Johansson (2020). “Environmental Sustainability of Digitalization in Manufacturing: A Review.” Sustainability, 12: 10298. Chircu, A., and D. Lee (2005). “E-government: Key success factors for value discovery and realization.” Electronic Government: An International Journal, 2(1): 11–25. Decancq, K. and M. Lugo (2013). “Weights in multidimensional indices of wellbeing: An overview.” Econometric Reviews, 32(1): 7-34. Donnellan, B., Sheridan, C. and E. Curry (2011). “A Capability Maturity Framework for Sustainable Information and Communication Technology.” IEEE IT Professional 13: 33- 40. European Union (2012). “Directive 2012/19/EU of the European Parliament and of the Council of 4 July 2012 on waste electrical and electronic equipment (WEEE)”, European Parliament, Brussels. Freitag J. and A. Brensen-Lee (2020). “The climate impact of ICT: A review of estimates, trends and regulations.” Lancaster University. Accessed at: https://arxiv.org/ftp/arxiv/papers/2102/2102.02622.pdf 19 Ferreira, A. and B. Pernici (2014). “Managing the complex data center environment: an integrated energy-aware framework.” Computing 1:1–41. Foster, J., Suman S., Lokshin, M., and Z. Sajaia (2013). A Unified Approach to Measuring Poverty and Inequality: Theory and Practice. Washington D.C.: World Bank. Forti V., Baldé C.P., Kuehr R., and G. Bel (2020). The Global E-waste Monitor 2020: Quantities, flows and the circular economy potential. United Nations University (UNU)/United Nations Institute for Training and Research (UNITAR) – co-hosted SCYCLE Programme, International Telecommunication Union (ITU) & International Solid Waste Association (ISWA), Bonn/Geneva/Rotterdam. Gartner (2022). “Forecast: Enterprise IT Spending for the Government and Education Markets, Worldwide, 2018-2024, 4Q20 Update.” Accessed at: https://www.gartner.com/en/newsroom/press-releases/2021-02-18-gartner-forecasts- global-government-it-spending-to-grow-5-percent-in-20210. Global Reporting Initiative (2021). GRI Standards, Amsterdam, The Netherlands. Gupta, U., Kim, Y., Lee, S., Tse, J., Lee, H., Wei, G., Brooks, D., and C. Wu (2021). “Chasing Carbon: The Elusive Environmental Footprint of Computing,” IEEE International Symposium on High-Performance Computer Architecture. Hammond, A. and W. Institute (1995). “Environmental Indicators: a Systematic Approach to Measuring and Reporting on Environmental Policy Performance in the Context of Sustainable Development,” World Resources Institute, Washington DC. Hankel, A., Oud, L., Saan, M., and P. Lago (2015). “A Maturity Model for Green ICT: The Case of the SURF Green ICT Maturity Model.” In Enviro Info 2014–28th International Conference on Informatics for Environmental Protection; BIS Verlag: Oldenburg, Germany: 33–40. Hankel, A., Heimeriks, G., and P. Lago (2019). “Green ICT Adoption Using a Maturity Model,” Sustainability 11: 7163. ______ (2017).”Green ICT Assessment for Organisations,” Journal of ICT Standardization, 4(2): 87-110 International Organization for Standartiation (2021). ISO 14001, Geneva, Switzerland. Krumay B. and R. Brandtweiner (2016). “Measuring the environmental impact of ICT hardware.” International Journal of Sustainable Development and Planning, 11(6): 1064- 1076. Lautenschutz, D., España, S., Hankel, A., Overbeek, S. and P. Lago (2018). “A Comparative Analysis of Green ICT Maturity Models.” EPiC Series in Computing 52: 153-167. Molla, A., Cooper, V. and S. Pittayachawan (2009). “IT and Eco-sustainability: Developing and Validating a Green IT Readiness Model.” ICIS 2009 Proceedings, 141. Oberhaus, D. (2019). “Amazon, Google, Microsoft: Here’s Who Has the Greenest Cloud”, Wired Science. https://www.wired.com/story/amazon-google-microsoft-green-clouds-and- hyperscale-data-centers/ Palmer, N. and D. Perkins (2012). “Technological Democratization: The Potential Role of ICT in Social and Political Transformation in China and Beyond,” Perspectives on Global 20 Development and Technology, 11(4): 456-479. doi: https://doi.org/10.1163/15691497- 12341236 Paulk, M., Curtis, B., Chrissis, M. and C. Weber (1993). “Capability maturity model, version 1.1.” Software, IEEE, 10(4):18–27. Ribeiro, M., Tommasetti, R, Gomes, M., Castro, A. and A. Ismail (2021). “Adoption phases of Green Information Technology in enhanced sustainability: A bibliometric study.” Cleaner Engineering and Technology 3: 100095. 10.1016/j.clet.2021.100095. Rocheleau, B. & Wu, L. (2002). “Public versus private information systems: Do they differ in important ways? A review and empirical test.” American Review of Public Administration, 32(4): 379-397. Singh, M. and G. Sahu (2020). “Towards adoption of Green IS: a literature review using classification methodology.” International Journal of Information Management, 54: 102147. Tansel, B. (2017). “From electronic consumer products to e-wastes: Global outlook, waste quantities, recycling challenges.” Environment International, 98: 35-45. Uddin, M. and A. Rahman (2012). “Energy efficiency and low carbon enabler green IT framework for data centers considering green metrics.” Renewable and Sustainable Energy Reviews, 16: 4078–4094. United Nations (2021). E-Government Survey 2020. United Nation, New York. United State Congress (2017). “Federal agencies’ reliance on outdated unsupported information technology: a ticking time bomb.” Committee on Oversight and Government Reform, Hearing Report No 114-120. United State Government (2021). Green Government Initiative, Office of the Federal Chief Sustainability Officer, Washington DC, USA: accessed at https://www.sustainability.gov/ggi/ Wallström, M. (2006). “Active or reactive - CSR reporting and sustainable development as tools for smart growth,” In GRI Conference - Reporting Sustainability, V.P.o.t.E.C.r.f.I.R.a.C. Strategy, Amsterdam. World Resource Institute (2021). Greenhouse Gas Protocol, Washington DC, USA https://ghgprotocol.org/ World Bank (2021). Europe and Central Asia Economic Update, Spring 2021 : Data, Digitalization, and Governance. Washington, DC: World Bank. © World Bank. https://openknowledge.worldbank.org/handle/10986/35273 21 Appendix Green Government IT Index Questionnaire The GGIT index is the aggregated numeric score of several binary (Yes/No) answers to questions related to the Acquisition, Operations, and Disposal of IT goods and services. All “Yes” answers need to be substantiated by evidence contained in Standard Operating Procedures (SOPs), regular operational reports, senior management reports, audit results, rules manuals, guiding principles documents, etc. The questions are grouped under Pillars and cross-cutting Themes that are used to further indicate their nature. Several questions are generic and do not fall under any Pillar; they are listed at the very beginning of the questionnaire. The following table depicts the mentioned structure: Cross-cutting themes Pillars Acquisition Operation Disposal 1. Strategy 2. Standards 3. Performance 4. KPI 5. Governance 6. Knowledge and outreach 1. General questions: a) Does your Organization/Agency have a Green Agenda? A Green Agenda is defined through a series of published broad objectives, standards, policies, and requirements to be followed or complied with by the Organization. b) Does your Organization/Agency have a clearly established accountability for its Green Agenda? Accountability at the corporate level defines broad objectives, standards, policies, and requirements and ensures compliance, adjustment as needed, and evolution of the Green Agenda. c) Does your Organization/Agency recognize ICT as a significant contributor to its overall Green Agenda? An effective Green Agenda recognizes ICT as one of its important components and deserving of a specific agenda, scrutiny, and periodic reviews. d) Does the IT Department have a specific IT Green Agenda? An IT Green Agenda defines a series of IT-specific objectives, standards, policies, and requirements to be followed or complied with by the IT Department. e) Is the IT Department’s Green Agenda part of its (IT) strategic objectives? A successful IT Green Agenda is part of the IT Department’s strategic objectives. f) Does the IT Department have a clearly established accountability for its Green Agenda? Accountability within the IT Department defines and maintains the IT Green Agenda. 22 g) Are the IT staff trained on the IT Department’s Green Agenda? A Green Agenda can only be accomplished when staff is trained and follow established objectives, standards, policies, and requirements of the agenda. h) Does your IT Department conduct periodic audits focusing on its Green Agenda? The IT Department needs to be reviewed and audited, considering its Green Agenda, regularly to identify compliance gaps and/or opportunities for improvement. i) Is the IT Department a member of any industry organization dedicated to issues of IT sustainability or environmental agendas? Membership in such organizations helps the IT Department to keep up to date with trends and the evolution of Green Agenda issues across the industry. 1. Acquisition Pillar This pillar contains indicators of actions and procedures that guide the acquisition of IT equipment and software. Strategic IT acquisition planning is aimed at maximizing system performance. The equipment acquisition under such a strategy was made with a large performance reserve, and these resources remained unused for most of the time. Green and Sustainable IT equipment acquisition pursues a different goal of maximizing IT resource utilization. This objective requires different hardware and software architecture and measurement. 1) Strategy: a. Does your organization have a strategy for acquiring environmentally sustainable IT equipment and services? Environmentally sustainable IT equipment and services include items such as reduced carbon footprint, reduced packaging, and energy efficiency. b. Do you have a protocol for reviewing and updating this strategy? Such a protocol may include regular checkpoints and reviews by internal or external industry experts. c. Do your IT procurement, and tender processes include environmental and social screening criteria? Such criteria may include environmental sustainability areas, minority-owned businesses, etc. d. Do you rely on a multi-year budget framework for procuring IT equipment? The multi-year budget framework allows for sustainable procurement and lifecycle management by normalizing all activities over multiple years in predictable quantities. e. Does your organization have a program to procure refurbished energy-efficient IT equipment? Such equipment may include batteries, spare IT parts etc. 2) Standards: a. Do you integrate lifecycle costing into green and sustainable procurement policies? Lifecycle costing (LCC) is defined in the International Organization for Standardization standard, Buildings and Constructed Assets, Service-Life Planning, Part 5: Lifecycle Costing (ISO 15686-5) as an ―economic assessment considering all agreed projected significant and relevant cost flows over a period of analysis expressed in monetary value. b. Do you use Green Agenda standards for acquiring IT equipment for your organization? e.g., Energy Star, EPEAT-certified equipment. c. Is the IT equipment you acquire designed for recycling (DfR standard) certified? Designed for recycling, also referred to as DfR. Design for recycling is an eco-design 23 strategy. Eco-design is a systematic approach allowing the design of more environmentally friendly products. d. Is the IT equipment you acquire “designed for the environment” (DfE standard) certified? Design for environment (DfE) attempts to reduce the impact of product design upon the environment of a product or service. It considers the whole life cycle - going beyond just using recycled materials or proper packaging or disposal. The DfE approach includes five aspects that follow the life cycle of a product and enables companies to be more environmentally friendly in their work. These five aspects are - (1) Materials and extraction; (2) Production; (3) Transport, distribution, and packaging; (4) Use; and (5) End of life, Design For Disassembly and Design for Recycling. 3) Performance: a. Do you consider green alternatives in your logistic chain for IT equipment? Green alternatives may include activities such as establishing provisioning centers closer to manufacturing plants and reducing travel time for IT products from the factory to your Staff. b. Do you structure the delivery of IT equipment in cycles allowing for bulk and green- friendly schemas? Such activity may include Consolidated shipping spread over a period of time may result in the optimization of carbon footprint. c. Do you consult with industry experts to measure and minimize your carbon footprint? Regular checkpoints with industry experts to measure and optimize activities related to carbon footprint may assist with long-term strategic objectives. 4) KPI: a. Does your organization have indicators to monitor green procurement processes such as costs and time overruns; carbon footprint; adherence to specifications, best practices in green management; sustainability parameters such as EPEAT? Key performance indicator (KPI) enables measurement of success or failure in achieving established targets. b. Do you regularly validate your IT equipment vendor partners for their sustainability practices throughout the contract lifecycle? Such validation may include regular reporting and checkpoints. 5) Governance: a. Do you have a department or division in your organization responsible for implementing the Green IT acquisition strategy? This department or division will be accountable for implementing the Green IT acquisition strategy. b. Do you have a manager or other leadership responsible for the green IT acquisition strategy? Appropriate leadership provides the needed oversight and management of various strategic initiatives. c. Does your IT organization have professionals focused on compliance (PMTeams) and work programs specific to Green IT equipment acquisition? Specific skillsets include Program Management, Logistics, and Finance. 6) Knowledge and outreach: a. Do you have a program for ongoing skill development of procurement officers to upgrade their skills and knowledge of the latest innovations in green products, services, and the design of sustainable tenders? A robust knowledge program enables constant learning to keep up with modern innovations in this space. b. Do you participate in global events to bring together minds from various organizations focusing on green strategies in procurement? Knowledge-sharing 24 events are critical to strategic planning and the evolution of green strategies and collective efforts toward a Greener IT. 2. Operation Pillar 2.1 Capacity Planning This section is about how the IT Department plans for needed computing capacity considering a Green Agenda. 2.1.a) Does your IT Department employ a Capacity Planning methodology? IT Departments may utilize a Capacity Planning methodology to forecast expansion and reduction of needed computing resources. 2.1.b) Is the plan reviewed at least once a year in consultation with stakeholders? IT Departments need to review capacity plans regularly to take advantage of optimal conditions for acquisition and operations performance. 2.2 Cloud Computing This section is about how Cloud Computing is used by the IT Department and if the Green Agenda requirements are part of how these services are sourced and rendered. 2.2.a) Does your IT Department use Cloud Services (xaaS)? Cloud services can be rendered at different levels. 2.2.b) Does your IT Department have long-term, strategic contracts with Cloud Services providers? IT Departments may utilize Cloud Services based on long-term, strategic contracts or only simple and smaller contracts employed for on-demand, infrequent ad-hoc needs. 2.2.c) Do the long-term strategic IT contracts include the requirements of a Green Agenda? IT Departments may or not enforce such requirements in the contract with Cloud Services providers. 2.2.d) Are the Green Agenda requirements monitored for compliance on a regular basis by your IT Department? Any requirements in a contract should be reviewed regularly to ensure objectives are met. 2.2.e) Is “Cloud only Operations” (CoO) the vision for the future of computing resources utilization in your IT Department? IT Departments may have decided to eradicate “On- Premises” operations as a source of Computing resources. 2.2.f) Has your IT Department developed plans that define how long and at what pace the goal of “CoO” will take to be achieved? The “CoO” goal should be substantiated by concrete plans that define milestones and their timelines. 2.2.g) Does your IT Department track a server virtualization ratio (virtualized vs. Physical)? Such a ratio is useful as a proxy for how efficiently the server operations are run with direct implications for energy and heat consumption levels. 25 2.2.h) Does your IT Department employ Hyper-Converged technology for servers, storage, and switches? Hyper-Converged technologies are another proxy for best practices with direct implications for energy and heat consumption levels. 2.2.i) Does your IT Department take advantage of Cloud Elasticity techniques? Cloud Elasticity allows organizations to optimize resource utilization with direct implications for energy and heat consumption levels. 2.2.j) Does your IT Department take advantage of moving OS instances to cheaper and better-performing server offerings on a regular basis? Newer and cheaper Cloud servers are proxies for optimal operations with direct implications for energy and heat consumption levels. 2.3 Co-Located (CoLo) Data Centers This section is about the utilization of CoLo Data Centers by the IT Department. CoLo Data Centers operated by specialized companies are usually managed using industry best practices as a standard, with direct implications for the better of an IT Green Agenda. This means that many indicators and strategies aimed at resource consumption utilization and monitoring are included in the standard Service Levels Agreements (SLAs) offered by the provider. 2.3.a) Does your IT Department use CoLo Data Center space for its non-Cloud operations? Co-Located facilities operated by specialized companies are usually managed using industry best practices as a standard with direct implications for the better of an IT Green Agenda. 2.4 On-Premises Data Centers This section is about the utilization of On-Premises Data Centers by the IT Department and if the IT Green Agenda requirements are part of how these services and their key elements are sourced, stood-up, and rendered. 2.4.a) Does your IT Department have targets for Data Center related environment indicators? Appropriate targets for humidity, power consumption, and cooling are some examples of indicators with direct implications for energy and heat consumption levels. 2.4.b) Does your IT Department monitor these indicators for target compliance on a regular basis? 2.4.c) Specifically, does your IT Department have a defined PUE (Power Utilization Efficiency) target? PUE is an industry-accepted indicator of Data Center power management efficiency. 2.4.d) Does your IT Department take advantage of free energy? Free energy is energy derived from cool weather locations but can also include wind, solar, and others. 2.4.e) Is your IT Department’s Data Center layout designed with optimal cooling as a goal? Optimal Data Center layouts can greatly benefit cooling targets through well-designed aisles, compartments, etc. 2.4.f) Does your IT Department solely use Energy-Star-accredited equipment? Energy-Star equipment is accredited for best energy consumption levels. 26 2.4.g) Does your IT Department use On-Premises Private Clouds? Private Clouds allow IT Departments to use Cloud techniques (virtualization, elasticity, etc.) to drive optimal resource utilization with direct implications for energy and heat consumption levels. 2.5 End-Point Devices This section is about devices used by users, including how they are managed by the IT Department, considering an IT Green Agenda. 2.5.a) Are enterprise PCs set by default to go into sleep mode when not in use? Organizations can set timer defaults for PCs to go into sleep mode when not in use. 2.5.b) Do you use thin-client computers to save power consumption? 2.5.c) Do you use specialized software to manage enterprise PC (Personal Computers) power consumption? 2.5.d) Do you monitor PCs utilization levels? 2.5.e) Does your IT Department solely use Energy-Star-accredited equipment? Energy-Star equipment is accredited for best energy consumption levels. 2.6 IT Accessories This section is about how accessories are used by the IT Department and if the IT Green Agenda requirements are part of how these products are sourced and utilized. 2.6.a) Has your IT Department abolished the utilization of wired IT Accessories? 2.7 Printing This section is about Printing, Scanning, Faxing, and Copying devices and/or services, including how they are sourced, deployed and managed by the IT Department considering an IT Green Agenda 2.7.a) Does your organization have a “Print-Less Paper” strategy? 2.7.b) Does the “Print-Less Paper” strategy include regular campaigns, incentives (cost), and/or any other initiatives aimed at lowering paper utilization? 2.7.c) Does your IT Department promote the utilization of black/white over color printing? 2.7.d) Does your IT Department promote the utilization of double-sided printing? 2.7.e) Does your IT Department promote the utilization of eFax? 2.7.f) Does your IT Department have a “sleep” schedule for printers? 2.7.g) Does your IT Department recycle toner cartridges? 2.7.h) Does your IT Department utilize MPS to run the printing environment? 2.8 Networking This section is about how Cloud Computing is used by the IT Department and if the green agenda requirements are part of how these services are sourced and rendered. 27 2.8.a) Does your IT Department have a Wireless strategy for end-point access (PCs, Smartphones, others)? 3. Utilization Pillar - Disposal 1) Strategy: a. Does your organization have a Green IT e-waste strategy? e-waste is a ther term for Waste Electrical and Electronic Equipment (WEEE). E-waste strategy is a strategy for sustainable utilization of E-waste. b. Do you have a protocol for reviewing and updating this strategy? regular review and update process will maintain effectiveness and enable modernization. 2) Standards: a. Do you adhere to R2 Certification standards as defined by Sustainable Electronics Recycling International (SERI - https://sustainableelectronics.org/ )? R2 is the prevalent standard in America, alternative localized standards are acceptable. 3) Performance: a. Does your organization have a program to repurpose IT equipment? Such programs may include re-provisioning of used equipment to the workforce. b. Do you have a contract with a third-party organization to sustainably dispose, refurbish, and/or recycle end-of-life IT equipment? Such organizations specialize in bulk recycling. Contracts help maintain expected quality and sustainability standards. c. Do you have any in-house programs to recycle IT equipment through donations? Community donation programs are some of the most sustainable means of recycling IT equipment. d. Do you measure the sustainability competence of your disposal third-party organizations via an established measurement standard? Standards may include a certificate of final recycling mechanism and carbon footprint mapping. 4) KPI: a. Does your organization have key performance indicators to assess and monitor sustainable IT e-waste processes? Indications may include e-waste volume per quarter and carbon footprint mapping of e-waste processes. b. Do you measure your sustainable IT e-waste processes with any industry standard KPIs? Such KPIs will enable results measurement to drive better strategy. c. Do you have targets set for your organization’s sustainable IT e-waste processes? Targets may include carbon footprint KPI, total e-waste volume KPI, Donation program count KPI 5) Governance: a. Is accountability clearly established for IT e-waste processes? Accountability drives successful implementation. b. Is this accountability part of the governance structure of either your organization or your IT department? Appropriate governance structure such as Project Management Office enables rules-based and measured implementation. c. Do you regularly engage with industry specialists to review your e-waste strategy? Regular reviews may include consultations with organizations such as Gartner, Forrester. 6) Knowledge and outreach: 28 a. Do you have a program to enable Staff to contribute (dispose personal end of life IT equipment) through IT e-waste programs with a Green IT culture? Enabling good behavior in terms of sustainability within your organization will drive the overall Green IT culture. b. Do you create incentives or opportunities for Staff to participate in IT e-waste initiatives in local communities? Opportunities may include paid time off, employee performance recognition, and measurement programs. c. Do you participate in any global forums that bring together organizations involved in sustainable recycling of IT e-waste? Collaboration and knowledge sharing with industry experts and peers enable ideation toward Green IT program development and improvement. 29