Policy Research Working Paper 9462 Promoting Parental Involvement in Schools Evidence from Two Randomized Experiments Felipe Barrera-Osorio Paul Gertler Nozomi Nakajima Harry Anthony Patrinos Education Global Practice October 2020 Policy Research Working Paper 9462 Abstract Parental involvement programs aim to increase school- student behavior in school. The grants did not impact and-parent communication and support children’s overall parent or student behaviors. Notably, the paper does not learning environment. This paper examines the effects of find impacts of either intervention on educational achieve- low-cost, group-based parental involvement interventions ment. To understand these null effects, the paper explores in Mexico using data from two randomized controlled how social ties between parents and teachers evolved over trials. The first experiment provided financial resources to the course of the two interventions. Parental involve- parent associations. The second experiment provided infor- ment interventions led to significant changes in perceived mation to parents about how to support their children’s trustworthiness between teachers and parents. The results learning. Overall, the interventions induced different types suggest that parental involvement interventions can back- of parental engagement in schools. The information inter- fire if institutional rules are unclear about the expectations vention changed parenting behavior at home—with large of parents and teachers as parents increase their involvement effects among indigenous parents who have historically in schools. been discriminated and socially excluded—and improved This paper is a product of the Education Global Practice. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://www.worldbank.org/prwp. The authors may be contacted at hpatrinos@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Promoting Parental Involvement in Schools: Evidence from Two ∗ Randomized Experiments Felipe Barrera-Osorio, Paul Gertler, Nozomi Nakajima, and Harry Anthony Patrinos† Keywords— parents, schools, experiments, Mexico, indigenous, trust JEL Codes— I20 ∗ We thank seminar participants at 3ie, AEFP, APPAM, Columbia University, IZA Economics of Education Workshop, University of Massachusetts Boston, Vanderbilt University and the World Bank for comments and feedback on the paper. The research was supported by grants from the Spanish Impact Evaluation Fund, the World Bank Research Committee (RF-P1123327-RESE-BBRSB), and the Bank-Netherlands Partnership Program. We dedicate this paper to Eduardo Rodriguez-Oreggia who worked on the original thinking about the study and passed away in 2014. We are thankful for excellent research assistance from Angelica Rivera-Olvera, Stefan Metzger, Diego Cardozo-Medeiros and Pedro Pablo Parra-Diaz. Manuel Felix supported the project since its inception. We are grateful to SEP for access to data. Thanks for all the support to CONAFE staff at the time, namely: Arturo Saenz Ferral, Lucero Nava Bolanos, Jose Carlos Rocha Silva, Alfonso Gonzalez Ramirez, Dolores Ramirez Vargas, Georgina Quintanilla Cerda, Rafaela Merecias Sanchez, Maria Angelica Santiago Antonio and Teresa Nateras Valdez. The authors have no financial or material interests in the results of this paper. The registration number of the trial at the AER RCT Registry is AEARCTR-0006424. The opinions expressed herein are those of the authors and not necessarily of the institutions they represent. All errors remain our own. † Barrera-Osorio: Vanderbilt University (felipe.barrera.-.osorio@vanderbilt.edu); Gertler: University of California, Berkeley and NBER (gertler@berkeley.edu); Nakajima: Harvard University (nnakajima@g.harvard.edu); Patrinos: World Bank (hpatrinos@worldbank.org) 1 Introduction Parents play an important role in their children’s educational experiences and outcomes (Cunha, Heckman, Lochner, & Masterov, 2006; Houtenville & Conway, 2008; Todd & Wolpin, 2007). How- ever, parents often face challenges when supporting their children through school. For example, parents can hold inaccurate beliefs about the returns to education (Attanasio & Kaufmann, 2014; Jensen, 2010; Nguyen, 2008) and about their own children’s academic performance and behaviors (Banerjee, Banerji, Duflo, Glennerster, & Khemani, 2010; Dizon-Ross, 2019). These biased be- liefs can lead to misallocation of educational investments. Parents may also have limited cognitive bandwidth to respond to the various tasks associated with supporting their children’s education (Mani, Mullainathan, Shafir, & Zhao, 2013; Mullainathan & Shafir, 2013). Challenges may also derive from organizational structures. Schools often assume that parents are familiar with how to engage with teachers. This assumption can lead to systematic exclusion of low-income, culturally, and linguistically diverse parents from advocating for their children’s needs and accessing school resources (J. S. Lee & Bowen, 2006). To overcome the range of challenges that parents face, parental involvement programs (also known as family engagement programs) increase school-and-parent communication to support chil- dren’s overall learning environment. In this paper, we analyze data from two field experiments to examine the impacts of parental involvement interventions on parent and teacher behavior, and educational outcomes. The experiments were conducted across 430 public schools in four states in Mexico. These states have a large indigenous population that has faced a long history of discrimina- tion. In this setting, parental involvement programs hold particularly great promise for improving school-and-parent communication and supporting the education of indigenous children. The first experiment focuses on financial grants to parent associations. Schools assigned to the treatment condition received double the typical grant amount allocated to parent associations. This additional grant money was modest, as it covered approximately 83% of the out-of-pocket educational costs spent by parents in our study setting. Schools assigned to the control group received the standard grant amount that is allocated to parent associations in Mexican public schools. The second experiment focuses on information provision to parents. Parents in treatment schools 1 attended group sessions where a community facilitator informed them about ways to become in- volved in school activities and decision-making processes, as well as where to access community resources to support their children’s learning. Parents in control schools did not receive the infor- mation intervention. We also leverage the design of the two experiments to estimate a non-experimental treatment effect of receiving the standard grant amount. Specifically, the control group from the first ex- periment is compared to the treatment group from the second experiment. To compare schools across experiments, we use the fact that selection into the experiments was based on the propor- tion of indigenous students in schools. We begin by trimming our data to only include schools in the region of common support in terms of indigenous student population. Then, we adopt a con- ditional independence strategy and adjust the treatment and comparison groups using covariates selected from the post-double selection (PDS) lasso estimator (Belloni, Chernozhukov, & Hansen, 2014). Recent studies have shown that machine-learning tools can be useful for principled variable selection (Goller, Lechner, Moczall, & Wolff, 2020; B. K. Lee, Lessler, & Stuart, 2010; Urminsky, Hansen, & Chernozhukov, 2016) and we show that our results are robust to a range of specifica- tions, as suggested by Angrist and Frandsen (2019). We take caution in interpreting the treatment effect of the standard grant amount, given the strong unconfoundedness assumption required for identification. Nonetheless, this non-experimental comparison is of substantive interest because it uncovers the effect of providing grants to parent associations at the extensive margin (i.e., no grant versus standard grant amount), whereas the second experiment focuses on the effect of grants at the intensive margin (i.e., standard grant amount versus double the standard grant amount). Finally, we correct for multiple-hypothesis testing to account for the number of contrasts and outcomes that are explored in this paper. We present four key results. First, the two experiments induced different types of parental involvement in schools. Through the double grant, parent associations gained a moderate increase in financial resources. This additional money allowed parent associations to temporarily “have a seat at the table” with respect to school decision-making processes. In contrast, the information intervention offered advice to parents about how to support their children through school. This information induced parent associations to organize school activities and events, and to meet with teachers to discuss student performance. 2 Second, the information intervention changed parenting behavior at home. Parents in schools that received the information intervention were significantly more likely to be aware of their chil- dren’s school assignments and to help their children with homework. Moreover, these changes in parenting behavior were concentrated among indigenous parents. For example, 32 percent of indige- nous parents in treatment schools helped their children with homework, whereas only 20 percent of indigenous parents in control schools did so. The large impacts among indigenous parents suggest the potential for improving school-to-parent communication by targeting groups that have been historically excluded. Third, despite increases in parental involvement, the double grant and information provision did not affect student test scores on the national standardized exam. For the double grant experiment, the null results on educational outcomes are consistent with the fact that parents did not significantly change their parenting behavior at home and teachers did not significantly shift their teaching behavior. In contrast, the information intervention improved parental support for children’s learning at home. These changes in parenting behavior likely contributed to the marginal reduction in school dropout and disciplinary action, but did not translate to improvements in educational achievement. Finally, we explore how social ties between parents and teachers evolved over the course of the two interventions. A large body of theoretical and empirical research suggests that trust is a core component of social capital (Coleman, 1994; Putnam, 2001) and the absence of trust severely hampers transactions between actors (Fehr, 2009). We find that parental involvement interventions led to significant changes in perceived trustworthiness of teachers and parents. While the information intervention improved parents’ trust towards teachers, the double grant intervention diminished both parents’ trust towards teachers and teachers’ trust toward parents. The negative effect of the double grant intervention on trust suggests that parental involvement interventions can backfire if institutional rules are unclear about expectations as parents increase their involvement in schools. This study makes a few contributions to the economic literature on parental involvement in schools. First, the information intervention is closely related to experiments that provide parents with information about their children’s education. Studies have shown that providing parents with information about their children’s academic progress or performance can lead parents to update their biased beliefs, reallocate resources, improve student behavior (Avvisati, Gurgand, Guyon, & Maurin, 2014; Rogers & Feller, 2018), and raise academic performance (Barrera-Osorio, Gonzalez, 3 Lagos, & Deming, 2020; Bergman, 2016; Dizon-Ross, 2019).1 The information intervention studied in this paper has been implemented at scale by the national government. This is an important feature given that studies from efficacy field trials do not always yield similar results when implemented at scale (Al-Ubaydli, List, & Suskind, 2020; Banerjee et al., 2017). The scaling up of the information intervention was made possible by delivering information to parents through a group that exists in all schools: parent associations. Groups have been demonstrated to be efficient platforms for information delivery in other settings, such as women’s groups in developing countries (Dìaz-Martin, Gopalan, Guarnieri, & Jayachandran, 2020). Parent associations are also useful because they create opportunity for social interaction among members, resulting in positive externalities (Small & Gose, 2020). As the information intervention in the first experiment included time for group discussion, we view the social interaction in parent associations as an important feature of the intervention. Second, the double grant experiment is closely related to a broader set of school-based man- agement (SBM) reforms. Since the 1990s, SBM has been a popular policy strategy to improve the quality of education in developing countries. By decentralizing decision-making authority of school operations and management from the federal government to local stakeholders, SBM allows for schools to directly respond to local needs and priorities. Despite the widespread adoption of SBM, the effectiveness of these programs in raising educational outcomes have been mixed. Experiments in India (Banerjee et al., 2010), the Gambia (Blimpo, Evans, & Lahire, 2011), and Niger (Beasley & Huillery, 2017) show no improvement on student learning outcomes. In contrast, experimental evidence from Indonesia (Pradhan et al., 2014) and Kenya (Duflo, Dupas, & Kremer, 2015) finds significant, positive effects on student test scores. The effective intervention in Indonesia provided information to parents about ways to become involved in their children’s school along with “link- age” meetings between parent committees and the village council (Pradhan et al., 2014). In Kenya, Duflo et al. (2015) provided parents with information about how to evaluate contract teachers and involved parents in the formal review process of contract renewals for teachers. In particular, our study builds upon the numerous evaluations of Mexico’s longstanding school- based management program, which consists of a package of education reforms including infras- 1 The information intervention studied in this paper is similar to a parental outreach program in France, which gave parents information about the functioning of schools and advice on how to support children with school work (Avvisati et al., 2014). The French program increased parental involvement and improved student behaviors, but did not raise student achievement. 4 tructure improvement, provision of school materials, teacher training, and school-based manage- ment (Garcia-Moreno, Gertler, & Patrinos, 2020; Murnane, Willet, & Cardenas, 2006; Santibanez, Abreu-Lastra, & O’Donoghue, 2014; Skoufias & Shapiro, 2006). Our study focuses specifically on the parental involvement component (Apoyo a la Gestión Escolar). A previous study of AGE used quasi- experimental methods to estimate the effect of the program (Gertler, Patrinos, & Rubio-Codina, 2012). This paper uses data from two randomized control trials conducted by the government of Mexico to understand the effectiveness of different program components – information, grants at the extensive margin, and grants at the intensive margin – in improving parental involvement in schools. The rest of the paper proceeds as follows. In Section 2, we present relevant background details of the study setting and the experimental design. We describe our data sources in Section 3, introduce our conceptual framework in Section 4, and present our empirical strategy in Section 5. In Section 6, we present results on the effect of each intervention on parental involvement in school, parenting behavior at home, teacher behavior in school, and children’s educational outcomes. In Section 7, we explore trust between parents and teachers as a mechanism for understanding the effectiveness of parental involvement programs. Section 8 concludes. 2 Background and Study Design In 1996, the Government of Mexico established Apoyo a la Gestión Escolar (AGE), a parental involvement program targeting parents of children enrolled in primary schools. The program has two key components: (1) financial grants to parent associations and (2) information provision to parents through parent associations. The grant is provided annually, ranging from USD 500 to 700 depending on the size of the student population. Parent associations can decide how to use these funds for school infrastructure, supplies, and activities. These funds are not permitted to be used towards increasing teacher or principal salaries.2 The information component of AGE provides parents with guidance on how to become more involved in their children’s schools and ways to support their children’s education. Each school appoints a community advisor, who is responsible for disseminating information to parents and 2 The grants were to be used for non-wage expenditure given that over 97 percent of school spending in Mexico is allocated towards teacher and principal salaries (Santibanez et al., 2014). 5 reporting the school’s progress to the Mexican Secretariat of Public Education. The community advisor receives training from the Mexican Secretariat of Public Education before the information intervention and they receive an allowance of approximately USD 40 at the end of the school year. In the vast majority of schools (98%), the school principal serves as the community advisor. Overall, the information component of AGE is extremely low-cost with a per-student cost of approximately USD 0.98.3 The information component consists of five sessions, each lasting approximately one hour. Each session was facilitated by the community advisor and focused on group discussion among parents. The first session is an introduction to AGE, highlighting the importance of parental involvement in schools. The second session covers the role of parents in their children’s education and ways in which parents can become involved in school activities and decision-making processes. The third session informs parents about education and health resources in their communities. The fourth session covers the key developmental milestones of children and adolescents, and introduces age- appropriate activities for parents to support to their children’s learning. The fifth session encourages parents to develop an action plan on how they will play a more active role in their children’s schooling and learning. Appendix A1 provides additional details of the information intervention.4 As the Government of Mexico gradually expanded AGE, two randomized controlled trials were conducted in four states (Chiapas, Guerrero, Puebla and Yucatan). As shown in Figure 1, these four states are home to a large indigenous population, which have faced a long history of discrimination in Mexico (Hall & Patrinos, 2004). The expansion of parental involvement programs in these states holds great promise for improving education but also presents unique challenges for engaging with families that have been historically marginalized. Given the gradual roll out of AGE, some schools in these states were already participating in AGE while others were not. Thus, the government designed two randomized controlled trials to tease out the effect of the financial component from the information intervention of the parental involvement program. The design of the experiments is outlined in Figure 2. The first experiment consists of 250 public schools that were already participating in AGE. 3 Cost calculations are based on administrative data from CONAFE during the expansion of information inter- ventions in 2006. 4 To encourage the proper dissemination of funds and information to parents, the Mexican Secretariat of Public Education conducts an audit for a random sample of schools each year. 6 Schools were randomly assigned to either (i) a control group that received the “standard” AGE program, consisting of the information intervention with a USD 500-700 grant, or (ii) a treatment group that received the same information intervention but with double (USD 1000-1400) the stan- dard grant amount.5 The design of the first experiment allows us to estimate the average treatment effect of doubling the grant for parent associations with the parental involvement program. Baseline data for the first experiment were conducted in 2007, with follow-up data collection in 2008, 2009 and 2010. In practice, the additional grant money in treatment schools is modest. Parents in the 250 public schools in experiment 1 reported spending an average of 9 USD per year per child on school materials, prior to the intervention. With an average of 80 students in these schools, the “double grant” amounts to an additional 7.5 USD per year per student, which covers approximately 83% of the out-of-pocket cost that is typically spent by parents. The second experiment consists of 180 public schools that had never participated in AGE. Schools were randomly assigned to either (i) no intervention or (ii) the information intervention. The design of the second experiment enables us to estimate the average treatment effect of the information intervention of the parental empowerment program. The duration of the second experiment was only one year, as baseline data were collected in 2009, with follow-up data collection in 2010. 3 Data and Descriptive Statistics Data for this study come from three sources: the school census data (Estadistica 911), the national standardized exam (ENLACE), and detailed self-reported surveys by students, parents, and teachers in grades 3-5 of the schools participating in the two experiments. The school census data and standardized exam scores were obtained through the National Council for Education Development (CONAFE). The survey data are publicly available on the World Bank Microdata Library.6 At the school-level, we merge the school census data with parent surveys, which were completed by the head of the parent association. At the student-level, we merge the national standardized exam with 5 The doubling of the grant was made possible through various donors including: Cinépolis, Deutsche Bank, Fundación Lazos, Fundación Televisa, Gillette Hall, JP Morgan Foundation, Panamerican Development Foundation and Western Union Foundation. 6 Mejia and Filus (2018) have used the survey data from the double grant experiment to explore which variables predict improvement in student test scores. 7 the student and teacher surveys using the unique population registry code (CURP). Descriptive statistics of baseline characteristics are presented in Table 1 for the double grant experiment and Table 2 for the information experiment. Each table is organized as a balance test of school-level variables in Panel A and a balance test of student-level variables in Panel B. In experiment 1, both general schools (which provide all instruction in Spanish) as well as in- digenous schools (which provide instruction in both indigenous languages and Spanish) participated in the study. While indigenous schools are 10.4% more likely to be found in the treatment group, we find no systematic difference between treatment and control schools in other school- or student- level variables. We conduct a joint F-test of the null hypothesis that there are no mean difference between treatment and control groups across all variables in each panel. The p -values suggests that our randomization provided balanced treatment and control groups at both the school level (our level of randomization) and student level. The summary statistics in Table 1 highlight two important features of our study context. First, parents have low levels of education. Of the parent association presidents, 77.6% in control schools and 81.6% in treatment schools reported primary (grades 1-6 in Mexico) as their highest level of education completed. Second, teachers are substantially more educated than parents. On average, 76.6% of teachers in control schools and 75.4% of teachers in treatment schools completed either a teaching college degree or university degree. Panel B of Table 1 shows the characteristics of students participating in the double grant exper- iment. Students in grades 3, 4 and 5 are surveyed, and nearly half of the study sample is female. The survey includes a number of questions about household assets, which are used to construct a wealth index using Principal Components Analysis (PCA). We further normalize the wealth index to be mean 0 with standard deviation 1 using the control group at baseline. Language and math test scores are from the national standardized exams (ENLACE). The test ranges from 200 to 800 points, with a national average of 500 and a standard deviation of 100. On average, students in experiment 1 score below the national average, with mean test scores ranging from 437 to 450. In our analyses below, we standardized the test scores in Spanish and Math for each grade to be mean 0 with standard deviation 1 using the control group scores at baseline. In the information experiment (Table 2), only general schools participated in the study. Overall, we do not find significant differences between treatment and control schools in school-level variables. 8 While students in grade 4 were 1.8% more likely to be sampled in control schools than in treatment schools, we do not find systematic difference between the two groups in other student-level variables. The p-values from the joint F-test are 0.477 (school) and 0.329 (student), which is consistent with successful randomization. Similar to schools in the double grant experiment, schools in the information experiment also have large differences in the educational backgrounds of parent and teachers. The majority of parent association presidents report primary school as their highest level of education, while most teachers had completed a teaching college degree or a university degree. At the student-level, the surveys for experiment 2 were administered to students in grades 3, 4 and 5, and nearly half of the study sample is female. We perform the same procedure as Table 1 to construct the wealth index. 4 Conceptual Framework & Measures Parental involvement programs (also known as family engagement programs) aim to improve school- and-parent communication to support children’s overall learning environment. These interventions target parents and caretakers, as it is widely accepted that parents play an important role in shaping children’s educational experience and outcomes (Doepke, Sorrenti, & Zilibotti, 2019). Parental involvement programs recognize that parents may not be able to fully engage with their children’s education because of biased beliefs (Barrera-Osorio et al., 2020; Bergman, 2016; Dizon-Ross, 2019; Rogers & Feller, 2018) and limited cognitive bandwidth, particularly for low- income parents (Mani et al., 2013; Mullainathan & Shafir, 2013). Thus, the group-based information interventions in our study are aimed at overcoming these psychological and informational barriers that often impede school-and-parent communication. Parental involvement programs also acknowledge that institutional discrimination hinders school- and-parent communication. Research suggest that schools can exclude parents whose culture or lifestyle differs from that of the dominant culture (J. S. Lee & Bowen, 2006). The power imbalance between schools and less advantaged parents makes it difficult for parents to take an active role in their children’s education. Thus, the double grant experiment in our study is aimed at giving parents power through direct influence over resource allocation in schools. The theory of action underlying parental involvement programs consists of three steps. The 9 first step is an increase in parental involvement. We examine parental involvement in schools using four measures: whether parent associations organized school activities and events, whether parent associations met with teachers to discuss children’s academic progress, whether parent associations participated in school decision meetings, and the percent of parents attending parent association meetings. In the next step, increase in parental involvement in schools should lead to changes in child inputs by parents and teachers. Parents have more information about their children’s behavior and performance in school, which allows them to adjust how they support their children at home. We measure two types of parental behavior at home: whether parents were aware of their children’s school assignments and whether parents helped with their children’s homework. Increases in parental involvement at school also mean greater oversight over teachers in how they manage their classrooms. Teachers may be induced to exert greater effort given that parents are regularly participating in school activities and events. We measure two types of teaching behavior: recorded days of teacher absences in the past month and an index of student-centered instruction. The index of student-centered instruction is the first principal components from principal component analyses of four survey items: (i) teacher explains concepts clearly, (ii) teacher reviews homework assignments, (iii) teacher does not ask students to copy from textbooks/blackboard without any explanation, and (iv) teacher gives students exercises that apply concepts learned in class. 7 The last step in the theory of action is improvements in educational outcomes. Using adminis- trative records from the school census data, we measure school-level failure, repetition, and dropout rates. We use the national standardized exam data for student-level test scores in Spanish and Math. Finally, we measure disciplinary action in schools by whether a student had been suspended, expelled, or involved in any other type of disciplinary action (i.e., referred to the principal) in the past academic year. 5 Empirical Strategy Experiments. For each experiment, we estimate the effect of being assigned to treatment at each year of data collection using the following model specification: 7 To benchmark the effect size of student-centered instruction, we show the association between student-centered instruction and the items used to construct the index at baseline in Table A6. 10 Yj = αj + βTj + ζY0j + εj (1) where Yj is the outcome of interest for school j , Tj is a binary variable (1 if school j was a treatment school and 0 otherwise), and Y0j is the baseline measure of the outcome of interest. β is the intent- to-treat effect of the intervention. While some of our outcomes of interest are measured at the school-level (parental involvement and school progression), others are measured at the student level (parenting behavior, teaching behavior, student test scores, and student disciplinary action).8 For student-level outcomes, we estimate the following model specification: ¯0j + γXij + εij Yij = αj + βTj + ζ Y (2) where Yij is the outcome of interest for student i in school j , Tj is a binary variable (1 if school j ¯0j is the school-average baseline measure of the outcome was a treatment school and 0 otherwise), Y of interest, and Xij is a vector of child characteristics (grade, gender, household wealth). β is the intent-to-treat effect of the intervention and we estimate robust standard errors clustered at the school level. As noted in our conceptual framework, we are interested in understanding how parental in- volvement programs affect several outcomes of interest and over several post-treatment years. We address multiple hypothesis testing by controlling for the familywise error rate (FWER) using the stepdown procedure proposed by Westfall, Young, and Wright (1993). A key threat to identification of our intent to treat estimates is differential attrition (i.e., the non- response on outcome measures at follow-up data collection) between treatment and control schools. We address these concerns by conducting two tests. First, we compare attrition rates between treatment and control schools and find that they are similar across groups. Second, we examine if the mean of baseline observable characteristics differs across treatment and control groups, conditional on response status. Overall, we do not find evidence of differential attrition rates or evidence of selective attrition based on observables (see results for double grant experiment in Table A1 and 8 Binary outcomes are estimated using a linear probability model for ease of interpretation of β in units of percentage points. Results estimated using logistic regression yield the same conclusion and are available upon request. 11 for information intervention in Table A2). Observational data. In addition to the two experiments, we also have a non-experimental contrast between the control group from experiment 1 and the treatment group from experiment 2. By comparing these two groups, we can estimate the effect of receiving the standard grant amount associated with the parental involvement program. In other words, this non-experimental contrast reveals the effect of providing grants to parent associations at the extensive margin (i.e., no grant versus single grant) whereas the double grant experiment focuses on the effect of providing grants at the intensive margin (i.e., single grant versus double grant). As described in Section 2 and noted in Figure 2, schools in experiment 1 were those already receiving the “standard” parental involvement package of grants and information, while schools in experiment 2 were those that had not yet received the parental involvement program. Histori- cally, the government selected schools to implement the parental involvement program based on an increasing function of indigenous student population. This means that schools in experiment 1 historically had larger proportions of indigenous students than schools in experiment 2. While we do not know the precise selection formula, we confirm in Figure 3a that the probability of be- ing assigned to experiment 1 is strongly increasing in the proportion of indigenous students. This implies that schools in experiments 1 and 2 have different proportions of indigenous students, and therefore, we cannot simply compare across the two experiments. Given our knowledge of the selection process to experiments 1 and 2, we trim our data to exclude indigenous schools – focusing only on general schools – when comparing across the two experiments to estimate the effect of the single grant. As shown in Figure 3b, dropping the indigenous schools imposes a common support restriction on the proportion of indigenous students and brings the distribution of indigenous students in treatment schools (single grant & information from experiment 1) and comparison schools (no grants & information from experiment 2) close together. Moreover, we drop one treatment school that has an indigenous student population above the 99th percentile of that of comparison schools. Figure 3c shows the distribution of treatment and comparison schools up to the 99th percentile cutoff (.94). This additional trimming procedure follows guidance from the matching literature, which suggests dropping treatment group observations with propensity scores above the 99th percentile of the propensity score in the comparison group as a way to establish 12 common support and improve the precision of estimators (Lechner & Strittmatter, 2017). For identification, we assume that the potential outcome of units in the treatment group (grant & information) and comparison group (no grant & information) are conditionally independent of the treatment assignment, given observed pre-treatment covariates x1 ... xp . Our main challenge is selecting a set of appropriate covariates. On the one hand, omitting covariates that predict the dependent variable and are correlated with treatment assignment can result in biased estimates of the average treatment effect. On the other hand, adding too many covariates can result in over- fitting the data. There is also concern of “researcher degrees of freedom” whereby authors may select covariates to generate the results they seek. To overcome the challenge of variable selection, we follow a principled approach using the double- lasso or post-double selection (PDS) (Belloni et al., 2014). The PDS uses lasso regression, which is a penalized regression that improves out-of-sample prediction by shrinking estimated regression coefficients towards zero and setting some coefficients to zero. These shrinkage properties of lasso allow it to perform variable selection. However, lasso tends to underestimate (and therefore exclude) small coefficients that are actually non-zero, which can result in omitted variable bias if directly applied as a regression that estimates Yi (outcome) on Ti (treatment) and x1 ... xp (observed covariates). Thus, the PDS approach aims to reduce omitted variable bias by following a three step procedure: 1. Fit lasso regression to predict the outcome Yi from observed covariates xi,1 to xi,p : Yi = β1 xi,1 + β2 xi,2 + ... + βp xi,p + εi (3) Covariates with non-zero coefficients from this model are A. 2. Fit lasso regression to predict the treatment assignment Ti from observed covariates xi,1 to xi,p : Ti = σ1 xi,1 + σ2 xi,2 + ... + σp xi,p + εi (4) Covariates with non-zero coefficients from this model are B . 13 3. Fit a linear regression of the outcome Yi on the treatment assignment Ti and covariates wi = A ∪ B : Yi = αTi + wi β + εi (5) Our coefficient of interest is α, which is the effect of receiving the single grant amount on outcome Yi , assuming that the dependence between treatment assignment and outcomes can be removed by conditioning on observable variables. Following advice from Angrist and Frandsen (2019), we show that our results are robust to different model specifications. Our main model estimates the double lasso regressions using the ‘plug-in’ penalty, which selects the tuning parameter of the penalty term to be just large enough to control the noise in the data (Belloni, Chen, Chernozhukov, & Hansen, 2012). Appendix Figures A1, A2, A3, A4 summarize our sensitivity analyses, showing that our main results remain similar across different model specifications. Details of our approach can be found in Appendix A2. 6 Results 6.1 Implementation of interventions Before turning to the ITT effects of the two interventions, we begin by examining the implementation of the interventions. For the double grant, we examine how the parent associations spent the additional funds. Figure 4 summarizes the overall spending patterns. The largest category of spending was for learning-related supplies (books, writing utensils, and writing surfaces) with 28% of the funds allocated in the first year. This amount increased over time, with 38% of total funds spent on learning supplies by year 3. Following learning supplies, the next largest category of spending was health-related supplies (first aid kits, personal hygiene products, and cleaning supplies). The amount allocated was 18% in year 1 and down to 16% by year 3. In contrast, parents chose to spend more funds towards repairs (fixing broken equipment, furniture, and space) and upgrades (purchasing new equipment and furniture) over time. Funding allocation for repairs increased from 17% in year 1 to 23% in year 3, and funding for upgrades increased from 18% to 21% between years 1 and 3. Less than 10% 14 of funds were spent on rent and utilities, transportation, or construction. For the information intervention, we examine whether information was actually offered to par- ents. Table 3 shows the results of regressing a binary outcome of whether an information session was offered to parents (where 0 = not offered and 1 = offered) on treatment status. As expected, none of the control schools offered these information sessions. Across the five separate information sessions offered to parents, between 91.0% and 94.9% of treatment schools offered the information session to parents. 6.2 Parental involvement Next, we estimate the effect of the interventions on parental involvement in schools. Table 4 presents these results. For the double grant experiment, we show the results separately for each follow-up year. Overall, we do not find significant changes in parental involvement in school activities and events (column 1), or in meetings with teachers to discuss student performance (column 2). The coefficients in column (1) are not statistically significant after correcting for multiple hypothesis testing. However, the double grant intervention seems to have created an opportunity for parent associa- tions to “have a seat at the table” with respect to school decision making processes. In the first year of the double grant, we observe a 15.3 percentage point increase in parental involvement in school decision making. This effect is quite large, as it translates to a 18.3% increase. Notably, we do not observe these effects in subsequent years, which suggests that the double grants created tempo- rary and not necessarily meaningful changes in parental involvement in the school decision-making process. The information intervention induced parent associations to become more involved in school activities & events by 15.0 percentage points. This is equivalent to a 16.9% increase in parental involvement, from a baseline participation rate of 88.8%. It appears that the increase in parental involvement was driven by greater participation among parents who were already members of par- ent associations rather than new parents becoming involved with the parent association. Column (4) indicates that there was no change in the percent of parents who are members of the parent association. Finally, we observe no impact of the single grant on parental involvement in Table 4. The 15 null results on parental involvement in school activities and events, and meeting with teachers to discuss student performance, are both consistent with the findings for the double grant. Given the modest amount of financial resources that parent associations are given, it is not surprising that the standard amount (single grant) was not sufficient to get parents involved in decision-making at schools. 6.3 Parenting and teaching behaviors Given that both the double grant and information interventions increased parental involvement in schools, we now turn to estimating effects on parenting and teaching behaviors in Table 5. For the double grant experiment, we do not find changes in parenting behavior; either in terms of awareness of children’s school assignments (column 1) or helping children with their homework (column 2). We also do not find impacts on teaching behaviors; either in terms of teacher absences (column 3) or student-centered instruction (column 4). For the information experiment, we find significant changes in parents’ behavior towards sup- porting their children’s learning. Parents are 5.7 percentage points more likely to be aware of their children’s school assignments and 8.8 percentage points more likely to help with their children’s homework. These results suggest that the information intervention not only increased parental involvement within schools but also improved parenting behavior outside of schools to support children’s learning. As noted in the background section, schools in these two experiments were in states with a large indigenous population. Given the historical marginalization of indigenous people, the parental involvement interventions in our study offer an opportunity to improve parent and school communi- cation, particularly for indigenous parents. We explore treatment effect heterogeneity by interacting the ITT parameter with whether parents identified as indigenous (where 0 = not indigenous and 1 = indigenous). The interaction coefficient indicates the degree to which parental involvement effects vary across non-indigenous and indigenous parents. In the information experiment, the improve- ments in parental behavior at home are concentrated among indigenous parents (see Table A4). For example, 20.5% of indigenous parents in control schools helped their children with homework, while 31.9% of indigenous parents in treatment schools did so. In contrast, 19.6% of non-indigenous par- ents in control schools supported their children with homework, whereas 25.1% of non-indigenous 16 parents in treatment schools did so. The difference in effect sizes between indigenous and non- indigenous parents is significant at the p < 0.10 level. To contextualize the null results on teaching behavior for the double grant experiment and information experiment, it is important to note that both interventions were targeted at parents. Any changes in teaching behavior would require not only increases in parental engagement in schools but also require parents to have sufficient opportunity to demand teachers to improve their behaviors. It is also worth noting the strength of teacher unions in Mexico, which ensures considerable job security in the profession (Estrada, 2019; Santibanez, 2006). Given that the majority of teachers in public schools are unionized, there may be little incentive for teachers to directly respond to parental demands. 6.4 Educational outcomes Finally, we examine whether the interventions had impacts on educational outcomes. Overall, results presented in Table 6 show null effects of providing grants – at the intensive margin (double grant) as well as at the extensive margin (single grant) – on school and student outcomes. These null results on educational outcomes are consistent with our previous findings that the grants did not induce meaningful changes in parenting and teaching behaviors. For the information intervention, we observe a 1.2 percentage point reduction in dropout rates and a 6.0 percentage point decrease in disciplinary action in treatment schools. Given our previous results on child inputs, these improvements in educational outcomes are likely to have been driven by changes in parenting behavior at home rather than by changes in teaching behavior. Notably, the information intervention did not have any impacts on test scores. Our findings are largely consistent with findings from (Avvisati et al., 2014), which found that a parental outreach program in France increased parental involvement and improved student behaviors, but did not raise student achievement. Taken together, our analysis highlights three key results. First, the two experiments induced different types of parental involvement in schools. Through the double grant, parent associations gained a moderate increase in financial resources. This additional money allowed parents to tem- porarily “have a seat at the table” with respect to school decision making processes. In contrast, the information intervention provided parents with resources to support their children’s education 17 from both within and outside schools. This information encouraged parents to become more in- volved in school activities and events, and to establish regular meetings with teachers to discuss their children’s performance in schools. Second, the information intervention changed parenting behavior at home. Parents in schools that received the information intervention were significantly more likely to be aware of their chil- dren’s school assignments and to help their children with homework. Moreover, these changes in parenting behavior were concentrated among indigenous parents. For example, 32 percent of indige- nous parents in treatment schools helped their children with homework, whereas only 20 percent of indigenous parents in control schools did so. The large impacts among indigenous parents suggest the potential for improving school-to-parent communication by targeting groups that have been historically excluded. Third, despite increases in parental involvement, the double grant and information provision did not affect student test scores on the national standardized exam. For the double grant experiment, the null results on educational outcomes are consistent with the fact that parents did not significantly change their parenting behavior at home and teachers did not significantly shift their teaching behavior. In contrast, the information intervention improved parental support for children’s learning at home. These changes in parenting behavior likely contributed to the marginal reduction in school dropout and disciplinary action, but did not translate to improvements in educational achievement. 7 Mechanism: Trust between parents and teachers Our study demonstrates that group-based interventions aimed at parents can improve parental involvement in schools and encourage parents to take a more active role in supporting their children’s learning at home. However, we show that increases in parental involvement do not necessarily translate to improvements in educational outcomes. While there are many plausible explanations for why we do not observe a causal link between parental involvement and educational outcomes, we focus our attention in this section on the theoretically important concept of trust. Parental involvement programs rely on the formation of successful social ties between parents and teachers to collectively support the needs of children. What constitutes a successful parent- teacher relationship? A large body of theoretical and empirically research suggests that trust is a 18 core component of social capital (Coleman, 1994; Putnam, 2001) and the absence of trust severely hampers transactions between actors (Fehr, 2009). Trust is formed between individuals through networks and institutions (Ostrom, 2000). In net- works, the repeated nature of social interaction allows individuals to examine each other’s behaviors. If these repeated interactions send a positive (negative) signal, trust is enhanced (diminished). In institutions, rules are established to punish or reward behaviors, and a common understanding of these rules between individuals can foster trust. However, when rules are not clear in institutions, a lack of common expectations can decrease trust. We view the information treatment as an intervention aimed to enhance network formation as parents are expected to participate in group discussions with other parents and teachers, and share their views and experiences. In the information sessions, parents learn about what teachers are teaching in school and how the learning objectives align with children’s development. This means that in theory, the information sessions give parents an opportunity to receive repeated positive signals about teachers. Thus, we hypothesize an enhancement of trust between parents and teachers from the information intervention. In contrast, we view the double grant treatment as an intervention aimed to strengthen rules in institutions (in this case, schools) by giving parents more financial authority over school resources. Given the flexibility in how these funds can be allocated, the double grant intervention can cre- ate “an incomplete social contract” (Ostrom, 2000), whereby parents and teachers may not share common expectations about how these funds should be distributed. Thus, we hypothesize that the institutional context surrounding the double grant intervention can lead to a decline in trust between teachers and parents. To test these theoretical predictions, we estimate the effect of the two experiments on trust. Trust is widely measured by asking survey respondents whether they trust others.9 Following the survey literature on measuring trust in economics (Fehr, 2009; Glaeser, Laibson, Scheinkman, & Soutter, 2000), we directly asked parent and teachers about their trust in each other. Specifically, we asked parents, "do you think that most teachers can be trusted?" and asked teachers, "do you 9 The most frequently used measure of trust is based on the American General Social Survey (GSS) and the World Values Survey (WVS), which asks, “Generally speaking, would you say that most people can be trusted or that you can’t be too careful in dealing with people?” Miller and Mitamura (2003) demonstrates that this wording measures both beliefs about the trustworthiness of others as well as preferences towards taking social risks. We follow the recommendation by Miller and Mitamura (2003) and remove the risk preference aspect of the question. 19 think that most parents can be trusted?" In addition to directly asking teachers and parents about the trustworthiness of each other, we also construct a measure of responsibility. At the core of parent-teacher relationships is the shared responsibility of educating children. Thus, we hypothesize that parent and teacher trustworthiness are likely to be closely related to how well they are perceived to be carrying out this shared re- sponsibility of supporting children’s learning. We construct a responsibility index separately for teachers and parents using the first principal components from principal component analyses of several survey items. For parental views of teacher responsibility, we use the following survey items: (i) teachers are available for meetings, (ii) teachers support extra-curricular activities, (iii) teachers handle conflict resolution between students, (iv) teachers provide additional classes for struggling students, (v) teachers care about student achievement, and (vi) teachers provide feedback on stu- dent assignments. For teacher views of parent responsibility, we use the following survey items: (i) parents are available for meetings, (ii) parents help children with school work, (iii) parents make sure that children are completing school assignments, and (iv) parents support extra-curricular ac- tivities. For this exploratory analysis, we re-estimate equation (1) using trust and job responsibility as the outcome of interest. The results are presented in columns (1) and (3) in Table 7. In the double grant experiment, parents’ view of teacher trustworthiness significantly declined by 9.6 percentage points from a high baseline level of 91.9%. This negative impact on teacher trustworthiness persists into years 2 and 3 of the double grant intervention. We observe a similarly sharp decrease in teachers’ view of parent trustworthiness, between 8.5 and 10.0 percentage points. These declines in trustworthiness seems to track with perceptions of responsibility, as parents are significantly less likely to believe that teachers are carrying out their duties to support their children’s education. Moreover, the diminished trust between parents and teachers may explain why the temporary increase in year 1 for parental involvement in school decision-making was not sustained in subsequent years. While speculative, the double grant intervention’s negative effect on trust suggests that parental involvement interventions can backfire if institutional rules are unclear about the expectations of parents and teachers as parents increase their involvement in schools. In the information experiment, parents’ trust towards teachers significantly improved, which is consistent with the theoretical prediction above. Parents in treatment schools are 14.8 percentage points more likely to believe that most teachers can be trusted, from a baseline level of 81.2%. 20 This also tracks with the large positive effects of the intervention on parents’ perception of teacher responsibility. While we see positive shifts on the parent-side, we do not observe any significant changes in teachers’ trust or view of responsibility. These results underscore the fact that the information intervention targeted parents, not teachers, in promoting school-to-parent communica- tion. The results also raise a question about whether information interventions may need to more formally integrate teachers in order to foster stronger social ties between parents and teacher to support children’s development. 8 Conclusion In this paper, we examined the effectiveness of group-based parental involvement interventions using two randomized controlled trials. The first experiment provided parent associations in treatment schools with a modest increase in financial resources. The second experiment provided information to parents about how to become more involved in their children’s schooling. We found that the two experiments induced different types of parental involvement in schools. Through the double grant, parent associations gained a moderate increase in financial resources. This additional money allowed parents to temporarily “have a seat at the table” with respect to school decision making processes. In contrast, the information intervention provided parents with resources to support their children’s education from both within and outside schools. This informa- tion encouraged parents to become more involved in school activities and events, and to establish regular meetings with teachers to discuss their children’s performance in schools. The information intervention was effective at changing parenting behavior at home. Parents in schools that received the information intervention were significantly more likely to be aware of their children’s school assignments and to help their children with homework. Moreover, these changes in parenting behavior were concentrated among indigenous parents. The result highlights the enormous potential for improving school-to-parent communication by targeting groups that have been historically excluded from parent associations and school committees. Yet despite increases in parental involvement, neither interventions affected educational achieve- ment. To better understand these null effects, we explored how trust plays a role in parent-teacher relationships. We find that parental involvement interventions led to significant changes in perceived 21 trustworthiness between teachers and parents. The double grant intervention led to a significant decline in relational trust between teachers and parents. Meanwhile, the information intervention resulted in improvements in teachers’ trustworthiness. While these results are suggestive, we hy- pothesize that relational trust is important for establishing positive parent-teacher relationships and creating learning environments conducive to raising student performance. 22 9 Tables & Figures Yucatán 65.40 23 Puebla 35.28 Guerrero 33.92 Percent Indigenous Population Chiapas Greater than 30% 36.15 20-29% 10-19% Less than 10% Figure 1: States of Mexico Note: The figure shows the percent of indigenous population in each state in Mexico. The parental involvement interventions were carried out in Guerrero, Puebla, Chiapas and Yucatan (labeled in white font). Data source: World Bank Open Data. Timing of data Experiment 1: Double grant Experiment 2: Information collection 250 schools 180 schools randomization Year 0 Control Treatment Grant Double Grant ($500-700) ($1000-1400) & & Information Information Year 1 125 schools 125 schools 24 randomization Year 2 Control Treatment No Grant No Grant & & No Information Information 100 schools 80 schools Year 3 Figure 2: Experimental design of two parental involvement interventions Note: The figure summarizes the experimental design of the two randomized controlled trials in this study. Baseline survey data for experiment 1 was collected in year 0 and baseline survey data for experiment 2 was collected in year 2. Table 1: Baseline balance of double grant experiment Control: Treatment: Difference: Grant & Info Double Grant & Info Mean (S.D.) Mean (S.D.) T-C (S.E.) Panel A: School characteristics Type of school General school (1=Yes) 0.600 (0.492) 0.496 (0.502) -0.104 * (0.063) Indigenous school (1=Yes) 0.400 (0.492) 0.504 (0.502) 0.104 * (0.063) Parent association president Highest edu. is primary (1=Yes) 0.776 (0.419) 0.816 (0.389) 0.040 (0.051) Years as president 1.376 (1.336) 1.328 (1.148) -0.048 (0.158) Indigenous (1=Yes) 0.400 (0.492) 0.504 (0.502) 0.104 * (0.063) Teachers Prop. with teaching college degree 0.165 (0.315) 0.210 (0.361) 0.045 (0.043) Prop. with university degree 0.606 (0.446) 0.544 (0.462) -0.063 (0.057) Failure rate 0.099 (0.066) 0.097 (0.093) -0.003 (0.010) 25 Repetition rate 0.070 (0.060) 0.068 (0.061) -0.003 (0.008) Dropout rate 0.022 (0.039) 0.024 (0.072) 0.002 (0.007) Number of schools 125 125 p-value of joint F-test 0.768 Panel B: Student characteristics Indigenous (1=Yes) 0.391 (0.488) 0.426 (0.495) 0.036 (0.064) Female (1=Yes) 0.487 (0.500) 0.487 (0.500) 0.000 (0.012) Household wealth index (S.D.) 0.000 (1.000) -0.027 (1.012) -0.027 (0.088) Grade 3 (1=Yes) 0.333 (0.471) 0.338 (0.473) 0.004 (0.008) Grade 4 (1=Yes) 0.337 (0.473) 0.341 (0.474) 0.004 (0.008) Grade 5 (1=Yes) 0.329 (0.470) 0.321 (0.467) -0.008 (0.008) Language score 440.623 (87.430) 437.804 (89.606) -2.819 (9.750) Math score 450.740 (97.461) 447.853 (102.326) -2.886 (11.477) Number of students 4796 4570 p-value of joint F-test 0.943 Notes: Robust standard errors clustered at school level in Panel B. Reported p-value is from a joint F-test of the null hypoth- esis that there are no mean differences between treatment and control across all variables in the panel. All variables measured in 2007. * p<0.1, ** p<0.05, *** p<0.01 Table 2: Baseline balance of information experiment Control: Treatment: Difference No Grant & No Info No Grant & Info Mean (S.D.) Mean (S.D.) T-C (S.E.) Panel A: School characteristics Type of school General school (1=Yes) 1.000 (0.000) 1.000 (0.000) 0.000 (0.000) Parent association president Highest edu. is primary (1=Yes) 0.650 (0.479) 0.662 (0.476) 0.013 (0.072) Years as president 1.590 (0.830) 1.688 (0.894) 0.098 (0.129) Indigenous (1=Yes) 0.200 (0.402) 0.150 (0.359) -0.050 (0.058) Teachers Prop. with teaching college degree 0.207 (0.323) 0.246 (0.333) 0.038 (0.049) Prop. with university degree 0.603 (0.424) 0.529 (0.419) -0.073 (0.063) Failure rate 0.080 (0.060) 0.068 (0.063) -0.013 (0.009) Repetition rate 0.055 (0.053) 0.047 (0.044) -0.009 (0.007) 26 Dropout rate 0.025 (0.042) 0.026 (0.045) 0.001 (0.007) Number of schools 100 80 p-value of joint F-test 0.477 Panel B: Student characteristics Indigenous (1=Yes) 0.151 (0.358) 0.115 (0.319) -0.036 (0.047) Female (1=Yes) 0.499 (0.500) 0.492 (0.500) -0.007 (0.011) Household wealth index (S.D.) 0.000 (1.000) -0.011 (0.916) -0.011 (0.080) Grade 3 (1=Yes) 0.314 (0.464) 0.329 (0.470) 0.015 * (0.008) Grade 4 (1=Yes) 0.349 (0.477) 0.332 (0.471) -0.018 ** (0.007) Grade 5 (1=Yes) 0.336 (0.473) 0.339 (0.473) 0.003 (0.005) Language score 495.214 (107.029) 489.013 (102.780) -6.201 (9.327) Math score 513.129 (120.626) 505.739 (116.290) -7.390 (10.694) Number of students 4576 3602 p-value of joint F-test 0.284 Notes: Robust standard errors clustered at school level in Panel B. Reported p-value is from a joint F-test of the null hypothesis that there are no mean differences between treatment and control across all variables in the panel. All variables measured in 2009. * p<0.1, ** p<0.05, *** p<0.01 1 Prob. of selection to treatment .8 .6 .4 .2 0 .2 .4 .6 .8 1 Proportion of indigenous students (a) Selection into treatment by proportion of indigenous students 8 15 27 6 10 Density Density 4 5 2 0 0 0 .2 .4 .6 .8 1 .03 .14 .66 .94 Proportion of indigenous students Proportion of indigenous students Density of treatment group Density of treatment group Density of comparison group Density of comparison group (b) Kernel density estimates of percent indigenous students (c) Histogram density of percent indigenous students Figure 3: Common support for non-experimental contrast Note: These figures asses the selection process and common support for the non-experimental contrast: the treatment group is the single grant & no information group from experiment 1 and the comparison group is the no grant & information group from experiment 2. (a) is a binned scatter plot of the percent of schools assigned to treatment as a function of the proportion of indigenous students. Schools are binned into 20 equal sized group along the x-axis, and the mean of each group is plotted on the y-axis. (b) is a kernel density of the proportion of indigenous students after the data has been trimmed to exclude indigenous schools. (c) is the histogram density of the proportion of indigenous students after the data has been trimmed to exclude indigenous schools and schools outside of the 75th and 99th percentile of indigenous student population. The 1st to 4th number in the x-axis corresponds to the 75th, 90th, 95th, and 99th percentiles. 40 Percent of double grant spent 30 20 10 0 1 year 2 years 3 years Post treatment years Learning related supplies 28 Health related supplies Repairs Upgrades Rent & utilities Transportation Construction Figure 4: Allocation of double grant Note: The figure shows the means and 95% confidence intervals of allocation of the double grant in treatment schools. "Learning related supplies" are spending on books, writing utensils (e.g., pencils, pens, chalk, etc.), and writing surfaces (e.g., paper, chalkboard, notebooks). "Health related supplies" are spending on first aid kits, basic personal hygiene products, and cleaning supplies. "Repairs" refer to funds spent on fixing broken equipement, furniture, and space in schools. "Upgrades" refer to funds spent on purchasing new equipment and furniture. "Rent & utilities" are spending related to real estate, electricity, water, etc. "Transportation" is spending related to transportation for teachers and students. "Construction" is spending on the infrastructure of classrooms and teacher dormitories. Data source: Parent association survey. Table 3: Delivery of information to parent association (1) (2) (3) (4) (5) Role of Community Child Action Overview parents resources development plans Information experiment Treatment 0.936*** 0.910*** 0.936*** 0.949*** 0.936*** (0.028) (0.033) (0.028) (0.025) (0.028) Control mean 0.000 0.000 0.000 0.000 0.000 Observations 174 174 174 174 174 Notes: Robust standard errors in parentheses. Data source: parent surveys. * p<0.1, ** p<0.05, *** p<0.01 29 Table 4: Effect on parental involvement Organized Met with Involved in Percent of parents school activities teachers to discuss school regularly attending & events student performance decision making meetings (1) (2) (3) (4) Double grant experiment Treatment (1 year) 0.076* 0.034 0.153** -0.726 (0.044) (0.042) (0.060) (2.383) WY p -value 0.160 0.660 0.020 0.780 Treatment (2 year) 0.084** 0.003 0.046 0.034 (0.042) (0.042) (0.062) (1.901) WY p -value 0.160 1.000 0.840 1.000 Treatment (3 year) 0.067* -0.021 0.016 -1.073 (0.040) (0.045) (0.064) (1.666) WY p -value 0.320 0.900 0.900 0.840 30 Control mean 0.824 0.840 0.584 86.776 Observations 244 244 244 244 Information experiment Treatment (1 year) 0.150*** 0.129** 0.037 3.300 (0.055) (0.054) (0.075) (2.078) WY p -value 0.020 0.080 0.560 0.280 Control mean 0.730 0.750 0.550 87.770 Observations 174 174 174 174 Single grant observation Treatment (1 year) -0.056 -0.084 0.013 -1.839 (0.055) (0.053) (0.085) (2.123) WY p -value 0.740 0.340 0.900 0.740 Control mean 0.910 0.910 0.615 91.231 Observations 151 151 151 151 Notes: Robust standard errors in parentheses. Westfall and Young (WY) p-values reported. Dependent variables in columns (1)-(3) are binary (where 1=Yes). Dependent variable in column (4) is continuous. Data source: Parent associa- tion surveys. * p<0.1, ** p<0.05, *** p<0.01 Table 5: Effect on parenting and teaching behaviors Parenting Teaching Aware of Student- school Helps with Days absent centered assignments homework in past month instruction (1) (2) (3) (4) Double grant experiment Treatment (1 year) 0.018 0.021 -0.335 0.031 (0.026) (0.026) (0.367) (0.055) WY p -value 0.790 0.790 0.790 0.790 Treatment (2 year) 0.047 0.043 0.295 0.038 (0.034) (0.033) (0.299) (0.059) WY p -value 0.550 0.560 0.590 0.590 Treatment (3 year) 0.021 0.030 -0.172 0.017 (0.033) (0.033) (0.433) (0.064) 31 WY p -value 0.930 0.780 0.950 0.950 Control mean 0.437 0.414 1.597 0.076 Observations 8881 8881 8881 8881 Information experiment Treatment (1 year) 0.045** 0.072*** -0.087 -0.002 (0.021) (0.021) (0.466) (0.058) WY p -value 0.130 0.010 0.990 0.990 Control mean 0.245 0.203 1.534 -0.094 Observations 7950 7950 7950 7950 Single grant observation Treatment (1 year) -0.038 -0.043 0.694 -0.039 (0.036) (0.031) (0.610) (0.072) WY p -value 0.560 0.550 0.560 0.580 Control mean 0.306 0.270 2.591 -0.013 Observations 6288 6288 6288 6288 Notes: Robust standard errors in parentheses. Westfall and Young (WY) p-values reported. Dependent variables in columns (1) and (2) are binary (where 1=Yes). Dependent variable in columns (3) and (4) are continuous. Data source: Student surveys. * p<0.1, ** p<0.05, *** p<0.01 Table 6: Effect on educational outcomes School-level Student-level (1) (2) (3) (4) (5) (6) Failure Dropout Repetition Spanish Math Disciplinary rate rate rate test (S.D.) test (S.D.) action Double grant experiment Treatment (1 year) 0.013 0.001 0.000 0.079 0.079 0.003 (0.009) (0.006) (0.005) (0.075) (0.082) (0.024) WY p -value 0.220 0.960 0.990 0.490 0.490 0.900 Treatment (2 year) 0.001 0.008 -0.001 0.063 0.005 -0.006 (0.008) (0.006) (0.004) (0.084) (0.087) (0.026) WY p -value 0.940 0.420 0.880 0.770 0.950 0.950 Treatment (3 year) -0.003 0.003 -0.001 0.037 -0.002 -0.008 (0.008) (0.007) (0.004) (0.098) (0.103) (0.027) WY p -value 1.000 0.970 1.000 0.930 1.000 0.930 32 Control mean 0.082 0.073 0.020 0.021 -0.093 0.269 Observations 250 250 250 8881 8881 8881 Information experiment Treatment (1 year) -0.010 -0.012* -0.006 0.078 0.023 -0.061*** (0.008) (0.006) (0.006) (0.107) (0.112) (0.021) WY p -value 0.280 0.250 0.280 0.610 0.850 0.000 Control mean 0.069 0.056 0.025 0.066 -0.078 0.254 Observations 180 180 180 7981 7981 7981 Single grant observation Treatment (1 year) 0.018 0.007 0.007 0.112 0.139 0.025 (0.011) (0.006) (0.007) (0.109) (0.118) (0.026) WY p -value 0.210 0.390 0.420 0.570 0.570 0.570 Control mean 0.056 0.041 0.019 0.109 -0.086 0.200 Observations 151 151 151 6288 6288 6288 Notes: Robust standard errors clustered by school in parentheses. Westfall and Young (WY) p-values reported. Data source: Administrative data from school census (E911) and national standardized exam scores (ENLACE).* p<0.1, ** p<0.05, *** p<0.01 Table 7: Trust and responsibility between teachers and parents Parent’s perspective Teacher’s perspective (1) (2) (3) (4) Most Most teachers Teacher parents Parent can be Responsibility can be Responsibility trusted Index (SD) trusted Index(SD) Double grant experiment Treatment (1 year) -0.096** -0.418*** -0.110** -0.096 (0.043) (0.133) (0.048) (0.121) WY p -value 0.000 0.000 0.100 0.460 Treatment (2 year) -0.089*** -0.300** -0.099** -0.154 (0.034) (0.129) (0.040) (0.129) WY p -value 0.000 0.000 0.060 0.140 Treatment (3 year) -0.113*** -0.217 -0.115** -0.068 33 (0.040) (0.142) (0.045) (0.135) WY p -value 0.020 0.080 0.000 0.460 Control mean 0.920 0.000 0.803 -0.000 Observations 244 244 732 732 Information experiment Treatment (1 year) 0.149*** 0.309** 0.027 0.294* (0.046) (0.152) (0.055) (0.164) WY p -value 0.000 0.040 0.500 0.120 Control mean 0.820 0.000 0.777 -0.000 Observations 174 174 522 522 Single grant observation Treatment (1 year) -0.007 -0.446*** 0.027 -0.192 (0.034) (0.170) (0.070) (0.193) WY p -value 0.900 0.000 0.840 0.640 Control mean 0.962 0.337 0.778 0.209 Observations 151 151 453 453 Notes: Robust standard errors clustered by school in parentheses. Westfall and Young (WY) p-values reported. Data source: parent association survey and teacher survey.* p<0.1, ** p<0.05, *** p<0.01 References Al-Ubaydli, O., List, J. A., & Suskind, D. (2020). The science of using science: Towards an understanding of the threats to scalability. International Economic Review . Angrist, J., & Frandsen, B. (2019). Machine labor (Tech. Rep.). National Bureau of Economic Research. Attanasio, O. P., & Kaufmann, K. M. (2014). Education choices and returns to schooling: Moth- ers’ and youths’ subjective expectations and their role by gender. Journal of Development Economics , 109 , 203–216. Avvisati, F., Gurgand, M., Guyon, N., & Maurin, E. (2014). Getting parents involved: A field experiment in deprived schools. Review of Economic Studies , 81 (1), 57–83. Banerjee, A., Banerji, R., Berry, J., Duflo, E., Kannan, H., Mukerji, S., . . . Walton, M. (2017). From proof of concept to scalable policies: Challenges and solutions, with an application. Journal of Economic Perspectives , 31 (4), 73–102. Banerjee, A., Banerji, R., Duflo, E., Glennerster, R., & Khemani, S. (2010). Pitfalls of participatory programs: Evidence from a randomized evaluation in education in India. American Economic Journal. Economic Policy , 2 (1), 1. Barrera-Osorio, F., Gonzalez, K., Lagos, F., & Deming, D. J. (2020). Providing performance infor- mation in education: An experimental evaluation in Colombia. Journal of Public Economics , 186 , 104185. Beasley, E., & Huillery, E. (2017). Willing but unable? short-term experimental evidence on parent empowerment and school quality. The World Bank Economic Review , 31 (2), 531–552. Belloni, A., Chen, D., Chernozhukov, V., & Hansen, C. (2012). Sparse models and methods for optimal instruments with an application to eminent domain. Econometrica , 80 (6), 2369–2429. Belloni, A., Chernozhukov, V., & Hansen, C. (2014). Inference on treatment effects after selection among high-dimensional controls. The Review of Economic Studies , 81 (2), 608–650. Bergman, P. (2016). Parent-child information frictions and human capital investment: Evidence from a field experiment investment. Journal of Political Economy . Blimpo, M. P., Evans, D. K., & Lahire, N. (2011). School-based management and educational outcomes: Lessons from a randomized field experiment. Unpublished manuscript . Coleman, J. S. (1994). Foundations of social theory. Harvard university press. Cunha, F., Heckman, J. J., Lochner, L., & Masterov, D. V. (2006). Interpreting the evidence on life cycle skill formation. Handbook of the Economics of Education , 1 , 697–812. Dìaz-Martin, L., Gopalan, A., Guarnieri, E., & Jayachandran, S. (2020). Greater than the sum of the parts? evidence on mechanisms operating in women?s groups. Dizon-Ross, R. (2019). Parents’ beliefs about their children’s academic ability: Implications for educational investments. American Economic Review , 109 (8), 2728–65. Doepke, M., Sorrenti, G., & Zilibotti, F. (2019). The economics of parenting. Annual Review of Economics , 11 , 55–84. Duflo, E., Dupas, P., & Kremer, M. (2015). School governance, teacher incentives, and pupil-teacher ratios: Experimental evidence from Kenyan primary schools. Journal of Public Economics , 123 , 92–110. Estrada, R. (2019). Rules versus discretion in public service: Teacher hiring in Mexico. Journal of Labor Economics , 37 (2), 545–579. Fehr, E. (2009). On the economics and biology of trust. Journal of the European Economic Association , 7 (2-3), 235–266. Garcia-Moreno, V. A., Gertler, P., & Patrinos, H. A. (2020). School-based management and learning outcomes : Experimental evidence from Colima, Mexico. Community Participation 34 with Schools in Developing Countries: Towards Equitable and Inclusive Basic Education for All . Gertler, P., Patrinos, H. A., & Rubio-Codina, M. (2012). Empowering parents to improve education: Evidence from rural mexico. Journal of Development Economics , 99 (1), 68–79. Glaeser, E. L., Laibson, D. I., Scheinkman, J. A., & Soutter, C. L. (2000). Measuring trust. The Quarterly Journal of Economics , 115 (3), 811–846. Goller, D., Lechner, M., Moczall, A., & Wolff, J. (2020). Does the estimation of the propensity score by machine learning improve matching estimation? the case of Germany’s programmes for long term unemployed. Labour Economics , 101855. Hall, G., & Patrinos, H. A. (2004). Indigenous peoples, poverty and human development in latin america: 1994-2004. The World Bank. Houtenville, A. J., & Conway, K. S. (2008). Parental effort, school resources, and student achieve- ment. Journal of Human resources , 43 (2), 437–453. Jensen, R. (2010). The (perceived) returns to education and the demand for schooling. The Quarterly Journal of Economics , 125 (2), 515–548. Lechner, M., & Strittmatter, A. (2017). Practical procedures to deal with common support problems in matching estimation. Econometric Reviews , 38 (2), 193–207. Lee, B. K., Lessler, J., & Stuart, E. A. (2010). Improving propensity score weighting using machine learning. Statistics in medicine , 29 (3), 337–346. Lee, J. S., & Bowen, N. K. (2006). Parent involvement, cultural capital, and the achievement gap among elementary school children. American Educational Research Journal , 43 (2), 193–218. Mani, A., Mullainathan, S., Shafir, E., & Zhao, J. (2013). Poverty impedes cognitive function. science , 341 (6149), 976–980. Mejia, A., & Filus, A. (2018). Exploring predictors of impact of school-based management in rural Mexico: Do student engagement, teacher attitudes and parent involvement predict better academic outcomes? International Journal of Educational Research , 88 , 95–108. Miller, A. S., & Mitamura, T. (2003). Are surveys on trust trustworthy? Social Psychology Quarterly , 62–70. Mullainathan, S., & Shafir, E. (2013). Scarcity: Why having too little means so much. Macmillan. Murnane, R. J., Willet, J. B., & Cardenas, S. (2006). Did participation of schools in Programa Escuelas de Calidad influence student outcomes? Working Paper . Nguyen, T. (2008). Information, role models and perceived returns to education: Experimental evidence from Madagascar. Unpublished manuscript , 6 . Ostrom, E. (2000). Social capital: A fad or a fundamental concept. Social capital: A multifaceted perspective , 172 (173), 195–98. Pradhan, M., Suryadarma, D., Beatty, A., Wong, M., Gaduh, A., Alisjahbana, A., & Artha, R. P. (2014). Improving educational quality through enhancing community participation: Results from a randomized field experiment in Indonesia. American Economic Journal: Applied Eco- nomics , 6 (2), 105–26. Putnam, R. (2001). Social capital: Measurement and consequences. Canadian journal of policy research , 2 (1), 41–51. Rogers, T., & Feller, A. (2018). Reducing student absences at scale by targeting parents’ misbeliefs. Nature Human Behaviour , 2 (5), 335. Santibanez, L. (2006). Why we should care if teachers get A’s: Teacher test scores and student achievement in Mexico. Economics of Education Review , 25 (5), 510–520. Santibanez, L., Abreu-Lastra, R., & O’Donoghue, J. L. (2014). School based management effects: Resources or governance change? Evidence from Mexico. Economics of Education Review , 39 , 97–109. 35 Skoufias, E., & Shapiro, J. (2006). The pitfalls of evaluating a school grants program using non- experimental data. Policy Research Working Paper , 4036 . Small, M. L., & Gose, L. E. (2020). How do low-income people form survival networks? routine organizations as brokers. The ANNALS of the American Academy of Political and Social Science , 689 (1), 89–109. Todd, P. E., & Wolpin, K. I. (2007). The production of cognitive achievement in children: Home, school, and racial test score gaps. Journal of Human capital , 1 (1), 91–136. Urminsky, O., Hansen, C., & Chernozhukov, V. (2016). Using double-lasso regression for principled variable selection. Working Paper . Westfall, P. H., Young, S. S., & Wright, S. P. (1993). On adjusting p-values for multiplicity. Biometrics , 49 (3), 941–945. Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American Statistical Association , 101 (476), 1418–1429. 36 A Appendix A1 Details of the information intervention Information was provided to parents in five sessions. Each session was facilitated by the community advisor and focused on group discussion. Details of the sessions are described below. Overview: The first session provided an overview about the importance of parental involvement in schools. The community advisor and parents introduced themselves. This session was intended to establish community norms, with the community advisor encouraging parents to share their perspectives and raise questions. The logistics of future sessions were planned. Role of parents: The second session focused on how parents can support their children’s learning both within and outside schools. Parents formed groups of 3-7 to discuss recent events/issues in their community that have affected their children’s education. The community advisor facilitated discussion about how parents can work together with teachers in the school to tackle these issues. Community resources: In the third session, the community advisor provided information about educational and health resources in the community. Parents received a detailed map of where to access these resources. Parents formed groups of 3-7 to share their views about additional resources that are needed in the community. Child development: The fourth session focused on children’s development. The community advisor explained the key learning objectives/materials covered in each grade in primary school. Parents formed groups of 3-7 to discuss concretely what parents can do to support their children’s learning at home. Action plans: The last session focused on making concrete action plans for parents. The action plans focused on ideas or initiatives that individual parents can do to support their children’s education as well as what the parent association can do to increase parental involvement in schools. 37 A2 Details of post-double selection lasso To compare groups across experiments, we assume that the potential outcome of units in the treat- ment group (grant & information) and comparison group (no grant & information) are conditionally independent of the treatment assignment, given observed pre-treatment covariates x1 ... xp . We select covariates using the double-lasso or post-double selection (PDS) (Belloni et al., 2014). The PDS uses lasso regression, a penalized regression that minimizes: n p 1 (yi − x i β )2 + λ |βj | (A1) n i=1 j =1 where n is the sample size, y is the outcome, x contains the p potential covariates, β is the vector of coefficients on x , and βj is the j th element of β . The first term is the least-squares fit measure and the second term is the penalty term. λ is the tuning parameter, which determines the shrinkage of estimated coefficients. In our sensitivity analyses, we select the penalty level of λ using four different model specifica- tions. The first and main model uses the ‘plug-in’ penalty, which optimizes between a value of λ large enough to control the noise in the data and small enough for shrinkage bias (Belloni et al., 2012). The second specification uses cross-validation. We partition the data into 10 folds, treating each fold as a validation (testing) data set while the remaining folds are used as training data. The value of λ that shows the best out-of-sample predictive performance across the folds is selected. The third model is a variant of cross-validation and selects the largest penalty such that the mean- squared error (MSE) is within one-standard deviation of the MSE from cross-validation. Our last specification uses adaptive lasso, which is a two-step version of cross-validation. The first step of adaptive lasso is cross-validation and the second step performs cross-validation among the covari- ates selected in the first step, which tends to exclude covariates with small coefficients that should have been omitted (Zou, 2006). While the plug-in tends to produce models with few covariates, cross-validation tends to select many covariates (Belloni et al., 2012). Given that our identification relies on conditional independence, using cross-validation (and variants of it) to select λ allows us to examine the sensitivity of our main results to the inclusion of additional covariates. The candidate covariates for each of our outcome variables are summarized below. 38 Number of Category Outcomes Description of covariates covariates Head of parent association’s gender, indigenous status, family’s 121 indigenous status, education level, selection to position (election/appointed), Involved in school activities & events 121 years in position; community advisor is principal; lagged outcome variables Meeting with teachers to discuss student performance Parental involvement measured prior year; educational outcomes aggregated at school level in Involved in school-decision making 121 prior year; teachers’ educational level aggregated at school level in prior year; Percent of parents regularly attending meetings 121 average annual school expenditure in prior year; two-way interactions of covariates; missing indicators of covariates. Student’s gender, grade level, indigenous status, physical disabilities, learning disabilities, age of school entry, number of siblings, mother’s education level, Parenting: Aware of school assignments 300 household wealth index; teacher’s gender, indigenous status, holds non- Parenting & teaching Parenting: Helps with homework 300 teaching jobs, years of teaching experience, experience teaching same grade, behaviors Teaching: Days absent in past month 300 educational attainment level; lagged outcome variable aggregated at school Teaching: Student-centered instruction 300 level in prior year; two-way interactions of covariates; missing indicators of covariates. Teachers’ educational attainment level; average annual school expenditure in prior year; lagged outcome variable in prior year; educational outcomes Failure rate 60 aggregated at school level in prior year; head of parent association’s gender, Dropout rate 60 Educational outcomes indigenous status, family’s indigenous status, education level, selection to Repetition rate 60 position (election/appointed), years in position; community advisor is 39 principal; two-way interactions of covariates; missing indicators of covariates. Student’s gender, grade level, indigenous status, physical disabilities, learning disabilities, age of school entry, number of siblings, mother’s education level, Spanish test 299 household wealth index; teacher’s gender, indigenous status, holds non- Math test 299 teaching jobs, years of teaching experience, experience teaching same grade, Disciplinary action 299 educational attainment level; lagged outcome variable aggregated at school level in prior year; two-way interactions of covariates; missing indicators of covariates. Head of parent association’s gender, indigenous status, family’s indigenous status, education level, selection to position (election/appointed), years in position; community advisor is principal; lagged outcome variables Parent’s perspective: Most teachers can be trusted 119 measured prior year; educational outcomes aggregated at school level in Mechanisms Parent’s perspective: Teacher responsibility index 119 prior year; teachers’ educational level aggregated at school level in prior year; average annual school expenditure in prior year; two-way interactions of covariates; missing indicators of covariates. Teacher’s grade, gender, indigenous status, holds non- teaching jobs, years of teaching experience, experience teaching same grade, educational attainment Teacher’s perspective: Most parents can be trusted 58 level; lagged outcome variables measured prior year; educational outcomes Teacher’s perspective: Parent responsibility index 58 aggregated at school level in prior year; teachers’ educational level aggregated at school level in prior year; average annual school expenditure in prior year; two-way interactions of covariates; missing indicators of covariates. Table A1: Balance test conditional on response status for double grant experiment Control: Treatment: Difference: Grant & Info Double Grant & Info Mean (S.D.) Mean (S.D.) T-C (S.E.) Attrition rate Attrition (1=Yes) 0.032 (0.177) 0.016 (0.126) -0.016 (0.019) Number of schools 125 125 Panel A: School characteristics Type of school General school (1 = Yes) 0.603 (0.491) 0.496 (0.502) -0.107 * (0.064) Indigenous school (1=Yes) 0.397 (0.491) 0.504 (0.502) 0.107 * (0.064) Parent association president Highest edu. is primary (1=Yes) 0.777 (0.418) 0.813 (0.391) 0.036 (0.052) Years as president 1.388 (1.350) 1.309 (1.146) -0.079 (0.160) Teachers Prop. with teaching college degree 0.159 (0.310) 0.213 (0.363) 0.054 (0.043) 40 Prop. with university degree 0.613 (0.446) 0.544 (0.461) -0.068 (0.058) Failure rate 0.099 (0.066) 0.091 (0.075) -0.008 (0.009) Repetition rate 0.072 (0.060) 0.069 (0.061) -0.003 (0.008) Dropout rate 0.021 (0.035) 0.018 (0.036) -0.003 (0.005) Number of schools 121 123 p-value of joint F-test 0.738 Panel B: Student characteristics Female (1=Yes) 0.489 (0.500) 0.488 (0.500) -0.001 (0.012) Household wealth index (S.D.) 0.000 (1.001) -0.025 (1.015) -0.025 (0.090) Grade 3 (1=Yes) 0.334 (0.472) 0.337 (0.473) 0.003 (0.008) Grade 4 (1=Yes) 0.336 (0.472) 0.340 (0.474) 0.004 (0.008) Grade 5 (1=Yes) 0.330 (0.470) 0.323 (0.468) -0.007 (0.008) Language score 441.526 (87.272) 437.876 (89.871) -3.650 (9.959) Math score 451.370 (97.565) 448.122 (102.610) -3.248 (11.745) Number of students 4547 4524 p-value of joint F-test 0.978 Notes: Robust standard errors clustered at school level in Panel B. Reported p-value is from a joint F-test of the null hypoth- esis that there are no mean differences between treatment and control across all variables in the panel. All variables measured in 2007. * p<0.1, ** p<0.05, *** p<0.01 Table A2: Balance test conditional on response status for information experiment Control: Treatment: Difference No Grant & No Info No Grant & Info Mean (S.D.) Mean (S.D.) T-C (S.E.) Attrition rate Attrition (1=Yes) 0.040 (0.197) 0.025 (0.157) -0.015 (0.027) Number of schools 100 80 Panel A: School characteristics Type of school General school (1 = Yes) 1.000 (0.000) 1.000 (0.000) 0.000 (0.000) Parent association president Highest edu. is primary (1=Yes) 0.646 (0.481) 0.654 (0.479) 0.008 (0.073) Years as president 1.583 (0.842) 1.679 (0.904) 0.096 (0.133) Teachers Prop. with teaching college degree 0.209 (0.324) 0.252 (0.335) 0.043 (0.050) Prop. with university degree 0.614 (0.421) 0.530 (0.417) -0.084 (0.064) 41 Failure rate 0.079 (0.055) 0.069 (0.063) -0.010 (0.009) Repetition rate 0.054 (0.047) 0.048 (0.044) -0.007 (0.007) Dropout rate 0.023 (0.031) 0.027 (0.046) 0.004 (0.006) Number of schools 96 78 p-value of joint F-test 0.378 Panel B: Student characteristics Female (1=Yes) 0.498 (0.500) 0.492 (0.500) -0.006 (0.011) Household wealth index (S.D.) -0.002 (1.002) -0.012 (0.916) -0.010 (0.081) Grade 3 (1=Yes) 0.314 (0.464) 0.329 (0.470) 0.015 * (0.008) Grade 4 (1=Yes) 0.349 (0.477) 0.332 (0.471) -0.017 ** (0.007) Grade 5 (1=Yes) 0.337 (0.473) 0.340 (0.474) 0.002 (0.005) Language score 495.201 (106.948) 489.164 (102.784) -6.037 (9.433) Math score 512.919 (120.681) 505.635 (116.178) -7.284 (10.844) Number of students 4544 3595 p-value of joint F-test 0.298 Notes: Robust standard errors clustered at school level in Panel B. Reported p-value is from a joint F-test of the null hypothesis that there are no mean differences between treatment and control across all variables in the panel. All variables measured in 2009. * p<0.1, ** p<0.05, *** p<0.01 Table A3: Variation of treatment effect on parental involvement by indigenous status of head of parent association Organized Met with Involved in Percent of parents school activities teachers to discuss school regularly attending & events student performance decision making meetings (1) (2) (3) (4) Double grant experiment (1 year) Treatment x Indigenous 0.039 0.103 0.067 -0.294 (0.091) (0.088) (0.122) (4.729) (2 year) Treatment x Indigenous -0.023 -0.010 -0.134 1.368 (0.087) (0.087) (0.125) (3.921) 42 (3 year) Treatment x Indigenous 0.008 -0.050 0.223* -1.031 (0.082) (0.092) (0.130) (3.513) Information experiment (1 year) Treatment x Indigenous 0.067 0.122 -0.035 4.491 (0.113) (0.152) (0.204) (5.628) Single grant observation (1 year) Treatment x Indigenous -0.079 -0.072 -0.041 0.514 (0.123) (0.172) (0.236) (4.937) Notes: Robust standard errors. * p<0.1, ** p<0.05, *** p<0.01 Table A4: Variation of treatment effect on inputs by indigenous status of students Parenting Teaching Aware of Student- school Helps with Days absent centered assignments homework in past month instruction (1) (2) (3) (4) Double grant experiment (1 year) Treatment x Indigenous 0.013 -0.022 0.614 -0.138 (0.013) (-0.022) (0.614) (-0.138) (2 year) Treatment x Indigenous 0.059 0.037 -0.562 0.023 43 (0.059) (0.037) (-0.562) (0.023) (3 year) Treatment x Indigenous 0.032 0.042 0.058 0.091 (0.032) (0.042) (0.058) (0.091) Information experiment (1 year) Treatment x Indigenous 0.007 -0.042 1.564 0.137 (0.054) (0.057) (1.345) (0.188) Single grant observation (1 year) Treatment x Indigenous -0.069 -0.059 -0.535 -0.004 (0.051) (0.045) (1.263) (0.182) Notes: Robust standard errors clustered at school level. * p<0.1, ** p<0.05, *** p<0.01 Table A5: Variation of treatment effect on educational outcomes by indigenous status of students (1) (2) (3) Spanish Math Disciplinary test (S.D.) test (S.D.) action Double grant experiment (1 year) Treatment x Indigenous -0.022 0.127 -0.006 (-0.022) (0.127) (-0.006) (2 year) Treatment x Indigenous -0.245* -0.205 -0.058 (-0.245) (-0.205) (-0.058) 44 (3 year) Treatment x Indigenous -0.061 -0.224 0.026 (-0.061) (-0.224) (0.026) Information experiment (1 year) Treatment x Indigenous -0.071 -0.194 -0.100 (-0.071) (-0.194) (-0.100) Single grant observation (1 year) Treatment x Indigenous 0.045 0.046 0.047 (0.197) (0.175) (0.071) Notes: Robust standard errors clustered at the school level. * p<0.1, ** p<0.05, *** p<0.01 Table A6: Association of index variables with related covariates at baseline Student Teacher Parent Centered Responsibility Responsibility Instruction (S.D.) Index (S.D.) Index (S.D.) (1) (2) (3) Teacher explains concepts clearly 0.202*** (0.004) Teacher reviews homework assignment 0.204*** (0.004) Teacher does not make students 0.232*** simply copy from textbooks/blackboard (0.005) Teacher gives students exercises 0.190*** that apply concepts learned in class (0.004) Teachers are available for meetings 0.236*** (0.023) Teachers support extra-curricular activities 0.292*** (0.026) 45 Teachers handle conflict resolution between students 0.233*** (0.023) Teachers provide additional classes for struggling students 0.307*** (0.023) Teachers care about student achievement 0.235*** (0.020) Teachers provide feedback on student assignments 0.287*** (0.020) Parents are available for meetings 0.291*** (0.022) Parents help children with school work 0.297*** (0.022) Parents make sure that children are completing school assignments 0.260*** (0.023) Parents support extra-curricular activities 0.138*** (0.095) Notes: Robust standard errors clustered at the school level. Each cell is the result of a separate regression where the outcome is the variable in the far left column and the predictor is the the variable in the top row. Sample pools together the control group of both experiments at baseline. * p<0.1, ** p<0.05, *** p<0.01 plug-in (2) plug-in (2) cv lambda (14) cv lambda (11) 1 se lambda (0) 1 se lambda (0) adaptive(10) adaptive(9) -.4 -.3 -.2 -.1 0 .1 .2 .3 .4 -.4 -.3 -.2 -.1 0 .1 .2 .3 .4 (b) Met with teachers to discuss student perfor- (a) Organized school activities & events mance plug-in (1) plug-in (2) 46 cv lambda (20) cv lambda (14) 1 se lambda (0) 1 se lambda (0) adaptive(18) adaptive(12) -.4 -.3 -.2 -.1 0 .1 .2 .3 .4 -10 -8 -6 -4 -2 0 2 4 6 8 10 (c) Involved in school decision making (d) Percent of parents regularly attending meetings Figure A1: Robustness check for post-double selection lasso - Parental involvement outcomes Note: The figure shows the treatment effect estimate and 95% confidence interval for each model specification. Parentheses indicates number of covariates selected by model. The models are post-double selection lasso using (i) the plug-in penalty, (ii) cross validation penalty, (iii) largest penalty such that the cross-validated MSE is within one-standard deviation of the minimum penalty, and (iv) adaptive lasso. plug-in (22) plug-in (24) cv lambda (149) cv lambda (149) 1 se lambda (120) 1 se lambda (122) adaptive(130) adaptive(138) -.2 -.1 0 .1 .2 -.2 -.1 0 .1 .2 (a) Parenting: Aware of school assignments (b) Parenting: Helps with homework plug-in (33) plug-in (24) cv lambda (149) cv lambda (149) 47 1 se lambda (149) 1 se lambda (124) adaptive(149) adaptive(132) -3 -2 -1 0 1 2 3 -.4 -.3 -.2 -.1 0 .1 .2 .3 .4 (c) Teaching: Days absent in past month (d) Teaching: Student-centered instruction Figure A2: Robustness check for post-double selection lasso - Child inputs Note: The figure shows the treatment effect estimate and 95% confidence interval for each model specification. Parentheses indicates number of covariates selected by model. The models are post-double selection lasso using (i) the plug-in penalty, (ii) cross validation penalty, (iii) largest penalty such that the cross-validated MSE is within one-standard deviation of the minimum penalty, and (iv) adaptive lasso. plug-in (2) plug-in (4) plug-in (1) cv lambda (16) cv lambda (23) cv lambda (8) 1 se lambda (0) 1 se lambda (3) 1 se lambda (0) adaptive(12) adaptive(17) adaptive(7) -.05 -.04 -.03 -.02 -.01 0 .01 .02 .03 .04 .05 -.03 -.02 -.01 0 .01 .02 .03 -.03 -.02 -.01 0 .01 .02 .03 (a) Failure rate (b) Dropout rate (c) Repetition rate plug-in (32) plug-in (34) plug-in (27) cv lambda (149) cv lambda (149) cv lambda (149) 48 1 se lambda (149) 1 se lambda (149) 1 se lambda (149) adaptive(149) adaptive(149) adaptive(149) -.5 -.4 -.3 -.2 -.1 0 .1 .2 .3 .4 .5 -.5 -.4 -.3 -.2 -.1 0 .1 .2 .3 .4 .5 -.2 -.15 -.1 -.05 0 .05 .1 .15 .2 (d) Spanish test (e) Math test (f ) Disciplinary action Figure A3: Robustness check for post-double selection lasso - Educational outcomes Note: The figure shows the treatment effect estimate and 95% confidence interval for each model specification. Parentheses indicates number of covariates selected by model. The models are post-double selection lasso using (i) the plug-in penalty, (ii) cross validation penalty, (iii) largest penalty such that the cross-validated MSE is within one-standard deviation of the minimum penalty, and (iv) adaptive lasso. plug-in (1) plug-in (1) cv lambda (15) cv lambda (15) 1 se lambda (0) 1 se lambda (0) adaptive(13) adaptive(13) -1 -.5 0 .5 1 -1 -.5 0 .5 1 (a) Parent’s perspective: Most teachers can be (b) Parent’s perspective: Teacher responsibility in- trusted dex plug-in (5) plug-in (4) 49 cv lambda (19) cv lambda (13) 1 se lambda (4) 1 se lambda (4) adaptive(17) adaptive(12) -.4 -.2 0 .2 .4 -1 -.5 0 .5 1 (c) Teacher’s perspective: Most parents can be (d) Teacher’s perspective: Parent responsibility in- trusted dex Figure A4: Robustness check for post-double selection lasso - Mechanisms Note: The figure shows the treatment effect estimate and 95% confidence interval for each model specification. Parentheses indicates number of covariates selected by model. The models are post-double selection lasso using (i) the plug-in penalty, (ii) cross validation penalty, (iii) largest penalty such that the cross-validated MSE is within one-standard deviation of the minimum penalty, and (iv) adaptive lasso. .4 .4 Double grant Information Single grant Double grant Double grant Information Single grant Double grant .2 .2 Difference in treatment effect Difference in treatment effect 0 0 -.2 -.2 Double grant - Information - Single grant - Information Single grant Double grant Double grant - Information - Single grant - -.4 -.4 Information Single grant Double grant (a) Organized school activities & events (b) Met with teachers to discuss student performance 50 .4 Double grant Information Single grant Double grant 10 Double grant Information Single grant Double grant .2 Difference in treatment effect Difference in treatment effect 5 0 0 -.2 -5 Double grant - Information - Single grant - Information Single grant Double grant -.4 -10 Double grant - Information - Single grant - Information Single grant Double grant (c) Involved in school decision making (d) Percent of parents regularly attending meetings Figure A5: Difference in treatment effect across interventions - Parental involvement outcomes Note: The figure shows the difference in the treatment effect estimate and 95% confidence interval between each of the three interventions (double grant, information, single grant). Standard errors for the difference are calculated using cluster bootstrap. .2 Double grant Information Single grant Double grant .2 .1 Double grant Information Single grant Double grant Difference in treatment effect Difference in treatment effect .1 0 0 -.1 -.1 Double grant - Information - Single grant - Information Single grant Double grant -.2 -.2 Double grant - Information - Single grant - Information Single grant Double grant (a) Parents: Aware of school assignments (b) Parents: Helps with homework 51 Double grant Information Single grant Double grant Double grant Information Single grant Double grant 2 .4 Difference in treatment effect Difference in treatment effect .2 0 0 -.2 -2 -.4 Double grant - Information - Single grant - Double grant - Information - Single grant - Information Single grant Double grant Information Single grant Double grant (c) Teachers: Days absent in past month (d) Teachers: Student-centered instruction Figure A6: Difference in treatment effect across interventions - Parenting and teacher behaviors Note: The figure shows the difference in the treatment effect estimate and 95% confidence interval between each of the three interventions (double grant, information, single grant). Standard errors for the difference are calculated using cluster bootstrap. Double grant Information Single grant Double grant Double grant Information Single grant Double grant Double grant Information Single grant Double grant .04 Difference in treatment effect Difference in treatment effect Difference in treatment effect .02 .02 .02 .01 0 0 0 -.01 -.02 -.02 -.02 -.04 -.03 -.06 -.04 Double grant - Information - Single grant - Double grant - Information - Single grant - Double grant - Information - Single grant - Information Single grant Double grant Information Single grant Double grant Information Single grant Double grant (a) Failure rate (b) Dropout rate (c) Repetition rate Double grant Information Single grant Double grant Double grant Information Single grant Double grant .2 Double grant Information Single grant Double grant .4 .4 .1 Difference in treatment effect Difference in treatment effect Difference in treatment effect .2 .2 0 0 0 52 -.2 -.2 -.1 -.4 -.4 Double grant - Information - Single grant - Double grant - Information - Single grant - -.2 Double grant - Information - Single grant - Information Single grant Double grant Information Single grant Double grant Information Single grant Double grant (d) Spanish test (e) Math test (f ) Disciplinary action Figure A7: Difference in treatment effect across interventions - Educational outcomes Note: The figure shows the difference in the treatment effect estimate and 95% confidence interval between each of the three interventions (double grant, information, single grant). Standard errors for the difference are calculated using cluster bootstrap. .2 Double grant Information Single grant Double grant .2 .1 Double grant Information Single grant Double grant Difference in treatment effect Difference in treatment effect .1 0 0 -.1 -.1 Double grant - Information - Single grant - Information Single grant Double grant -.2 -.2 Double grant - Information - Single grant - Information Single grant Double grant (a) Parent’s perspective: Most teachers can be trusted (b) Parent’s perspective: Teacher responsibility index 53 Double grant Information Single grant Double grant Double grant Information Single grant Double grant 2 .4 Difference in treatment effect Difference in treatment effect .2 0 0 -.2 -2 -.4 Double grant - Information - Single grant - Double grant - Information - Single grant - Information Single grant Double grant Information Single grant Double grant (c) Teacher’s perspective: Most parents can be trusted (d) Teacher’s perspective: Parent responsibility index Figure A8: Difference in treatment effect across interventions - Trust and Responsibility Note: The figure shows the difference in the treatment effect estimate and 95% confidence interval between each of the three interventions (double grant, information, single grant). Standard errors for the difference are calculated using cluster bootstrap.