Parents respond to information by increasing their educational investment, but this initial response decays unless report cards are sustained

Poor information is one of the key sources of underinvestment in education, especially amongst low-income families. Providing accurate information on children’s progress in school, as well as the returns to education, can affect enrolment decisions, parents’ interaction with schools, and parents’ support of their children’s education. For these reasons, a lack of information (or incorrect information) may lead parents and children to underinvest in education. 

In our study (Barrera-Osorio, Gonzalez, Lagos, and Deming 2020), we evaluate the impact of a  experimental pilot implimented by the local Luker Foundation in the city of Manizales, Colombia, that provided information to parents on their children’s maths and language performance in grades four through six. The aim of the intervention was to solve the information problem, and by doing so, induce an optimal level of parental investment in education and improve parents’ relationship with the school, thereby strengthening the school-household link. 

The intervention: Providing households with their child’s performance report card 

In association with the  Luker Foundation, and the Secretary of Education in Manizales, we first collected baseline data on maths and language performance of students in grades four through six using the Early Grade Reading and Early Grade Math Assessments (EGRA and EGMA). In April 2014, a group of researchers – in collaboration with the Luker Foundation and the local government – collected the EGRA and EGMA test scores of 3,026 students in grades four and five (45% of the students of these grades), distributed across 31 public schools in the city. We also visited the households of all students in the sample to collect household socioeconomic information, as well as information on parents’ beliefs about their children’s performance on the EGRA and EGMA.

We randomly assigned the families of the 3,026 students to either an information treatment group or to a control group. In the treatment group, the intervention provided families with a one-page report card at the end of their household visit that showed their child’s maths and language performance on the EGRA and EGMA, as well as their child’s position relative to the average performance of students in the same grade and school. The information was presented in a format similar to a percentile rank, introduced in a way that is highly salient to all families. Families in the control group did not receive any information.

After randomly assigning the families of the 3,026 students to one of these two groups, the team conducted the pilot version of the intervention. The team randomly selected a subsample of 2,100 of the 3,026 students (1,600 in the treatment group, plus 400 in the control group) to whom we conducted the home visits and, for treatment families, provided information. To implement the intervention, the team visited parents and guardians at their homes in October 2014. These visits, which were previously scheduled by phone, were divided into three sections: 

  1. The team member explained the objective of the study and provided the consent materials.
  2. The agent administered the questionnaire to the parents, including asking the parents to state their beliefs about their student’s performance. 
  3. After the interview was concluded, the agent gave the appropriate report cards to the treatment group and explained its meaning. For the control group, the interview was the end of the visit.

After the home visit, the team contacted families by telephone and administered short surveys, which asked about various forms of parental investment in child learning. In December, we administered the same exam that was given in April at all of the 31 participating schools; we subsequently linked each child’s performance to our home visitation and survey data.

The team conducted the full intervention in October 2015. The Luker Foundation and Secretary of Education in Manizales conducted an additional round of EGRA and EGMA scores for the 3,026 students who participated in the pilot and an additional 1,345 students. Families who participated in the pilot remained in their original random assignment group; families of the 1,345 new students were randomly assigned to either the information treatment or control group. In October 2015, the research team conducted a second round of home visits. Families in the treatment group again received information on their students’ maths and language performance on the EGRA and EGMA. However, for the full intervention, this information was also bundled with specific  suggestions for how parents and guardians could engage in their children’s education. After the second home visit, we administered the same exam four more times in December 2015, June 2016, December 2016, and June 2017. We followed students through grade six. Students’ scores on these assessments captured the impact of the intervention on their maths and language learning.

Parents increase educational investment in the short run

We document that, on average, parents underestimated their children’s reading performance by approximately 0.5 standard deviations. However, we see a very different pattern for maths. On average, parents overestimated their children’s performance on the subtractions assessment, our primary measure of student maths performance, by nearly one standard deviation.

The intervention resulted in an initial pattern of small and statistically significant short-run effects (through the June 2016 assessment) following the receipt of information on student performance and the menu of suggestions to support parental investment (0.09 to 0.10 standard deviation) (see Figure 1). This suggests that parents respond to information by increasing their educational investment, but that this initial response decays unless new reports about their students’ performance are made available to them. 

Figure1 Estimated treatment impact across each follow-up wave, by grade

Notes: Points indicate treatment impact estimates; bars indicate standard errors. Each result is from a regression of composite test scores on a single indicator for treatment 1 or treatment 2, controlling for age, gender, and baseline test scores. Models are estimated separately for each grade and follow-up wave.

Our second important finding is that the impacts of the intervention were larger for students with low baseline scores; impacts for these students were as large as 0.28 standard deviation. This is consistent with these families having less accurate information about their students’ performance, or an increase in parent-student information frictions in these households. Still, we detect the same pattern in the dynamics of the effects for this population, with student performance backsliding to the baseline score. 

Our results suggest that the main channel of these effects was through the school-parent relationship. We show increases in the number of parent-teacher meetings: parents in the treatment group were 9 percentage points more likely to report consistently (always) attending meetings with teachers relative to the control group. We also find that information leads parents to update their beliefs about their students’ performance. In contrast, there is no evidence of any effect on parental investment within the household. 

Scaling up

Considering the results of the study, the Secretary of Education in Manizales and the Luker Foundation implemented a scaled-up version of the intervention. Building on these findings that information was most effective at improving the maths and language performance of lower-performing students, the scaled-up intervention targeted lower-performing students. As the evaluation also finds that impacts of information provision fade out in the absence of new information, the expansion of the programme delivered information (including recommending actions on how parents can engage with their children’s education) on a more frequent, ongoing basis by using bi-weekly mobile text messages. The results of this intervention will provide additional insight on how information provision can be used as a low-cost mechanism to support student learning at scale.


Barrera-Osorio, F, K Gonzalez, F Lagos and D Deming (2020), “Providing Performance Information in Education: An Experimental Evaluation in Colombia”, Journal of Public Economics 186: 104216.

Education Test scores Information Report cards School Colombia