Evaluation of Challenge the Gap

An evaluation of the Challenge the Gap (CtG) programme for the Education Endowment Foundation found no evidence that the programme increased average achievement for either primary or secondary pupils overall.

Challenge the Gap is a two-year school improvement programme that aims to help schools improve the achievement of their disadvantaged pupils through a professional development programme for staff. The evaluation conducted by The University of Manchester, involved 21,041 pupils from 104 schools (64 primary schools and 39 secondary schools). Around 24% of pupils in the primary schools and 16% in the secondary schools were eligible for free school meals. The evaluation assessed the impact on all participating schools using 2015 Key Stage 2 or Key Stage 4 results. CtG schools were compared to schools with a similar socio-demographic profile.

No evidence was found that CtG increased the average achievement for either primary or secondary school pupils, overall. For children eligible for free school meals (FSM), those in CtG primary schools made two months’ additional progress (average effect size = +0.10) compared to similar children in non-CtG schools. In CtG secondary schools, FSM-eligible pupils made two months’ less progress compared to similar pupils in non-CtG secondary schools (average effect size = -0.10). The smaller number of FSM-eligible pupils in the trial means that these results are less secure than the overall findings.

Source: Challenge the Gap: Evaluation report and executive summary (July 2017), Education Endowment Foundation

Science professional development and pupil achievement: A cluster-randomised trial

Joseph Taylor of Abt Associates and colleagues conducted a rigorous study of the Science Teachers Learning Through Lesson Analysis (STeLLA) professional development (PD) programme.

STeLLA is designed to increase elementary (primary) teachers’ science knowledge. Instead of the standard practice of teaching pupils to memorise science concepts and then perform activities that prove these concepts, STeLLA teachers lead pupils to discover science concepts through experience and experimentation. One of STeLLA’s main tenets is to have pupils think through science problems aloud so that teachers can respond to pupils’ ideas and guide them to scientific conclusions and specific learning goals. Its other distinguishing feature is that during the course of a year, groups of 5–10 teachers led by a PD coach watch and critique videos of experienced science teachers’ lessons, later moving on to their own and their colleagues’ lessons, to analyse them regarding science content, teaching and learning. In addition, STeLLA teachers are taught by university-level science teachers the summer prior to implementation to provide them with greater science content knowledge, a process called “content deepening”.

In the current study, researchers used a cluster-randomised design to compare STeLLA to The Content Deepening Program, a PD programme that deepens teachers’ science knowledge through university faculty-led science teaching, like STeLLA does, but without STeLLA’s analysis-of-practice component. Seventy-seven schools, with 144 teachers and 2,823 fourth and fifth grade pupils (Years 5 and 6) in Colorado, were randomly assigned either to STeLLA (n=42 schools) or to The Content Deepening Program (n=35 schools) in two cohorts, the first in 2011–12 and the second in 2012–13. Teachers in both conditions experienced 88 hours of PD and had the same learning goals for their pupils. Pupils were pre- and post-tested on a science measure based on established assessments. Although the control group demonstrated a slight achievement advantage at baseline, results showed that pupils in STeLLA classes scored higher (effect size = +0.55) at post-test than pupils in classes whose teachers had been through The Content Deepening Program.

Source: The effect of an analysis-of-practice, videocase-based, teacher professional development program on elementary students’ science achievement (2017),  Journal of Research on Educational Effectiveness, Volume 10:  Issue 2

Professional development programme unsuccessful in improving maths achievement

Developing Mathematical Ideas (DMI) is a professional development programme designed to increase teachers’ knowledge of fourth grade (Year 5) maths fractions and rational numbers with the ultimate goal of improving their pupils’ maths achievement.

A study conducted in the 2014–15 school year, prepared for the Institute of Education Sciences by Madhavi Jayanthi and colleagues at Instructional Research Group and REL Southeast, investigated the effects of DMI on teacher content knowledge and their pupils’ subsequent achievement in fractions. A total of 264 fourth grade (Year 5) teachers in 84 elementary (primary) schools in Florida, Georgia and South Carolina in the US were randomly assigned by school to receive either DMI (n=42 schools, 129 teachers) or their usual professional development programme (n=42 schools, 135 teachers). The 84 schools were matched on grade four enrolment, number of pupils who exceeded fourth grade maths standards, percentage of African American and Hispanic pupils and percentage of pupils eligible for free- or reduced-price lunches. In autumn 2014, DMI teachers received eight three-hour training sessions conducted over four days, followed by homework and concluding with a test on fractions. A total of 4,204 fourth grade pupils’ (2,091 E, 2,113 C) baseline scores on third grade standardised tests were used as a pre-test, because most third graders know little about fractions and the Test for Understanding of Fractions was used as the post-test at the end of the academic year to measure their knowledge gain after their teachers had completed DMI.

Results showed no significant differences between either the DMI or non-DMI teachers’ knowledge of fractions and their pupils’ proficiency in fractions.

Source: Impact of the Developing Mathematical Ideas professional development program on grade 4 students’ and teachers’understanding of fractions (March 2017), US Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast.

Examining the impact of content-intensive teacher professional development

A new evaluation report by Michael S Garet and colleagues, published by the US Institute of Education Sciences, examines the impact of providing elementary (primary) school teachers with content-focused maths continuing professional development (CPD) on their knowledge, teaching, and students’ achievement.

The study’s CPD had three components totalling 93 hours. The core of the CPD was Intel Math, an 80-hour workshop delivered in the summer of 2013 that focused on deepening teachers’ knowledge of grades K–8 mathematics (Years 1 to 9). Two additional CPD components totalling 13 hours were delivered during the 2013–14 school year.

The study’s sample included grade 4 (Year 5) teachers from 94 schools in six US school districts and five states that were randomly assigned within schools to either a treatment group that received the study CPD or a control group that did not receive the study CPD.
Key findings were as follows:

  • The CPD had a positive impact on teacher knowledge. On average, treatment teachers’ maths knowledge scores on a study-administered maths assessment were 21 percentile points higher than control teachers’ scores in spring 2014, after the CPD was completed.
  • The CPD had a positive impact on some aspects of teaching practice, particularly Richness of Mathematics, which emphasises the conceptual aspects of maths, such as the use and quality of mathematical explanations.
  • Despite the CPD’s generally positive impact on teacher outcomes, the CPD did not have a positive impact on student achievement. On average, treatment teachers’ students scored 2 percentile points lower than control teachers’ students in spring 2014 on both a study-administered maths assessment aligned with the content of the CPD and the state maths assessment. This difference was statistically significant for the state maths assessment but not for the study-administered assessment.

Source: Focusing on Mathematical Knowledge: The Impact of Content-Intensive Teacher Professional Development (2016), Institute of Education Sciences

Is professional development better than being dismissed?

The last issue of Best Evidence in Brief reported on a study in which low-performing teachers were dismissed. A new working paper from the National Bureau of Economic Research reports on an experiment where low-performing teachers were provided with coaching from higher-performing peers.

The experiment took place in Tennessee in 14 elementary and middle schools. Tennessee teachers are observed in the classroom many times each year, and scored on 19 specific skills (eg, questioning, lesson structure and pacing, and managing student behaviour). Schools were randomly assigned to a treatment condition or business-as-usual control group. In the treatment schools, low-performing “target” teachers were matched with high-performing teachers, based on the outcomes of their classroom observations. The high-performing teachers were chosen based on their high scores in skills for which the low-performing teachers had received a low score. The pairs were encouraged to work together on these skills, as well as more generally on observing each other’s teaching, discussing strategies for improvement, and following up on each other’s commitments throughout the year.

After a year, students in treatment schools (whether taught by target or non-target teachers) showed a small improvement (effect size +0.06) on maths and English tests, when compared with students in control schools. Gains by students taught by target teachers were higher (+0.12). These improvements persisted and grew. In the following year, the effect for target teachers was a marginally significant +0.25.

Source: Learning Job Skills from Colleagues at Work: Evidence from a Field Experiment Using Teacher Performance Data (2016), The National Bureau of Economic Research.

Preschool maths and science professional development fails to improve learning

A new article published in the Journal of Educational Psychology describes a study into the impact of professional development on maths and science learning in early childhood education.

For the study, 65 staff from 34 varied early childhood settings in Ohio were randomly assigned to experience 10.5 days (64 hours) of training on maths and science or an alternative topic (art and creativity). The maths and science training was adapted from the Core Knowledge Preschool Sequence, which provides a developmental progression based on early childhood research and theory.

The study looked at both maths and science learning opportunities, and the maths and science learning gains of the children (n=385). In terms of opportunities, the authors found that the professional development significantly impacted on the provision of science learning opportunities, but not maths. However, in terms of learning gains, none were observed.

The authors suggest a number of factors that may have contributed to the outcome. These include the fact that although educators were provided with hands-on opportunities to try new maths and science activities during training, there were no systematic means of ensuring they had regular opportunities to apply these in their classrooms. Also, that changes in practice may be difficult to achieve as the emphasis on these subject areas is relatively new in early childhood education.

Source: Professional Development for Early Childhood Educators: Efforts to Improve Math and Science Learning Opportunities in Early Childhood Classrooms (2015), Journal of Educational Psychology, 107(2).