The Education Endowment Foundation recently published a report on the effectiveness trial of the Success for All (SFA) programme to evaluate its impact on the literacy outcomes of Reception pupils. SFA is a whole-school approach to improving literacy in primary schools. All teachers and senior leaders are involved, with the school receiving a total of 16 training and support days. Teachers receive pedagogical training – for example on effective phonics teaching – and teaching materials such as structured lesson plans. For the school leadership team, there is support in areas such as data management, ability grouping and parental engagement.
Fifty-four schools took part in this effectiveness trial, which was evaluated by Sarah Miller and colleagues from Queen’s University Belfast. Although the intervention was delivered on a whole-school basis, the evaluation focused only on the outcomes of 1,767 pupils starting in Reception, and followed them through to the end of Year 1.
The main analysis found that Reception pupils in SFA schools made more progress than pupils in control schools after two years (effect size = +0.07). The effect was slightly larger for pupils eligible for free school meals (FSM), compared to FSM pupils in control schools after two years (effect size = +0.12). In both cases, the effect was smaller than those found in previous evaluations (this is the third RCT of SFA to be conducted, and the first independent trial of the programme in England). Trials in the US reported effect sizes between +0.15 and +0.30. The report suggests that one possible reason for this was that some schools struggled to implement the programme as intended.
The project delivery team was from the University of York. Robert Slavin, director of the US Center for Research and Reform in Education, is Co-founder and Chairman of the Board of the Success for All Foundation.
Source: Success for All: Evaluation report and executive summary (July 2017), Education Endowment Foundation
The Laura and John Arnold Foundation (LJAF), which sponsors the US version of Best Evidence in Brief, has launched a new initiative called Straight Talk on Evidence.
The purpose of the initiative is to “distinguish credible findings of programme effectiveness from the many others that claim to be, through an easy-to-read, no-spin digest of recent programme evaluation findings.”
For example, the site presents highlights of a report on preventing youth crime. LJAF reviewed a randomised controlled trial (RCT) of Reading for Life, a mentoring and character development programme for young offenders in the US. The review found this to be a well-conducted RCT, showing that the programme reduced the rate of subsequent re-arrests. The study’s main limitation is that it was conducted in a single town in Indiana.
Source: Promising new evidence in the effort to prevent youth crime (August 2017), Straight Talk on Evidence
A study published by the Institute of Education Sciences in the US evaluates the impact of the Retired Mentors for New Teachers programme – a two-year programme in which recently retired teachers provide tailored mentoring to new teachers – on pupil achievement, teacher retention and teacher evaluation ratings. The new teachers meet with their mentors weekly on a one-to-one basis and monthly in school-level groups over the course of the two years.
Dale DeDesare and colleagues conducted a randomised controlled trial involving 77 teachers at 11 primary schools in Aurora, Colorado. Within each school, half of the new teachers were randomly assigned to a control group to receive the district’s business-as-usual mentoring support, while the other half received the intervention as well as business-as-usual mentoring support.
The study found that at the end of the first year, pupils who were taught by teachers in the programme group scored 1.4 points higher on the spring Measures of Academic Progress maths assessment than those taught by teachers in the control group, (effect size = +0.064), and this difference was statistically significant. Reading achievement was also higher among pupils taught by teachers in the programme group, however, the difference was not statistically significant (effect size = +0.014 at the end of the first year and +0.07 at the end of the second year). The effect of the programme on teacher evaluation ratings and teacher retention was not significant, although more teachers in the programme group left after two years than in the control group.
Source: Impacts of the retired mentors for new teachers program (REL 2017–225) (March 2017), US Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Central.
An NBER Working Paper examines the impact of implementing management training for head teachers on pupil achievement. The management training focused on lesson planning, data-driven teaching and teacher observation and coaching (approximately 300 hours over two years). Using a school-level randomised experiment, 58 schools in Houston, Texas, were randomised to receive either the training intervention or to serve as a business-as-usual control group.
The study found that offering management training to head teachers led to increased test scores across low-stakes tests in a range of subjects in year one (effect size = +0.19). For high-stakes test scores in maths and reading, the effect size was lower (+0.10). However, the training intervention had no impact on high-stakes tests in year two.
The training was most beneficial for head teachers who were less experienced, had better maths skills, had more internal locus of control, had higher levels of “grit” and remained in the school for both years of the study.
The intervention showed most impact on teachers in the schools who were more experienced and more educated. The intervention showed most impact for pupils who were new to the school, white or Hispanic and economically well-off.
Source: Management and student achievement: Evidence from a randomized field experiment (May 2017), NBER Working Paper No. 23437, National Bureau of Economic Research
A new study published in Child Development has found positive results for an intervention designed to improve early language and pre-literacy skills in children in Denmark.
Dorthe Bleses and colleagues conducted a randomised controlled trial of three variations of SPELL (Structured Preschool Efforts in Language and Literacy), to evaluate to what extent the intervention increased children’s language and pre-literacy skills compared to business as usual. SPELL, which is a Danish version of an existing programme, Read it Again-PreK!, is a 20-week storybook-based intervention for children aged three- to six-years-old. Twice a week, children receive 30-minute lessons in small groups which include a before, during, and after reading activity to address the lesson’s objectives, which is delivered via an iPad-based digital learning technology.
For the trial, 6,483 children from 144 childcare settings were randomly assigned to one of three variations of SPELL, or continued with business as usual. Pre- to post-test comparisons showed an impact of all three interventions for literacy skills (effect sizes = +0.21 to +0.27) but not language skills (+0.04 to +0.16), with little difference among the three variations.
Source: The effectiveness of a large-scale language and preliteracy intervention: The SPELL randomized controlled trial in Denmark (2017), Child Development doi:10.1111/cdev.12859
The National Foundation for Education Research (NFER) has published the results of a randomised controlled trial and process evaluation of Code Clubs – a UK network of after-school clubs where children aged 9–11 learn to program by making games, animations, websites and applications. Code Club UK produces material and projects that support the teaching of Scratch, HTML/CSS and Python. The clubs, which are supported by volunteers, usually run for one hour a week after school during term time.
The evaluation, conducted by Suzanne Straw and colleagues, assessed the impact of Code Clubs on Year 5 pupils’ computational thinking, programming skills and attitudes towards computers and coding. Twenty-one schools in the UK took part in the trial which used a pupil-randomised design to compare pupil outcomes in the intervention and control groups. Intervention group pupils attended Code Club during the 2015/16 academic year, while control group pupils continued as they would do normally.
The results of the evaluation showed that attending Code Club for a year did not impact on pupils’ computational thinking any more than might have occurred anyway, but did significantly improve their coding skills in Scratch, HTML/CSS and Python. This was true even when control children learned Scratch as part of the computing curriculum in school. Code Club pupils reported increased usage of all three programming languages – and of computers more generally. However, the evaluation data suggests that attending Code Club for a year does not affect how pupils view their abilities in a range of transferable skills, such as following instructions, problem solving, learning about new things and working with others.
Source: Randomised controlled trial and process evaluation of code clubs (March 2017), National Foundation for Educational Research (NFER)