French pupils from disadvantaged
areas demonstrate lower achievement than their more affluent peers. In an
effort to close this achievement gap, the French government issued a policy in
2017 reducing Year 2 class size in high-priority educational areas to no more
than 12 pupils, extending to Year 3 classes and priority educational areas in
2018. In order to provide evidence regarding the feasibility of such a policy,
researchers used data from a 2003 first-grade-class-size-reduction policy in
France to examine its carry-over effects into the second grade.
The 2003 study involved assigning classrooms
to either small (12 pupils/class n=100 classes) or large (20–25 pupils/class,
n=100 classes) class sizes. At the start of the 2002–03 school year, children
were pre-tested on pre-reading skills and matched. In post-tests at the end of
the school year, results favoured the small-class-size group on word reading
(ES = +0.14) and word spelling (ES = +0.22). These effects are very small in
light of the costs of halving class size.
The new study examined these pupils’ reading achievement at the end of Year 3, where the pupils formerly placed in smaller classrooms had been placed in full-sized classes again. Subjects were 1,264 pupils (663 in the intervention group and 601 in the control group) who had received both the initial testing in Year 2 and had test scores at the end of Year 3. Results showed that while both groups were equivalent at the start of Year 2, and by the end of the year the small-class-size group showed greater academic achievement than the control group, this gain diminished over the summer break and had completely disappeared by the end of Year 3. That is, there was no long-term impact of one year of reduced class size.
Reducing the number of
pupils in French first-grade classes: Is there evidence of contemporaneous and
carryover effects? (November 2018), International
Journal of Educational Research, Volume 96,
A new report by Rachel Rosen and colleagues at MDRC reviews the available research evidence supporting various types of career and technical education (CTE) programmes, examining both the amount of evidence available in each area and its level of rigour. The report details several CTE programme types (eg, instruction and training, apprenticeships and readiness skills training) and provides a literature review of the available evidence to support each programme type.
Key findings were as follows:
The most evidence exists for CTE course work and training. In that area, there are multiple studies suggesting that participation in CTE can improve pupils’ outcomes. In addition, multiple studies found that career-related certificates and associate’s degrees are linked to increased wages.
Several career pathway models, particularly career academies and early college high schools, are also supported by strong, rigorous studies that provide evidence of positive benefits for pupils.
The evidence for other models and for individual programme components is weaker. The authors suggest that these models and components probably need to be evaluated further.
Source: Career and technical education current policy, prominent programs, and evidence (September 2018), MDRC
The Laura and John Arnold Foundation (LJAF), which sponsors the US version of Best Evidence in Brief, has launched a new initiative called Straight Talk on Evidence.
The purpose of the initiative is to “distinguish credible findings of programme effectiveness from the many others that claim to be, through an easy-to-read, no-spin digest of recent programme evaluation findings.”
For example, the site presents highlights of a report on preventing youth crime. LJAF reviewed a randomised controlled trial (RCT) of Reading for Life, a mentoring and character development programme for young offenders in the US. The review found this to be a well-conducted RCT, showing that the programme reduced the rate of subsequent re-arrests. The study’s main limitation is that it was conducted in a single town in Indiana.
Source: Promising new evidence in the effort to prevent youth crime (August 2017), Straight Talk on Evidence
The Education Endowment Foundation has reported on two studies that looked at using education research to improve teaching practice.
Research into Practice was a pilot intervention aimed at supporting teachers to use evidence-based teaching and learning strategies to improve student progress. The project ran for a year in ten primary schools in Rochdale (north-west England). It involved professional development sessions and direct consultant support to help teachers:
Have more positive views about the use of research for improving teaching and learning;
Apply education research findings in the classroom and at a strategic level; and
Establish a stronger culture of evidence-based inquiry and practice.
There were some positive changes in teachers’ attitudes toward research. However, there was no evidence that teachers were more likely to use research evidence to inform their teaching practice. The Research Champions project used a senior teacher based at one of five schools to work with research leads, other teachers, and senior leaders to promote engagement with research evidence. There were “audits” of school research needs, research symposia for teachers, periodic research and development forums, and personalised support. However, there was no evidence that teachers’ attitudes toward research, or their use of research, changed during the intervention.
Source: Research into Practice – Evidence-informed CPD in Rochdale and Research Champions (2016), Education Endowment Foundation.
A new systematic review from the EPPI-Centre at the Institute of Education looks at what works to increase research use by decision-makers. The review included 23 reviews whose relevance and methodological quality were judged appropriate.
There was reliable evidence that the following were effective:
Interventions facilitating access to research evidence, for example, through communications strategies and evidence repositories, conditional on the intervention design simultaneously trying to enhance decision-makers’ opportunity and motivation to use evidence.
Interventions building decision-makers’ skills to access and make sense of evidence (such as critical appraisal training programmes) conditional on the intervention design simultaneously trying to enhance both capability and motivation to use research evidence.
There was limited evidence that interventions that foster changes to decision-making structures and processes by formalising and embedding one or more of the other mechanisms of change within existing structures and processes (such as evidence-on-demand services integrating push, user-pull, and exchange approaches) enhance evidence use.
There is reliable evidence that some intense and complex interventions lead to an increase in evidence use. Overall though, simpler and more defined interventions appear to have a better likelihood of success.
Source: The Science of Using Science: Researching the Use of Research Evidence in Decision-Making (2016) EPPI-Centre
An OECD report, Education Policy Outlook 2015: Making Reforms Happen, calls for a coherent framework of assessment and analysis of the effectiveness of education reforms. The report looks at the implementation of educational policy reforms through national and international comparisons.
Pasi Sahlberg, visiting Professor of Practice at Harvard University, critiques the report in a recent article. He highlights one of the report’s conclusions that “once new policies are adopted, there is little follow-up” and that only one in ten of the policies considered in the OECD report has been evaluated for its impact. Professor Sahlberg also praises increases in research and policy analysis in some countries, naming the UK and USA as examples.
Source: Education Policy Outlook 2015: Making Reforms Happen (2015), OECD.