The Ministry of Education in France introduced a policy in 2002 that reduced class size to no more than 12 pupils in areas determined to have social difficulties and high proportions of at-risk pupils, called Zones d’Education Prioritaire (ZEP). In order to evaluate the effectiveness and usefulness of this policy, researcher Jean Ecalle and colleagues in France examined the results of the policy-mandated class size reduction on the reading achievement of first grade (Year 2) pupils (Study 1), and compared them to the effects of an evidence-based literacy intervention on the reading achievement of at-risk children in normal-sized classes (20 pupils) (Study 2).
Study 1, reducing class size, involved assigning classrooms to
either small (12 pupils/class n=100 classes) or large (20–25 pupils/class,
n=100 classes) class sizes (with the support of the Ministry). At the start of
the 2002–03 school year, 1,095 children were pre-tested on pre-reading skills
and matched at pre-test. At the end of the school year, children were post-tested,
with results favouring the small-class-size group on word reading (effect size=+0.14)
and word spelling (effect size=+0.22).
In Study 2, researchers separated 2,803 first grade (Year 2)
pupils in ZEP areas into an experimental group who received an evidence-based
reading intervention, and a control group who did not. The intervention was a
protocol developed by the Association Agir pour l’Ecole (Act for School), who
developed a hierarchy of teaching reading based on evidence-based methods of
learning to read, progressing from training phonological skills, to learning
letter sounds, decoding, and fluency. Act for School monitored compliance with
the protocol weekly. Class size for both groups was 20 pupils. Experimental
teachers received one day of training, and provided 30 minutes of teaching a
day to average or high readers in groups of 10 to 12, and one hour a day for
lower readers in groups of four to six. Again, children were pre-tested on
reading skills and matched between groups. All areas post-tested favoured the
experimental group, with significant effects in word reading (effect size=+0.13)
and word spelling (effect size=+0.12).
Researchers stated that based on the results of both studies, the
optimal recommendation to improve literacy skills for at-risk pupils would be a
double intervention, combining evidence-based practices within small classes.
of policy and educational interventions intended to reduce difficulties in
literacy skills in grade 1 (June 2019), Studies
in Educational Evaluation, Volume 61
In a review of important 2017 releases, MDRC recently referenced a memo to policymakers with recommendations for increasing research use and applying evidence to all policy decisions, both educational and otherwise.
- Programmes and policies should be independently evaluated. To ensure high-quality evaluations, they should be directly relevant to policy, free of political or other influences and credible to subjects and consumers.
- The government should provide incentives for programmes to apply evidence results to improve their performance.
- Utilise a tiered evidence strategy, such as is used in the Every Student Succeeds Act, to set clear guidelines for standards.
- Existing funding sources should be applied to generate evidence. A 1% set-aside was recommended.
- Federal and state agencies should be allowed to access and share their data for evaluation purposes.
Source: Putting evidence at the heart of making policy (February 2017), MDRC
A new report from Child Trends reviews the literature on conditions under which US policy-makers are most likely to use research, including the presentation formats that best facilitate their use. The authors, Elizabeth Jordan and P Mae Cooper, offer several insights based on their review of the evidence, including:
- Policy-makers prefer a personal connection or conversation to a written report. One reason the authors cite is that reports are undigested information, meaning they require some expertise to pull out the information that is most relevant to the situation at hand.
- While personal connections are usually best, no legislator can build and maintain relationships with experts in every field. The authors say that usually it is legislative staffers who fill this gap. Reports that summarise findings from a body of research are particularly useful to staffers, as they cover a variety of topics at one time.
- For research to be useful to policy-makers and their staff, it must be relevant. The authors note that the information must relate to current policy debates, show an impact on “real people”, present information that is useful across states or localities, and be easy to read.
- There are some formatting decisions that can help improve a written report’s accessibility. The authors suggest bulleted lists, highlighted text, charts, and graphs to help a policy-maker or staffer quickly absorb the main points of the research.
The report also provides several real-life examples of how research has informed public policy. For instance, the authors describe how rigorous evidence of the short- and long-term positive outcomes for children and families who participated in early childhood home visiting led the Obama Administration to create a new federal home visiting programme.
Source: Building bridges: How to share research about children and youth with policymakers (2016), Child Trends
An OECD report, Education Policy Outlook 2015: Making Reforms Happen, calls for a coherent framework of assessment and analysis of the effectiveness of education reforms. The report looks at the implementation of educational policy reforms through national and international comparisons.
Pasi Sahlberg, visiting Professor of Practice at Harvard University, critiques the report in a recent article. He highlights one of the report’s conclusions that “once new policies are adopted, there is little follow-up” and that only one in ten of the policies considered in the OECD report has been evaluated for its impact. Professor Sahlberg also praises increases in research and policy analysis in some countries, naming the UK and USA as examples.
Source: Education Policy Outlook 2015: Making Reforms Happen (2015), OECD.
The Center for American Progress has released a new report that examines the productivity of US school districts, and the conclusion is that productivity could be improved.
The authors used the results of 2010-11 state reading and maths assessments in elementary, middle, and high schools. They also used three productivity ratings that looked at the academic achievement of districts for each dollar spent, taking into account factors such as cost-of-living differences and concentrations of pupils with English as an Additional Language or with special educational needs.
The report argues that low educational productivity remains a pressing issue, with billions of dollars lost in low-capacity districts. Problems include inconsistent spending priorities (eg, some districts in Texas spend more than 10% of their unadjusted per-pupil operating expenditures on athletics); only a few states taking a weighted approach and distributing money to schools based on pupil need; funding disparity between different school districts within states; and inconsistent budget practices between different states.
The authors conclude that school productivity has not become part of the reform conversation, despite education leaders facing increasingly challenging budget choices. They recommend that:
- States should build capacity for productivity gains through targeted grants, assistance teams, and performance metrics;
- Education leaders should improve accounting procedures to make them more transparent and actionable, and create a multi-state initiative that will focus on building more robust education budgets;
- Educators should come together to improve the quality of fiscal data across states; and
- States and districts should encourage smarter, fairer approaches to school funding, such as pupil-based funding policies.
Source: Return on Educational Investment: 2014. A District-by-District Evaluation of U.S. Educational Productivity (2014) Center of American Progress.
This policy brief from the RAND Corporation examines the impact of child-targeted interventions in early childhood education and care (ECEC) as well as initiatives to widen access to higher education in Europe, and their impact on social mobility in later years. It provides an overview of research on the topic, discusses various policies, and describes a number of case studies on different programmes and practices.
One example presented is the UK Aim Higher initiative, which focused on children from lower socio-economic backgrounds living in areas characterised by low participation in higher education. The aim of the initiative was two-fold: first, to raise the aspirations of potential candidates, and second, to develop the abilities of under-represented groups so they could apply to college. According to the brief, research suggests that the programme appears to have delivered some improvements in exam results, retention, and progression to higher education. However, there appears to be little evidence that it was successful in influencing participants’ attitudes towards higher education.
Overall, key conclusions of the brief include:
- In the context of economic uncertainty, investing in high-quality ECEC appears to be an effective evidence-based social policy tool, although it should not be considered a panacea.
- The level of ECEC provision is very unequal across the EU: to be effective, it needs to be of high quality.
- One way to break the cycle of disadvantage would be to develop ambitious indicators and policy goals that link ECEC provision for under-represented groups to access to higher education.
Source: Breaking the Cycle of Disadvantage: Early Childhood Interventions and Progression to Higher Education in Europe (2014), RAND Corporation.