Reviewing the evidence on career and technical education

A new report by Rachel Rosen and colleagues at MDRC reviews the available research evidence supporting various types of career and technical education (CTE) programmes, examining both the amount of evidence available in each area and its level of rigour. The report details several CTE programme types (eg, instruction and training, apprenticeships and readiness skills training) and provides a literature review of the available evidence to support each programme type.

Key findings were as follows:

  • The most evidence exists for CTE course work and training. In that area, there are multiple studies suggesting that participation in CTE can improve pupils’ outcomes. In addition, multiple studies found that career-related certificates and associate’s degrees are linked to increased wages.
  • Several career pathway models, particularly career academies and early college high schools, are also supported by strong, rigorous studies that provide evidence of positive benefits for pupils.
  • The evidence for other models and for individual programme components is weaker. The authors suggest that these models and components probably need to be evaluated further.

Source: Career and technical education current policy, prominent programs, and evidence (September 2018), MDRC

Straight talk on evidence

The Laura and John Arnold Foundation (LJAF), which sponsors the US version of Best Evidence in Brief, has launched a new initiative called Straight Talk on Evidence.

The purpose of the initiative is to “distinguish credible findings of programme effectiveness from the many others that claim to be, through an easy-to-read, no-spin digest of recent programme evaluation findings.”

For example, the site presents highlights of a report on preventing youth crime. LJAF reviewed a randomised controlled trial (RCT) of Reading for Life, a mentoring and character development programme for young offenders in the US. The review found this to be a well-conducted RCT, showing that the programme reduced the rate of subsequent re-arrests. The study’s main limitation is that it was conducted in a single town in Indiana.

Source: Promising new evidence in the effort to prevent youth crime (August 2017), Straight Talk on Evidence

Using research to improve teaching practice

The Education Endowment Foundation has reported on two studies that looked at using education research to improve teaching practice.

Research into Practice was a pilot intervention aimed at supporting teachers to use evidence-based teaching and learning strategies to improve student progress. The project ran for a year in ten primary schools in Rochdale (north-west England). It involved professional development sessions and direct consultant support to help teachers:

  • Have more positive views about the use of research for improving teaching and learning;
  • Apply education research findings in the classroom and at a strategic level; and
  • Establish a stronger culture of evidence-based inquiry and practice.

There were some positive changes in teachers’ attitudes toward research. However, there was no evidence that teachers were more likely to use research evidence to inform their teaching practice.
The Research Champions project used a senior teacher based at one of five schools to work with research leads, other teachers, and senior leaders to promote engagement with research evidence. There were “audits” of school research needs, research symposia for teachers, periodic research and development forums, and personalised support. However, there was no evidence that teachers’ attitudes toward research, or their use of research, changed during the intervention.

Source: Research into Practice – Evidence-informed CPD in Rochdale and Research Champions (2016), Education Endowment Foundation.

What works to increase research use?

A new systematic review from the EPPI-Centre at the Institute of Education looks at what works to increase research use by decision-makers. The review included 23 reviews whose relevance and methodological quality were judged appropriate.

There was reliable evidence that the following were effective:

  • Interventions facilitating access to research evidence, for example, through communications strategies and evidence repositories, conditional on the intervention design simultaneously trying to enhance decision-makers’ opportunity and motivation to use evidence.
  • Interventions building decision-makers’ skills to access and make sense of evidence (such as critical appraisal training programmes) conditional on the intervention design simultaneously trying to enhance both capability and motivation to use research evidence.

There was limited evidence that interventions that foster changes to decision-making structures and processes by formalising and embedding one or more of the other mechanisms of change within existing structures and processes (such as evidence-on-demand services integrating push, user-pull, and exchange approaches) enhance evidence use.

There is reliable evidence that some intense and complex interventions lead to an increase in evidence use. Overall though, simpler and more defined interventions appear to have a better likelihood of success.

Source: The Science of Using Science: Researching the Use of Research Evidence in Decision-Making (2016) EPPI-Centre

Implement, educate, evaluate

An OECD report, Education Policy Outlook 2015: Making Reforms Happen, calls for a coherent framework of assessment and analysis of the effectiveness of education reforms. The report looks at the implementation of educational policy reforms through national and international comparisons.

Pasi Sahlberg, visiting Professor of Practice at Harvard University, critiques the report in a recent article. He highlights one of the report’s conclusions that “once new policies are adopted, there is little follow-up” and that only one in ten of the policies considered in the OECD report has been evaluated for its impact. Professor Sahlberg also praises increases in research and policy analysis in some countries, naming the UK and USA as examples.

Source: Education Policy Outlook 2015: Making Reforms Happen (2015), OECD.

Success in evidence-based reform: The importance of failure

The latest blog post from Robert Slavin, a Professor in the IEE and director of the Center for Research and Reform in Education, considers the large number of randomised experiments evaluating educational programmes that find few achievement effects. This is a problem that will take on increasing significance as results from the first cohort of the US Investing in Innovation (i3) grants are released.

At the same time, the Education Endowment Foundation in the UK, much like i3, will also begin to report outcomes. It’s possible that the majority of these projects will fail to produce significant positive effects in rigorous, well-conducted evaluations. However, there is much to be learned in the process. For example, the i3 process is producing a great deal of information about what works and what does not, what gets implemented and what does not, and the match between schools’ needs and programmes’ approaches.