Do physically active lessons improve pupil engagement?

A study published in Health Education and Behavior looks at the effects of introducing physically active lessons into primary school classes. Emma Norris and colleagues used the Virtual Traveller (VT) intervention to evaluate whether physically active lessons had any effect on pupil engagement, physical activity and on-task behaviour.

Virtual Traveller is a programme of pre-prepared physically active lesson sessions delivered using classroom interactive whiteboards during regular lessons. A total of 219 children aged 8- to 9-years-old from 10 schools in Greater London took part in the cluster-randomised controlled trial. Children in the intervention schools received 10-minute VT sessions three times a week, for six weeks, during maths and English lessons. To assess the effectiveness of VT, pupils’ physical activity levels, on-task behaviour and engagement were measured at baseline (T0), at weeks two (T1) and four  (T2) of the six-week intervention, and at one week (T3) and three months (T4) post-intervention.

Pupils in the intervention group showed more on-task behaviour than those in the control at T1 and T2, but this was not maintained post-intervention. No difference in pupil engagement between the control and intervention groups was observed at any time point. VT was found to increase physical activity, but only during lesson time.

Source: Physically active lessons improve lesson activity and on-task behavior: a cluster-randomized controlled trial of the “Virtual Traveller” intervention (March 2018), Health Education & Behavior DOI: 10.1177/1090198118762106

Is the pen mightier than the mouse?

Ben Backes and James Cowan from CALDER have published a working paper on the differences between computer- and paper-based tests.

In 2015, Massachusetts introduced the new PARCC assessment. School districts could choose whether to use the computer or paper versions of the test, and in 2015 and 2016, districts were divided fairly evenly between the two. The authors use this division to compare results for pupils in Grades 3–8 (Years 4–9).

Pupils who took the online version of PARCC scored about 0.10 standard deviations lower in maths and about 0.25 standard deviations lower in English than pupils taking the paper version of the test. When pupils took computer tests again the following year, these differences reduced by about a third for maths and by half for English.

The study also looked at whether the change to computer tests affected some pupils disproportionately. There were no differences for maths, but for English there was more of an effect on pupils at the bottom of the achievement distribution, pupils with English as an additional language and special education pupils.

The authors point out that these differences not only have consequences for individual pupils, but for other decisions based on the data, including teacher and school performance measures and the analysis of schoolwide programmes.

Source: Is the pen mightier than the keyboard? The effect of online testing on measured student achievement (April 2018), National Center for Analysis of Longitudinal Data in Education Research, Working Paper 190

Maths anxiety, working memory and self-concept

A study conducted by researchers at the University of Jaén, Spain, and published in the British Journal of Educational Psychology looks at the relationship between maths anxiety and maths performance in primary school children, and also the possible mediating role of working memory and maths self-concept.

A total of 167 pupils in grades 3 and 5 (age 8–12 years) took part in the study. Each pupil completed a set of questionnaires to assess maths anxiety and self-concept as well as their mathematical performance. Working memory was assessed using two backward span tasks. Teachers were also asked to rate each pupils’ maths achievement.

As expected, results showed that pupils who demonstrated higher levels of anxiety about maths tended to have lower scores on maths outcomes such as ability, problem‐solving and teacher‐rated maths achievement. However, this relationship was lessened once the effects of working memory and self-concept were considered. The researchers suggest, therefore, that it is worth taking into consideration working memory and self-concept when designing interventions aimed at helping pupils with maths anxiety.

Source: Math anxiety and math performance in children: The mediating roles of working memory and math self‐concept (May 2017), British Journal of Educational Psychology, Volume 87, Issue 4

New evidence on maths teaching

A new review of evidence, commissioned by the EEF and the Nuffield Foundation, analyses the best available international research on teaching maths to children aged 9–14 to find out what the evidence says about effective maths teaching. It highlights some areas of maths teaching – like feedback, collaborative learning and different types of textbooks – and considers what the evidence says, and how much evidence there is.

One area where there is strong evidence is using calculators to support learning. The report suggests that pupils’ maths skills may not be harmed by using calculators as previously thought. In fact, using them in maths lessons can boost puipils’ calculation and problem-solving skills if they are used in a thoughtful and considered way.

Other findings include:

  • Maths homework tends to benefit older pupils, but not those in primary school
  • Teacher subject knowledge is crucial for realising the potential of maths resources and interventions to raise attainment
  • High-quality feedback tends to have a large effect on learning, but it should be used sparingly and mainly for more complex tasks

Source: Evidence for review of mathematics teaching: Improving mathematics in Key Stages two and three: Evidence review (March 2018), Education Endowment Foundation

A thirty-year look at studies on computer-assisted maths

During the past 30 years, thousands of articles have been written about technology’s effects on pupil achievement. In order to quantify technology’s effects on maths achievement, Jamaal Young at the University of Texas conducted a meta-analysis of all of the meta-analyses on the topic during the last three decades. His second-order meta-analysis was comprised of 19 meta-analyses representing 663 primary studies, more than 141,000 pupils and 1,263 effect sizes. Each meta-analysis that was included had to address the use of technology as a supplement to instruction, use pupil maths achievement as an outcome measure, report an effect size or enough data to calculate one, have been published after 1985 and be accessible to the public.

The author found that all technology enhancements positively affected pupil achievement, regardless of the technology’s purpose. However, technology that helped pupils perform computational functions had the greatest effects on pupil achievement, while combinations of enhancements demonstrated the least effects on pupil achievement. The author found that study quality and the type of technology used in the classroom were the main influencers on effect sizes. The highest-quality studies had the lowest effect sizes, which he attributes to their more rigorous analysis procedures. The high-quality reviews gave an overall effect size for the use of technology of +0.16 (compared with +0.38 for low- and +0.46 for medium-quality reviews).

Source: Technology-enhanced mathematics instruction: A second-order meta-analysis of 30 years of research (November 2017), Educational Research Review, Volume 22

Do teacher observations make any difference to pupil performance?

An evaluation published by the Education Endowment Foundation (EEF) has found that introducing more frequent and structured lesson observations – where teachers observe their colleagues and give them feedback – made no difference to pupils’ GCSE maths and English results.

A randomised controlled trial of the whole-school intervention Teacher Observation was conducted in 82 secondary schools in England, which had high proportions of pupils who had ever been eligible for free school meals. In total, the study involved 14,163 pupils – 7,507 pupils (41 schools) in the intervention, and 6,656 pupils (41 schools) in the control.

Maths and English teachers in the intervention schools were asked to take part in at least six structured 20-minute peer observations over a two-year period (with a suggested number of between 12 and 24). Teachers rated each other on specific elements of a lesson, such as how well they managed behaviour, engaged pupils in learning, or used discussion techniques.

The evaluation, which was conducted by a team from the National Foundation for Educational Research (NFER), found that Teacher Observation had no impact on pupils’ GCSE English and maths scores compared to those of pupils in control schools (effect size = -0.01).

Source: Teacher Observation: Evaluation report and executive summary (November 2017), Education Endowment Foundation