AmeriCorps is a US organisation that trains volunteers to serve the community in various civically-minded ways. A recent evaluation examined the effects on pupils’ maths achievement of training AmeriCorps volunteers to teach maths strategies to struggling maths pupils in grades 4–8 (Years 5–9). The volunteers used scripted protocols to teach three maths strategies to struggling pupils. Each strategy was studied in prior research and shown to have positive effects on achievement: concrete-representational-abstract, which uses concrete objects to teach concepts; cover-copy-compare, which teaches steps for computation and provides practice; and cognitive-strategy instruction, which teaches pupils to use procedures and reasoning to solve word problems.
AmeriCorps volunteers had to agree to a year-long, full-time commitment and received four days of training before starting the intervention, with additional training one and two months after. Each school received at least one volunteer from AmeriCorps, who was mentored by one school-staff member who was fully trained in the programme.
Subjects were 489 pupils in 150 Minnesota schools who were randomly assigned to either receive the intervention at the start of the school year (n=310), or to a control group who would receive the intervention a few months later (n=179). All pupils had scored below proficient in the prior year’s state maths assessment. During the intervention, pupil pairs with similar maths scores were to receive maths support for 90 minutes a week for a term. Post-tests using STAR Math were analysed two ways: the intent-to-treat analysis included all pupils who received the intervention, and showed significant positive effects as compared to the control group (effect size = +0.17); and the optimal dosage analysis that included pupils who received the targeted 12 weeks of intervention for at least an hour a week. Effect sizes for the experimental group increased to +0.24 when pupils were given the optimal dosage.
of a math intervention program implemented with community support (May 2019), Journal of Research on Educational
Effectiveness, DOI: 10.1080/19345747.2019.1571653
A report published by the Nuffield Foundation finds that computer use in schools does not on its own boost pupils’ digital literacy or prepare them for the workplace.
The report, written by Angela McFarlane, examines how digital
technologies are used in schools to enhance learning, and identifies research
questions to inform better practice and policy. It examines ten years of
existing evidence on the effect the use of digital technology has on learning
and finds that:
Putting computers into schools is no guarantee
that there will be a positive impact on learning outcomes as measured in
high-stakes assessments or on the development of digital literacy.
How digital technologies are used is as
important as whether they are used.
There is no shared picture of what effective
digital skills teaching looks like.
Teachers may not have opportunities to develop
the skills they need to make effective use of technology.
The current use and knowledge of computer-based
technology in schools and at home is leaving many young people unprepared for
the world of work.
Source: Growing up digital: What do we really need to know about educating the digital generation? (July 2019), Nuffield Foundation
The Education Endowment Foundation (EEF) has published
findings from a large trial of an approach to “growth mindsets”, which aims to
encourage in pupils the belief that intelligence can be developed through
effort and dedication.
A total of 5,018 pupils from 101 schools in the UK took part
in the trial of Changing Mindsets, a programme designed to improve
maths and literacy grades by teaching Year 6 pupils that their brain potential
is not a fixed entity but can grow and change through effort exerted.
Teachers received professional development training on
approaches to developing a growth mindset, together with lesson plans,
interactive resources and practical classroom tips, before then delivering
sessions to pupils over eight weeks. Teachers were encouraged to embed aspects
of the “growth mindsets” approach throughout their teaching – for example, when
giving feedback outside the sessions.
The independent evaluation, by a team from the National Institute for Economic and Social Research (NIESR), found no evidence that the pupils who took part in the programme made any additional progress in literacy or numeracy – as measured by standardised tests in reading, grammar, punctuation and spelling, and maths – compared to pupils in the control group.
The EEF commentary advises that teachers should be cautious
about using the approach as a standalone method of improving pupil achievement.
The Nuffield Foundation has published a systematic review by researchers at Ulster University that analyses the outcomes of classroom-based mathematical interventions.
The systematic review included studies that assessed the
outcomes of interventions aimed at improving maths achievement in primary
school children. Forty-five randomised controlled trials were included along
with thirty-five quasi-experimental studies. The studies were published between
2000 and 2017, and were mostly conducted in the US and Europe.
The results of the review suggest that there are effective
strategies teachers can use to help with learning maths and being fluent with
mathematical facts. It also found there are many different ways teachers can
support children to have a wide bank of strategies to complete mathematical
problems, and for children to know when is best to apply them. Technology in
the classroom can also be helpful as long as these tools have been developed
with a clear understanding of how children learn.
The report concludes that the evidence base on mathematical
interventions is weak, and recommends that researchers should test how
effective mathematical interventions are in order to help teachers support
to improve mathematical achievement in primary school-aged children. A systematic
review (June 2019), Nuffield Foundation
Test anxiety can have negative impacts on pupils’ performance and psychological health. This study published in PLoS One examined whether expressive writing could be beneficial to alleviate test anxiety. Lujun Shen and colleagues conducted a randomised controlled trial among high school pupils in China who were facing The National Higher Education Entrance Examination (Gaokao), which is considered a crucial exam.
The study randomly selected 200 pupils (aged 16-17) from three high schools in Xinxiang city. Pupils were first assessed for eligibility. A sample of 75 pupils was recruited into the study for having a high level of test anxiety. Next, 38 of the pupils were allocated into an expressive writing group, and 37 of them were allocated to a control writing group. Pupils in the expressive writing group were instructed to write for 20 minutes about the positive emotions they had each day, consecutively for 30 days. Pupils in the control writing group were instructed to write about their daily activities consecutively for the same period of time.
Pupils were assessed using the Test Anxiety Scale (TAS)
during the recruitment (late April), and after the end of the writing (early
June). The study also analysed summaries of the writing manuscripts of the 38
expressive writing group pupils for qualitative data. The findings were as
The expressive writing group scored
significantly lower than the control writing group in the Test Anxiety Scale
There were no significant gender differences in
the post-test TAS scores.
Qualitative analysis of the writing found more
elements of positive emotion in the last ten days of expressive writing
compared to the first ten days among the expressive writing group.
The authors suggest that expressive writing is an easy,
inexpensive, and convenient method to cope with anxiety because it does not
require a psychological counsellor nor a specific location.
of expressive writing in reducing test anxiety: A randomized controlled trial
in Chinese samples (February 2018), PLoS
A meta-analysis in the Journal of Research in Reading has synthesised the findings of studies comparing print and digital text regarding time required to read, reading comprehension and readers’ perceptions of their comprehension. Researcher Virginia Clinton performed a systematic literature review, only including studies using random assignment and that were published between 2008 and 2018, yielding 29 reports of 33 studies for analysis. She found that readers require equal amounts of time to read print and digital text, although screen reading negatively impacted reading comprehension (effect size = -0.25). Readers were more accurately able to judge their comprehension on paper (effect size = +0.20) than on screen.
The negative effect on performance for reading text from
screens rather than paper did not vary for readers who were adults or children
(under 18). However, the author suggests this finding should be interpreted
with caution because there were more studies with adult participants (26) than
child participants (7).
Best Evidence in Brief reported on an earlier meta-analysis solely examining reading comprehension, whose results also favoured printed text.
Source: Reading from paper compared to screens: A systematic review and meta‐analysis (May 2019), Journal of Research in Reading, volume 42, issue 2