A report published by the Nuffield Foundation finds that computer use in schools does not on its own boost pupils’ digital literacy or prepare them for the workplace.
The report, written by Angela McFarlane, examines how digital
technologies are used in schools to enhance learning, and identifies research
questions to inform better practice and policy. It examines ten years of
existing evidence on the effect the use of digital technology has on learning
and finds that:
- Putting computers into schools is no guarantee
that there will be a positive impact on learning outcomes as measured in
high-stakes assessments or on the development of digital literacy.
- How digital technologies are used is as
important as whether they are used.
- There is no shared picture of what effective
digital skills teaching looks like.
- Teachers may not have opportunities to develop
the skills they need to make effective use of technology.
- The current use and knowledge of computer-based
technology in schools and at home is leaving many young people unprepared for
the world of work.
Source: Growing up digital: What do we really need to know about educating the digital generation? (July 2019), Nuffield Foundation
Juanjuan Chen and colleagues recently performed a meta-analysis on the effects of computer-supported collaborative learning (CSCL).
Using 425 empirical studies (all of which used a controlled experimental or quasi-experimental design) published between 2000 and 2016, researchers found several main characteristics to examine: the effects of the collaboration itself; the effects of computer use during collaboration; the effects of extra technology-related learning tools used in CSCL, such as videoconferencing and sharing visuals with team partners; and strategies such as role assignment and peer feedback.
Collaborative learning itself positively affected:
- Knowledge gain (+0.42)
- Skill acquisition (+0.62)
- Pupil perceptions of the experience (+0.38)
The use of computers, when combined with collaborative learning, positively affected:
- Knowledge gain (+0.45)
- Skill acquisition (+0.53)
- Pupil perceptions (+0.51)
- Group task performance (+0.89)
- Social interaction (+0.57)
Lastly, extra technology-related learning tools during CSCL positively affected knowledge gain (+0.55), as did the use of strategies (+0.38).
Source: The role of collaboration, computer use, learning environments, and supporting strategies in CSCL: A meta-analysis (December 2018), Review of Educational Research, 88(6).
An evaluation published in Educational Evaluation and Policy Analysis evaluates the impact of the Digital Conversion Initiative on pupil outcomes for one US school district in North Carolina.
The initiative provided laptop computers to every pupil from the fourth grade (Year 5) upwards, while also providing teachers with training on how to best use the technology in their lesson plans.
Marie Hull and Katherine Duch used administrative school data from 2005 to 2013 to determine the programme’s impact on maths and reading achievement for pupils in grades 4 to 8 (Years 5 to 9), as well as the impact of the programme on pupil behaviour. They compared the district’s data from before and after implementation, as well as data from neighbouring school districts without one-to-one programmes to determine the short- and medium-term effects.
Their results suggest there is potential for one-to-one laptop programmes to help improve pupil outcomes. They found that:
- Maths scores for pupils improved by 0.11 standard deviations in the short term and 0.13 standard deviations in the medium term.
- No significant change in reading scores in the short term, and mixed evidence of improvement in the medium term.
- Time spent on homework stayed constant.
- Pupils spent more of their homework time using a computer.
Source: One-to-one technology and student outcomes: Evidence from Mooresville’s Digital Conversion Initiative (September 2018), Educational Evaluation and Policy Analysis
The National Foundation for Education Research (NFER) has published the results of a randomised controlled trial and process evaluation of Code Clubs – a UK network of after-school clubs where children aged 9–11 learn to program by making games, animations, websites and applications. Code Club UK produces material and projects that support the teaching of Scratch, HTML/CSS and Python. The clubs, which are supported by volunteers, usually run for one hour a week after school during term time.
The evaluation, conducted by Suzanne Straw and colleagues, assessed the impact of Code Clubs on Year 5 pupils’ computational thinking, programming skills and attitudes towards computers and coding. Twenty-one schools in the UK took part in the trial which used a pupil-randomised design to compare pupil outcomes in the intervention and control groups. Intervention group pupils attended Code Club during the 2015/16 academic year, while control group pupils continued as they would do normally.
The results of the evaluation showed that attending Code Club for a year did not impact on pupils’ computational thinking any more than might have occurred anyway, but did significantly improve their coding skills in Scratch, HTML/CSS and Python. This was true even when control children learned Scratch as part of the computing curriculum in school. Code Club pupils reported increased usage of all three programming languages – and of computers more generally. However, the evaluation data suggests that attending Code Club for a year does not affect how pupils view their abilities in a range of transferable skills, such as following instructions, problem solving, learning about new things and working with others.
Source: Randomised controlled trial and process evaluation of code clubs (March 2017), National Foundation for Educational Research (NFER)
A working paper from the National Bureau of Economic Research reports the findings from a large-scale randomised controlled trial that explores whether owning a home computer has a negative effect on children’s social development.
The study included 1,123 students in grades 6-10 (Years 7-11) in 15 different schools across California. Students were eligible to take part in the trial only if they did not already have a computer at home. Half were then randomly selected to receive free computers, while the other half served as the control group. Surveys were conducted with the students and schools at the start of the school year to collect data on child and household characteristics and school participation. Follow-up surveys were then administered at the end of the school year, and the data compared to establish any causal evidence.
As predicted, Robert W Fairlie and Ariel Kalil found that having computers at home did increase the amount of time that children spent on social networking sites and email as well as for games and other entertainment. However, rather than being socially isolating, children in the treatment group communicated with 1.57 more friends per week than children in the control group, and spent 0.72 more hours with their friends in person. They also found no evidence that the children who received a computer were less likely to participate in sports teams or after-school clubs, or spend any less time in these activities.
Source: The effects of computers on children’s social development and school participation: evidence from a randomized control experiment (December 2016), NBER Working Paper No. 22907, The National Bureau of Economic Research
The National Forum on Education Statistics in the US has created a guide to elementary and secondary virtual education data. In the US, the term “virtual education” includes, but is not limited to, digital learning, distributed learning, open learning, online learning, computer-based learning, distance learning, blended learning, and other similar terms. In the document, a Virtual Education Working Group provides recommendations for collecting accurate, comparable, and useful data about virtual education. The guide also provides real-world examples and common practices implemented by state departments, local districts, and schools to modify their data systems and add elements that better reflect the needs unique to virtual education.
The guide was developed to assist state and local education agencies and other education stakeholders, such as policymakers and researchers, as they
- consider the impact of virtual education on established data elements and methods of data collection; and
- address the scope of changes, the rapid pace of new technology development, and the proliferation of resources in virtual education.
Source: Forum Guide to Elementary/Secondary Virtual Education Data (2016), National Forum on Education Statistics.