As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programmes. Educational Researcher has recently published an article that examines how methodological features such as types of publication, sample sizes, and research designs affect effect sizes in experiments.
A total of 645 studies from 12 recent reviews of evaluations of reading, mathematics, and science programmes were studied. The findings suggest that effect sizes are roughly twice as large for published articles, small-scale trials, and experimenter-made measures, than for unpublished documents, large-scale studies, and independent measures, respectively. In addition, effect sizes are significantly higher in quasi-experiments than in randomised experiments.
Explanations for the effects of methodological features on effect sizes are discussed, as are implications for evidence-based policy.
Source: How Methodological Features Affect Effect Sizes in Education (2016), Educational Researcher