Evaluation of Education Programmes and Policies

A special issue of Education Sciences (ISSN 2227-7102).

Deadline for manuscript submissions: 30 April 2024 | Viewed by 1696

Special Issue Editors


E-Mail Website
Guest Editor
School of Education, Durham University, Durham, DH1 1TA, UK
Interests: teacher supply; teacher development; teacher effectiveness; education policy; parental involvement; critical thinking; arts education; evaluation of education programmes; research methods
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor Assistant
School of Education, Durham University, Durham DH1 1TA, UK
Interests: research synthesis; metacognition and self-regulated learning; teacher supply, evaluation of education programmes; STEM learning; English as a second language (ESL)

Special Issue Information

Dear Colleagues,

The last decade has been an exciting era for education research. It has seen a burgeoning of rigorous evaluations of education programmes and policies, not only in the UK, US and Europe, but also in Latin America, Africa, Asia and elsewhere. New and more robust approaches have been developed, tried and applied to the evaluation of education programmes.

More organisations and countries are recognising the importance of evidence-based practices in education. The Education Endowment Foundation (EEF) in the UK and the Institute of Education Sciences in the US are just two examples of organisations that are committed to evaluating education programmes and policies to ensure that programmes and policies used in schools are based on the best evidence.

International assessments, such as PISA, TIMSS, PIRLS and TALIS, provide valuable data on the performance of students and education systems across different countries and regions, which can be used to inform education policies and practices.  Access to administrative datasets has also made it possible to evaluate the effectiveness of education policies on a large scale. The use of robust research methods and the availability of data have greatly enhanced our ability to evaluate education policies and programmes and make evidence-based decisions. This Special Issue aims to curate a collection of such evaluations.

Why the need to evaluate education programmes and policies?

Whilst many education programmes have been evaluated, there is also an abundance of education classroom practices that have not been rigorously evaluated. For example, there is still no clear evidence whether class sizes, streaming or tracking, academic selection, academisation of schools (England) and grade retention are beneficial, and for which phase of education and in what context. Further, the proliferation of education technology and online teaching resources, especially during the lockdown in the recent COVID-19 pandemic, has seen many of these resources and this technology used by schools. Most of these have not been tested. Some may, indeed, be detrimental to learning.

Evaluating education policies and practices is crucial to ensure that our children receive the best possible education. It is important to recognise that different contexts and different groups of children may require different approaches to education. What works well in one school or community may not work as well in another, and what works for one group of students may not work for another. Evaluating education programmes and policies allows us to understand these nuances and tailor our approach to education to best meet the needs of all students.

In this Special Issue, we would like to invite papers that evaluate education policies, practices and programmes for all phases of education, from early years to higher education. We are particularly interested in rigorous research in the evaluation of education programmes and policies. These could be systematic reviews, meta-analyses and experimental studies, such as randomised control trials and quasi-experimental studies, e.g., regression discontinuity and difference-in-difference approaches. We also welcome articles that focus on methodological issues, conceptual pieces on policy evaluations and replication works.

If you would like to contribute to this Special Issue, please submit an abstract of between 250 and 300 words.

Prof. Dr. Beng Huat See
Guest Editor

Loraine Hitt
Guest Editor Assistant

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Education Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • programme evaluation
  • policy evaluation
  • systematic reviews
  • education interventions
  • experimental designs
  • process or implementation evaluations

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

12 pages, 794 KiB  
Article
An Investigation of the Cross-Language Transfer of Reading Skills: Evidence from a Study in Nigerian Government Primary Schools
by Steve Humble, Pauline Dixon, Louise Gittins and Chris Counihan
Educ. Sci. 2024, 14(3), 274; https://doi.org/10.3390/educsci14030274 - 06 Mar 2024
Viewed by 883
Abstract
This paper investigates the linguistic interdependence of Grade 3 children studying in government primary schools in northern Nigeria who are learning to read in Hausa (L1) and English (L2) simultaneously. There are few studies in the African context that consider linguistic interdependence and [...] Read more.
This paper investigates the linguistic interdependence of Grade 3 children studying in government primary schools in northern Nigeria who are learning to read in Hausa (L1) and English (L2) simultaneously. There are few studies in the African context that consider linguistic interdependence and the bidirectional influences of literacy skills in multilingual contexts. A total of 2328 Grade 3 children were tested on their Hausa and English letter sound knowledge (phonemes) and reading decoding skills (word) after participating in a two-year English structured reading intervention programme as part of their school day. In Grade 4, these children will become English immersion learners, with English becoming the medium of instruction. Carrying out bivariate correlations, we find a large and strongly positively significant correlation between L1 and L2 test scores. Concerning bidirectionality, a feedback path model illustrates that the L1 word score predicts the L2 word score and vice versa. Multi-level modelling is then used to consider the variation in test scores. Almost two thirds of the variation in the word score is attributable to the pupil level and one third to the school level. The Hausa word score is significantly predicted through Hausa sound and English word score. English word score is significantly predicted through Hausa word and English sound score. The findings have implications for language policy and classroom instruction, showing the importance of cross-language transfer between reading skills. The overall results support bidirectionality and linguistic interdependence. Full article
(This article belongs to the Special Issue Evaluation of Education Programmes and Policies)
Show Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Title: Advancing evaluation of local system change: learning from evaluating the Family Hubs initiative

Abstract: Whilst significant advances have been made in evaluation of educational interventions, for example through the work of the Education Endowment Foundation in the UK, progress in evaluation of complex local system change is slower. In this paper we reflect on recent evaluations focussed on England's Family Hubs place-based initiative, which aims to join up services for families of young children in local areas. The evaluations consisted of a deep dive into the operation of one Local Authority's Family Hubs approach, and a study of the application of behavioural science approaches to understand engagement with Family Hubs. The article draws out a set of key challenges grouped around theorisation, design, implementation and backdrop, (building Stame's (2010) work on evaluation failure, and Rog's (2012) categorisations of context, identifying methodological solutions and gaps in these that require further work.

Title: Where are the costs? The application of cost methodology for a region wide evaluation of formative assessment programme

Abstract: Education systems are moving to a more evidence informed paradigm, with a focus at all levels to utilise evidence to support practice and policy making. Good quality research can support decision makers to identify more promising approaches to improve learner outcomes, but there are gaps in the evaluative framework (Levin, 2017). Understanding the economic impact of choices when determining education provision is an important aspect to consider, but there are currently very few examples of this described in education (Kraft, 2020). This paper evaluates the Formative Assessment Implementation Project (FAIP), a regional school improvement project designed to improve teachers’ understanding and use of a range of formative assessment principles and strategies to improve learner outcomes. This paper describes the use of a mixed method quasi-experimental design to explore the impact on a range of learner outcomes, including measures of wellbeing, health utility and attainment, including an economic impact evaluation. Using methodology commonly employed in health economics to evaluate economic costs, this paper explores the utility of using these evaluation tools in education. A cost-consequence analysis (CCA) was performed, and we present an evaluation of learner outcomes when considered alongside the cost of the intervention. The function of CCA is to allow decision makers to decide which outcomes are the most relevant to their context whilst still being able to understand the cost of the project. The paper will discuss the difficulties of evaluating large scale universal interventions in education (Greenberg & Abenavoli, 2017) and consider the tools needed to improve the quality and robustness of economic evaluations in education research. The paper concludes with recommendations for evaluators to routinely include cost methodology in future research studies.

Title: Evaluating an active multi-component dissemination approach to improve teachers’ use of research evidence in practice
Author: Erkan
Highlights: • Getting evidence into use is not a straightforward process • Asking teachers to judge the quality of evidence may unintentionally lead to them develop more sceptical attitudes towards all evidence. • Perhaps teachers should be given evidence whose quality has already been judged, rather than asking them to assess the quality of evidence • Educators need more advanced approaches to disseminate evidence

Back to TopTop