Next Article in Journal
Towards Digitalization in Early Childhood Education: Pre-Service Teachers’ Acceptance of Using Digital Storytelling, Comics, and Infographics in Saudi Arabia
Next Article in Special Issue
Exploring the BME Attainment Gap in a Russell Group University: A Mixed Methods Case-Study
Previous Article in Journal
Systematic Literature Review of Innovative Schools: A Map and a Characterization from Which We Learn
Previous Article in Special Issue
Highs, Lows and Turning Points in Marginalised Transitions and Experiences of Noncompletion amongst Pushed Dropouts in South African Higher Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

No Budge for any Nudge: Information Provision and Higher Education Application Outcomes

1
The Faculty of Education, University of Cambridge, Cambridge CB2 1TN, UK
2
Behavioural Insights Team, Centre for Transforming Access and Student Outcomes (TASO), London SW1H 9EA, UK
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(10), 701; https://doi.org/10.3390/educsci12100701
Submission received: 16 August 2022 / Revised: 27 September 2022 / Accepted: 6 October 2022 / Published: 13 October 2022
(This article belongs to the Special Issue Transition to Higher Education: Challenges and Opportunities)

Abstract

:
Despite increasing efforts to improve their access, students facing socio-economic disadvantages are still underrepresented in UK higher education. In this paper, we study whether behavioural nudging with information provision through text messages, embedded within a larger programme of widening participation activities, can be effective at increasing higher education application rates. We conducted two randomised control trials in which final year students in schools and further education colleges in areas with low higher education participation rates in the East of England region received a series of text messages that prompted thinking and/or action regarding the process of applying to higher education. We find null and statistically insignificant effects on application outcomes, suggesting that behavioural nudging in a setting where it is implemented as part of a more intensive widening participation programme is not effective at increasing higher education application rates. These results add to recent evidence regarding the potential impact of nudging in education by studying such interventions within a busy intervention space.

1. Introduction

The need for policy interventions to raise higher education access and participation rates among people who face socio-economic disadvantages is widely acknowledged in the academic literature. Existing evidence from England, the context of our study, suggests that the socio-economic gap in higher education participation can be explained to a great extent by differences in prior attainment during secondary school, rather than by barriers arising at the point of entry to higher education [1]. At the same time, when people from less advantaged socio-economic backgrounds do participate in higher education, they are less likely to make optimal decisions in terms of choice of institution or course in comparison to their similarly achieving but more advantaged peers [2]. Both the higher education application processes and the decision-making around that are recognised as complex [3,4]. Such evidence highlights the need for policy interventions that can effectively support the attainment raising efforts of schools and colleges, potentially simplify processes and the information space around them, and provide gentle nudges to individuals to use such information as best as possible, about whichever post-compulsory education pathway they choose [5,6].
In England, the recent period has seen the implementation of a variety of outreach widening participation in higher education programmes and interventions. These interventions have attracted substantial governmental, institutional, and third-sector funding. With outreach and widening participation programming growing and diversifying, causal evidence of impact is increasingly important for improving funders’ decisions regarding which programmes to prioritise, providers’ decisions about the format and content of such programmes, and young people’s decisions about which programmes to attend. This evidence, however, is often lacking [7] or faces the challenge of so-called ‘black box’ programmes. There is some evidence [7] that these ‘black box’ programmes, often encompassing a variety of interventions, are effective, especially when combining, for instance, the simplification of the application processes with individual support [8] but disentangling between their constituent parts is often difficult. Further evidence [5,9,10] suggests that particular types of interventions, specifically nudging programmes which provide participants with relevant information and seek to spark certain behaviours, are effective at improving both application rates and take-up of places in higher education. Even as very recent evidence finds small and inconsistent effects from nudge interventions [11], the provision of information is an established form of higher education outreach in the English context and there is some causal evidence [12] that, alongside further support, the provision of accurate information increases application rates, particularly to selective universities.
It is therefore essential to understand the impact of discrete and stand-alone interventions on higher education applications and access, especially if they are delivered in an otherwise busy intervention space. In this paper, we therefore present results from an experimental study evaluating the ways in which a behavioural nudging information-provision intervention has affected higher education application rates when deployed within a larger programme of higher education outreach in the East of England region.

1.1. Aims of the Paper

We report results from two randomised control trials (RCTs) designed to evaluate the effect of a light-touch information-providing (via text message) nudge intervention embedded within a larger programme of widening participation activities. The aim of the specific nudge intervention was to increase the probability of students applying to higher education. The trials, using the same experimental design in each of the two iterations of trialling, involved a total of 935 students in the final year of compulsory-age education, enrolled in 58 schools and colleges in the East of England region during the academic years 2017–2018 and 2018–2019.
The programme within which this intervention was embedded is the ongoing Take Your Place programme (hereafter, the programme) undertaken since 2017 by the Network for East Anglian Collaborative Outreach (neaco). The programme targets students living in socio-economically deprived geographic areas and delivers outreach activities in schools and further education colleges with high proportions of such students. The aim of the programme is to improve access to broadly defined higher education, by helping students explore their options and academic potentials. We return to a fuller description of the programme after discussing the relevant evidence background.

1.2. Evidence Background

The paper makes several contributions to the emerging literature that employs field experiments to examine nudging as a potential high-benefit and low-cost approach to improve educational outcomes. Although they have become increasingly popular in the field of behavioural economics, there is mixed evidence on the overall effectiveness of such nudging interventions [11]. Field experiments that provide information or reminders to students about the college application process and financial aid availability and eligibility, without the accompanying offer of professional assistance, typically have not led to higher rates of college access or success [13,14,15]. Using US data, Phillips and Reber [16] found no improvement in low-income students’ higher education enrolment rates when they were provided with the information and support that higher-income students typically have. Similarly, Carrell and Sacerdote [17] found that providing students with information on the benefits of attending college had no impact on their attendance and persistence. More recently, Avery et al. [18] found null and negative effects of text message-based outreach on improving US students’ college choices and outcomes at a national level, in contrast to positive and significant effects identified from the same intervention when distributed in specific school districts in Texas. In the German context, a separate study [19] found that information nudges increased higher education enrolment for students from a non-academic family background while decreasing (at least in the short-term) the enrolment intentions of students from academic backgrounds.
Overall, studies on nudging interventions in education appear to provide mixed evidence of the effectiveness of such approaches for higher education access and participation outcomes. A recent and comprehensive review of the nudging literature in education [6] suggests that nudging interventions can have broad and long-term effects on overall student outcomes but are not effective in all contexts or for all students. A key conclusion of that review highlights the importance of understanding the underlying behavioural mechanisms potentially resulting in application and access, and how closely interventions designed to impact these behavioural mechanisms match them. Furthermore, the broader evidence also suggests the importance of clarifying the conditions under which such interventions can facilitate behaviour change [20], including in terms of the wider context in which they are delivered.
Our focus on the effectiveness of a nudging intervention (delivered through text messages) on increasing university application rates, when the intervention is embedded within a wider range of widening participation activities, is a non-trivial contribution that we provide to this literature. Producing such evidence is important for informing education policy design and for understanding individual decision making. We suggest that by varying one aspect of a programme’s provision, it is possible to capture the effectiveness and efficacy of specific programme components. This may contribute to the development of evidenced-based practices for widening participation and outreach practitioners. In addition, we verify the robustness of our empirical findings by repeatedly implementing the trial in two consecutive academic years and finding identical results.
We further contribute to the above debates by providing evidence within a context in which a lack of information, advice, and guidance may be a major driver of socio-economic inequalities in higher education applications. This is relevant both in terms of participation and access to selective institutions [2,21]. This is also relevant since credit constraints [22] and geographic isolation [23] continue to be factors that have been previously identified as equally important drivers of inequalities in higher education access among students from low socio-economic backgrounds in England. Such factors may be mitigated against by an income-contingent loan system that covers the entirety of students’ tuition fees, and separately by increasing patterns of localisation, whereby students travel relatively short distances to reach a higher education institution [2]. Against this backdrop, generating evidence around the effectiveness of an outreach intervention, with a clear mechanism that may be delivered straightforwardly and at relatively low cost, is important.
Finally, we show evidence on the differential impact of the intervention by pre-determined student observable characteristics to facilitate previous findings that suggested that the impact of such interventions can be mostly effective on sub-groups of students.

2. The Intervention

The nudging intervention we explore in this paper has been delivered as part of one of the several regional partnerships under Uni Connect, a large-scale government-supported initiative. We provide context to Uni Connect and the relevant evidence about its effectiveness below, before describing the specific regional Uni Connect partnership and its outreach programme. We then provide comprehensive information about the nudging intervention.

2.1. The National Context in England

Despite the increasing number of young people accessing higher education, young people facing socio-economic deprivation are still less likely to progress to higher education in England [24]. Factors which are associated with lower progression primarily focus on attainment at school [1], but also include being the first in the family to potentially attend higher education (an aspect associated with relatively less available knowledge of the higher education system) [25], and the economic circumstances of the household [26]. In addition to this, changing labour market conditions [27] and perceptions of their individual potential experiences in higher education [28] also contribute to changing intentions in relation to higher education applications [29].
To tackle these enduring inequalities, a large range of widening participation, fair access, and outreach programmes have been implemented in England. A relatively recent national programme is the government-funded Uni Connect initiative, looking to increase higher education participation across all types of higher education provision (from university to vocational routes) by taking a place-based approach and working regionally with universities, further education colleges, and schools, by delivering tailored programmes of outreach activity. Recent evidence [30,31] suggests that the impact of Uni Connect mirrors its complex nature. This large-scale evaluation work [30,31] takes in the full national programme and finds that both the range of interventions delivered and their relative impact vary by geography. It also finds that the overall impact of the programme is either negligible or has not been possible to causally attribute to the existence of the initiative. This is taken into consideration against a backdrop of recent disruptions and negative impacts from the pandemic and the associated public health crisis. While a further national-level evaluation is still underway, existing findings already suggest a need to disaggregate the constituent parts of Uni Connect activity, much like our present study does. Similarly, the current shift in direction for the next several years of Uni Connect towards attainment-raising interventions means that the earlier stages of the programme offered a unique opportunity to explore an intervention aimed at increasing application rates, rather than any other aspect of the higher education access process. Our study capitalises on this opportunity.

2.2. The Network for East Anglian Collaborative Outreach

The Network for East Anglian Collaborative Outreach (neaco) is one of the 29 regional partnerships under Uni Connect. The partnership has operated since 2017 in the East of England region with support from all universities and further education colleges in the region and delivers Take Your Place, its flagship outreach programme, in areas wherein the higher education participation of young people is low—and much lower than expected, given average attainment at age 16, and socio-economic composition. Students from these areas are classified as “target students” and represent the specific group whose progression to higher education is the key focus of this programme. A total of 106 schools and eight further education colleges were involved in the programme for the period under investigation in this paper.
The programme is special in that the overall approach adopted in the delivery of activities is based on a progressive framework that seeks repeated interactions with students. This is a key feature of the wider, national, and government-supported Uni Connect programme that for the past five years has dominated the higher education outreach landscape in England, alongside higher education provider- and third sector-driven activity. In participating schools and colleges, this translates into Take Your Place being delivered in a way that varies in each school or college, adapting to the needs of each educational setting, their environment, and the available resources.
There are two central foci for the outreach activity delivered by Take Your Place. The first prioritises the development of students’ understanding and preparedness by focusing on the specific requirements, means, and option choices through which students can realise their aspirations for transitions between the key stages of the English educational system and into higher education. The second strand of activities is focused on passion and ambition, focusing on enabling students to explore, identify, and articulate their passions and aspirations, giving positive incentives for choosing post-16 and post-18 pathways. The activities delivered by the so-called Higher Education Champions (HECs), an outreach specialist usually embedded in schools and their college-based counterparts, range from information sessions and university campus visits to summer schools and community engagement opportunities.
At the time of the delivery of the first iteration of a behavioural nudging intervention explored in this paper (2017–2018) and the first trial, the delivery of Take Your Place was in its relatively early stages. By 2018–2019, the time of the second iteration of the intervention and the second associated trial, Take Your Place was far more established, both in terms of the scope and the range of activities being delivered. As a recent report for the programme illustrates [32], there continues to be substantial variation in the range of activities that the different schools and colleges engage in as part of Take Your Place, with levels of individual engagement with the programme monitored by the programme team and the target of separate analysis elsewhere. This is an important point as it relates to the potential of the nudging intervention to affect change in an increasingly busy intervention space, an issue we return to in discussing our design of the trial and the implications of our findings.
In addition to the in-school and in-college outreach activities provided by Take Your Place, in its first two years the programme also included a light-touch information-provision element.
This light-touch behavioural nudging intervention is the focus of the randomised control trials reported in this paper and is described below.

2.3. The Behavioural Nudging Intervention

In addition to progressive and sustained provision detailed above, the Take Your Place programme included a light-touch information-provision component. The objective of this behavioural nudging intervention was aligned with the programme’s overarching aim, which is to improve the higher education application rates of participating students. The intervention aimed to do this through the provision of easily understandable information that students could act upon. A secondary aim of delivering this intervention was to enable the exploration of this type of information-provision nudging intervention in terms of its effectiveness.
The intervention was delivered in two consecutive school years (2017–2018 and 201–2019), with only minor differences between the two years, all relating to the accuracy of the information provided via text messages to individual participants: the specific dates and deadlines were updated, and the links to any online material shared to students were updated. Otherwise, the intervention was materially the same.
The content of the information related principally to the process of applying to higher education through the Universities and Colleges Admissions Service (UCAS). UCAS is the centralised national admissions system, where all universities and a number of other higher education providers are included. Individuals wanting to apply to university (or the other available types of higher education providers) make one single application through UCAS, to a total of up to five separate degree courses in each year. The intervention studied here provided participants with information about preparatory steps (e.g., drafting a personal statement, identifying appropriate degree courses), as well as practical issues (e.g., navigating the UCAS website and application portal, finding the required information and deadlines) and places where the students could go to find more information about any of the above aspects (such as signposting to teachers and staff of Take Your Place, providing links to relevant information web pages or videos hosted online).
A total of up to 14 text messages were administered to participants in the intervention. However, to recognise that participants may have applied to higher education prior to the deadline, and to avoid irrelevant information being sent to them, two text messages inviting a response were also sent, containing a simple yes/no question regarding whether the individual student had already applied to higher education. For all those responding positively, the text messages stopped, and the individual participants’ outcome was recorded as having applied to higher education. Appendix A contains all the text messages that were sent to participants, excluding any links which are no longer available.
The timing of the delivery of the intervention was important, as it needed to align with the application window and relevant deadlines. It was administered starting the last week of October of each respective school year (2017–2018 for the first trial; 2018–2019 for the second trial) and ended in mid-January of the same school year (in the next calendar year), immediately after the application deadline, which regularly falls in the middle of January each year. As Appendix A indicates, the last text message was sent after the passing of the application deadline, signposting students to relevant information in terms of options available to them if they still wanted to apply to higher education for the relevant year.
Importantly, the text messages were personalised with the names of the individual participants, using a direct address (“Hi, [student name]!”). This followed evidence according to which personalisation was important in the provision of information in higher education outreach [12], and sought to create rapport with participants, which was hypothesised to increase the likelihood of action following the reception of the text messages.
A large team contributed to the development of the intervention, including staff on the neaco partnership and their institutional partners. The lead researcher was also involved in the set-up of the intervention through the provision of evidence in relation to various decisions (e.g., around personalisation).

3. Trial Design

To estimate the causal impact of the above nudging intervention on higher education application outcomes, two randomised control trials were implemented in each school year when the intervention was delivered (2017–2018 and 2018–2019). Each trial underwent the ethical approval process at the Faculty of Education University of Cambridge. The first of the two trials was jointly undertaken with researchers from the Behavioural Insights Team (BIT) and was registered by them (trial number 2017136). The latter team undertook a separate analysis of data pertaining to the first trial, were only briefly involved in the second trial, and did not undertake the full analysis of data as reported in this paper. We acknowledge their contributions to the first trial and thank them for their insights.
While the two trials were undertaken independently of each other, the testing of the intervention (materially the same across the two implementation years) allows us to pool the data across the two trials. This has implications for the power calculations (reported below), but we also explore the potentially different impacts of the intervention in each respective trial cohort in our later analysis. This is particularly relevant given the embedding of this intervention in the wider Take Your Place programme, which was at different stages of development in the two intervention years.

3.1. Outcome Measure

The outcome measure of interest is whether students applied to higher education via UCAS. This outcome measure was coded as binary, taking the value 1 if students had applied, and the value 0 if they had not applied. The outcome measure was collected with two procedures: first, from self-reported responses on whether they completed their application before the relevant deadline of the respective academic year; and second, with the help of on-the-ground staff, who obtained this information directly from the participants’ schools and further education colleges. While there is evidence to support that student responses to this type of question are highly predictive of actual student application [29], the addition of the staff-provided data meant that the outcome measure could be collected from a high proportion of initial trial participants, contributing to very low attrition, as outlined later in this paper.

3.2. Trial Hypotheses

Each of the two trials operated under the same overall research hypothesis, according to which the text-based information-provision nudging intervention may encourage students in their final year of secondary education to apply to higher education via the standard UCAS route. We used a two-tailed test to test the non-directional hypothesis that the rate of application to higher education for students randomly allocated to receive the intervention was no different to that for students randomly allocated to the control condition.

3.3. Trial Characteristics

Both trials were based on individual-randomisation, balanced, two-arm (intervention and control) trials, run under an intention-to-treat approach. The intention-to-treat approach means that all participants randomly allocated to each of the trial conditions remained in that respective condition for the purpose of analysis (barring any missing data) regardless of the (unknowable) level of engagement with the intervention: that is, students randomly assigned to the intervention condition were considered as part of this intervention condition even if they did not engage with any of the text messages. It was impossible to monitor engagement with, and immediate actions as a result of, the text messages because the participants’ school and home lives were not monitored as part of these trials. They may have engaged in the behaviours suggested by the text messages immediately after receiving them, at a later point, or not at all; or they may have sought information or advice from their school or college. While clearly a limitation, this aligns with the commonly used intention-to-treat approach (analysing data according to the initial allocation result) we have also taken in this trial and means we are minimising the risk of over-stating our results.
We now outline the full experimental set-up and procedure. This applies to both trials.

3.3.1. Participant Recruitment

In the period under consideration for this study, the Take Your Place programme administered two large-scale surveys to students in schools and colleges participating in the wider programme. A separate section in each of these surveys invited final year students (those eligible to apply to higher education) to take part in the randomised control trial.
Detailed but simple information was provided to students as part of this recruitment process. Students were asked for fully informed consent to participate in a trial, with different students reached in the two consecutive years of the trials’ implementation.
The information provided to students during this recruitment process included the trial aims, the randomisation procedure (explained as a 50–50 chance of receiving the text messages if taking part in the trial), and information about what the intervention would entail. The participants who consented to taking part were then invited to provide their phone numbers for the purposes of the text messages delivery.
The inclusion/exclusion criterion for the presentation of the recruitment information related to the participants’ self-reported likelihood of applying to higher education. As part of the development of the text messages, it was decided that students expressing a very low likelihood of applying to higher education would not benefit from the text messages.
The students’ likelihood to apply to higher education was gauged during the survey with a stand-alone 6-point Likert response scale question asking them to rate the likelihood of application at age 18/19 (the relevant age for a vast majority of students in the participating schools and further education colleges).The students were also asked if they had already applied to higher education. Based on the above questions, two formal exclusion criteria were used: first, the students who expressed that they had no intentions to apply to higher education were excluded from the sample eligible to take part in the trial. Second, all students who indicated that they had already applied to higher education were also excluded.
For the first trial, the survey was undertaken between September and early November 2017. A total of just over 21,300 students responded, with just over 4000 final year students invited to take part in the trial. A total of 531 students signed up.
For the second trial, the survey was undertaken between September and late October 2018. A similar number of total respondents was reached, and a total of 439 students signed up to the second trial.
There are two potential implications of this recruitment process. First, the external validity of the trials may be relatively low as participating schools (in the overall Take Your Place programme, and therefore also in the trials) were selected based on specific characteristics of the areas wherein the students lived. The second implication is that we are only able to estimate the impact of digital nudging among students who were willing to receive text messages, with findings not necessarily generalisable to the wider population of Take Your Place students. While this latter issue is important, it is also unavoidable from the perspective of ethical conduct of research and of trials, with prospective participants only recruited into the trial on the basis of full informed consent. To address this concern, we explored responses to a series of relevant learner survey questions (the same survey used for recruitment purposes) including self-reported knowledge of (higher) education options, knowledge of specific education or employment options, and knowledge of where to seek information about such topics, comparing responses between trial participants and trial non-participants in the relevant year group. While this full analysis is beyond the scope of this paper and will be reported elsewhere, we found no statistically significant differences between these two groups on the above variables. This suggests that the self-selected trial participants were not, at least for these observed variables, meaningfully different to the non-participants. We return to issues of external validity when we discuss the results of the trials in relation to the intervention set-up as part of the wider outreach programme.

3.3.2. Randomisation Procedure

Randomisation occurred after the participants had signed up to each respective trial as per the procedure above and it was carried out at the individual level. Randomisation was stratified by target student status (students living in areas where the rate of higher education progression was lower than expected given the average age (16) of attainment) and by student self-reported gender. This was conducted to ensure that any differences in higher education application likelihood by these two characteristics would not represent a bias in the trial.
Randomisation was carried out in statistical software (Stata) using a random number generator with a randomly chosen seed number, and it saw 50% of participants allocated to the intervention condition and 50% of participants allocated to the control condition, separately for each respective trial.
This randomisation approach generated an intervention and a control condition in each trial. While we were not able to monitor participant compliance with allocation, the distribution of text messages was carefully monitored, and no contamination errors at the distribution point were noted. It remains possible, though not highly probable, that the individuals in the intervention condition may have shared text messages, or information therein, with control group counterparts. However, as outlined above, the intervention was designed so that the text messages would build upon each other and follow a progressive and time-specific pattern. Therefore, unless participants in the intervention condition had ‘leaked’ all the messages and information to participants in the control condition, the intervention would not have been able to be engaged with as designed.

3.3.3. Attrition after Randomisation

A total of 970 eligible participants were recruited into the two trials. Data on the outcome measure (outlined below) were not available for a small number of these participants (3%), with 940 of the 970 (97%) participants across both trials presenting full data for analysis.
For the first trial, 515 participants of the 531 initially recruited were retained in the analysis. Attrition was similar for the control and intervention arms of this trial, at 3% each. For the second trial, 425 participants were retained in the analysis from the recruited total of 439, again with a balanced attrition per arm, at 3% each.
While attrition is always a concern in trials, due to the implications of the internal validity of the analysis, at 3%, the attrition rate for this trial is very low [33]. As such, we did not carry out any imputation checks; however, we did carry out a robustness test, as we detail in the Results section later in this paper.

3.3.4. Balance at Baseline

We examine whether our randomisation created balanced groups at baseline according to the observable characteristics of students. Table 1 below presents the descriptive statistics of the originally randomised sample and the magnitude of the differences between the intervention and control groups (column 3 in Table 1) for the pooled data and across both trials. The observed differences between the two groups are nearly equal to 0. We do not provide tests of statistical significance related to these mean comparisons because to do so would violate the logic of randomisation.
We then move on to empirically examine the balance across the intervention and control groups for the analytical sample (after attrition, as outlined above). In Table 2, we show balance across the intervention and control groups for the sample with non-missing outcome data.
For the sample of students for whom we have outcome data and non-missing information on all other covariates, we observe no imbalance between trial arms across gender and target status, suggesting that the randomisation was balanced on these observable characteristics. This applies both to the initial baseline and to the post-attrition analytical sample.

3.3.5. Power Calculations

As part of the set-up of the trials, power analyses were conducted to judge the feasibility of detecting an effect of the intervention considering the likely response rate from the students. Given the lack of directly relevant evidence regarding the effect of such an intervention on university application rates at the time of the development of the trials, we calculated the sample sizes using a theorised minimum detectable effect size of 0.2. We assumed a conventional 80% statistical power (i.e., at least an 80% chance of detecting the main effect), and we also assumed that we could explain approximately 50% of the variance in the main outcome with the baseline variables we included, namely demographic characteristics (including ‘target’ student status and gender). The power calculation test to be run is two-tailed, as although the hypotheses are directional, it is important to statistically test for the eventuality of a negative effect. There parameters resulted in a required sample of 395 participants, half in the control group, and half in the intervention group. Were we to not meet the sample size requirements, a sample of 300 would yield a minimum detectable effect size of 0.229, and a sample of 200 one of 0.282, holding all other assumptions constant. All power calculations were performed in PowerUp! [34].
At the recruitment stage, keeping all other parameters the same as above, the achieved sample yielded a minimum detectable effect size of 0.172 for the first trial and a minimum detectable effect size of 0.190 for the second trial. When pooled, the minimum detectable effect size was 0.127, which is very good for education trials in England, many of which are (under-) powered for a 0.2 effect size [33].
At the analysis stage, we re-calculated the minimum detectable effect sizes. We used the same parameters as above, but instead of estimating the proportion explained variance from the covariates, we obtained this from a simple analysis, which put it at 13%. Together with the slight reduction in sample size, the at-analysis minimum detectable effect size was 0.231 for the first trial, 0.254 for the second trial, and 0.171 for the pooled sample.

3.4. Analytical Strategy

To obtain a causal effect of the information-provision nudging intervention on student outcomes, we compared post-intervention higher education applications by trial condition (intervention status) using the following OLS regression model for student i, in institution s, at year t:
Y i . s , t P o s t = α + β 0 Y i . s , t P r e + β   T r e a t i , s , t + δ   X i ` + η s + τ t + ϵ i , s , t ,   Y i . s , t P r e 0
where:
  • Y i . s , t P o s t is a post-intervention binary measure of higher education application;
  • Y i . s , t P r e is a pre-intervention self-reported measure of intentions to apply to higher education;
  • T r e a t i , s is a binary variable indicating whether the student was in the intervention or control group (0 = control; 1 = intervention);
  • X i ` is a vector of individual characteristics (the stratification factors) at baseline;
  • η s are the institution fixed effects;
  • τ t is a dummy indicator of academic year (2017–2018 or 2018–2019);
  • ϵ i , s , t is a robust standard error clustered at the institution level.
It is important to note that participation to the trials was limited to students who expressed at least a mild intention to apply to higher education, that is, when Y i . s , t P r e 0 . To account for the fact that the wider outreach programme is an institution-level intervention and there is a clustering of students within institutions, we include institution-fixed effects and cluster all reported standard errors at the institution level. The coefficient of interest is β , which shows the impact of the individual-level random assignment to the nudging intervention on the probability of having applied to higher education before the deadline.

4. Results

First, we present the descriptive results for the outcome measure and the baseline measure of interest. Table 3 shows the rate of higher education applications for the intervention and control conditions for both trials, separately and pooled. In terms of the outcome measure of applications to higher education, and pooled across the two trials, 60% of the participants in the intervention group applied, compared to 59% of the control group. Additionally, for the pooled sample, the baseline intentions to apply (captured on a 6-point scale and used to recruit participants in the trials, with only those with at least a slight intention to apply to higher education being eligible) were also very similar across the two arms.
We observed a very similar pattern when looking at the disaggregated data for the two trials, with the proportions of students applying to higher education in each of the respective intervention and control conditions across the two trials being very similar to each other.
In relation to the baseline intentions to apply (also in Table 3), these were fairly high across the board, and balanced by the intervention and control conditions. This mirrors evidence about the national sample of students engaged in Uni Connect, with only 11% of the learners in the analysis by the national evaluator relating to the relevant stage of the programme (by 2019) reporting that they were unlikely to apply to higher education [35].
We then applied the analytical strategy as outlined above. The results of the application of this strategy to the pooled trial data indicate that there is a very small but not statistically significant effect of the nudging intervention on higher education applications of students within schools and colleges participating in the wider outreach programme under consideration here.
Table 4 presents the estimates on the impact of the intervention on the higher education applications. These results refer to the pooled sample of students participating in the two trials, presented in a sequential manner. In the first column (1), we show the row effect of the nudging intervention. In column two (2), we then add controls for individual-level characteristics. Finally, we add school-fixed effects for the results presented in column three (3).
This third column represents the analysis as specified above and offers the main trial results. Table 5 and Table 6 illustrate the results of the same analysis separately for the two trials. The estimated intervention effect is positive, yet very small and statistically insignificant, with an almost identical figure across all specifications (pooled, and separately for both trials, as seen in Table 4, Table 5 and Table 6). While target student status remains a statistically non-significant explanatory variable for higher education applications across all specifications of both the separate and the pooled analysis, for the first trial and for the pooled model, gender is statistically significantly (and positively) associated with higher education applications (intervention and target student status held constant), but only in the analytical specification without institution-fixed effects (column (2) in Table 4 and Table 5 below). This is likely a result of a school/college-based variation in the overall rate of higher education application by gender, something that the institution-fixed effects capture (column (3)).
The outcome of the trial is therefore clear and consistent, showing no effect of the text messaging intervention on higher education applications.

4.1. Robustness Checks

We undertake two robustness checks to investigate how sensitive our estimates are to different specifications. First, we consider whether selective attrition between the treated and the control group students may bias our results. This is despite the fact that we observe very little variation in the overall rate of attrition by trial arm across the two trials. Therefore, in the first robustness check, we tested whether our results were similar when we replaced missing observations by assuming that all students for whom we had missing data in the intervention group applied to higher education, and that all the students for whom we had missing data in the control group did not apply to higher education. By running this analysis, we were able to examine whether, if we had managed to collect data for all the randomised sample and under the most optimistic assumptions about these missing data, we might have observed a significant effect of the intervention. The results are reported in Table 7 below, indicating that even under our most optimistic assumptions about missing data, we do not see an effect of the intervention on higher education applications.
For the second robustness check, we repeated our main estimations using conditional logistic regression to account for the dichotomous nature of our dependent variable (instead of the linear probability model used in the main analytical specification above). Table 8 presents the marginal effects from this analysis. Inevitably, the conditional logistic regression in column three (3), that is, when school-fixed effects are included in the estimation, results in a reduction in the sample size due to dropped observations when no variation in higher education applications was observed within schools. Even with that caveat, which further supports our choice of the OLS specification, we find no difference to the main results generated from our main analysis above when using this specification.
As a result, the main findings of the trial remain unchanged, either under the alternative analytical specification, or when testing a best-case scenario attrition situation as in the first robustness check. This increases the confidence in our results.

4.2. Effect Heterogeneity

Finally, we explore whether the effect of the intervention may have been different for any of the two stratification factors, gender, and target student status. In Table 9, we show the results from the application of the main analytical strategy, but for disaggregated samples: target and non-target students, and, respectively, girls and boys, both across the pooled data.
The results above suggest no evidence of a heterogeneous effects of the intervention on higher education application rates by the two groups (target/non-target student status, gender) under consideration in our experimental study. Taken together, these results therefore suggest the robustness of our analysis and its precise null results.

5. Discussion and Conclusions

In this paper, we have reported the results from two randomised control trials testing the effects of a light-touch behavioural nudging information-provision intervention on higher education applications in the English context. Given the existing evidence [13] on the use of behavioural nudging in the context of providing relevant educational information to (prospective) students, we hypothesised that the intervention, designed to work alongside the Universities and Colleges Admissions Service (UCAS) higher education application process in England, may encourage students who had at baseline expressed at least a mild intention to apply to higher education to realise this intention and apply to higher education.
We implemented two randomised control trials of the same intervention in consecutive school years, using individual-level random allocation to one of two experimental conditions in each trial: an intervention condition, receiving the intervention, and a control condition, not receiving the intervention.
The intervention was delivered as part of a wider programme of outreach and widening participation in the East of England region, which saw schools and further education colleges deliver, via staff employed by the programme, a wide-ranging set of outreach activities. From an ethical perspective, this means that students in the control group were not unfairly treated in relation to their opportunities to participate in potentially impactful outreach activities. However, from the perspective of the trials we have implemented, this means that we were in fact able to estimate what would amount to an additional effect of the nudging intervention. In that sense, the randomisation procedure, resulting as outlined above in balanced samples, may have also, in principle, ensured a balanced distribution of potential participation in these in-school activities; however, the business-as-usual approach of both experimental conditions may include a substantial amount of outreach intervention. While this represents a clear limitation of the trial, it also reflects the only possible real-world scenario for the delivery and testing of an outreach intervention: the English policy and activity landscape around outreach that we have outlined above means many schools and colleges routinely are the place of many and diverse outreach and widening participation activities. Testing the nudging intervention in this context is a way to increase translational validity, even if it may work to potentially minimise the effect size of the intervention. Further research and evaluation around the Take Your Place programme will explore how variation by school/college, as well as by individual student, shapes later higher education outcomes for programme participants, and will look to understand the changes to self-reported knowledge, expectations, and intentions around higher education that may have occurred due to a participation (in various amounts) in Take Your Place.
As such, our main trial result of finding positive, very small, but statistically non-significant results—essentially null results—is not necessarily surprising. This finding was robust both for statistical specification and for testing for the impact of (the very small) trial attrition, and it was consistent across each of the two trials separately as well as for the pooled data. Since the trial protocol was robustly implemented, attrition was low and the statistical power of the trial was good compared to other educational trials, which offers confidence that the null result is indeed a valid picture of the impact of this intervention, as delivered in the context of the wider outreach programme.
Although this embedding of a nudging intervention with an existing widening participation programme allowed for robust data collection, a high response rate, and a low attrition rate, nesting the intervention within the larger programme may therefore explain the lack of significant results.
This finding is particularly relevant given prior evidence [7] around so-called ‘black box’ interventions, where a variety of potential mechanisms for change may be at play at any one time, making it difficult to disentangle them. In that sense, our experimental study provides specific robust evidence regarding the impact (or rather, lack thereof) of a particular aspect of the wider outreach programme being delivered in the East of England region.
Moreover, our findings align with recent evidence that challenge the hypothesis that nudging may result in large effects [11] and offer further support to suggestions [36] that intensive guidance might be needed to change higher education application and enrolment behaviours. This is precisely what the wider programme, the focus of a larger-scale quasi-experimental evaluation currently underway, may have provided to some of the students participating in these two trials, potentially minimising the likely effect of the nudging intervention.
We are unable to provide definitive evidence regarding the interplay between this intervention and the wider programme in terms of their potential impact on higher education applications. However, the fact that each trial concludes with the same result, while being run when at different stages of the wider programme (in 2017–2018 in its first full year of implementation and therefore at an incipient stage; in 2018–2019 already embedded) may suggest that the level of other activity happening in the participating schools and colleges was not, overall, a factor affecting the potential effectiveness of the intervention. Our future research relating to Take Your Place will be able to explore this variation by school and college in greater depth.
We acknowledge two further limitations of our study, particularly in relation to issues of internal and external validity. First, the intention-to-treat approach to both trials meant that we did not consider whether students had actually read, engaged, and acted upon the information provided to them via the text messages they had received. We were also not able to measure any ‘leakage’ or contamination from the intervention to the control group. While it is possible that students in the intervention group may have communicated with those in the control group (therefore minimising any intervention effect we may have been able to detect), this would have also meant that the recipients of the intervention had given the information at least some minimal thought and that the information would have prompted action, potentially cancelling out these two aspects. Future trials could make use of existing technology to measure actual levels of engagement (e.g., link clicks) with the information provided by the intervention. Future trials could also explore using alternative forms of communication of this intervention, with social media currently being a powerful vehicle for the communication of relevant information amongst young people.
From an external validity perspective, the recruitment into the trial of students who had expressed a non-negative (at least ‘slightly likely’) intention to apply to higher education means that the results are not immediately generalisable to the wider population of higher education-ready students in England. This is a common challenge of trials in education [33], but one that future trials may address by using rich administrative data present in England alongside national outreach and widening participation programmes which may offer the opportunity to generate representative samples and therefore more readily generalisable evidence.
The above limitations notwithstanding, the evidence we have generated with our experimental study is relevant for local policy-making purposes, including within the wider outreach programme within which the intervention was initially embedded. The null results are attributed to the implementation team leading the outreach programme, who decided not to continue its deployment and instead focus on intensive in-school outreach activity.
We also contend that the understanding of the intervention is useful for wider policy-making purposes, especially in a context of limited resources but continuing efforts to improve equity and fairness in higher education applications, access, and participation.

Author Contributions

Conceptualization (trial design): S.I. and E.K.; Formal analysis: All authors. Writing—original: S.I., K.M. and A.B.; Writing-revision: S.I. and K.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Network for East Anglian Collaborative Outreach (neaco), which is funded by the Office for Students in England.

Institutional Review Board Statement

The studies received ethical approval from the Faculty of Education University of Cambridge.

Informed Consent Statement

Informed consent was obtained from all participants involved in the study.

Data Availability Statement

The data presented in this study cannot be shared and are not publicly available due to ethical considerations and the privacy and data protection restrictions governing their use.

Acknowledgments

This paper includes analysis from two randomised control trials conducted as part of a wider evaluation of the higher education outreach programme run by the Network for East Anglian Collaborative Outreach (neaco) in the East of England region. We also thank members of the neaco team. We also thank members of the Behavioural Insights Team who supported the initial development of the first trial and provided separate analysis for that trial.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Content of the information provided via text messages to participants randomly allocated to the intervention condition of each respective trial. Dates are indicative of when the text messages were sent and were consistent across the two trials.
Table A1. Text messages sent to intervention group participants (links removed from the below, as they differed slightly between the 2017–2018 and 2018–2019 iterations of the trial, as a result of websites to which participants were directed updating).
Table A1. Text messages sent to intervention group participants (links removed from the below, as they differed slightly between the 2017–2018 and 2018–2019 iterations of the trial, as a result of websites to which participants were directed updating).
DateText in the Text Message
(0) Introduction, 31 October–2 NovemberHi [student name], you signed up for the ‘Take your place’ project at [school/college name] and you have been selected! We will send you around 1 text each week with tips about what you can do next year. Reply STOP if you don’t want to receive this extra support.Thanks!
(1) 31 October–2 NovemberHi [student name], have you thought about applying to university? Now is a good time to research your options; speak to your teacher about how to apply! University can boost your career and you’ll meet loads of new people—find out more here: [link]
(2) 7 NovemberHi [student name], do you know what courses you can study at uni? There are so many options—make a plan to research 3 possible courses this week. Here is a useful link to get you started: [link]
(3) 14 NovemberHi [student name], are you worried about the cost of going to uni? Help is at hand! Check out this video for info on the support available: [link]
(4) 21 NovemberHi [student name], it’s great that you’re thinking more about your future! If you haven’t already, set aside some time this week to register online to apply via UCAS: [link]
(5) 21 November (week of)Hi [student name], we hope these messages are helpful. If you’ve already applied to higher education, reply YES.
(6) 28 NovemberHi [student], have you found a course you want to apply for, and 5 different universities/colleges where it’s offered? Why not make a list of what you need to do next—stick it on your fridge so you don’t forget! If you are still choosing your courses there is more advice here: [link]
(7) 5 DecemberHi [student name], now is a good time to write your personal statement. Remember to get straight into why you’re interested in the subject, and really focused on the course—you can find top tips for different subjects at [link].
(8) 12 DecemberHi [student name], just 4 weeks until applications close. Re-read your personal statement and ask yourself: did you begin with your strongest reason for wanting to study your subject of choice? Check these 14 common mistakes to avoid: [link]
(9) 19 DecemberHello [student name] it’s nearly time for Christmas! You’ve done some great work on your personal statement so far, well done. Remember to keep any description of extra-curricular activities short and explain what skill you gained from each. You should limit this part of your personal statement to one paragraph at the end.
(10) 2 January (after)Happy New Year [student name]! Not long before applications close. If you’ve already applied to higher education, reply YES.
(11) 2 JanuaryHi [student name], if you’re in the middle of filling out your UCAS application, check out their handy step-by-step guide [link] This week, make a plan to sit down and check you have ticked all the boxes!
(12) 9 JanuaryNext week is the final deadline for your UCAS application. Make sure you have 5 choices in total and that you submit your application before 6 pm on the 15 of January. You can find useful guidance on filling out your application here: [link]
(13) 14 January—go out in the morningHello [student name], tomorrow at 6 pm is the UCAS 2018 deadline to apply to higher education. If you have any questions about your application, ask your teachers tomorrow!
(14) 16 JanuaryDid you miss the application deadline? If you’re not yet sure that higher education is the route for you, don’t worry, you can still apply through Clearing later in the year. Contact the university or college directly and ask for advice.

References

  1. Chowdry, H.; Crawford, C.; Dearden, L.; Goodman, A.; Vignoles, A. Widening participation in higher education: Analysis using linked administrative data. J. R. Stat. Soc. Ser. A (Stat. Soc.) 2013, 176, 431–457. [Google Scholar] [CrossRef] [Green Version]
  2. Campbell, S.; Macmillan, L.; Murphy, R.; Wyness, G. Matching in the dark? Inequalities in student to degree match. J. Labor Econ. 2022, 40, 807–850. [Google Scholar] [CrossRef]
  3. Dynarski, S.; Wiederspan, M. Student aid simplification: Looking back and looking ahead. Natl. Tax J. 2012, 65, 211–234. [Google Scholar] [CrossRef] [Green Version]
  4. Budd, R. Undergraduate orientations towards higher education in Germany and England: Problematizing the notion of ‘student as customer’. High. Educ. 2017, 73, 23–37. [Google Scholar] [CrossRef] [Green Version]
  5. Sanders, M.; Chande, R.; Selley, E.; Team, B.I. Encouraging People into University; Department for Education: London, UK, 2017. Available online: https://www.bl.uk/britishlibrary/~/media/bl/global/social-welfare/pdfs/non-secure/d/f/e/dfesc-encouraging-people-into-university-2017.pdf (accessed on 11 July 2022).
  6. Damgaard, M.T.; Nielsen, H.S. Nudging in education. Econ. Educ. Rev. 2018, 64, 313–342. [Google Scholar] [CrossRef]
  7. Younger, K.; Gascoine, L.; Menzies, V.; Torgerson, C. A systematic review of evidence on the effectiveness of interventions and strategies for widening participation in higher education. J. Furth. High. Educ. 2019, 43, 742–773. [Google Scholar] [CrossRef] [Green Version]
  8. Herbaut, E.; Geven, K. What works to reduce inequalities in higher education? A systematic review of the (quasi-) experimental literature on outreach and financial aid. Res. Soc. Stratif. Mobil. 2020, 65, 100442. [Google Scholar] [CrossRef]
  9. Castleman, B.L.; Page, L.C. Summer nudging: Can personalized text messages and peer mentor outreach increase college going among low-income high school graduates? J. Econ. Behav. Organ. 2015, 115, 144–160. [Google Scholar] [CrossRef]
  10. Castleman, B.L.; Page, L.C. Parental influences on postsecondary decision making: Evidence from a text messaging experiment. Educ. Eval. Policy Anal. 2017, 39, 361–377. [Google Scholar] [CrossRef]
  11. Szaszi, B.; Higney, A.; Charlton, A.; Gelman, A.; Ziano, I.; Aczel, B.; Goldstein, D.G.; Yeager, D.S.; Tipton, E. No reason to expect large and consistent effects of nudge interventions. Proc. Natl. Acad. Sci. USA 2022, 119, e2200732119. [Google Scholar] [CrossRef]
  12. Sanders, M.; Burgess, S.; Chande, R.; Dilnot, C.; Kozman, E.; Macmillan, L. Role models, mentoring and university applications-evidence from a crossover randomised controlled trial in the United Kingdom. Widening Particip. Lifelong Learn. 2018, 20, 57–80. [Google Scholar] [CrossRef]
  13. Bergman, P. Nudging technology use: Descriptive and experimental evidence from school information systems. Educ. Financ. Policy 2020, 15, 623–647. [Google Scholar] [CrossRef]
  14. Bettinger, E.P.; Long, B.T.; Oreopoulos, P.; Sanbonmatsu, L. The role of application assistance and information in college decisions: Results from the H&R Block FAFSA experiment. Q. J. Econ. 2020, 127, 1205–1242. [Google Scholar]
  15. Gurantz, O.; Pender, M.; Mabel, Z.; Larson, C.; Bettinger, E. Virtual advising for high-achieving high school students. Econ. Educ. Rev. 2020, 75, 101974. [Google Scholar] [CrossRef]
  16. Phillips, M.; Reber, S. When “Low Touch” is Not Enough: Evidence from a Random Assignment College Access Field Experiment, UCLA CCPR Population Working Papers. 2018. Available online: http://papers.ccpr.ucla.edu/index.php/pwp/article/view/1213/596 (accessed on 6 September 2022).
  17. Carrell, S.; Sacerdote, B. Why do college-going interventions work? Am. Econ. J. Appl. Econ. 2017, 9, 124–151. [Google Scholar] [CrossRef] [Green Version]
  18. Avery, C.; Castleman, B.L.; Hurwitz, M.; Long, B.T.; Page, L.C. Digital messaging to improve college enrolment and success. Econ. Educ. Rev. 2021, 84, 102170. [Google Scholar] [CrossRef]
  19. Peter, F.H.; Zambre, V. Intended college enrolment and educational inequality: Do students lack information? Econ. Educ. Rev. 2017, 60, 125–141. [Google Scholar] [CrossRef] [Green Version]
  20. Mertens, S.; Herberz, M.; Hahnel, U.J.; Brosch, T. The effectiveness of nudging: A meta-analysis of choice architecture interventions across behavioral domains. Proc. Natl. Acad. Sci. USA 2022, 119, e2107346118. [Google Scholar] [CrossRef]
  21. Hoxby, C.M.; Avery, C. The Missing “One-Offs”: The Hidden Supply of High-Achieving, Low Income Students (No. w18586); National Bureau of Economic Research: Cambridge, MA, USA, 2012; Available online: https://www.nber.org/system/files/working_papers/w18586/w18586.pdf (accessed on 6 September 2022).
  22. Callender, C.; Jackson, J. Does the fear of debt constrain choice of university and subject of study? Stud. High. Educ. 2008, 33, 405–429. [Google Scholar] [CrossRef]
  23. Gibbons, S.; Vignoles, A. Geography, choice and participation in higher education in England. Reg. Sci. Urban Econ. 2012, 42, 98–113. [Google Scholar] [CrossRef]
  24. Boliver, V.; Gorard, S.; Siddiqui, N. Using contextual data to widen access to higher education. Perspect. Policy Pract. High. Educ. 2021, 25, 7–13. [Google Scholar] [CrossRef]
  25. Adamecz-Völgyi, A.; Henderson, M.; Shure, N. Is ‘first in family’ a good indicator for widening university participation? Econ. Educ. Rev. 2020, 78, 102038. [Google Scholar] [CrossRef]
  26. Anders, J. The influence of socioeconomic status on changes in young people’s expectations of applying to university. Oxf. Rev. Educ. 2017, 43, 381–401. [Google Scholar] [CrossRef] [Green Version]
  27. Donald, W.E.; Ashleigh, M.J.; Baruch, Y. Students’ perceptions of education and employability: Facilitating career transition from higher education into the labor market. Career Dev. Int. 2018, 23, 513–540. [Google Scholar] [CrossRef]
  28. Harrison, N.; Waller, R. Challenging discourses of aspiration: The role of expectations and attainment in access to higher education. Br. Educ. Res. J. 2018, 44, 914–938. [Google Scholar] [CrossRef]
  29. Anders, J.; Micklewright, J. Teenagers’ expectations of applying to university: How do they change? Educ. Sci. 2015, 5, 281–305. [Google Scholar] [CrossRef] [Green Version]
  30. Harding, S.; Bowes, L. Fourth Independent Review of Impact Evaluation Evidence Submitted by Uni Connect Partnerships. 2022. Available online: https://www.officeforstudents.org.uk/media/c304f005-89a1-4a5b-9468-b98eb7475ad4/cfe-review-of-impact-evidence-from-uni-connect-partnerships.pdf (accessed on 13 June 2022).
  31. Office for Students. Uni Connect National Evaluation. Research Report OfS 2022.26. 2022. Available online: https://www.officeforstudents.org.uk/media/ebdc4bcd-148d-4d96-be5d-22a7d8660c51/uni-connect-evaluation-report-finalforweb.pdf (accessed on 13 June 2022).
  32. Take Your Place Annual Report. 2021. Network for East Anglian Collaborative Outreach. Available online: https://www.takeyourplace.ac.uk/media/1369/neaco-annual-report-2021.pdf (accessed on 8 September 2022).
  33. Connolly, P.; Keenan, C.; Urbanska, K. The trials of evidence-based practice in education: A systematic review of randomised controlled trials in education research 1980–2016. Educ. Res. 2018, 60, 276–291. [Google Scholar] [CrossRef] [Green Version]
  34. Dong, N.; Maynard, R. PowerUp!: A tool for calculating minimum detectable effect sizes and minimum required sample sizes for experimental and quasi-experimental design studies. J. Res. Educ. Eff. 2013, 6, 24–67. [Google Scholar] [CrossRef]
  35. Bowes, L.; Tazzyman, S.; Steer, R.; Birkin, G.; Telhaj, S. An Independent Evaluation of Uni Connect’s Impact on Intermediate Outcomes for Learners. 2021 Office for Students Report. Available online: https://www.officeforstudents.org.uk/media/931324a7-ef78-442d-bfc5-9d3c6bb42062/uc_wave-2-survey-findings_final_for_web.pdf (accessed on 9 September 2022).
  36. Oreopoulos, P.; Petronijevic, U. The Remarkable Unresponsiveness of College Students to Nudging and What We Can Learn from It (No. w26059); National Bureau of Economic Research: Cambridge, MA, USA, 2012 2019; Available online: https://www.nber.org/system/files/working_papers/w26059/w26059.pdf (accessed on 9 September 2022).
Table 1. Descriptive statistics and mean differences across intervention and control groups, stratification variables at baseline, and pooled data.
Table 1. Descriptive statistics and mean differences across intervention and control groups, stratification variables at baseline, and pooled data.
Intervention Control Difference
(1) (2) (1)–(2)
Mean/(sd)NMean/(sd)NMean/(se)
Girl0.58 (0.49)4840.58 (0.49)4860.00 (0.03)
Target student0.33 (0.47)4840.33 (0.47)4860.00 (0.03)
Total N970
Note: The numbers presented in columns (1) and (2) are the mean values for the intervention and control groups. The numbers in parentheses are the standard deviations. The last column presents the mean differences (and the standard errors in parentheses) between the intervention and control groups for each variable.
Table 2. Descriptive statistics and mean differences across intervention and control groups, stratification variables at analysis stage, and pooled data.
Table 2. Descriptive statistics and mean differences across intervention and control groups, stratification variables at analysis stage, and pooled data.
Intervention Control Difference
(1) (2) (1)–(2)
Mean (sd)NMean (sd)NMean (se)
Girl0.59 (0.49)4680.58 (0.49)4720.01 (0.04)
Target student0.32 (0.47)4680.33 (0.47)4720.01 (0.04)
Total N940
Note: The numbers presented in columns (1) and (2) are the mean values for the intervention and control groups. The numbers in parentheses are the standard deviations. The last column presents the mean differences (and the standard errors in parentheses) between the intervention and control groups for each variable.
Table 3. Descriptive statistics for the higher education application outcome and baseline intentions to apply to higher education; pooled sample.
Table 3. Descriptive statistics for the higher education application outcome and baseline intentions to apply to higher education; pooled sample.
Intervention Control
Mean (sd)NMean (sd)N
Pooled data
Applied to HE0.60 (0.49)4680.59 (0.49)472
Baseline intentions to apply5.84 (1.77)4685.91 (1.81)472
Trial 1
Applied to HE0.61 (0.49)2580.59 (0.49)257
Baseline intentions to apply5.34 (1.53)2585.24 (1.66)257
Trial 2
Applied to HE0.59 (0.49)2100.58 (0.49)215
Baseline intentions to apply6.47 (1.84)2106.70 (1.69)215
Note: The numbers presented in columns (1) and (2) are the mean values for the intervention and control groups. The numbers in parentheses are the standard deviations.
Table 4. The impact of the text message nudging intervention on higher education application outcomes.
Table 4. The impact of the text message nudging intervention on higher education application outcomes.
HE Application(1)(2)(3)
Coefficient (se)Coefficient (se)Coefficient (se)
Intervention0.01 (0.04)0.01 (0.03)0.01 (0.03)
Girl 0.11 ** (0.04)0.08 (0.05)
Target student 0.02 (0.03)−0.02 (0.03)
Constant0.62 *** (0.04)0.55 *** (0.09)0.27 (0.23)
N940940940
Number of clusters (schools)575757
Academic year
Institution-fixed effects
Notes: Pooled sample from trials 1 and 2. Standard errors clustered at institution level and reported in parentheses. Significance levels: * p < 0.10, ** p < 0.05, *** p < 0.01.
Table 5. The impact of the text message nudging intervention on higher education application outcomes, first trial (2017–2018).
Table 5. The impact of the text message nudging intervention on higher education application outcomes, first trial (2017–2018).
HE Application(1)(2)(3)
Coefficient (se)Coefficient (se)Coefficient (se)
Intervention0.02 (0.05)0.02 (0.04)0.02 (0.05)
Girl 0.16 ** (0.06)0.08 (0.07)
Target student 0.02 (0.05)−0.00 (0.05)
Constant0.59 *** (0.05)0.50 *** (0.06)0.40 *** (0.04)
N515515515
Number of clusters (schools)575757
Institution-fixed effects
Notes: Sample from trial 1 only. Standard errors clustered at institution level and reported in parentheses. Significance levels: * p < 0.10, ** p < 0.05, *** p < 0.01.
Table 6. The impact of the text message nudging intervention on higher education application outcomes, second trial (2018–2019).
Table 6. The impact of the text message nudging intervention on higher education application outcomes, second trial (2018–2019).
HE Application(1)(2)(3)
Coefficient (se)Coefficient (se)Coefficient (se)
Intervention0.01 (0.05)0.01 (0.05)0.02 (0.06)
Girl 0.06 (0.07)0.04 (0.07)
Target student 0.01 (0.05)−0.04 (0.05)
Constant0.58 *** (0.07)0.54 *** (0.08)0.47 *** (0.06)
N425425425
Number of clusters (schools)292929
Institution-fixed effects
Notes: Sample from trial 2 only. Standard errors clustered at institution level and reported in parentheses. Significance levels: * p < 0.10, ** p < 0.05, *** p < 0.01.
Table 7. Robustness check for main trial result: ‘most optimistic scenario’ of all missing observations in intervention group applying to HE and all missing observations in the control group not applying to HE; pooled sample.
Table 7. Robustness check for main trial result: ‘most optimistic scenario’ of all missing observations in intervention group applying to HE and all missing observations in the control group not applying to HE; pooled sample.
HE Application(1)(2)(3)
Coefficient (se)Coefficient (se)Coefficient (se)
Intervention0.04 (0.03)0.04 (0.03)0.04 (0.03)
Girl 0.11 ** (0.04)0.07 (0.05)
Target student 0.02 (0.03)−0.01 (0.03)
Constant0.59 *** (0.09)0.53 *** (0.09)0.43 *** (0.13)
N970970970
Number of clusters (schools)585858
Institution-fixed effects
Notes: Pooled sample from trials 1 and 2. Standard errors clustered at institution level and reported in parentheses. Significance levels: * p < 0.10, ** p < 0.05, *** p < 0.01. Missing dummy included for students with no data on attended institution.
Table 8. Robustness check for main trial result: marginal effects (conditional) from logistic regression; pooled sample.
Table 8. Robustness check for main trial result: marginal effects (conditional) from logistic regression; pooled sample.
HE Application(1)(2)(3)
Marg. Eff. (se)Marg. Eff. (se)Marg. Eff. (se)
Intervention0.06 (0.15)0.06 (0.14)0.05 (0.16)
Girl 0.46 ** (0.18)0.36 (0.23)
Target student 0.06 (0.15)−0.09 (0.17)
Constant0.48 (0.40)0.21 (0.38)−1.13 (1.14)
N940940890
Number of clusters (schools)575744
Institution-fixed effects
Notes: Pooled sample from trials 1 and 2. Standard errors clustered at institution level and reported in parentheses. Significance levels: * p < 0.10, ** p < 0.05, *** p < 0.01. Missing dummy included for students with no data on attended institution.
Table 9. Robustness check for main trial result: marginal effects (conditional) from logistic regression; pooled sample.
Table 9. Robustness check for main trial result: marginal effects (conditional) from logistic regression; pooled sample.
HE ApplicationTarget
Student
Non-Target
Student
GirlsBoys
(1)(2)(3)(4)
Coefficient (se)Coefficient (se)Coefficient (se)Coefficient (se)
Intervention−0.07 (0.06)0.06 (0.05)0.03 (0.04)−0.01 (0.05)
Constant0.11 (0.35)0.46 * (0.25)0.51 * (0.29)−0.37 (0.30)
N307633549391
Number of clusters (schools)50495450
Covariates
Trial dummy
Institution-fixed effects
Notes: Pooled sample from trials 1 and 2. Sample disaggregated by Target/Non-Target Students; Girls/Boys. Standard errors clustered at institution level and reported in parentheses. Significance levels: * p < 0.10, ** p < 0.05, *** p < 0.01. Missing dummy included for students with no data on attended institution.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ilie, S.; Maragkou, K.; Brown, A.; Kozman, E. No Budge for any Nudge: Information Provision and Higher Education Application Outcomes. Educ. Sci. 2022, 12, 701. https://doi.org/10.3390/educsci12100701

AMA Style

Ilie S, Maragkou K, Brown A, Kozman E. No Budge for any Nudge: Information Provision and Higher Education Application Outcomes. Education Sciences. 2022; 12(10):701. https://doi.org/10.3390/educsci12100701

Chicago/Turabian Style

Ilie, Sonia, Konstantina Maragkou, Ashton Brown, and Eliza Kozman. 2022. "No Budge for any Nudge: Information Provision and Higher Education Application Outcomes" Education Sciences 12, no. 10: 701. https://doi.org/10.3390/educsci12100701

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop